text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Buggane] | [TOKENS: 828] |
Contents Buggane In Manx folklore, a buggane (or boagane) was a huge ogre-like creature native to the Isle of Man. Some[who?] have considered them akin to the Scandinavian troll. Manx folklore A shapeshifter, the buggane is generally described as a malevolent being that can appear as a large black calf or human with ears or hooves of a horse. It was large enough to tear the roof off a church. Its natural form is described as "covered with a mane of coarse, black hair; it had eyes like torches, and glittering sharp tusks". Another tale describes it as a huge man with bull's horns, glowing eyes and large teeth. As magical creatures, bugganes were unable to cross water or stand on hallowed ground. The most famous story recounts a buggane who found himself an inadvertent stowaway on a ship bound for Ireland. Determined to return to the Isle of Man, he caused a storm and guided the ship towards the rocky coast of Contrary Head. His plan was interdicted through the intervention of St. Trinian. Invoked by the captain with a promise to build a chapel in his honour, the saint guided the ship safely into Peel Harbour. Incensed, the buggane screamed, "St. Trinian should never have a whole church in Ellan Vannin." When the chapel came to be built, three times the local people put a roof on, and three times the buggane tore it off. The Buggan ny Hushtey lived in a large cave near the sea and was known for having no liking for lazy people. However, it should not be confused with the Cabbyl-ushtey, the Manx water horse. Bugganes were occasionally called upon by the fairies to punish people that had offended them. The buggane of Glen Maye would have pitched a lazy housewife into a waterfall for putting off baking until after sunset, had she not cut loose the strings of her apron to escape. The buggane from Gob-na-Scuit was known for tearing the thatch off the haystacks, puffing the smoke down chimneys, and pushing sheep over the edge of the brooghs (a steep bank or grassy cliff). In Manx legend, the Irish giant Fionn mac Cumhaill (Finn MacCool) crossed over to Mann and settled near Cregneash. The buggane from Barrule came to do battle, but Fionn did not want to fight. Fionn's wife, Oonagh, disguised Fionn as a baby and tucked him into a cradle. When the buggane saw the size of the 'baby', he thought that its father, Fionn, must be a giant among giants, and so he left. They eventually met near Kirk Christ Rushen and fought from sunrise to sunset. Fionn had one foot in the Big Sound, and so made the channel between the Calf of Man and Kitterland, and the other foot was in the Little Sound, and so he made the narrow channel between Kitterland and the main island. The buggane was standing at Port Erin. He came off victorious and slashed Fionn awfully, so that he had to run to Ireland. Fionn could walk on the sea, but the buggane could not, so he tore out a tooth and threw it at Fionn. It hit him on the back of the head, and then it fell into the sea and became what is now called Chicken Rock. Fionn turned round and roared a mighty curse, "My seven swearings of a curse on it!" "Let it lie there for a vexation to the sons of men while water runs and grass grows!" And so it has. The Irish version of the story has Fionn's adversary a giant from Scotland. In popular culture References External links Media related to Buggane at Wikimedia Commons |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Judeo-Tat_literature] | [TOKENS: 3527] |
Contents Judeo-Tat literature Hebrew Judeo-Aramaic Judeo-Arabic Other Jewish diaspora languages Jewish folklore Jewish poetry Judeo-Tat literature is the literature of the Mountain Jews in the Juhuri language. History Judeo-Tat literature is rich in folklore. The most popular narrators of folklore at the beginning of the 20th century were Mordecai ben Avshalom (1860–1925), Shaul Simandu (1856–1939), Khizgil Dadashev (1860–1945) and Aibolo of Tarki. In 1904, Rabbi Yeshayahu Rabinovich was among the first to create literary works in the Judeo-Tat language for a Judeo-Tat theatre group in the city of Derbent. In the 1920s, theatre was the main form of Judeo-Tat literature. Playwrights who wrote for the first Mountain Jewish amateur theatrical troupes include Yakov Agarunov (1907–1992), (Juhuri:Падшох, рабби ва ошир) - "Tsar, rabbi and the rich man", Herzl Gorsky (Ravvinovich) (1904–1937?), (Juhuri:Бахар дас баба-дадай) - "The fruits of the hands of the father and mother", P. Shcherbatov, (Juhuri:Кук савдогар-революционер) - "The merchant's son is a revolutionary" and Yuno Semyonov (1899–1961), who wrote plays (Juhuri:Амалданэ илчи) - "The wise matchmaker", 1924, (Juhuri:Дю алатфуруххо) - "Two junkies", 1924 and (Juhuri:Махсюм) - "Makhsum", 1927. Since the appearance on 3 June 1928 in Derbent of a newspaper in the Judeo-Tat language Захметкеш - The Toiler, whose editor-in-chief was Asail Binaev (1882–1958), one of the first Mountain Jewish professional literati; poems in the Judeo-Tat language were published regularly. All the Mountain Jewish poets of the 1920s - Ekhiil Matatov (1888–1943), Rachamim Ruvinov (1893-1955), Yakov Agarunov, Boris Gavrilov (1908–1990), Neten Solomonov and Z. Nabinovich - were poets of Civic poetry. The theme of women's equality recurs throughout poetry of Yakov Agarunov (Juhuri:Духдар доги) - "Mountain Girl", 1928, Iskhog Khanukhov (1903–1973) (Juhuri:Джофокашэ дадай) - "Mother-toiler" and (Juhuri:Ай зан Мизрах) - "About the Woman of the East", both were written in 1928, a series of poems were written by Ekhiil Matatov and Boris Gavrilov. The formation and development of Mountain Jewish artistic prose started by the end of the 20th century. One of its founders was Yuno Semyonov. His biggest story was (Juhuri:Ошнахой ан раби Хасдил) - "Familiar people of Rabbi Hasdil", 1928–29. In the early 1930s a Mountain Jewish literary circle was formed in Moscow, headed by I. Ben-Ami (Benyaminov) (d. ca. 1937?). The poet, playwright and prose writer Mishi Bakhshiev (1910–1972), poets Manuvakh Dadashev (1913–1943) and Daniel Atnilov (1913–1968), the first professional literary translator Zovolun Bakhshiev (1896–1968) and others quickly took the leading place in the Judeo-Tat literature. In the mid-1930s, this literary circle in Moscow ceased functioning. In 1932, poet Mishi Bakhshiev wrote his first book - Komsomol, the main theme was the social disintegration of the Mountain Jews. Another theme in his work was the involvement of Mountain Jewish women in the Soviet reality (Juhuri: Ма‘ни духдару) - "Song of a Girl", 1933, (Juhuri:Рапорт) - "Report", 1933 and (Juhuri:Хумор) - "Gamble", 1933-34. In the second half of the 1930s the playwright Mishi Bakhshiev wrote a play (Juhuri:Бесгуни игидхо) - "Victory of the Heroes" (1936) about the civil war in Dagestan. It was the first heroic drama in the Judeo-Tat language. Later Mishi Bakhshiev wrote (Juhuri:Хори) - "Earth", 1939 and in 1940, he created a play in verse for folklore motifs: (Juhuri:Шох угли, шох Аббас ва хомбол Хасан) - "Shah's son, Shah Abbas and loader Hasan". Bakhshiyev's first novel was (Juhuri:Э пушорехьи тозе зиндегуни) - "Towards a New Life", 1932, in which he followed the Azerbaijani narrative models. The second, his biggest story was (Juhuri:Ватагачихо) - "Fishermen", 1933, about the life of the Mountain Jewish fishermen from Derbent. From the end of 1934 until the termination of publishing and cultural activities in the Judeo-Tat language in Azerbaijan in 1938, a Judeo-Tat literary circle existed in Baku under the newspaper Kommunist (editor-in-chief Yakov Agarunov) and the Mountain Jewish department of the Azerbaijan State Publishing House, which was headed by Yakov Agarunov and Yuno Semyonov. Poet Dubiya Bakhshiev (1914–1993), in his poem (Juhuri:Занхо а колхоз) - "Woman in the collective farm", 1933, combines the theme of women with the theme of the creation of the Mountain Jewish collective farms. Yuno Semyonov continued to play a significant role in the dramaturgy of the 1930s, writing his drama (Juhuri:Дю бирор) - "Two brothers". In the late 1930s, novelist, poet and playwright Hizgil Avshalumov (1913–2001) published a large story (Juhuri:Басгуни джовонхо) - "The Victory of the Young", 1940, it appeared with essays and feuilletons. Avshalumov dedicated a number of his works to the modern hero of the Mountain Jewish’s village (Juhuri:Маслахат на хингар) - "Council and Khinkal", (Juhuri:Аджал занхо) - "Death to wives", (Juhuri:Шюваран дю хову) - "Bigamist”, essays about the Hero of Socialist Labour Gyulboor Davydova (1892–1983) and Solomon (Shelmun) Rabaev (1916–1963) and others. The story (Juhuri:Занбирор) - "Sister-in-law" is about the life of the Mountain Jews social elite in Derbent on the eve and during the revolution and in the first years of Soviet Union power. In his novel (Juhuri:Кук гудил) - "The son of the mummer", 1974, Avshalumov gave a detailed description of the Mountain Jew farmer and his centuries-old traditional way of life. Later, Hizgil Avshalumov created a folklore image of the witty (Juhuri:Шими Дербенди) - "Shimi from Derbent" (Mountain Jewish analogue of Hershel of Ostropol) The Great Purge of 1936–38 caused a cruel blow to the Judeo-Tat literature. Herzl Gorsky (Ravvinovich), Ekhiil Matatov, I. Ben-Ami (Benyaminov, I.) and Asail Binaev were arrested. With the exception of Asail Binaev, they all died in Soviet prisons and gulags. During the World War II years of the Soviet Union with Germany (1941-45), most figures of the Judeo-Tat literature were drafted into the army. Poet Manuvakh Dadashev was killed in the war. During the four years of the war, not a single literary and artistic book in the Judeo-Tat language was published. In the 1940s, the authorities closed in Derbent the Mountain Jewish newspaper Vatan. From 1946 to the end of 1953, Judeo-Tat literature existed only implicitly. All these years the Mountain Jewish section of the writers' organization did not function, and the creative issues of the Judeo-Tat literature disappeared from the agenda of the Dagestan Writers' Union. Only at the end of 1953 the publication of a small collection of poems by Daniil Atnilov (Juhuri:Чихрат вахд) - "The Image of Time" renewed the functioning of the Judeo-Tat literature as one of the literatures of Dagestan. Since the 1950s, prose has been predominant in Judeo-Tat literature. The leading role in it belongs to Mishi Bakhshiev and Hizgil Avshalumov. Since 1955 began to appear in the Judeo-Tat language almanac (Juhuri:Ватан советиму) - "The Soviet Homeland". In 1946, in Dagestan, the circle of readers of the Judeo-Tat literature is constantly narrowing due to the termination of school education in the Judeo-Tat language. Mishi Bakhshiev originally published his works in Russian ("Stories about My Countrymen", 1956, a collection of essays and short stories "Simple People", 1958 and "Noisy gardens", 1962). In these books, the author spoke not so much specifically as the Mountain Jewish, but as a general Dagestan writer. In 1963 Mishi Bakhshiev published a novel (Juhuri:Хушахой онгур) - "Bunches of grapes". In 1972, Mikhail Dadashev (1936) published (Juhuri:Хьэлоле мерд) - "Noble Man", a collection of humorous stories. In 1977, he released (Juhuri:Тубономе) - "Confession", a collection of satirical stories. In 1980, he published (Juhuri:Бироргьо) - "Brothers", a novel, and in 1983, (Juhuri:Гьисмет) - "Fate", a short story. The Judeo-Tat children writer in the post-Stalin period was Amaldan Kukullu (1935–2000). He released a collection of stories (Juhuri:Синемиши) - "Testing", 1968, and others. Poetry in the Judeo-Tat literature of the 1950s-70s was mostly from achievements of the 1930s. Most prolific and famous poet of that period was Daniil Atnilov. Permanently living in Moscow, in isolation from the everyday elements of the Judeo-Tat language. His collection (Juhuri:Гюлхой инсони) - "The Color of Mankind", 1971 was published posthumously that summarized his work of the 1950s and 1960s. A number of poets of the 20th century created their works in the Judeo-Tat language, such as Sergey Izgiyayev (1922–1972), created poems and plays: (Juhuri: Иму гъэлхэнд шолуминим) - We are the defenders of the World (1952), (Juhuri:Фикиргьой шогьир) - "Thoughts of the Poet" (1966), (Juhuri:Муьгьбет ве гьисмет) - "The fate and love" (1972) and a number of other works. Shimshun Safonov, in 1968, created a collection of poetry (Juhuri:Парза, ма‘ни ма) - "Fly, my verse". Poet Zoya Semenduev (1929–2020) released a collection (Juhuri:Войгей дуьл) - "The Command of the Heart". In 2007, published the book (Juhuri:Духдер эн дуь бебе) - "Daughter of two fathers", which includes the play of the same name and fairy tales. At the end of the 20th century, a number of Mountain Jewish writers wrote only in Russian, such as the poet Lazar Amirov (1936–2007), novelist Felix Bakhshiev (1937), literary critic and novelist Manashir Azizov (1936–2011), and Asaf Mushailov [ru] (1952). For some Mountain Jews, Israel became not only their new homeland, but also a source of inspiration for their literary creativity. Initially, Mountain Jews in Israel wrote in the languages of the countries they came from. However, a new generation of writers and poets began publishing exclusively in Hebrew, such as children's author Arnold Ikhaev (Hebrew: ארנולד איחייב) (b. 1981). He published fairy tales in verse, such as (Hebrew: המכחול שצייר את הכל) - The Brush That Painted Everything, (Hebrew: הפרפר והפר) - The Butterfly and the Bull, (Hebrew: הפוני של נוני) - The pony of Noni, (Hebrew: סוד המפתח והכי גדול זה להיות קטן) - The mystery of the key, and the greatest thing — to be small. Among the Mountain Jews who continued to write in languages other than Hebrew while living in Israel was Eldar Gurshumov (1940–2024) — a poet, singer, and bard. He authored several collections of poems in Judeo-Tat and Azerbaijani, including (Juhuri:Рубаи) - Ruba'i, (Juhuri:Йэ бэндэюм) - I am a human, (Juhuri:Чор-джэргэи и ду-джэргэи) - Quatrain and Couplet, and (Juhuri:Газельхо) - Ghazal, among others. Bat-Zion Abramova (b. 1953) is a poet. She wrote poems in the Judeo-Tat language: (Juhuri:Tazə џərüs) - Young Bride, (Juhuri:Dü dədəy) - Two homelands, and (Juhuri:Cuhurbirə џəsont nisdi) - It's Not Easy to Be Jews. Raya-Rakhil Razilova (b. 1975) published works in both Judeo-Tat and Hebrew. Her writings include Rakhil, or the Whole Life on the Shelf and the poem (Juhuri:Эри Дедейме) - Mother. Literature Mountain Jewish authors References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Tonian] | [TOKENS: 392] |
Contents Tonian The Tonian (from Ancient Greek: τόνος, romanized: tónos, meaning "stretch") is the first geologic period of the Neoproterozoic Era. It lasted from 1000 to 720 Mya (million years ago). Instead of being based on stratigraphy, these dates are defined by the ICS based on radiometric chronometry. The Tonian is preceded by the Stenian Period of the Mesoproterozoic Era and followed by the Cryogenian. Rifting leading to the breakup of supercontinent Rodinia, which had formed in the mid-Stenian, occurred during this period, starting from 900 to 850 Mya. Biology The first putative metazoan (animal) fossils are dated to the middle to late Tonian (c. 890-800 Mya). The fossils of Otavia antiqua, which has been described as a primitive sponge by its discoverers and numerous other scholars, date back to about 800 mya. Even earlier sponge-like fossils have been reported in reefs dating back to 890 million years before the present, but their identity is highly debated. This dating is consistent with molecular data recovered through genetic studies on modern metazoan species; more recent studies have concluded that the base of the animal phylogenetic tree is in the Tonian. Tonian rocks preserve some of the earliest fossils of macroalgae, such as the benthic macroalgae from the Longfengshan biota of the Luotuoling Formation or the green algae from the Dolores Creek Formation. The first large evolutionary radiation of acritarchs occurred during the Tonian. Vase-shaped microfossils abound in late Tonian sediments and represent the earliest testate amoebozoans. See also References Further reading This geochronology article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Middle_Eastern_Empires] | [TOKENS: 10179] |
Contents Middle Eastern empires Middle East empires have existed in the Middle East region at various periods between 3000 BCE and 1924 CE; they have been instrumental in the spreading of ideas, technology, and religions within Middle East territories and to outlying territories. Since the 7th century CE, all Middle East empires, with the exception of the Byzantine Empire, were Islamic and some of them claiming the titles of an Islamic caliphate. The last major empire based in the region was the Ottoman Empire. 3000–2000 BCE: Ancient Middle East The rich fertile lands of the Fertile Crescent gave birth to some of the oldest sedentary civilizations, including the Egyptians and Sumerians, who contributed to later societies and are credited with several important innovations, such as writing, the boats, first temples, and the wheel. The Fertile Crescent saw the rise and fall of many great civilizations that made the region one of the most vibrant and colorful in history, including empires like that of the Assyrians and Babylonians, and influential trade kingdoms, such as the Lydians and Phoenicians. In Anatolia, the Hittites were probably the first people to use iron weapons. In the southwest was Egypt, a land with rich resources that sustained a thriving culture. Ebla was an important center throughout the 3rd millennium BCE and in the first half of the 2nd millennium BCE. Its discovery proved the Levant was a center of ancient, centralized civilization equal to Egypt and Mesopotamia and ruled out the view that the latter two were the only important centers in the Near East during the Early Bronze Age. The first Eblaite kingdom has been described as the first recorded world power. Starting as a small settlement in the Early Bronze Age (c. 3500 BCE), Ebla developed into a trading empire and later into an expansionist power that imposed its hegemony over much of northern and eastern Syria. Ebla was destroyed during the 23rd century BCE; it was then rebuilt and was mentioned in the records of the Third Dynasty of Ur. The second Ebla was a continuation of the first, ruled by a new royal dynasty. It was destroyed at the end of the 3rd millennium BCE, which paved the way for the Amorite tribes to settle in the city, forming the third Ebla. The third kingdom also flourished as a trade center; it became a subject and an ally of Yamhad (modern-day Aleppo) until its final destruction by the Hittite king Mursili I in c. 1600 BCE. The Akkadian Empire was the first ancient empire of Mesopotamia, after the long-lived civilization of Sumer. It was centered in the city of Akkad and its surrounding region. The empire united Akkadian (Assyrian and Babylonian) and Sumerian speakers under one rule. The Akkadian Empire exercised influence across Mesopotamia, the Levant, and Anatolia, sending military expeditions as far south as Dilmun and Magan (modern Bahrain and Oman) in the Arabian Peninsula. Mesopotamian city-states, both Sumerian and East Semitic, had a legacy of intercity warfare, and the tools of these wars have been found in graves, such as copper axes and blades. The first chariot was used extensively, and the Sumerians possessed a dynamic and innovative military. Early cavalry were employed as shock troops, needed to punch holes into the enemy lines to allow infantry to penetrate them, isolate pockets and eliminate them. They were also used to harass enemy flanks, and sometimes outflank enemies, and most armies trembled at the sight of a chariot force. As infantry, the Sumerians used a heavy infantry phalanx, depicted on the Stele of the Vultures, which commemorates the victory over Umma by Lagash in 2525 BCE. These were very similar to the later Macedonian phalanx, although the weaponry was less advanced. They carried spears and uncomfortable armor. Sumerian armies also made great use of skirmishers to harass an opponent. The empire's most remarkable ruler was undoubtedly Sargon the Great (of Akkad), who lived from 2334 to 2279 BCE and numbers among the first great Middle Eastern rulers, as well as a great military tactician and strategist. He is credited as the first general to use amphibious warfare in recorded history. After some years of peace, Sargon waged wars against his rival Elam, and then launched a separate attack on Syria and Lebanon. The key to Sargon's victories was his coordination in the army movement, his ability to improvise tactics, his combined arms strategy, and his skill at siege warfare, as well as the keeping of intelligence always relying on heavy reconnaissance. After Sargon's conquest of Sumer, the area enjoyed a relatively peaceful and prosperous era – perhaps its golden age. International trade flourished as the merchants went from Sumer to the expanses of the east, and also to the vast resources of the west. Goods from Egypt, Anatolia, Iran and elsewhere flowed into Sargon's gargantuan kingdom. Sargon's legacy was one of trade and one of forming the standing army, which later rulers would use offensively. When Sargon died, Rimush, his son, inherited the empire. However, he was plagued by constant uprisings. After his death, his brother took the throne. He too was plagued by constant rebellions and was later usurped by Naram-Sin. Naram-Sin quickly destroyed and dispersed the Sumerian rebels and also went on a vast campaign of conquest, taking his armies to Lebanon, Syria and Israel, and then to Egypt. However, after Naram-Sin, the dynasty went into decline and soon fell altogether. The Third Dynasty of Ur, also called the Neo-Sumerian Empire, refers to a 22nd to 21st century BCE (middle chronology) ruling dynasty based in the city of Ur and a short-lived territorial-political state in Mesopotamia which some historians consider to have been a nascent empire. The Third Dynasty of Ur is commonly abbreviated as Ur III by historians studying the period. It is numbered in reference to previous dynasties, as they appeared in some of the better-preserved editions of the Sumerian King List, although it seems the once supposed Second Dynasty of Ur never existed. It began after several centuries of control by Akkadian and Gutian kings. It controlled the cities of Isin, Larsa, and Eshnunna and extended as far north as Upper Mesopotamia. 1800–1200 BCE: the Babylonian, Mitanni, Egyptian, and Hittite Empires The city of Babylon makes its second to the last appearance in historical sources after the fall of the Third Dynasty of Ur, which had ruled the city-states of the alluvial plain between the rivers Euphrates and Tigris for more than a century. An agricultural crisis meant the end of this centralized state and several more or less nomadic tribes settled in southern Mesopotamia. One of these was the nation of the Amorites ("westerners"), which took over Isin, Larsa, and Babylon. Their kings are known as the First Dynasty of Babylon. The area was reunited by Hammurabi, a king of Babylon of Amorite descent. From his reign on, the alluvial plain of southern Iraq was called, with a deliberate archaism, Mât Akkadî, "the country of Akkad", after the city that had united the region centuries before, but it is known to us as Babylonia. It was one of the most fertile and rich parts of the ancient world. Babylon and its ally Larsa fought a defensive war against Elam, the archenemy of Akkad. After this war had been brought to a successful end, Hammurabi turned against Larsa and defeated its king Rim-Sin. This scenario was repeated: together with the king Zimri-Lim of Mari, Hammurabi waged war against Aššur, and after success had been achieved, the Babylonians attacked their ally and Mari was sacked. Other wars were fought against Yamhad (Aleppo), Elam, Eshnunna, and the mountain tribes in the Zagros. Babylon was the capital of the entire region between Harran in the northwest and the Persian Gulf in the southeast. Hammurabi's successes became the problems of his successors. After the annexation of Mari in the northwest and Ešnunna in the east, there was no buffer against the increasing power of the Hittite Empire and the Kassite tribes in the Zagros. It was impossible for the successors of Hammurabi to fight against all these enemies at the same time, and they started to lose their grip. These enemies sometimes invaded Babylonia, and in 1595 BCE the Hittite king Mursilis I advanced along the Euphrates, sacked Babylon, and even took away the statue of the supreme god of Babylonia, Marduk, from its temple, the Esagila. With the fall of the Assyrian empire (612 BCE), the Babylonian Empire was the most powerful state in the ancient world. Even after the Babylonian Empire had been overthrown by the Persian king Cyrus the Great (539), the city itself remained an important cultural center and the ultimate prize in the eyes of aspiring conquerors. Mitanni was the most powerful Hurrian-speaking kingdom in the region. It came to dominate northern Syria, northern Mesopotamia and southeast Anatolia. Shaushtatar, king of Mitanni, sacked the Assyrian capital of Assur some time in the 15th century during the reign of Nur-ili, and took the silver and golden doors of the royal palace to Washukanni. This is known from a later Hittite document, the Suppililiuma-Shattiwaza treaty. After the sack of Assur, Assyria may have paid tribute to Mitanni up to the time of Eriba-Adad I (1390–1366 BCE). The Mitanni kingdom would go on to fight full-scale wars, and occasionally form alliances, with the Egyptians, Assyrians and Hittites. The Hittites would ultimately destroy the kingdom after conquering its capital. From 1560 to 1080 BCE, the Egyptian Empire reached its zenith as the dominant power in the Middle East. When Rome was still a marsh and the Acropolis was an empty rock, Egypt was already one thousand years old. Although the period of the pyramid-builders was long over, Egypt lay on the threshold of its greatest age. The New Kingdom would be an empire forged by conquest, maintained by intimidation and diplomacy, and remembered long after its demise. [citation needed] By 1400 BCE, the Egyptian Empire stretched from northern Syria to the Sudan in Africa, under the rule of Amenhotep III. It was a golden age of wealth, power, and prosperity, and remarkable diplomacy was used to keep the empire's rivals at bay. Art, technology and new ideas flourished and Egyptian rulers were seen as gods.[citation needed] The peak of Egyptian imperial expansion came when threatened from abroad, when Ramesses II led an army to the north to fight the Hittites at Kadesh. The battle was his crowning achievement and the basis for a new period of stability and wealth. Resources flooded into Egypt. However, foreign powers once again threatened it, and some provinces wavered in their allegiance.[citation needed] After the long reign of Ramesses II, the great tombs were systematically looted and a civil war ensued. Though Egypt was once again divided, carved up among foreign powers, the period left a rich legacy.[citation needed] The Hittite empire is often confused with that of the Chaldean/Babylonians[citation needed] and Greek historians of the period rarely mention it. The Egyptian documents that mention the eponymous Hatti region of the Hittites are the war annals of Thutmoses III and of Seti and Ramses II. The El Amarna letters, written in cuneiform, refer frequently to Hatti. This period in the conventional chronology covers the time from about 1500 to1250 BCE. Merneptah, who followed Ramses II, said that Hatti was pacified. Ramses III, supposedly of about 1200–1180 BCE, wrote that Hatti was already crushed or wasted. A Babylonian chronicle mentions the Hatti in connection with an invasion of Babylon at the close of the ancient dynasty of Hammurabi, supposedly in the 17th or 16th centuries. 1200 BCE – 1100 BCE: Elamite Empire Under the Shutrukids (c. 1210 – 1100 BCE), the Elamite empire reached the height of its power. Shutruk-Nakhkhunte and his three sons, Kutir-Nakhkhunte II, Shilhak-In-Shushinak, and Khutelutush-In-Shushinak were capable of frequent military campaigns into Kassite Babylonia (which was also being ravaged by the empire of Assyria during this period), and at the same time were exhibiting vigorous construction activity—building and restoring luxurious temples in Susa and across their Empire. Shutruk-Nakhkhunte raided Babylonia, carrying home to Susa trophies like the statues of Marduk and Manishtushu, the Manishtushu Obelisk, the Stele of Hammurabi and the stele of Naram-Sin. In 1158 BCE, after much of Babylonia had been annexed by Ashur-Dan I of Assyria and Shutruk-Nakhkhunte, the Elamites defeated the Kassites permanently, killing the Kassite king of Babylon, Zababa-shuma-iddin, and replacing him with his eldest son, Kutir-Nakhkhunte, who held it no more than three years before being ejected by the native Akkadian speaking Babylonians. The Elamites then briefly came into conflict with Assyria, managing to take the Assyrian city of Arrapha (modern Kirkuk) before being ultimately defeated and having a treaty forced upon them by Ashur-Dan I. Kutir-Nakhkhunte's son Khutelutush-In-Shushinak was probably of an incestuous relation of Kutir-Nakhkhunte's with his own daughter, Nakhkhunte-utu.[citation needed] He was defeated by Nebuchadnezzar I of Babylon, who sacked Susa and returned the statue of Marduk, but who was then himself defeated by the Assyrian king Ashur-resh-ishi I. He fled to Anshan, but later returned to Susa, and his brother Shilhana-Hamru-Lagamar may have succeeded him as last king of the Shutrukid dynasty. Following Khutelutush-In-Shushinak, the power of the Elamite empire began to wane seriously, for after the death of this ruler, Elam disappears into obscurity for more than three centuries. 1000 BCE – 550 BCE: the Neo-Assyrian, Phoenician, Median, Chaldean and Lydian Empires Following the conquests of Adad-nirari II in the late 10th century BCE, Assyria emerged as the most powerful state in the world at the time, coming to dominate the Ancient Near East, East Mediterranean, Asia Minor, Caucasus, and parts of the Arabian Peninsula and North Africa, eclipsing and conquering rivals such as Babylonia, Elam, Persia, Urartu, Lydia, the Medes, Phrygians, Cimmerians, Israel, Judah, Phoenicia, Chaldea, Canaan, the Kushite Empire, the Arabs, and Egypt. The Neo-Assyrian Empire succeeded the Old Assyrian Empire (c. 2025–1378 BCE), and the Middle Assyrian Empire (1365–934 BCE) of the Late Bronze Age. During this period, Aramaic was also made an official language of the empire, alongside Akkadian. The Assyrian army is said to have included as high as 300,000 soldiers at its prime. The Phoenicians were the first the peoples to establish a maritime empire with colonies as far as the extremities North Africa and Iberia. To facilitate their commercial ventures, the Phoenicians established numerous colonies and trading posts along the coasts of the Mediterranean. Phoenician city states generally lacked the numbers or even the desire to expand their territory overseas. Few colonies had more than 1,000 inhabitants; only Carthage and some nearby settlements in the western Mediterranean would grow larger. A major motivating factor was competition with the Greeks, who began expanding across the Mediterranean during the same period. Though a largely peaceful rivalry, their respective settlements in Crete and Sicily did clash intermittently. The earliest Phoenician settlements outside the Levant were on Cyprus and Crete, gradually moving westward towards Corsica, the Balearic Islands, Sardinia, and Sicily, as well as on the European mainland in Genoa and Marseilles. The first Phoenician colonies in the western Mediterranean were along the northwest African coast and on Sicily, Sardinia and the Balearic Islands. Tyre led the way in settling or controlling coastal areas. One of the earliest Phoenician inscriptions is the Nora Stone found on the south coast of Sardinia in 1773, it is dated to the 9th century BCE (c. 825-780 BCE). The inscription is most likely understood to be about a battle in which the forces of Pygmalion of Tyre (Pumayyaton) participated in at Tarshish: In this rendering, Frank Moore Cross has restored the missing top of the tablet (estimated at two lines) based on the content of the rest of the inscription, as referring to a battle that has been fought and won. Alternatively, "the text honours a god, most probably in thanks for the traveller's safe arrival after a storm", observes Robin Lane Fox. According to Cross the stone has been erected by a general, Milkaton, son of Shubna, victor against the Sardinians at the site of TRSS, surely Tarshish. Cross conjectures that Tarshish here "is most easily understood as the name of a refinery town in Sardinia, presumably Nora or an ancient site nearby." Cross's interpretation of the Nora Stone provides additional evidence that in the late 9th century BCE, Tyre was involved in colonizing the western Mediterranean, lending credence to the establishment of a colony in Carthage in that time frame. Phoenician colonies were fairly autonomous. At most, they were expected to send annual tribute to their mother city, usually in the context of a religious offering. However, in the seventh century BCE the western colonies came under the control of Carthage, which was exercised directly through appointed magistrates. Carthage continued to send annual tribute to Tyre for some time after its independence. The Median empire was the first Iranian dynasty corresponding to the northeastern section of present-day Iran, Northern-Khvarvarana and Asuristan, and South and Eastern Anatolia. The inhabitants, who were known as Medes, and their neighbors, the Persians, spoke Median languages that were closely related to Aryan (Old Persian). Historians know very little about the Iranian culture under the Median dynasty, except that Zoroastrianism, as well as a polytheistic religion, was practiced, and a priestly caste called the Magi existed. Traditionally, the creator of the Median kingdom was one Deioces who, according to Herodotus, reigned from 728 to 675 BCE and founded the Median capital Ecbatana(Hâgmatâna or modern Hamadan). Attempts have been made to associate Daiaukku, a local Zagros king mentioned in a cuneiform text as one of the captives deported to Assyria by Sargon II in 714 BCE, with the Deioces of Herodotus, but such an association is highly unlikely. To judge from the Assyrian sources, no Median kingdom such as Herodotus describes for the reign of Deioces existed in the early 7th century BCE; at best, he is reporting a Median legend of the founding of their kingdom. The Medes gained control over the lands in eastern Anatolia that had once been part of Urartu and eventually became embroiled in a war with the Lydians, the dominant political power in western Asia Minor. In 585 BCE, probably through the mediation of the Babylonians, peace was established between Media and Lydia, and the Halys (Kizil) River was fixed as the boundary between the two kingdoms. Thus, a new balance of power was established in the Middle East among Medes, Lydians, Babylonians, and, far to the south, Egyptians. At his death, Cyaxares controlled vast territories: all of Anatolia to the Halys, the whole of western Iran eastward, perhaps as far as the area of modern Tehran, and all of south-western Iran, including Fars. Whether it is appropriate to call these holdings or not, a kingdom is debatable; one suspects that authority over the various peoples, Iranian and non-Iranian, who occupied these territories, was exerted in the form of a confederation, such as it is implied by the ancient Iranian royal title, king of kings. Astyages followed his father, Cyaxares, on the Median throne (585–550 BCE). Comparatively, little is known of his reign. Nothing was well with the alliance with Babylon, and there is some evidence to suggest that Babylonia may have feared Median power. The latter, however, was soon in no position to threaten others, for Astyages was himself under attack. Indeed, Astyages and the Medians were soon overthrown by the rise to power in the Iranian world of Cyrus II the Great. While the Median kingdom controlled the highland region, the Chaldeans, with their capital at Babylon, were masters of the Fertile Crescent. Nebuchadnezzar, becoming king of the Chaldeans in 604 BCE, raised Babylonia to another epoch of brilliance after more than a thousand years of the eclipse. By defeating the Egyptians in Syria, Nebuchadnezzar ended their hopes of re-creating their empire. He destroyed Jerusalem in 586 BCE and carried thousands of Jews captive to Babylonia. Nebuchadnezzar reconstructed Babylon, making it the largest and most impressive city of its day. The tremendous city walls were wide enough at the top to have rows of small houses on either side. In the center of Babylon ran the famous Procession Street, which passed through the Ishtar Gate. This arch, which was adorned with brilliant tile animals, is the best remaining example of Babylonian architecture. The immense palace of Nebuchadnezzar towered terrace upon terrace, each resplendent with masses of ferns, flowers, and trees. These roof gardens, the famous Hanging Gardens of Babylon, were so beautiful that they were regarded by the Greeks as one of the seven wonders of the world. Nebuchadnezzar also rebuilt the great temple-tower or ziggurat, the Biblical "Tower of Babel," which the Greek historian Herodotus viewed a century later and described as a tower of solid masonry, a 220 yards in length and breadth, upon which was raised a second tower, and on that a third, and so on up to eight. Nebuchadnezzar was the last great Mesopotamian ruler, and Chaldean power quickly crumbled after his death in 562 BCE. The Chaldean priests, whose interest in astrology so greatly added to the fund of Babylonian astronomical knowledge that the word "Chaldean" came to mean astronomer, continually undermined the monarchy. Finally, in 539 BCE, they opened the gates of Babylon to Cyrus the Persian, thus fulfilling Daniel's message of doom upon the notorious Belshazzar, the last Chaldean ruler: "You have been weighed in the balances and found wanting" (Dan. 5:27). The Kingdom of Lydia entered the historical record in 660 BCE, when the Assyrian king Ashurbanipal demanded tribute from the Lydian king, "Gyges of Luddi." The grandson of Gyges, Alyattes, built the Lydian Empire during his fifty-seven-year reign. Alyattes captured Smyrna, the greatest port of the Asian coast, and one-by-one added Greek coastal towns to his domain. Although he let the Greek cities retain their customs and institutions, and their taxes, along with Lydian gold, he made Lydian monarchs the richest kings since Solomon. Croesus was the son and heir of Alyattes and the most important Lydian king concerning the Bible. He was fabulously wealthy, spawning the simile: "as rich as Croesus." The undoing of Croesus and the Lydian empire came when they attacked Cyrus the Great. Victorious over Cappadocians, Croesus was filled with confidence. The benevolent Cyrus offered Croesus his throne and kingdom if the latter would recognize Persian sovereignty. Croesus replied saying that the Persians would be slaves of the Lydians. Therefor, Cyrus immediately attacked Croesus. After two indecisive engagements, Croesus was driven from the field of battle. He begged for Egypt, Greece, or Babylon to help him, but his pleas fell on deaf ears. The Lydian capital of Sardis fell and Croesus was taken as a prisoner. Though, as was his custom, Cyrus dealt kindly with Croesus, the once very wealthy Lydian empire became a Persian satrapy called Saparda (Sardis). 550 BCE – 330 BCE: the Persian Empire Following the overthrow of the Medes by the Persians, they would inherit the former's territories but would significantly expand it. Eventually, this First Persian Empire (also better known as the Achaemenid Empire) would stretch three continents, namely Europe, Asia and Africa, encompassing 8 million square kilometers, and be the first world empire and the largest empire the world had yet seen in the ancient world. At its peak, it would stretch from Macedon and Paeonia-Bulgaria in the west, to the Indus Valley in the far east. Founded by Cyrus the Great, it was notable for embracing various civilizations and becoming the largest empire of the ancient history, for its successful model of a centralized, bureaucratic administration (through satraps under a king) and a government working to the profit of its subjects, for building infrastructure, such as a postal system and road systems and the use of an official language across its territories and a large professional army and civil services (inspiring similar systems in later empires), and for emancipating the slaves including the Jewish exiles in Babylon, and it is noted in Western history as the antagonist of the Greek city states during the Greco-Persian Wars. With an estimated population of 50 million in 480 BCE, the Achaemenid Empire, at its peak, ruled over 44% of the world's population, the highest such figure for any empire in history. The Greco-Persian Wars eventually culminated with the independence of Persia's westernmost territories (comprising Macedon, Thrace, and Paeonia) and the definitive withdrawal from the Balkans and Eastern Europe proper. In 333 B.C, following the Battle of Gaugamela, the Empire was overthrown and incorporated by Alexander the Great, starting a new period in Middle Eastern history, one noted by the emergence of Hellenistic and Greco-Persian culture, as well as dynasties (e.g. Kingdom of Pontus). 323 BCE – 64 BCE: Alexander's Hellenistic Empire The king of Macedon, Alexander III, to be known as Alexander the Great, came to the throne in October 336 BCE, aged 20. He would soon take control of the Persian empire and cover all the territories of the ancient world, as far as India. Alexander was a remarkable person who combined the military genius and political vision of his father Philip II of Macedon, with a literary bent romanticism and a taste for adventure. In less than two years, Alexander secured the Greek and Thracian borders and gathered an army of 50,000 men for the assault on Asia. In his early campaigns, he always maintained a considerable fleet of warships and supplies for his soldiers. With him there were many scholars who recorded Alexander's discoveries and achievements far in the east. In 334 BCE, Alexander fought the battle that would make his name, opposed by an army of Persians holding an advantageous position on the steep banks of the river Granicus. The unfamiliar tactics and brute strength of the highly disciplined Macedonian phalanx army, advancing with their heavy weapons, inflicted a crushing defeat to the Persian army, prompting the disgraced Persian commander to commit suicide. Barely six months passed as, one by one, all of the cities on the west coast of Anatolia were taken by Alexander. As winter came on, Alexander headed for Lycia, southern Anatolia, where he annexed all of the cities he went through. Amazingly, the Persians, who until that time had enjoyed a largely unchallenged dominance over the region, put up little resistance. Alexander left trusted lieutenants, as well as former Persian satraps, to rule his new conquests, as he continued on his relentless thrust to the very edge of the known world. Alexander's conquest of Persia replaced the Achaemenids with the Seleucids, but the absence of a clear successor after his untimely death and the in-fighting that inevitably followed meant that his empire would not long outlive him. The Seleucids and the Ptolemaic dynasty of Egypt quarreled over control of territory Alexander had conquered previously, mostly in the Middle East. Eventually, the Seleucids won their prize of controlling the Levant, Mesopotamia, Iran and parts of Anatolia adopting the title "Kings of Syria" later on; while Ptolemids established their stronghold in Egypt and adopted and promoted a blend of Greco-Egyptian culture adopting the title "Pharaoh". 88 BCE – 330 CE: the Roman, Armenian, Parthian and Palmyrene Empires The wars between Rome and the Parthian Empire, which took place roughly from 53 BCE to 217 CE, were a unique episode in classical antiquity. Although Rome conquered nearly the entire civilized world around the Mediterranean, The Parthians were a constant thorn in the Roman side. In 270, Palmyrene queen Zenobia would rebel against Roman authority and establish her rule over all of the Eastern provinces located in modern-day Egypt, the Levant, and Anatolia. When Roman expansion reached Mesopotamia, the Parthian Empire had already been prospering as a major power whose outskirts reached far into the east and trade routes ran deep into China. When Roman and Parthian borders finally met, the centuries that followed were a time of diplomacy and war between two empires of distinct cultures and methods of war. Roman–Parthian relations dominated international policy in the classical near east. As opposed to less organized tribes on Rome's European borders, the Parthians were a sophisticated culture of commerce and empire. The Parthians garnered significant wealth from their trade routes and their cities stood as some of the largest in the world. The Armenian Empire was a short lived state that rose to predominance under Tigranes the Great who conquered the entire middle east with the exception of the central and southern Arabia and western anatolia. For a short time he controlled the most powerful state on the planet. The founding of Rome goes back to the very early days of Western civilization; so old is it, that it is today known as 'the eternal city'. The Romans believed that their city was founded in 753 BCE. Modern historians, though, believe it was 625 BCE. In the 1st century BCE, the expanding Roman Republic absorbed the whole Eastern Mediterranean area, and under the Roman Empire the region was united with most of Europe and North Africa in a single political and economic unit. This unity facilitated the spread of Christianity, and by the 5th century, the whole region was Christian. After the empire became divided into its western and eastern parts the Emperors of the East ruled from Constantinople over the lands of the Middle East as far east as the Euphrates and over the Balkans. This empire was a Greek-speaking, Christian empire, and became known to historians as the Byzantine Empire (from the earlier name of its capital city). The Parthians ruled Persia parallel to the Han dynasty and around this time the Roman Empire reached the peak of its power. In this flourishing time and the next, Persia served as the link between Rome and China and was seen as of pivotal strategic importance by the Romans to safeguard their Around 300 BCE, the Parthians, an Iranian tribe, invaded West Asia from Central Asia. Like the Scythians, and as the Persians when they first came to West Asia, the Parthians were nomadic people. They traveled around Central Asia with their horses and their cattle, grazing them on the expansive grasslands there. The Parthians soon headed south into Alexander's empire. The recent death of Alexander the Great had heralded the beginning of the disintegration of his vast empire and the Parthians would be one of the main benefactors. The Parthians immediately succeeded in taking over the middle part of Alexander's empire (roughly modern Iran). This split the Seleucid empire in half, leaving the Macedonian colonies in Bactria (modern Afghanistan) isolated. They stayed there for about 200 years, gradually assimilating the culture of West Asia. By around 100 BCE, with Seleucia increasingly powerless, The Parthians started to take over parts of Eastern Seleucia. At the same time, the Romans started to take over parts of Western Seleucia. Eventually, the Romans and the Parthians met in the middle. At the Battle of Carrhae, in the year 53 BCE, the outnumbered Parthians won a decisive victory, and the Roman general Crassus was killed. In 116 CE, the Roman emperor Trajan invaded the Parthian empire and conquered Babylon. The Parthians were in disarray at this time, due to civil wars, and unable to offer much resistance. But in 117, just a year later, Trajan's successor Hadrian gave up most of the land that Trajan had conquered. However, eventually, these internal weaknesses caused the Parthian Empire to collapse and the Sassanid dynasty rose. Zenobia started an expedition against the Tanukhids in the spring of 270, during the reign of emperor Claudius Gothicus aided by her generals, Septimius Zabbai (a general of the army) and Septimius Zabdas (the chief general of the army) Zabdas sacked Bosra, killed the Roman governor, and marched south securing Roman Arabia. According to the Persian geographer Ibn Khordadbeh, Zenobia herself attacked Dumat Al-Jandal but could not conquer its castle. However, Ibn Khordadbeh is confusing Zenobia with al-Zabbā, a semi-legendary Arab queen whose story is often confused with Zenobia's story. In October of 270, a Palmyrene army of 70,000 invaded Egypt, and declared Zenobia queen of Egypt. The Roman general Tenagino Probus was able to regain Alexandria in November, but was defeated and escaped to the fortress of Babylon, where he was besieged and killed by Zabdas, who continued his march south and secured Egypt. Afterward, in 271, Zabbai started the operations in Asia Minor, and was joined by Zabdas in the spring of that year. The Palmyrenes subdued Galatia, and occupied Ankara, marking the greatest extent of the Palmyrene expansion. However, the attempts to conquer Chalcedon were unsuccessful. The Palmyrene conquests were done under the protective show of subordination to Rome. Zenobia issued coinage in the name of Claudius' successor Aurelian with Vaballathus depicted as king,[note 1] while the emperor allowed the Palmyrene coinage and conferred the Palmyrene royal titles. However, toward the end of 271, Vaballathus took the title of Augustus (emperor) along with his mother. 330 CE – 632 CE: the Eastern Roman Empire, the Ghassanids, the Sassanids, and the Lakhmids Constantinople, situated on the Bosporus Straits at the mouth of the Black Sea, became the capital of the Roman Empire in 330 CE after Constantine the Great founded the city on the site of the city of Byzantium. The city's status as residence of the Eastern Roman Emperor made it into the premier city in all of the Eastern Roman colonies in the Balkans, Syria, Jordan, Israel, Lebanon, Cyprus, Egypt, and part of present-day Libya. The sack of Rome led to the fall of the Western Roman Empire. While the Roman polity survived in the East, its ongoing evolution led historians by the 16th century to recognize use of the term Byzantine Empire to distinguish it from the unified Roman Empire (notwithstanding the period of the Tetrarchy). The Eastern Roman Empire reached its greatest extent in the 6th century under the emperor Justinian. The use of new military tactics and strategy, alliances, mercenary forces, and reforms in governance contributed to the survival of the Eastern Empire for further centuries, despite being greatly diminished in size after the Muslim conquest of the Levant. Surrounded by huge walls, Constantinople would be besieged repeatedly without success until the Fourth Crusade, after which the Empire would never recover, finally succumbing to the Ottoman Empire in 1453. The Sassanid era, encompassing the length of the Late Antiquity period, is considered to be one of the most important and influential historical periods in Iran. In many ways, the Sassanid period witnessed the highest achievement of Persian civilization and constituted the last great Iranian Empire before the Muslim conquest and adoption of Islam. Whereas the Romans were seen as the main aggressors against the Parthians, these roles were very much reversed by the Sassanids in their aggressiveness against the Romans and later the Byzantines. The Sassanids came to power on a wave of nationalism and pride. The first Shah of the Sassanid dynasty, Ardashir, promised to destroy the Hellenistic influence in Persia, avenge Darius III against the heirs of Alexander, and reconquer all the territories once held by the Achaemenid kings. The Shah saw the Romans as Persia's main enemy, and in the following wars that ensued, the Sassanids almost upheld the promises of Ardashir. Ardashir began his reign by conquering the few lands left under Parthian control as well as invading Armenia. He blamed the Romans for aiding the Armenians, who were a close ally to Rome, and in 230 invaded Mesopotamia and besieged Nisibis, however unsuccessfully, while his cavalry threatened Cappadocia and Syria. The Romans were shocked when they heard the Persians had invaded. They still thought of the Sassanids to be no different than the Parthians, however, the Sassanids were much different in terms of aggressiveness and nationalistic zeal and the Romans would soon realize this. The Romans sent a delegation to ask for Persian withdrawal, noting the past defeats of the Parthians by the Romans as a warning. Ardashir rejected and in 231 Rome mobilized for war under Severus Alexander, drawing troops from Egypt to the Black sea to form three massive armies. Rome's forces, under Emperor Alexander split up into three columns, one which went to Armenia (the left column), one which went to the Euphrates (the right column), and one that stayed in Mesopotamia, led by the emperor himself. Ardashir engaged the right column in battle, defeated it, and on this note, Alexander decided to end the war and retreated, although a peace treaty was never signed. In 233, after winning his wars in the east, Ardashir again invaded Rome, this time captured Nisibis and Carrhae. Ardashir extended the Persian Empire to Oxus in the north-east, to the Euphrates in the west, and on his death bed in 241, he passed on his crown to Shapur, who would carry on the war further into Rome. The Sassanid dynasty revived the old Achaemenid traditions, including Zoroastrianism, as Ardashir had promised. However, exhausting wars with Byzantium left the empire unready to face the Muslim armies from Arabia. The Ghassanids were Arab Christians that were established in Hauran, southern Syria. The term Ghassan refers to the kingdom of the Ghassanids, and supposedly means "a spring of water". The Ghassanid state was founded after king Jaffna bin ‘Amr emigrated with his family and retinue north and settled in Hauran (south of Damascus). The Ghassanid kingdom was an ally of the Byzantine Empire. More accurately the kings can be described as phylarchs, native rulers of subject frontier states. The capital was at Jabiyah in the Golan Heights. Geographically, it occupied much of Syria, Palestine and the northern Hijaz as far south as Yathrib (Medina). It acted as guardian of trade routes, policed Bedouin tribes and was a source of troops for the Byzantine army. The Ghassanid king al-Harith ibn Jabalah (reigned 529–569) supported the Byzantines against Sassanid Persia and was given the title patricius in 529 by the emperor Justinian I. Al-Harith was a Monophysite Christian; he helped to revive the Syrian Monophysite (Jacobite) Church and supported Monophysite development despite Orthodox Byzantium regarding it as heretical. Later Byzantine mistrust and persecution of such religious unorthodoxy brought down his successors, al-Mundhir (reigned 569–582) and Nu'man. The Ghassanids, who had successfully opposed the Persian allied Lakhmids of al-Hirah (Southern Iraq and Northern Arabia), prospered economically and engaged in much religious and public building; they also patronized the arts and at one time entertained the poets Nabighah adh-Dhubyani and Hassan ibn Thabit at their courts. Ghassan remained a Byzantine vassal state until its rulers were overthrown by the Muslims in the 7th century, following the Battle of Yarmuk. It was at this battle that some 12,000 Ghassanid Arabs defected to the Muslim side due to the Muslims offering to pay their arrears in wages. Their real power, however, had been destroyed by the Persian invasion in 614. Imru' al-Qais dreamt of a unified and independent Arab kingdom and, following that dream, he seized many cities in the Arabian Peninsula. He then formed a large army and developed the Kingdom as a naval power, which consisted of a fleet of ships operating along the Bahraini coast. From this position he attacked the coastal cities of Iran – which at that time was in civil war, due to a dispute as to the succession – even raiding the birthplace of the Sasanian kings, Fars province. In 325, the Persians, led by Shapur II, began a campaign against the Arab kingdoms. When Imru' al-Qais realised that a mighty Persian army composed of 60,000 warriors was approaching his kingdom, he asked for the assistance of the Roman Empire. Constantine promised to assist him but was unable to provide that help when it was needed. The Persians advanced toward Hira and a series of vicious battles took place around and in Hira and the surrounding cities. Shapur II's army defeated the Lakhmid army and captured Hira. In this, the young Shapur acted much more violently and slaughtered all the Arab men of the city and took the Arab woman and children as slaves.[citation needed] He then installed Aws ibn Qallam and retreated his army. Imru' al-Qais escaped to Bahrain, taking his dream of a unified Arab nation with him, and then to Syria seeking the promised assistance from Constantius II which never materialized, so he stayed there until he died. When he died he was entombed at al-Nimarah in the Syrian desert. Imru' al-Qais' funerary inscription is written in an extremely difficult type of script. Recently there has been a revival of interest in the inscription, and controversy has arisen over its precise implications. It is now certain that Imru' al-Qais claimed the title "King of all the Arabs" and also claimed in the inscription to have campaigned successfully over the entire north and centre of the peninsula, as far as the border of Najran. Two years after his death, in the year 330, a revolt took place where Aws ibn Qallam was killed and succeeded by the son of Imru' al-Qais, 'Amr. Thereafter, the Lakhmids' main rivals were the Ghassanids, who were vassals of the Sassanians' arch-enemy, the Roman Empire. The Lakhmid kingdom could have been a major centre of the Church of the East, which was nurtured by the Sassanians, as it opposed the Chalcedonian Christianity of the Romans. The Lakhmids remained influential throughout the sixth century. Nevertheless, in 602, the last Lakhmid king, al-Nu'man III ibn al-Mundhir, was put to death by the Sasanian emperor Khosrow II because of a false suspicion of treason, and the Lakhmid kingdom was annexed. It is now widely believed that the annexation of the Lakhmid kingdom was one of the main factors behind the fall of the Sasanian Empire and the Muslim conquest of Persia as the Sassanians were defeated in the Battle of Hira by Khalid ibn al-Walid. [clarification needed] At that point, the city was abandoned and its materials were used to reconstruct Kufa, its exhausted twin city. According to the Arab historian Abu ʿUbaidah (d. 824), Khosrow II was angry with the king, al-Nu'man III ibn al-Mundhir, for refusing to give him his daughter in marriage, and therefore imprisoned him. Subsequently, Khosrow sent troops to recover the Nu'man family armor, but Hani ibn Mas'ud (Nu'man's friend) refused, and the Arab forces of the Sasanian Empire were annihilated at the Battle of Dhi Qar, near al-Hirah, the capital of the Lakhmids, in 609. Hira stood just south of what is now the Iraqi city of Kufa. 632 CE – 1922 CE: Islamic Empires and other Caliphates According to Sunni Muslims, the first caliph was Abu Bakr Siddique, followed by Umar ibn al-Khattāb who was the first caliph to be called Amir al-Mu'minin and the second of the Four Rightly Guided Caliphs. Uthman ibn Affan and Ali ibn Abi Talib also were called by the same title, while the Shi'a consider Ali to have been the first truly legitimate caliph, although they concede that Ali accepted his predecessors because he eventually sanctioned Abu-Bakr. The rulers preceding these first four did not receive this title by consensus, and it was turned into a monarchy thereafter. After the first four caliphs, the Caliphate was claimed by dynasties such as the Umayyads, the Abbasids, and the Ottomans, and for relatively short periods by other, competing dynasties in al-Andalus, North Africa, and Egypt. Mustafa Kemal Atatürk officially abolished the last Caliphate, the Ottoman Empire, and founded the Republic of Turkey, in 1924. The Kings of Morocco still label themselves with the title Amir al-Mu'minin for the Moroccans, but lay no claim to the Caliphate. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-BillingtonRidge2001j2-106] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Myth] | [TOKENS: 7161] |
Contents Myth Myth is a genre of folklore consisting primarily of narratives that play a fundamental role in a society. For scholars, this is totally different from the ordinary sense of the term myth, meaning a belief that is not true, as the veracity of a piece of folklore is entirely irrelevant to determining whether it constitutes a myth. Myths are often endorsed by religious and secular authorities, and may be natural or supernatural in character. Many societies group their myths, legends, and history together, considering myths and legends to be factual accounts of their remote past. In particular, creation myths take place in a primordial age when the world had not achieved its later form. Origin myths explain how a society's customs, institutions, and taboos were established and sanctified. National myths are narratives about a nation's past that symbolize the nation's values. There is a complex relationship between recital of myths and the enactment of rituals. Etymology The word myth comes from Ancient Greek μῦθος (mȳthos), meaning 'speech', 'narrative', or 'fiction'. In turn, Ancient Greek μυθολογία (mythología 'story', 'legends', or 'story-telling') combines the word mȳthos with the suffix -λογία (-logia 'study'). Accordingly, Plato used mythología as a general term for fiction or story-telling of any kind. This word was adapted into other European languages in the early 19th century, in a much narrower sense, as a scholarly term for "[a] traditional story, especially one concerning the early history of a people or explaining a natural or social phenomenon, and typically involving supernatural beings or events." The Greek term mythología was then borrowed into Late Latin, occurring in the title of Latin author Fabius Planciades Fulgentius' 5th-century Mythologiæ to denote what is now referred to as classical mythology—i.e., Greco-Roman etiological stories involving their gods. Fulgentius's Mythologiæ explicitly treated its subject matter as allegories requiring interpretation and not as true events. The Latin term was then adopted in Middle French as mythologie. Whether from French or Latin usage, English adopted the word mythology in the 15th century, initially meaning 'the exposition of a myth or myths', 'the interpretation of fables', or 'a book of such expositions'. The word is first attested in John Lydgate's Troy Book (c. 1425). From Lydgate until the 17th or 18th century, mythology meant a moral, fable, allegory or a parable, or collection of traditional stories, understood to be false. It came eventually to be applied to similar bodies of traditional stories among other polytheistic cultures around the world. Thus mythology entered the English language before myth. Johnson's Dictionary, for example, has an entry for mythology, but not for myth. Indeed, the Greek loanword mythos (pl. mythoi) and Latinate mythus (pl. mythi) both appeared in English before the first example of myth in 1830. Protagonists and structure The main characters in myths are usually non-humans, such as gods, demigods, and other supernatural figures. Others include humans, animals, or combinations in their classification of myth. Stories of everyday humans, although often of leaders of some type, are usually contained in legends, as opposed to myths. Myths are sometimes distinguished from legends in that myths deal with gods, usually have no historical basis, and are set in a world of the remote past, very different from that of the present. Definitions Definitions of myth vary to some extent among scholars, though Finnish folklorist Lauri Honko offers a widely-cited definition: Myth, a story of the gods, a religious account of the beginning of the world, the creation, fundamental events, the exemplary deeds of the gods as a result of which the world, nature and culture were created together with all parts thereof and given their order, which still obtains. A myth expresses and confirms society's religious values and norms, it provides a pattern of behavior to be imitated, testifies to the efficacy of ritual with its practical ends and establishes the sanctity of cult. Another definition of myth comes from myth criticism theorist and professor José Manuel Losada. According to Cultural Myth Criticism, the studies of myth must explain and understand "myth from inside", that is, only "as a myth". Losada defines myth as "a functional, symbolic and thematic narrative of one or several extraordinary events with a transcendent, sacred and supernatural referent; that lacks, in principle, historical testimony; and that refers to an individual or collective, but always absolute, cosmogony or eschatology". According to the holistic myth research by assyriologist Annette Zgoll and classic philologist Christian Zgoll, "A myth can be defined as an Erzählstoff [narrative material] which is polymorphic through its variants and – depending on the variant – polystratic; an Erzählstoff in which transcending interpretations of what can be experienced are combined into a hyleme sequence with an implicit claim to relevance for the interpretation and mastering of the human condition." Scholars in other fields use the term myth in varied ways. In a broad sense, the word can refer to any traditional story, popular misconception or imaginary entity. Though myth and other folklore genres may overlap, myth is often thought to differ from genres such as legend and folktale in that neither are considered to be sacred narratives. Some kinds of folktales, such as fairy stories, are not considered true by anyone, and may be seen as distinct from myths for this reason. Main characters in myths are usually gods, demigods or supernatural humans, while legends generally feature humans as their main characters. Many exceptions and combinations exist, as in the Iliad, Odyssey and Aeneid. Moreover, as stories spread between cultures or as faiths change, myths can come to be considered folktales, their divine characters recast as either as humans or demihumans such as giants, elves and faeries. Conversely, historical and literary material may acquire mythological qualities over time. For example, the Matter of Britain (the legendary history of Great Britain, especially those focused on King Arthur and the knights of the Round Table) and the Matter of France, seem distantly to originate in historical events of the 5th and 8th centuries, respectively, and became mythologised over the following centuries. In colloquial use, myth can also be used of a collectively held belief that has no basis in fact, or any false story. This usage, which is often pejorative, arose from labelling the religious myths and beliefs of other cultures as incorrect, but it has spread to cover non-religious beliefs as well. As commonly used by folklorists and academics in other relevant fields, such as anthropology, myth has no implication whether the narrative may be understood as true or otherwise. Among biblical scholars of both the Old and New Testament, the word myth has a technical meaning, in that it usually refers to "describe the actions of the other‐worldly in terms of this world" such as the Creation and the Fall. Since myth is popularly used to describe stories that are not objectively true, the identification of a narrative as a myth can be highly controversial. Many religious adherents believe that the narratives told in their respective religious traditions are historical without question, and so object to their identification as myths while labelling traditional narratives from other religions as such. Hence, some scholars may label all religious narratives as "myths" for practical reasons, such as to avoid depreciating any one tradition because cultures interpret each other differently relative to one another. Other scholars may abstain from using the term myth altogether for purposes of avoiding placing pejorative overtones on sacred narratives. In present use, mythology usually refers to the collection of myths of a group of people. For example, Greek mythology, Roman mythology, Celtic mythology and Hittite mythology all describe the body of myths retold among those cultures. "Mythology" can also refer to the study of myths and mythologies.[citation needed] The compilation or description of myths is sometimes known as "mythography", a term also used for a scholarly anthology of myths or of the study of myths generally. Key mythographers in the Classical tradition include: Other prominent mythographies include the thirteenth-century Prose Edda attributed to the Icelander Snorri Sturluson, which is the main surviving survey of Norse Mythology from the Middle Ages. Jeffrey G. Snodgrass (professor of anthropology at the Colorado State University) has termed India's Bhats as mythographers. Myth criticism is a system of anthropological interpretation of culture created by French philosopher Gilbert Durand. Scholars have used myth criticism to explain the mythical roots of contemporary fiction, which means that modern myth criticism needs to be interdisciplinary. Traditional myth critics like Georges Dumézil, Hans Blumenberg, Kurt Hübner, and Pierre Brunel have created paradigmatic systems and clarified the meanings of myths within their original sources, evolution, and con texts. "This traditional approach to myth criticism proves effective when examining myths in pre-modern literature, identifying literary sources, tracing linguistic evolution, exploring intertextual relationships, and understanding social, psychological, and anthropological dimensions". Cultural Myth Criticism is a recent theoretical proposal in the myth criticism studies, set out in José Manuel Losada′s Mitocrítica cultural. Una definición del mito (2022) – a book in which he presents his own methodological, hermeneutic, and epistemological approach to myth. Drawing on mythopoetic perspectives, Cultural Myth Criticism takes a step further, incorporating the study of myth′s transcendent dimension (its function and its disappearance) in order to assess myth’s role as a mirror of contemporary culture. It examines the functions of myth and the factors that shape its study, and it develops the criteria for analysing it. The approach also takes into account the contemporary factors that affect the interpretation of mythology, such as globalisation, relativism, and immanence: Without abandoning symbolic analysis, Cultural Myth Criticism extends to all cultural manifestations and addresses the difficulties of understanding myth today. It studies mythical manifestations in fields such as literature, film and television, theater, sculpture, painting, video games, music, dancing, the Internet and other artistic fields. The approach is intended to adapt to the contemporary world. "[Cultural] Myth criticism, a discipline that studies myths (mythology contains them, like a pantheon its statues), is by nature interdisciplinary: it combines the contributions of literary theory, the history of literature, the fine arts and the new ways of dissemination in the age of communication. Likewise, it undertakes its object of study from its interrelation with other human and social sciences, in particular sociology, anthropology and economics. The need for an approach, for a methodology that allows us to understand the complexity of the myth and its manifestations in contemporary times, is justified." Cultural Myth Criticism should rest on five main pillars:: The difference between the classical myth criticism and Cultural Myth Criticism lies in their methodologies. The latter focuses on how contemporary factors shape our perception, seeking to "synthesise the evolution of myths within the complex contemporary context – tracking their origins, development, and processes of demystification and remythification". Losada therefore sets out to design a rigorous methodology for Cultural Myth Criticism that makes the process as inquisitive as possible. The aim is not only to analyse myth from a contemporary perspective, but also to analyse the contemporary world through myth – a bidirectional approach Because myth is sometimes used in a pejorative sense, some scholars have opted for mythos instead. "Mythos" now more commonly refers to its Aristotelian sense as a "plot point" or to a body of interconnected myths or stories, especially those belonging to a particular religious or cultural tradition. It is sometimes used specifically for modern, fictional mythologies, such as the world building of H. P. Lovecraft. Mythopoeia (mytho- + -poeia, 'I make myth') was termed by J. R. R. Tolkien, amongst others, to refer to the "conscious generation" of mythology. It was notoriously also suggested, separately, by Nazi ideologist Alfred Rosenberg.[citation needed] Mythemes can be defined as themes whose transcendent or supernatural dimension allows them to interact with other mythemes to form a myth. For a myth to take shape, at least two mythemes must combine in some way. For example, the legend of Melusine from Melusine; or, The Noble History of Lusignan by Jean d'Arras, consists of four mythemes: Myths often share mythemes, but their distribution and position differ. If two stories have exactly the same configuration of mythemes, they are considered the same myth in different variants. The term should not be confused with the storyline of a myth. A storyline may include elements that are not essential to the central structure, whereas a theme is necessary to it. To illustrate this, José Manuel Losada gives the example of the myth of Oedipus: the plague may be regarded as a plot event, while incest is a mytheme – an essential component of the myth. Prosopomyth is a neologism coined by José Manuel Losada which may be used interchangeably with the term mythological character. It designates "the character born as a myth in its essence". The term helps to explain why we use expressions such as "the Zeus myth" or "the Antigone myth" – the characters themselves have become myths in our minds. Prosopomyths (or mythological characters) may be divided into several groups: Myth narrates extraordinary events that occur in time, but outside the framework of modern historiography. These events unfold in a fictional realm that cannot be fixed within historical coordinates. This gap between myth and history often leads people to attribute mythic qualities to historical or literary figures. Hence the need to analyse the process of pseudo-mythification. As explained by José Manuel Losada, pseudo-mythification refers to a process in which a historical figure, a literary character, an animal, an object, or even an idea, is mistakingly perceived within the sacred transcendence characteristics of myth. That means that we may “mythify” the phenomenon of Taylor Swift´s fame or the character of James Bond. However, neither James Bond nor Taylor Swift are myths — they are pseudomythified literary or historical figures. For historical figures, the process means that they are stripped of their historical references (de-historicised) — the actual facts about them become forgotten. Through oral stories and tales, people create “myths”, for example a “myth” of Alexander the Great. In Cultural Myth Criticism, myth and religion are closely related but not interchangeable: both refer to a sacred transcendence, yet myth criticism approaches divinities and models of sanctity only as narrative figures within a symbolic discourse, without professing any canonical faith. From this perspective, calling a sacred narrative a ‘myth’ is not intended as a judgement about its historical truth, but as a descriptive, analytical label for its structure and function. Interpretations Comparative mythology is a systematic comparison of myths from different cultures. It seeks to discover underlying themes that are common to the myths of multiple cultures. In some cases, comparative mythologists use the similarities between separate mythologies to argue that those mythologies have a common source. This source may inspire myths or provide a common "protomythology" that diverged into the mythologies of each culture. A number of commentators have argued that myths function to form and shape society and social behaviour. Eliade argued that one of the foremost functions of myth is to establish models for behavior and that myths may provide a religious experience. By telling or reenacting myths, members of traditional societies detach themselves from the present, returning to the mythical age, thereby coming closer to the divine. Honko asserted that, in some cases, a society reenacts a myth in an attempt to reproduce the conditions of the mythical age. For example, it might reenact the healing performed by a god at the beginning of time in order to heal someone in the present. Similarly, Barthes argued that modern culture explores religious experience. Since it is not the job of science to define human morality, a religious experience is an attempt to connect with a perceived moral past, which is in contrast with the technological present. Pattanaik defines mythology as "the subjective truth of people communicated through stories, symbols and rituals." He says, "Facts are everybody's truth. Fiction is nobody's truth. Myths are somebody's truth." One theory claims that myths are distorted accounts of historical events. According to this theory, storytellers repeatedly elaborate upon historical accounts until the figures in those accounts gain the status of gods. For example, the myth of the wind-god Aeolus may have evolved from a historical account of a king who taught his people to use sails and interpret the winds. Herodotus (fifth-century BCE) and Prodicus made claims of this kind. This theory is named euhemerism after mythologist Euhemerus (c. 320 BCE), who suggested that Greek gods developed from legends about humans. Some theories propose that myths began as allegories for natural phenomena: Apollo represents the sun, Poseidon represents water, and so on. According to another theory, myths began as allegories for philosophical or spiritual concepts: Athena represents wise judgment, Aphrodite romantic desire, and so on. Müller supported an allegorical theory of myth. He believed myths began as allegorical descriptions of nature and gradually came to be interpreted literally. For example, a poetic description of the sea as "raging" was eventually taken literally and the sea was then thought of as a raging god. Some thinkers claimed that myths result from the personification of objects and forces. According to these thinkers, the ancients worshiped natural phenomena, such as fire and air, gradually deifying them. For example, according to this theory, ancients tended to view things as gods, not as mere objects. Thus, they described natural events as acts of personal gods, giving rise to myths. According to the myth-ritual theory, myth is tied to ritual. In its most extreme form, this theory claims myths arose to explain rituals. This claim was first put forward by Smith, who argued that people begin performing rituals for reasons not related to myth. Forgetting the original reason for a ritual, they account for it by inventing a myth and claiming the ritual commemorates the events described in that myth. James George Frazer—author of The Golden Bough, a book on the comparative study of mythology and religion—argued that humans started out with a belief in magical rituals; later, they began to lose faith in magic and invented myths about gods, reinterpreting their rituals as religious rituals intended to appease the gods. Academic discipline history Historically, important approaches to the study of mythology have included those of Vico, Schelling, Schiller, Jung, Freud, Lévy-Bruhl, Lévi-Strauss, Frye, the Soviet school, and the Myth and Ritual School. The critical interpretation of myth began with the Presocratics. Euhemerus was one of the most important pre-modern mythologists. He interpreted myths as accounts of actual historical events, though distorted over many retellings. Sallustius divided myths into five categories: Plato condemned poetic myth when discussing education in the Republic. His critique was primarily on the grounds that the uneducated might take the stories of gods and heroes literally. Nevertheless, he constantly referred to myths throughout his writings. As Platonism developed in the phases commonly called Middle Platonism and neoplatonism, writers such as Plutarch, Porphyry, Proclus, Olympiodorus, and Damascius wrote explicitly about the symbolic interpretation of traditional and Orphic myths. Mythological themes were consciously employed in literature, beginning with Homer. The resulting work may expressly refer to a mythological background without itself becoming part of a body of myths (Cupid and Psyche). Medieval romance in particular plays with this process of turning myth into literature. Euhemerism, as stated earlier, refers to the rationalization of myths, putting themes formerly imbued with mythological qualities into pragmatic contexts. An example of this would be following a cultural or religious paradigm shift (notably the re-interpretation of pagan mythology following Christianization). Interest in polytheistic mythology revived during the Renaissance, with early works of mythography appearing in the sixteenth century, among them the Theologia Mythologica (1532). The first modern, Western scholarly theories of myth appeared during the second half of the 19th century—at the same time as "myth" was adopted as a scholarly term in European languages. They were driven partly by a new interest in Europe's ancient past and vernacular culture, associated with Romantic Nationalism and epitomised by the research of Jacob Grimm (1785–1863). This movement drew European scholars' attention not only to Classical myths, but also material now associated with Norse mythology, Finnish mythology, and so forth. Western theories were also partly driven by Europeans' efforts to comprehend and control the cultures, stories and religions they were encountering through colonialism. These encounters included both extremely old texts such as the Sanskrit Rigveda and the Sumerian Epic of Gilgamesh, and current oral narratives such as mythologies of the indigenous peoples of the Americas or stories told in traditional African religions. The intellectual context for nineteenth-century scholars was profoundly shaped by emerging ideas about evolution. These ideas included the recognition that many Eurasian languages—and therefore, conceivably, stories—were all descended from a lost common ancestor (the Indo-European language) which could rationally be reconstructed through the comparison of its descendant languages. They also included the idea that cultures might evolve in ways comparable to species. In general, 19th-century theories framed myth as a failed or obsolete mode of thought, often by interpreting myth as the primitive counterpart of modern science within a unilineal framework that imagined that human cultures are travelling, at different speeds, along a linear path of cultural development. One of the dominant mythological theories of the latter 19th century was nature mythology, the foremost exponents of which included Max Müller and Edward Burnett Tylor. This theory posited that "primitive man" was primarily concerned with the natural world. It tended to interpret myths that seemed distasteful to European Victorians—such as tales about sex, incest, or cannibalism—as metaphors for natural phenomena like agricultural fertility. Unable to conceive impersonal natural laws, early humans tried to explain natural phenomena by attributing souls to inanimate objects, thus giving rise to animism. According to Tylor, human thought evolved through stages, starting with mythological ideas and gradually progressing to scientific ideas. Müller also saw myth as originating from language, even calling myth a "disease of language". He speculated that myths arose due to the lack of abstract nouns and neuter gender in ancient languages. Anthropomorphic figures of speech, necessary in such languages, were eventually taken literally, leading to the idea that natural phenomena were in actuality conscious or divine. Not all scholars, not even all 19th-century scholars, accepted this view. Lucien Lévy-Bruhl claimed that "the primitive mentality is a condition of the human mind and not a stage in its historical development." Recent scholarship, noting the fundamental lack of evidence for "nature mythology" interpretations among people who actually circulated myths, has likewise abandoned the key ideas of "nature mythology". Frazer saw myths as a misinterpretation of magical rituals, which were themselves based on a mistaken idea of natural law. This idea was central to the "myth and ritual" school of thought. According to Frazer, humans begin with an unfounded belief in impersonal magical laws. When they realize applications of these laws do not work, they give up their belief in natural law in favor of a belief in personal gods controlling nature, thus giving rise to religious myths. Meanwhile, humans continue practicing formerly magical rituals through force of habit, reinterpreting them as reenactments of mythical events. Finally, humans come to realize nature follows natural laws, and they discover their true nature through science. Here again, science makes myth obsolete as humans progress "from magic through religion to science." Segal asserted that by pitting mythical thought against modern scientific thought, such theories imply modern humans must abandon myth. The earlier 20th century saw major work developing psychoanalytical approaches to interpreting myth, led by Sigmund Freud, who, drawing inspiration from Classical myth, began developing the concept of the Oedipus complex in his 1899 The Interpretation of Dreams. Jung likewise tried to understand the psychology behind world myths. Jung asserted that all humans share certain innate unconscious psychological forces, which he called archetypes. He believed similarities between the myths of different cultures reveals the existence of these universal archetypes. The mid-20th century saw the influential development of a structuralist theory of mythology, led by Lévi-Strauss. Strauss argued that myths reflect patterns in the mind and interpreted those patterns more as fixed mental structures, specifically pairs of opposites (good/evil, compassionate/callous), rather than unconscious feelings or urges. Meanwhile, Bronislaw Malinowski developed analyses of myths focusing on their social functions in the real world. He is associated with the idea that myths such as origin stories might provide a "mythic charter"—a legitimisation—for cultural norms and social institutions. Thus, following the Structuralist Era (c. 1960s–1980s), the predominant anthropological and sociological approaches to myth increasingly treated myth as a form of narrative that can be studied, interpreted, and analyzed like ideology, history, and culture. In other words, myth is a form of understanding and telling stories that are connected to power, political structures, and political and economic interests.[citation needed] These approaches contrast with approaches, such as those of Joseph Campbell and Eliade, which hold that myth has some type of essential connection to ultimate sacred meanings that transcend cultural specifics. In particular, myth was studied in relation to history from diverse social sciences. Most of these studies share the assumption that history and myth are not distinct in the sense that history is factual, real, accurate, and truth, while myth is the opposite.[citation needed] In the 1950s, Barthes published a series of essays examining modern myths and the process of their creation in his book Mythologies, which stood as an early work in the emerging post-structuralist approach to mythology, which recognised myths' existence in the modern world and in popular culture. The 20th century saw rapid secularization in Western culture. This made Western scholars more willing to analyse narratives in the Abrahamic religions as myths; theologians such as Rudolf Bultmann argued that a modern Christianity needed to demythologize; and other religious scholars embraced the idea that the mythical status of Abrahamic narratives was a legitimate feature of their importance. This, in his appendix to Myths, Dreams and Mysteries, and in The Myth of the Eternal Return, Eliade attributed modern humans' anxieties to their rejection of myths and the sense of the sacred.[citation needed] The Christian theologian Conrad Hyers wrote: [M]yth today has come to have negative connotations which are the complete opposite of its meaning in a religious context... In a religious context, myths are storied vehicles of supreme truth, the most basic and important truths of all. By them, people regulate and interpret their lives and find worth and purpose in their existence. Myths put one in touch with sacred realities, the fundamental sources of being, power, and truth. They are seen not only as being the opposite of error but also as being clearly distinguishable from stories told for entertainment and from the workaday, domestic, practical language of a people. They provide answers to the mysteries of being and becoming, mysteries which, as mysteries, are hidden, yet mysteries which are revealed through story and ritual. Myths deal not only with truth but with ultimate truth. Both in 19th-century research, which tended to see existing records of stories and folklore as imperfect fragments of partially lost myths, and in 20th-century structuralist work, which sought to identify underlying patterns and structures in often diverse versions of a given myth, there had been a tendency to synthesise sources to attempt to reconstruct what scholars supposed to be more perfect or underlying forms of myths. From the late 20th century, researchers influenced by postmodernism tended instead to argue that each account of a given myth has its own cultural significance and meaning, and argued that rather than representing degradation from a once more perfect form, myths are inherently plastic and variable. There is, consequently, no such thing as the 'original version' or 'original form' of a myth. One prominent example of this movement was A. K. Ramanujan's essay "Three Hundred Ramayanas". Correspondingly, scholars challenged the precedence that had once been given to texts as a medium for mythology, arguing that other media, such as the visual arts or even landscape and place-naming, could be as or more important. Myths are not texts, but narrative materials (Erzählstoffe) that can be adapted in various media (such as epics, hymns, handbooks, movies, dances, etc.).. In contrast to other academic approaches, which primarily focus on the (social) function of myths, hylistic myth research aims to understand myths and their nature out of themselves. As part of the Göttingen myth research, Annette and Christian Zgoll developed the method of hylistics (narrative material research) to extract mythical materials from their media and make possible a transmedial comparison. The content of the medium is broken down into the smallest possible plot components (hylemes), which are listed in standardized form (so-called hyleme analysis). Inconsistencies in content can indicate stratification, i.e. the overlapping of several materials, narrative variants and edition layers within the same medial concretion. To a certain extent, this can also be used to reconstruct earlier and alternative variants of the same material that were in competition and/or were combined with each other. The juxtaposition of hyleme sequences enables the systematic comparison of different variants of the same material or several different materials that are related or structurally similar to each other. In his overall presentation of the hundred-year history of myth research, the classical philologist and myth researcher Udo Reinhardt mentions Christian Zgoll's basic work Tractatus mythologicus as "the latest handbook on myth theory" with "outstanding significance" for modern myth research. Modernity Scholars in the field of cultural studies research how myth has worked itself into modern discourses. Mythological discourse can reach greater audiences than ever before via digital media. Various mythic elements appear in popular culture, as well as television, cinema and video games. Although myth was traditionally transmitted through the oral tradition on a small scale, the film industry has enabled filmmakers to transmit myths to large audiences via film. In Jungian psychology, myths are the expression of a culture or society's goals, fears, ambitions and dreams. The basis of modern visual storytelling is rooted in the mythological tradition. Many contemporary films rely on ancient myths to construct narratives. The Walt Disney Company is well known among cultural study scholars for "reinventing" traditional childhood myths. While few films are as obvious as Disney fairy tales, the plots of many films are based on the rough structure of myths. Mythological archetypes, such as the cautionary tale regarding the abuse of technology, battles between gods and creation stories, are often the subject of major film productions. These films are often created under the guise of cyberpunk action films, fantasy, dramas and apocalyptic tales. 21st-century films such as Clash of the Titans, Immortals and Thor continue the trend of using traditional mythology to frame modern plots. Authors use mythology as a basis for their books, such as Rick Riordan, whose Percy Jackson and the Olympians series is situated in a modern-day world where the Greek deities are manifest. Scholars, particularly those within the field of fan studies, and fans of popular culture have also noted a connection between fan fiction and myth. Ika Willis identified three models of this: fan fiction as a reclaiming of popular stories from corporations, myth as a means of critiquing or dismantling hegemonic power, and myth as "a commons of story and a universal story world". Willis supports the third model, a universal story world, and argues that fanfiction can be seen as mythic due to its hyperseriality—a term invented by Sarah Iles Johnston to describe a hyperconnected universe in which characters and stories are interwoven. In an interview for the New York Times, Henry Jenkins stated that fanfiction 'is a way of the culture repairing the damage done in a system where contemporary myths are owned by corporations instead of owned by the folk.' The impact of contemporary factors on myths is recognised by Cultural Myth Criticism. The approach not only examines myths within today's world, it also uses them to study contemporary culture. It identifies globalisation as a key modern factor influencing the study of myth. Since globalisation tends towards unification, it poses a challenge to myths, which are rooted in specific cultural circles. Although mythological exchange between culture has long existed, it was comparatively limited in relation to globalisation. As myths are increasingly incorporated into audiovisual productions and therefore shared with vast numbers of users, their largely uncontrolled circulation across cultures may distort them and, in some cases, render them unrecognisable. Another modern factor highlighted by the approach is the doxa of relativism. This belief holds that truths are relative and dependent on one′s background. According to José Manuel Losada, it challenges ″the traditional and absolute frameworks of myth, which often postulate universal truths and principles that transcend individual and cultural variations″. As a result, the stability of traditional mythic meanings of myth may be endangered by modern philosophical thoughts that tend to recotextualise the narratives. A final modern factor is immanence. It complicates the interpretation of myths by filtering them through the material and emorional horizons of the present. The modern world often brackets or denies myth′s inherent sacred transcendence, favouring immanent readings that reduce these narratives to what is empirically verifiable or psychologically explainable. See also Notes Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/History_of_Iran] | [TOKENS: 23608] |
Contents History of Iran The history of Iran (also known as Persia) is intertwined with Greater Iran, which is a region encompassing all of the areas that have witnessed significant settlement or influence by the Iranian peoples and the Iranian languages – chiefly the Persians and the Persian language. Central to this region is the Iranian plateau, now covered by modern Iran. The most pronounced impact of Iranian history can be seen stretching from Anatolia in the west to the Indus Valley in the east, including the Levant, Mesopotamia, the Caucasus, and majority of Central Asia. It also stands in connection with the histories of many other major civilizations, such as India, China, Greece, Rome, and Egypt. Tradition recorded in The Story of Civilization notes that the name Ayranm Vujaa was used about 12000 years ago, and that Iran was originally known by this ancient designation, reflecting the deep antiquity of its cultural identity. Iran is home to one of the world's oldest continuous major civilizations, with historical and urban settlements dating back to the 5th millennium BC. The Iranian plateau's western regions were home to the Elamites (in Ilam and Khuzestan), the Kassites (in Kuhdesht), the Gutians (in Luristan), and later to other peoples like the Urartians (in Oshnavieh and Sardasht) near Lake Urmia and the Mannaeans (in Piranshahr, Saqqez and Bukan) in Kurdistan. German philosopher Georg Wilhelm Friedrich Hegel called the Persians the "first Historical People" in his Lectures on the Philosophy of World History. The sustained Iranian empire is understood to have begun with the rise of the Medes during the Iron Age, when Iran was unified as a nation under the Median kingdom in the 7th century BC. By 550 BC, the Medes were sidelined by the conquests of Cyrus the Great, who brought the Persians to power with the establishment of the Achaemenid Empire. Cyrus' ensuing campaigns enabled the Persian realm's expansion across most of West Asia and much of Central Asia, and his successors would eventually conquer parts of Southeast Europe and North Africa to preside over the largest empire the world had yet seen. In the 4th century BC, the Achaemenid Empire was conquered by the Macedonian Empire of Alexander the Great, whose death led to the establishment of the Seleucid Empire over the bulk of former Achaemenid territory. In the following century, Greek rule of the Iranian plateau came to an end with the rise of the Parthian Empire, which also conquered large parts of the Seleucids' Anatolian, Mesopotamian, and Central Asian holdings. While the Parthians were succeeded by the Sasanian Empire in the 2nd century, Iran remained a leading power for the next millennium, although the majority of this period was marked by the Roman–Persian Wars. In the 7th century, the Muslim conquest of Iran resulted in the Sasanian Empire's annexation by the Rashidun Caliphate and the beginning of the Islamization of Iran. In spite of invasions by foreign powers, such as the Greeks, Arabs, Turks, and Mongols, among others, the Iranian national identity was repeatedly asserted and perserved, allowing it to develop as a distinct political and cultural entity, massively influencing its invaders. While the early Muslim conquests had caused the decline of Zoroastrianism, which had been Iran's majority and official religion up to that point, the achievements of prior Iranian civilizations were absorbed into the nascent Islamic empires and expanded upon during the Islamic Golden Age. Nomadic Turkic tribes overran parts of the Iranian plateau during the Late Middle Ages and into the early modern period, negatively impacting the region. By 1501, however, the nation was reunified by the Safavid dynasty, which initiated Iranian history's most momentous religious change since the original Muslim conquest by converting Iran to Shia Islam. Iran again emerged as a leading world power, especially in rivalry with the Turkish-ruled Ottoman Empire. In the 19th century, Iran came into conflict with the Russian Empire, which annexed the South Caucasus by the end of the Russo-Persian Wars. The Safavid era (1501–1736) is becoming more recognized as an important time in Iran's history by scholars in both Iran and the West. In 1501, the Safavid dynasty became the first local dynasty to rule all of Iran since the Arabs overthrew the Sasanian Empire in the 7th century. For eight and a half centuries, Iran was mostly just a geographical area with no independent government, ruled by various foreign powers—Arabs, Turks, Mongols, and Tartars. The Mongol invasions in the 13th century were a turning point in Iran's history and in Islam. The Mongols destroyed the historical caliphate, which had been a symbol of unity for the Islamic world for 600 years. During the long foreign rule, Iranians kept their unique culture and national identity, and they used this chance to regain their political independence. In the 1940s, there were hopes that Iran could become a constitutional monarchy, but a 1953 coup aided by U.S. and U.K. removed the elected prime minister, and Iran was ruled as an autocracy under the Shah with American support from that time until the revolution. The Iranian monarchy lasted until the Islamic Revolution in 1979, when the country was officially declared an Islamic republic. Since then, it has experienced significant political, social, and economic changes. The establishment of an Islamic republic led to a major restructuring of the country's political system. Iran's foreign relations have been shaped by regional conflicts, beginning with the Iran–Iraq War and persisting through many Arab countries; ongoing tensions with Israel, the United States, and the Western world; and the Iranian nuclear program, which has been a point of contention in international diplomacy. Despite international sanctions and internal challenges, Iran remains a key player in regional and global geopolitics. Prehistory The earliest archaeological artifacts in Iran were found in the Kashafrud and Ganj Par sites that are thought to date back to 100,000 years ago in the Middle Paleolithic. Mousterian stone tools made by Neanderthals have also been found. There are more cultural remains of Neanderthals dating back to the Middle Paleolithic period, which mainly have been found in the Zagros region and fewer in central Iran at sites such as Kobeh, Kunji, Bisitun Cave, Tamtama, Warwasi, and Yafteh Cave. In 1949, a Neanderthal radius was discovered by Carleton S. Coon in Bisitun Cave. Evidence for Upper Paleolithic and Epipaleolithic periods are known mainly from the Zagros Mountains in the caves of Kermanshah and Khorramabad and a few number of sites in Piranshahr, Alborz and Central Iran. During this time, people began creating rock art in Iran. Early agricultural communities such as Chogha Golan in the 11th millennium BC along with settlements such as Chogha Bonut (the earliest village in Elam) in the 9th millennium BC began to flourish in and around the Zagros Mountains. Around about the same time, the earliest-known clay vessels and modelled human and animal terracotta figurines were produced at Ganj Dareh. There are 10,000-year-old human and animal figurines from Tepe Sarab in Kermanshah province among many other ancient artefacts. The south-western part of Iran was part of the Fertile Crescent where most of humanity's first major crops were grown, in villages such as Susa (where a settlement was first founded possibly as early as 4395 BC) and settlements such as Chogha Mish, dating back to 6800 BC; there are 7,000-year-old jars of wine excavated in the Zagros Mountains (now on display at the University of Pennsylvania) and ruins of 7,000-year-old settlements such as Tepe Sialk are further testament to that. The two main Neolithic Iranian settlements were Ganj Dareh and the hypothetical Zayandeh River Culture. The Kura–Araxes culture (circa 3400 BC—ca. 2000 BC) stretched from northwestern Iran up into the neighbouring regions of the Caucasus and Anatolia. Susa is one of the oldest-known settlements of Iran and the world. The general perception among archaeologists is that Susa was an extension of the Sumerian city-state of Uruk, hence incorporating many aspects of Mesopotamian culture.[page needed] In its later history, Susa became the capital of Elam, which emerged as a state founded 4000 BC. There are also dozens of prehistoric sites across the Iranian plateau pointing to the existence of ancient cultures and urban settlements in the fourth millennium BC. One of the earliest civilizations on the Iranian plateau was the Jiroft culture in southeastern Iran in the province of Kerman. Iran is one of the most artefact-rich archaeological sites in the Middle East. Archaeological excavations in Jiroft led to the discovery of several objects belonging to the 4th millennium BC. There is a large quantity of objects decorated with highly distinctive engravings of animals, mythological figures, and architectural motifs. The objects and their iconography are considered unique. Many are made from chlorite, a grey-green soft stone; others are in copper, bronze, terracotta, and even lapis lazuli. Recent excavations at the sites have produced the world's earliest inscription which pre-dates Mesopotamian inscriptions. There are records of numerous other ancient civilizations on the Iranian plateau before the emergence of Iranian peoples during the Early Iron Age. The Early Bronze Age saw the rise of urbanization into organized city-states and the invention of writing (the Uruk period) in the Near East. While Bronze Age Elam made use of writing from an early time, the Proto-Elamite script remains undeciphered, and records from Sumer pertaining to Elam are scarce. Russian historian Igor M. Diakonoff states that the modern inhabitants of Iran are descendants of mainly non-Indo-European groups, more specifically of pre-Iranic inhabitants of the Iranian Plateau.[a] Records become more tangible with the rise of the Neo-Assyrian Empire and its records of incursions from the Iranian plateau. As early as the 20th century BC, tribes came to the Iranian plateau from the Pontic–Caspian steppe. The arrival of Iranians on the Iranian plateau forced the Elamites to relinquish one area of their empire after another and to take refuge in Elam, Khuzestan and the nearby area, which only then became coterminous with Elam. Bahman Firuzmandi says that the southern Iranians might be intermixed with the Elamite peoples living in the plateau. By the mid-1st millennium BC, Medes, Persians, and Parthians populated the Iranian plateau. Until the rise of the Medes, they all remained under Assyrian domination, like the rest of the Near East. In the first half of the 1st millennium BC, parts of what is now Iranian Azerbaijan were incorporated into Urartu. Classical antiquity In 646 BC, Assyrian king Ashurbanipal sacked Susa, which ended Elamite supremacy in the region. For over 150 years Assyrian kings of nearby northern Mesopotamia had been wanting to conquer Median tribes of western Iran. Under pressure from Assyria, the small kingdoms of the western Iranian plateau coalesced into increasingly larger and more centralized states. In the second half of the 7th century BC, the Medes gained their independence and were united by Deioces. In 612 BC, Cyaxares, Deioces' grandson, and the Babylonian king Nabopolassar invaded Assyria and laid siege to and eventually destroyed Nineveh, the Assyrian capital, which led to the fall of the Neo-Assyrian Empire. Urartu was later on conquered and dissolved as well by the Medes. The Medes are credited with founding Iran as a nation and empire, and established the first Iranian empire, the largest of its day until Cyrus the Great established a unified empire of the Medes and Persians, leading to the Achaemenid Empire (c. 550–330 BC). Cyrus the Great overthrew, in turn, the Median, Lydian, and Neo-Babylonian empires, creating an empire far larger than Assyria. He was better able, through more benign policies, to reconcile his subjects to Persian rule; the longevity of his empire was one result. The Persian king, like the Assyrian, was also "King of Kings", xšāyaθiya xšāyaθiyānām (shāhanshāh in modern Persian) – "great king", Megas Basileus, as known by the Greeks. Cyrus's son, Cambyses II, conquered the last major power of the region, ancient Egypt, causing the collapse of the Twenty-sixth Dynasty of Egypt. Since he became ill and died before, or while, leaving Egypt, stories developed, as related by Herodotus, that he was struck down for impiety against the ancient Egyptian deities. After the death of Cambyses II, Darius ascended the throne by overthrowing the legitimate Achaemenid monarch Bardiya, and then quelling rebellions throughout his kingdom. As the winner, Darius based his claim on membership in a collateral line of the Achaemenid Empire. Darius' first capital was at Susa, and he started the building program at Persepolis. He rebuilt a canal between the Nile and the Red Sea, a forerunner of the modern Suez Canal. He improved the extensive road system, and it is during his reign that mentions are first made of the Royal Road (shown on map), a great highway stretching all the way from Susa to Sardis with posting stations at regular intervals. Major reforms took place under Darius. Coinage, in the form of the daric (gold coin) and the shekel (silver coin), was standardized (coinage had been invented over a century before in Lydia c. 660 BC but not standardized), and administrative efficiency increased. The Old Persian language appears in royal inscriptions, written in a specially adapted version of the cuneiform script. Under Cyrus the Great and Darius, the Persian Empire eventually became the largest empire in human history up until that point, ruling and administrating over most of the known world, as well as spanning the continents of Europe, Asia, and Africa. The greatest achievement was the empire itself. The Persian Empire represented the world's first superpower that was based on a model of tolerance and respect for other cultures and religions. In the late 6th century BC, Darius launched his European campaign, in which he defeated the Paeonians, conquered Thrace, and subdued all coastal Greek cities, as well as defeating the European Scythians around the Danube river. In 512/511 BC, Macedon became a vassal kingdom of Persia. In 499 BC, Athens lent support to a revolt in Miletus, which resulted in the sacking of Sardis. This led to an Achaemenid campaign against mainland Greece known as the Greco-Persian Wars, which lasted the first half of the 5th century BC, and is known as one of the most important wars in European history. In the First Persian invasion of Greece, the Persian general Mardonius re-subjugated Thrace and made Macedon a full part of Persia. The war eventually turned out in defeat, however. Darius' successor Xerxes I launched the Second Persian invasion of Greece. At a crucial moment in the war, about half of mainland Greece was overrun by the Persians, including all territories to the north of the Isthmus of Corinth, however, this was also turned out in a Greek victory, following the battles of Plataea and Salamis, by which Persia lost its footholds in Europe, and eventually withdrew from it. During the Greco-Persian wars, the Persians gained major territorial advantages. They captured and razed Athens twice, once in 480 BC and again in 479 BC. However, after a string of Greek victories the Persians were forced to withdraw, thus losing control of Macedonia, Thrace and Ionia. Fighting continued for several decades after the successful Greek repelling of the Second Invasion with numerous Greek city-states under the Athens' newly formed Delian League, which eventually ended with the peace of Callias in 449 BC, ending the Greco-Persian Wars. In 404 BC, following the death of Darius II, Egypt rebelled under Amyrtaeus. Later pharaohs successfully resisted Persian attempts to reconquer Egypt until 343 BC, when Egypt was reconquered by Artaxerxes III. From 334 BC to 331 BC, Alexander the Great defeated Darius III in the battles of Granicus, Issus and Gaugamela, swiftly conquering the Achaemenid Empire by 331 BC. Alexander's empire broke up shortly after his death, and Alexander's general, Seleucus I Nicator, tried to take control of Iran, Mesopotamia, and later Syria and Anatolia. His empire was the Seleucid Empire. He was killed in 281 BC by Ptolemy Keraunos. The Parthian Empire—ruled by the Parthians, a group of northwestern Iranian people—was the realm of the Arsacid dynasty. This latter reunited and governed the Iranian plateau after the Parni conquest of Parthia and defeating the Seleucid Empire in the late 3rd century BC. It intermittently controlled Mesopotamia between c. 150 BC and 224 AD and absorbed Eastern Arabia. Parthia was the eastern arch-enemy of the Roman Empire, and it limited Rome's expansion beyond Cappadocia (central Anatolia). The Parthian armies included two types of cavalry: the heavily armed and armored cataphracts and the lightly armed but highly-mobile mounted archers. For the Romans, who relied on heavy infantry, the Parthians were too hard to defeat, as both types of cavalry were much faster and more mobile than foot soldiers. The Parthian shot used by the Parthian cavalry was most notably feared by the Roman soldiers, which proved pivotal in the crushing Roman defeat at the Battle of Carrhae. On the other hand, the Parthians found it difficult to occupy conquered areas as they were unskilled in siege warfare. Because of these weaknesses, neither the Romans nor the Parthians were able completely to annex each other's territory. The Parthian empire subsisted for five centuries, longer than most Eastern Empires. The end of this empire came at last in 224 AD, when the empire's organization had loosened and the last king was defeated by one of the empire's vassal peoples, the Persians under the Sasanians. However, the Arsacid dynasty continued to exist for centuries onwards in Armenia, the Iberia, and the Caucasian Albania, which were all eponymous branches of the dynasty. The first shah of the Sasanian Empire, Ardashir I, started reforming the country economically and militarily. For a period of more than 400 years, Iran was once again one of the leading powers in the world, alongside its neighbouring rival, the Roman and then Byzantine Empires. The empire's territory, at its height, encompassed all of today's Iran, Iraq, Azerbaijan, Armenia, Georgia, Abkhazia, Dagestan, Lebanon, Jordan, Palestine, Israel, parts of Afghanistan, Turkey, Syria, parts of Pakistan, Central Asia, Eastern Arabia, and parts of Egypt. Most of the Sasanian Empire's lifespan was overshadowed by the frequent Byzantine–Sasanian wars, a continuation of the Roman–Parthian Wars and the all-comprising Roman–Persian Wars; the last was the longest-lasting conflict in human history. Started in the first century BC by their predecessors, the Parthians, and Romans, the last Roman–Persian War was fought in the seventh century. The Persians defeated the Romans at the Battle of Edessa in 260 and took emperor Valerian prisoner for the remainder of his life. Eastern Arabia was conquered early on. During Khosrow II's rule in 590–628, Egypt, Jordan, Palestine and Lebanon were also annexed to the Empire. The Sassanians called their empire Erânshahr ("Dominion of the Aryans", i.e., of Iranians). A chapter of Iran's history followed after roughly 600 years of conflict with the Roman Empire. During this time, the Sassanian and Romano-Byzantine armies clashed for influence in Anatolia, the western Caucasus (mainly Lazica and the Kingdom of Iberia; modern-day Georgia and Abkhazia), Mesopotamia, Armenia and the Levant. Under Justinian I, the war came to an uneasy peace with payment of tribute to the Sassanians. However, the Sasanians used the deposition of the Byzantine emperor Maurice as a casus belli to attack the Empire. After many gains, the Sassanians were defeated at Issus, Constantinople, and finally Nineveh, resulting in peace. With the conclusion of the over 700 years lasting Roman–Persian Wars through the climactic Byzantine–Sasanian War of 602–628, which included the very siege of the Byzantine capital of Constantinople, the war-exhausted Persians lost the Battle of al-Qādisiyyah (632) in Hilla (present-day Iraq) to the invading Muslim forces. The Sasanian era, encompassing the length of Late Antiquity, is considered to be one of the most important and influential historical periods in Iran, and had a major impact on the world. In many ways, the Sassanian period witnessed the highest achievement of Persian civilization and constitutes the last great Iranian Empire before the adoption of Islam. Persia influenced Roman civilization considerably during Sassanian times,[incomplete short citation] their cultural influence extending far beyond the empire's territorial borders, reaching as far as Western Europe,[incomplete short citation] Africa, China and India and also playing a prominent role in the formation of both European and Asiatic medieval art. This influence carried forward to the Muslim world. The dynasty's unique and aristocratic culture transformed the Islamic conquest and destruction of Iran into a Persian Renaissance.[incomplete short citation] Much of what later became known as Islamic culture, architecture, writing, and other contributions to civilization, were taken from the Sassanian Persians into the broader Muslim world. Medieval period In 633, immediately following a civil war and during the reign of the Sasanian king Yazdegerd III, the Muslims under Umar invaded Iran. Several Iranian nobles and families such as king Dinar of the House of Karen, and later Kanarangiyans of Khorasan, mutinied against their Sasanian overlords. Although the House of Mihran had claimed the Sasanian throne under the two prominent generals Bahram Chobin and Shahrbaraz, it remained loyal to the Sasanians during its struggle against the Arabs, but the Mihrans were eventually betrayed and defeated by their own kinsmen, the House of Ispahbudhan, under their leader Farrukhzad, who had mutinied against Yazdegerd III. Yazdegerd III fled from one district to another until a local miller killed him for his purse at Merv in 651. By 674, Muslims had conquered Khorasan (which included Khorasan province and modern Afghanistan and parts of Transoxiana). The Muslim conquest of Persia ended the Sasanian Empire and led to the eventual decline of the Zoroastrian religion in Persia. Over time, the majority of Iranians converted to Islam. Most of the aspects of the previous Persian civilizations were not discarded but were absorbed by the new Islamic polity. As Bernard Lewis has commented: These events have been variously seen in Iran: by some as a blessing, the advent of the true faith, the end of the age of ignorance and heathenism; by others as a humiliating national defeat, the conquest and subjugation of the country by foreign invaders. Both perceptions are of course valid, depending on one's angle of vision. After the fall of the Sasanian Empire in 651, the Arabs of the Umayyad Caliphate adopted many Persian customs, especially the administrative and the court mannerisms. Arab provincial governors were undoubtedly either Persianized Arameans or ethnic Persians; certainly Persian remained the language of official business of the caliphate until the adoption of Arabic toward the end of the 7th century, when in 692 minting began at the capital Damascus. The Islamic coins evolved from imitations of Sasanian coins (as well as Byzantine), and the Pahlavi script on the coinage was replaced with Arabic alphabet. During the Umayyad Caliphate, the Arab conquerors imposed Arabic as the primary language of the subject peoples throughout their empire. Al-Hajjaj ibn Yusuf, who was not happy with the prevalence of the Persian language in the divan, ordered the official language of the conquered lands to be replaced by Arabic, sometimes by force. In al-Biruni's From the Remaining Signs of Past Centuries for example it is written: When Qutaibah bin Muslim under the command of Al-Hajjaj bin Yousef was sent to Khwarazmia with a military expedition and conquered it for the second time, he swiftly killed whoever wrote the Khwarazmian native language that knew of the Khwarazmian heritage, history, and culture. He then killed all their Zoroastrian priests and burned and wasted their books, until gradually the illiterate only remained, who knew nothing of writing, and hence their history was mostly forgotten. Several historians see the rule of the Umayyads as setting up the "dhimmah" to increase taxes from the dhimmis to benefit the Muslim Arab community financially and by discouraging conversion. Governors lodged complaints with the caliph when he enacted laws that made conversion easier, depriving the provinces of revenues. In the 7th century, when many non-Arabs such as Persians entered Islam, they were recognized as mawali ("clients") and treated as second-class citizens by the ruling Arab elite until the end of the Umayyad Caliphate. During this era, Islam was initially associated with the ethnic identity of the Arab and required formal association with an Arab tribe and the adoption of the client status of mawali. The half-hearted policies of the late Umayyads to tolerate non-Arab Muslims and Shias had failed to quell unrest among these minorities. However, all of Iran was still not under Arab control; the region of Daylam was under the control of the Daylamites, while Tabaristan was under Dabuyid and Paduspanid control, and the Mount Damavand region was under Masmughans of Damavand. The Arabs had invaded these regions several times but achieved no decisive result because of the inaccessible terrain of the regions. The most prominent ruler of the Dabuyids, known as Farrukhan the Great (r. 712–728), managed to hold his domains during his long struggle against the Arab general Yazid ibn al-Muhallab, who was defeated by a combined Dailamite-Dabuyid army and was forced to retreat from Tabaristan. With the death of the Umayyad Caliph Hisham ibn Abd al-Malik in 743, the Islamic world was launched into civil war. Abu Muslim was sent to Khorasan by the Abbasid Caliphate initially as a propagandist and then to revolt on their behalf. He took Merv defeating the Umayyad governor Nasr ibn Sayyar. He became the de facto Abbasid governor of Khurasan. During the same period, the Dabuyid ruler Khurshid declared independence from the Umayyads but was shortly forced to recognize Abbasid authority. In 750, Abu Muslim became the leader of the Abbasid army and defeated the Umayyads at the Battle of the Zab. Abu Muslim stormed Damascus later that year. The Abbasid army consisted primarily of Khorasanians and was led by Abu Muslim. It contained both Iranian and Arab elements, and the Abbasids enjoyed both Iranian and Arab support. The Abbasids overthrew the Umayyads in 750. According to Amir Arjomand, the Abbasid Revolution essentially marked the end of the Arab empire and the beginning of a more inclusive, multi-ethnic state in the Middle East. One of the first changes the Abbasids made after taking power from the Umayyads was to move the empire's capital to Iraq. The latter region was influenced by Persian history and culture, and moving the capital was part of the Persian mawali demand for Arab influence in the empire. The city of Baghdad was constructed on the Tigris River, in 762, to serve as the Abbasid capital. The Abbasids established the position of vizier like Barmakids in their administration, which was the equivalent of a "vice-caliph", or second-in-command. Eventually, this change meant that many caliphs under the Abbasids ended up in a much more ceremonial role than ever before, with the vizier in real power. A new Persian bureaucracy began to replace the old Arab aristocracy, and the entire administration reflected these changes, demonstrating that the new dynasty was different in many ways from the Umayyads. By the 9th century, Abbasid control began to wane as regional leaders sprang up in the far corners of the empire to challenge the central authority of the Abbasid caliphate. The Abbasid caliphs began enlisting mamluks, Turkic-speaking warriors, who had been moving out of Central Asia into Transoxiana as slave warriors as early as the 9th century. Shortly thereafter the real power of the Abbasid caliphs began to wane; eventually, they became religious figureheads while the warrior slaves ruled. The 9th century also saw the revolt by native Zoroastrians, known as the Khurramites, against oppressive Arab rule. The movement was led by a Persian freedom fighter Babak Khorramdin. Babak's Iranianizing rebellion,[b] from its base in Azerbaijan in northwestern Iran,[c] called for a return of the political glories of the Iranian[d] past. The Khorramdin rebellion of Babak spread to the western and central parts of Iran and lasted more than 20 years before it was defeated when Babak was betrayed by Afshin, a senior general of the Abbasid Caliphate. As the power of the Abbasid caliphs diminished, a series of dynasties rose in various parts of Iran, some with considerable influence and power. Among the most important of these overlapping dynasties were the Tahirids in Khorasan (821–873); the Saffarids in Sistan (861–1003, their rule lasted as maliks of Sistan until 1537); and the Samanids (819–1005), originally at Bukhara. The Samanids eventually ruled an area from central Iran to Pakistan. By the early 10th century, the Abbasids almost lost control to the growing Persian faction known as the Buyid dynasty (934–1062). Since much of the Abbasid administration had been Persian anyway, the Buyids were quietly able to assume real power in Baghdad. The Buyids were defeated in the mid-11th century by the Seljuq Turks, who continued to exert influence over the Abbasids, while publicly pledging allegiance to them. The balance of power in Baghdad remained as such – with the Abbasids in power in name only – until the Mongol invasion of 1258 sacked the city and definitively ended the Abbasid dynasty. During the Abbasid period an enfranchisement was experienced by the mawali and a shift was made in political conception from that of a primarily Arab empire to one of a Muslim empire[incomplete short citation] and c. 930 a requirement was enacted for all bureaucrats of the empire to be Muslim. Islamization was a long process by which Islam was gradually adopted by the majority population of Iran. Richard Bulliet's "conversion curve" indicates that only about 10% of Iran converted to Islam during the relatively Arab-centric Umayyad period. Beginning in the Abbasid period, with its mix of Persian as well as Arab rulers, the Muslim percentage of the population rose. As Persian Muslims consolidated their rule of the country, the Muslim population rose from approximately 40% in the mid-9th century to close to 90% by the end of the 11th century.[incomplete short citation] Seyyed Hossein Nasr suggests that the rapid increase in conversion was aided by the Persian nationality of the rulers. Although Persians adopted the religion of their conquerors, over the centuries they worked to protect and revive their distinctive language and culture, a process known as Persianization. Arabs and Turks participated in this attempt.[e][incomplete short citation] In the 9th and 10th centuries, non-Arab subjects of the Ummah created a movement called Shu'ubiyyah in response to the privileged status of Arabs. Most of those behind the movement were Persian, but references to Egyptians, Berbers and Aramaeans are attested. Citing as its basis Islamic notions of equality of races and nations, the movement was primarily concerned with preserving Persian culture and protecting Persian identity, though within a Muslim context. The Samanid dynasty led the revival of Persian culture and the first important Persian poet after the arrival of Islam, Rudaki, was born during this era and was praised by Samanid kings. The Samanids also revived many ancient Persian festivals. Their successor, the Ghaznawids, who were of non-Iranian Turkic origin, also became instrumental in the revival of Persian culture. The culmination of the Persianization movement was the Shahnameh, the national epic of Iran, written almost entirely in Persian. This voluminous work, reflects Iran's ancient history, its unique cultural values, its pre-Islamic Zoroastrian religion, and its sense of nationhood. According to Bernard Lewis: Iran was indeed Islamized, but it was not Arabized. Persians remained Persians. And after an interval of silence, Iran re-emerged as a separate, different and distinctive element within Islam, eventually adding a new element even to Islam itself. Culturally, politically, and most remarkable of all even religiously, the Iranian contribution to this new Islamic civilization is of immense importance. The work of Iranians can be seen in every field of cultural endeavour, including Arabic poetry, to which poets of Iranian origin composing their poems in Arabic made a very significant contribution. In a sense, Iranian Islam is a second advent of Islam itself, a new Islam sometimes referred to as Islam-i Ajam. It was this Persian Islam, rather than the original Arab Islam, that was brought to new areas and new peoples: to the Turks, first in Central Asia and then in the Middle East in the country which came to be called Turkey, and of course to India. The Ottoman Turks brought a form of Iranian civilization to the walls of Vienna... The Islamization of Iran was to yield deep transformations within the cultural, scientific, and political structure of Iran's society: The blossoming of Persian literature, philosophy, medicine and art became major elements of the newly forming Muslim civilization. Inheriting a heritage of thousands of years of civilization, and being at the "crossroads of the major cultural highways", contributed to Persia emerging as what culminated into the "Islamic Golden Age". During this period, hundreds of scholars and scientists vastly contributed to technology, science and medicine, later influencing the rise of European science during the Renaissance. The most important scholars of almost all of the Islamic sects and schools of thought were Persian or lived in Iran, including the most notable and reliable Hadith collectors of Shia and Sunni like Shaikh Saduq, Shaikh Kulainy, Hakim al-Nishaburi, Imam Muslim and Imam Bukhari, the greatest theologians of Shia and Sunni like Shaykh Tusi, Imam Ghazali, Imam Fakhr al-Razi and Al-Zamakhshari, the greatest physicians, astronomers, logicians, mathematicians, metaphysicians, philosophers and scientists like Avicenna and Nasīr al-Dīn al-Tūsī, and the greatest shaykhs of Sufism like Rumi and Abdul-Qadir Gilani. In 977, a Turkic governor of the Samanids, Sabuktigin, conquered Ghazna (in present-day Afghanistan) and established a dynasty, the Ghaznavids, that lasted to 1186. The Ghaznavid empire grew by taking all of the Samanid territories south of the Amu Darya in the last decade of the 10th century, and eventually occupied parts of Eastern Iran, Afghanistan, Pakistan and north-west India. The Ghaznavids are generally credited with launching Islam into a mainly Hindu India. The invasion of India was undertaken in 1000 by the Ghaznavid ruler Mahmud and continued for several years. They were unable to hold power for long, however, particularly after the death of Mahmud in 1030. By 1040 the Seljuqs had taken over the Ghaznavid lands in Iran. The Seljuqs, who like the Ghaznavids were Persianate in nature and of Turkic origin, slowly conquered Iran over the course of the 11th century. The dynasty had its origins in the Turcoman tribal confederations of Central Asia and marked the beginning of Turkic power in the Middle East. They established a Sunni Muslim rule over parts of Central Asia and the Middle East from the 11th to 14th centuries. They set the Seljuq Empire that stretched from Anatolia in the west to western Afghanistan in the east and the western borders of modern-day China in the north-east; and was the target of the First Crusade. Today they are regarded as the cultural ancestors of the Western Turks, the present-day inhabitants of Turkey, Azerbaijan and Turkmenistan, and they are remembered as great patrons of Persian culture, art, literature, and language.[e] The founder of the dynasty, Tughril Beg, turned his army against the Ghaznavids in Khorasan. He moved south and then west, conquering but not wasting the cities in his path. In 1055 the caliph in Baghdad gave Tughril Beg robes, gifts, and the title King of the East. Under Tughril Beg's successor, Malik Shah (1072–1092), Iran enjoyed a cultural and scientific renaissance, largely attributed to his brilliant Iranian vizier, Nizam al Mulk. These leaders established the observatory where Omar Khayyám did much of his experimentation for a new calendar, and they built religious schools in all the major towns. They brought Abu Hamid Ghazali, one of the greatest Islamic theologians, and other eminent scholars to the Seljuq capital at Baghdad and encouraged and supported their work. When Malik Shah I died in 1092, the empire split as his brother and four sons quarreled over the apportioning of the empire among themselves. In Anatolia, Malik Shah I was succeeded by Kilij Arslan I who founded the Sultanate of Rûm and in Syria by his brother Tutush I. In Persia he was succeeded by his son Mahmud I whose reign was contested by his other three brothers Barkiyaruq in Iraq, Muhammad I in Baghdad and Ahmad Sanjar in Khorasan. As Seljuq power in Iran weakened, other dynasties began to step up in its place, including a resurgent Abbasid caliphate and the Khwarezmshahs. The Khwarezmid Empire was a Sunni Muslim Persianate dynasty, of East Turkic origin, that ruled in Central Asia. Originally vassals of the Seljuqs, they took advantage of the decline of the Seljuqs to expand into Iran. In 1194 the Khwarezmshah Ala ad-Din Tekish defeated the Seljuq sultan Toghrul III in battle and the Seljuq empire in Iran collapsed. Of the former Seljuq Empire, only the Sultanate of Rum in Anatolia remained. A serious internal threat to the Seljuqs during their reign came from the Nizari Ismailis, a secret sect with headquarters at Alamut Castle between Rasht and Tehran. They controlled the immediate area for more than 150 years and sporadically sent out adherents to strengthen their rule by murdering important officials. Several of the various theories on the etymology of the word assassin derive from these killers. Parts of northwestern Iran were conquered in the early 13th century AD by the Kingdom of Georgia, led by Tamar the Great. The Khwarazmian dynasty only lasted for a few decades, until the arrival of the Mongols. Genghis Khan had unified the Mongols, and under him the Mongol Empire quickly expanded in several directions. In 1218, it bordered Khwarezm. At that time, the Khwarazmian Empire was ruled by Ala ad-Din Muhammad (1200–1220). Muhammad, like Genghis, was intent on expanding his lands and had gained the submission of most of Iran. He declared himself shah and demanded formal recognition from the Abbasid caliph Al-Nasir. When the caliph rejected his claim, Ala ad-Din Muhammad proclaimed one of his nobles caliph and unsuccessfully tried to depose an-Nasir. The Mongol invasion of Iran began in 1219, after two diplomatic missions to Khwarezm sent by Genghis Khan had been massacred. During 1220–21 Bukhara, Samarkand, Herat, Tus and Nishapur were razed, and the populations were slaughtered. The Khwarezm-Shah fled, to die on an island off the Caspian coast. During the invasion of Transoxiana in 1219, along with the main Mongol force, Genghis Khan used a Chinese specialist catapult unit in battle; they were used again in 1220 in Transoxania. The Chinese may have used the catapults to hurl gunpowder bombs, since they already had them by this time. While Genghis Khan was conquering Transoxania and Persia, several Chinese who were familiar with gunpowder were serving in Genghis's army. "Whole regiments" entirely made out of Chinese were used by the Mongols to command bomb hurling trebuchets during the invasion of Iran. Historians have suggested that the Mongol invasion had brought Chinese gunpowder weapons to Central Asia. One of these was the huochong, a Chinese mortar. Books written around the area afterward depicted gunpowder weapons which resembled those of China. Before his death in 1227, Genghis had reached western Azerbaijan, pillaging and burning many cities along the way after entering into Iran from its north east. The Mongol invasion was by and large disastrous to the Iranians. Although the Mongol invaders eventually converted to Islam and accepted the culture of Iran, the Mongol destruction in Iran and other regions of the Islamic heartland (particularly the historical Khorasan region, mainly in Central Asia) marked a major change of direction for the region. Much of the six centuries of Islamic scholarship, culture, and infrastructure was destroyed as the invaders leveled cities, burned libraries, and in some cases replaced mosques with Buddhist temples. The Mongols killed many Iranian civilians. Destruction of qanat irrigation systems in the north east of Iran destroyed the pattern of relatively continuous settlements, producing many abandoned towns which were relatively quite good with irrigation and agriculture.[page needed] In 1221, Genghis Khan destroyed the city of Gurganj. Most if not all the ancient Iranic Khwarazmian people were killed or pushed out, paving the way for the Turkification of Khwarazm. After Genghis's death, Iran was ruled by several Mongol commanders. Genghis' grandson, Hulagu Khan, was tasked with the westward expansion of Mongol dominion. However, by the time he ascended to power, the Mongol Empire had already dissolved, dividing into different factions. Arriving with an army, he established himself in the region and founded the Ilkhanate, a breakaway state of the Mongol Empire, which would rule Iran for the next 80 years and become Persian in the process. Hulagu Khan seized Baghdad in 1258 and put the last Abbasid caliph to death. The westward advance of his forces was stopped by the Mamelukes, however, at the Battle of Ain Jalut in Palestine in 1260. Hulagu's campaigns against the Muslims also enraged Berke, khan of the Golden Horde and a convert to Islam. Hulagu and Berke fought against each other, demonstrating the weakening unity of the Mongol empire. The rule of Hulagu's great-grandson, Ghazan (1295–1304) saw the establishment of Islam as the state religion of the Ilkhanate. Ghazan and his famous Iranian vizier, Rashid al-Din, brought Iran a partial and brief economic revival. The Mongols lowered taxes for artisans, encouraged agriculture, rebuilt and extended irrigation works, and improved the safety of the trade routes. As a result, commerce increased dramatically. Items from India, China, and Iran passed easily across the Asian steppes, and these contacts culturally enriched Iran. For example, Iranians developed a style of painting based on a unique fusion of solid, two-dimensional Mesopotamian painting with the feathery, light brush strokes and other motifs characteristic of China. After Ghazan's nephew Abu Said died in 1335 the Ilkhanate lapsed into civil war and was divided between several petty dynasties – most prominently the Jalayirids, Muzaffarids, Sarbadars and Kartids. The mid-14th-century Black Death killed about 30% of the country's population. Prior to the rise of the Safavid Empire, Sunni Islam was the dominant religion, accounting for around 90% of the population at the time. According to Mortaza Motahhari the majority of Iranian scholars and masses remained Sunni until the time of the Safavids. The domination of Sunnis did not mean Shia were rootless in Iran. The writers of The Four Books of Shia were Iranian, as well as many other great Shia scholars. The domination of the Sunni creed during the first nine Islamic centuries characterized the religious history of Iran during this period. There were however some exceptions to this general domination which emerged in the form of the Zaydīs of Tabaristan (see Alid dynasties of northern Iran), the Buyids, the Kakuyids, the rule of Sultan Muhammad Khudabandah (r. Shawwal 703-Shawwal 716/1304–1316) and the Sarbedaran. Apart from this domination there existed, firstly, throughout these nine centuries, Shia inclinations among many Sunnis of this land and, secondly, original Imami Shiism as well as Zaydī Shiism had prevalence in some parts of Iran. During this period, Shia in Iran were nourished from Kufah, Baghdad and later from Najaf and Hillah. Shiism was the dominant sect in Tabaristan, Qom, Kashan, Avaj and Sabzevar. In many other areas merged population of Shia and Sunni lived together.[citation needed] During the 10th and 11th centuries, Fatimids sent Ismailis Da'i (missioners) to Iran as well as other Muslim lands. When Ismailis divided into two sects, Nizaris established their base in Iran. Hassan-i Sabbah conquered fortresses and captured Alamut in 1090 AD. Nizaris used this fortress until a Mongol raid in 1256.[citation needed] After the Mongol raid and fall of the Abbasids, Sunni hierarchies faltered. Not only did they lose the caliphate but also the status of official madhhab. Their loss was the gain of Shia, whose centre wasn't in Iran at that time. Several local Shia dynasties like Sarbadars were established during this time.[citation needed] The main change occurred in the beginning of the 16th century, when Ismail I founded the Safavid dynasty and initiated a religious policy to recognize Shi'a Islam as the official religion of the Safavid Empire, and the fact that modern Iran remains an officially Shi'ite state is a direct result of Ismail's actions.[citation needed] Iran remained divided until the arrival of Timur, a Turco-Mongol belonging to the Timurid dynasty. Like its predecessors, the Timurid Empire was also part of the Persianate world. After establishing a power base in Transoxiana, Timur invaded Iran in 1381 and eventually conquered most of it. Timur's campaigns were known for their brutality; many people were slaughtered and several cities were destroyed. His regime was characterized by tyranny and bloodshed, but also by its inclusion of Iranians in administrative roles and its promotion of architecture and poetry. His successors, the Timurids, maintained a hold on most of Iran until 1452, when they lost the bulk of it to Black Sheep Turkmen. The Black Sheep Turkmen were conquered by the White Sheep Turkmen under Uzun Hasan in 1468; Uzun Hasan and his successors were the masters of Iran until the rise of the Safavids. Sufi poet Hafez's popularity became firmly established in the Timurid era that saw the compilation and widespread copying of his divan. Sufis were often persecuted by orthodox Muslims who considered their teachings blasphemous. Sufism developed a symbolic language rich with metaphors to obscure poetic references to provocative philosophical teachings. Hafez concealed his own Sufi faith, even as he employed the secret language of Sufism (developed over hundreds of years) in his own work, and he is sometimes credited with having "brought it to perfection". His work was imitated by Jami, whose own popularity grew to spread across the full breadth of the Persianate world. The Kara Koyunlu were a Turkmen[f] tribal federation that ruled over northwestern Iran and surrounding areas from 1374 to 1468. The Kara Koyunlu expanded their conquest to Baghdad, however, internal fighting, defeats by the Timurids, rebellions by the Armenians in response to their persecution, and failed struggles with the Ag Qoyunlu led to their eventual demise. Aq Qoyunlu were Turkmen under the leadership of the Bayandur tribe, tribal federation of Sunni Muslims who ruled over most of Iran and large parts of surrounding areas from 1378 to 1501 CE. Aq Qoyunlu emerged when Timur granted them all of Diyar Bakr in present-day Turkey. Afterward, they struggled with their rival Oghuz Turks, the Qara Qoyunlu. While the Aq Qoyunlu were successful in defeating Kara Koyunlu, their struggle with the emerging Safavid dynasty led to their downfall. Early modern period Persia underwent a revival under the Safavid dynasty (1501–1736), the most prominent figure of which was Shah Abbas I. Some historians credit the Safavid dynasty for founding the modern nation-state of Iran. Iran's contemporary Shia character and significant segments of Iran's current borders take their origin from this era (e.g. Treaty of Zuhab). The Safavid dynasty was one of the most significant ruling dynasties of Iran and "is often considered the beginning of modern Persian history". They ruled one of the greatest Iranian empires after the Muslim conquest of Persia and established the Twelver school of Shi'a Islam as the official religion of their empire, marking one of the most important turning points in Muslim history. The Safavids ruled from 1501 to 1722 (experiencing a brief restoration from 1729 to 1736) and at their height, they controlled all of modern Iran, Azerbaijan and Armenia, most of Georgia, the North Caucasus, Iraq, Kuwait and Afghanistan, as well as parts of Turkey, Syria, Pakistan, Turkmenistan and Uzbekistan. Safavid Iran was one of the Islamic "gunpowder empires", along with its neighbours, its archrival and principal enemy the Ottoman Empire, and to the east, the Mughal Empire. The Safavid ruling dynasty was founded by Ismāil, who styled himself Shāh Ismāil I. Practically worshipped by his Qizilbāsh followers, Ismāil invaded Shirvan to avenge the death of his father, Shaykh Haydar, who had been killed during his siege of Derbent, in Dagestan. Afterwards he went on a campaign of conquest, and following the capture of Tabriz in July 1501, he enthroned himself as the Shāh of Iran,: 324 minted coins in this name, and proclaimed Shi'ism the official religion of his domain. Although initially the masters of Azerbaijan and southern Dagestan only, the Safavids had, in fact, won the struggle for power in Iran which had been going on for nearly a century between various dynasties and political forces following the fragmentation of the Kara Koyunlu and the Aq Qoyunlu. A year after his victory in Tabriz, Ismāil proclaimed most of Iran as his domain, and quickly conquered and unified Iran under his rule. Soon afterwards, the new Safavid Empire rapidly conquered regions, nations, and peoples in all directions, including Armenia, Azerbaijan, parts of Georgia, Mesopotamia (Iraq), Kuwait, Syria, Dagestan, large parts of what is now Afghanistan, parts of Turkmenistan, and large chunks of Anatolia, laying the foundation of its multi-ethnic character which would heavily influence the empire itself (most notably the Caucasus and its peoples). Tahmasp I, the son and successor of Ismail I, carried out multiple invasions in the Caucasus which had been incorporated in the Safavid empire since Shah Ismail I and for many centuries afterwards, and started with the trend of deporting and moving hundreds of thousands of Circassians, Georgians, and Armenians to Iran's heartlands. Initially only solely put in the royal harems, royal guards, and minor other sections of the Empire, Tahmasp believed he could eventually reduce the power of the Qizilbash, by creating and fully integrating a new layer in Iranian society. As Encyclopædia Iranica states, for Tahmasp, the problem circled around the military tribal elite of the empire, the Qizilbash, who believed that physical proximity to and control of a member of the immediate Safavid family guaranteed spiritual advantages, political fortune, and material advancement. With this new Caucasian layer in Iranian society, the undisputed might of the Qizilbash (who functioned much like the ghazis of the neighbouring Ottoman Empire) would be questioned and fully diminished as society would become fully meritocratic. Shah Abbas I and his successors would significantly expand this policy and plan initiated by Tahmasp, deporting during his reign alone around some 200,000 Georgians, 300,000 Armenians and 100,000–150,000 Circassians to Iran, completing the foundation of a new layer in Iranian society. With this, and the complete systematic disorganisation of the Qizilbash by his personal orders, he eventually fully succeeded in replacing the power of the Qizilbash, with that of the Caucasian ghulams. These new Caucasian elements (the so-called ghilman / غِلْمَان / "servants"), almost always after conversion to Shi'ism depending on given function would be, unlike the Qizilbash, fully loyal only to the Shah. The other masses of Caucasians were deployed in all other possible functions and positions available in the empire, as well as in the harem, regular military, craftsmen, farmers, etc. This system of mass usage of Caucasian subjects remained to exist until the fall of the Qajar dynasty. The greatest of the Safavid monarchs, Shah Abbas I the Great (1587–1629) came to power in 1587 aged 16. Abbas I first fought the Uzbeks, recapturing Herat and Mashhad in 1598, which had been lost by his predecessor Mohammad Khodabanda by the Ottoman–Safavid War (1578–1590). Then he turned against the Ottomans, the archrivals of the Safavids, recapturing Baghdad, eastern Iraq, the Caucasian provinces, and beyond by 1618. Between 1616 and 1618, following the disobedience of his most loyal Georgian subjects Teimuraz I and Luarsab II, Abbas carried out a punitive campaign in his territories of Georgia, devastating Kakheti and Tbilisi and carrying away 130,000 – 200,000 Georgian captives towards mainland Iran. His new army, which had dramatically been improved with the advent of Robert Shirley and his brothers following the first diplomatic mission to Europe, pitted the first crushing victory over the Safavids' archrivals, the Ottomans in the above-mentioned 1603–1618 war and would surpass the Ottomans in military strength. He also used his new force to dislodge the Portuguese from Bahrain (1602) and Hormuz (1622) with aid of the English navy, in the Persian Gulf. He expanded commercial links with the Dutch East India Company and established firm links with the European royal houses, which had been initiated by Ismail I earlier on by the Habsburg–Persian alliance. Thus Abbas I was able to break the dependence on the Qizilbash for military might and therefore was able to centralize control. The Safavid dynasty had already established itself during Shah Ismail I, but under Abbas I it really became a major power in the world along with its archrival the Ottoman Empire, against whom it became able to compete with on equal foot. It also started the promotion of tourism in Iran. Under their rule Persian Architecture flourished again and saw many new monuments in various Iranian cities, of which Isfahan is the most notable example. Except for Shah Abbas the Great, Shah Ismail I, Shah Tahmasp I, and Shah Abbas II, many of the Safavid rulers were ineffectual, often being more interested in their women, alcohol and other leisure activities. The end of Abbas II's reign in 1666, marked the beginning of the end of the Safavid dynasty. Despite falling revenues and military threats, many of the later shahs had lavish lifestyles. Shah Soltan Hoseyn (1694–1722) in particular was known for his love of wine and disinterest in governance. The declining country was repeatedly raided on its frontiers. Finally, Ghilzai Pashtun chieftain named Mir Wais Khan began a rebellion in Kandahar and defeated the Safavid army under the Iranian Georgian governor over the region, Gurgin Khan. In 1722, Peter the Great of neighbouring Imperial Russia launched the Russo-Persian War (1722–1723), capturing many of Iran's Caucasian territories, including Derbent, Shaki, Baku, but also Gilan, Mazandaran and Astrabad. In the midst of chaos, in the same year of 1722, an Afghan army led by Mir Wais' son Mahmud marched across eastern Iran, besieged and took Isfahan. Mahmud proclaimed himself 'Shah' of Persia. Meanwhile, Persia's imperial rivals, the Ottomans and the Russians, took advantage of the chaos in the country to seize more territory for themselves. By these events, the Safavid dynasty had effectively ended. In 1724, conform the Treaty of Constantinople, the Ottomans and the Russians agreed to divide large portions of Iran, which they had conquered between themselves. Iran's territorial integrity was restored by a native Iranian Turkic Afshar warlord from Khorasan, Nader Shah. He defeated and banished the Afghans, defeated the Ottomans, reinstalled the Safavids on the throne, and negotiated Russian withdrawal from Iran's Caucasian territories, with the Treaty of Resht and Treaty of Ganja. By 1736, Nader had become so powerful he was able to depose the Safavids and have himself crowned shah. Nader was one of the last great conquerors of Asia and briefly presided over what was probably the most powerful military force in the world. To financially support his wars against Iran's arch-rival, the Ottoman Empire, he fixed his sights on the weak but rich Mughal Empire to the east. In 1739, accompanied by his loyal Caucasian subjects including Erekle II,: 55 he invaded Mughal India, defeated a numerically superior Mughal army in less than three hours, and completely sacked and looted Delhi, bringing back immense wealth to Iran. On his way back, he also conquered all the Uzbek khanates – except for Kokand – and made the Uzbeks his vassals. He also firmly re-established Iranian rule over the entire Caucasus, Bahrain, as well as large parts of Anatolia and Mesopotamia. Undefeated for years, his defeat in Dagestan, following guerrilla rebellions by the Lezgins and the assassination attempt on him near Mazandaran is often considered the turning point in Nader's impressive career. To his frustration, the Dagestanis resorted to guerrilla warfare, and Nader with his conventional army could make little headway against them. At the Battle of Andalal and the Battle of Avaria, Nader's army was crushingly defeated and he lost half of his entire force, forcing him to flee for the mountains.[better source needed] Though Nader managed to take most of Dagestan during his campaign, the effective guerrilla warfare as deployed by the Lezgins, but also the Avars and Laks, made the Iranian re-conquest of the particular North Caucasian region this time a short lived one; several years later, Nader was forced to withdraw. Around the same time, an assassination attempt was made on him near, which accelerated his descent into paranoia and megalomania. He blinded his sons, whom he suspected of the assassination attempts, and showed increasing cruelty against his subjects and officers. In his later years, this eventually provoked multiple revolts and, ultimately, his assassination in 1747. Nader Shah's death was followed by a period of anarchy as rival army commanders fought for power. Nader's own family, the Afsharids, were soon reduced to holding on to a small domain in Khorasan. Many of the Caucasian territories broke away in various Caucasian khanates. Ottomans regained lost territories in Anatolia and Mesopotamia. Oman and the Uzbek khanates of Bukhara and Khiva regained independence. Ahmad Shah Durrani, one of Nader's officers, founded an independent state which eventually became modern Afghanistan. Erekle II and Teimuraz II, who in 1744 had been made the kings of Kakheti and Kartli respectively by Nader for their loyal service,: 55 capitalized on the eruption of instability and declared de facto independence. Erekle II assumed control over Kartli after Teimuraz II's death, thus unifying the two as the Kingdom of Kartli-Kakheti, becoming the first Georgian ruler in three centuries to preside over a politically unified eastern Georgia. Due to the frantic turn of events in mainland Iran he would be able to remain de facto autonomous through the Zand era. From his capital Shiraz, Karim Khan of the Zand dynasty ruled "an island of relative calm and peace in an otherwise bloody and destructive period," however the extent of Zand power was confined to contemporary Iran and parts of the Caucasus. Karim Khan's death in 1779 led to yet another civil war in which the Qajar dynasty eventually triumphed and became kings of Iran. During the civil war, Iran permanently lost Basra in 1779 to the Ottomans, which had been captured during the Ottoman–Persian War (1775–1776), and Bahrain to the House of Khalifa after the Bani Utbah invasion in 1783.[citation needed] Late modern period Agha Mohammad Khan emerged victorious out of the civil war that commenced with the death of the last Zand king. His reign is noted for the reemergence of a centrally led and united Iran. After the death of Nader Shah and the last of the Zands, most of Iran's Caucasian territories had broken away into various Caucasian khanates. Agha Mohammad Khan, like the Safavid kings and Nader Shah before him, viewed the region as no different from the territories in mainland Iran. Therefore, his first objective after having secured mainland Iran, was to reincorpate the Caucasus region into Iran. Georgia was seen as one of the most integral territories. For Agha Mohammad Khan, the resubjugation and reintegration of Georgia into the Iranian Empire was part of the same process that had brought Shiraz, Isfahan, and Tabriz under his rule. As the Cambridge History of Iran states, its permanent secession was inconceivable and had to be resisted in the same way as one would resist an attempt at the separation of Fars or Gilan. It was therefore natural for Agha Mohammad Khan to perform whatever necessary means in the Caucasus in order to subdue and reincorporate the recently lost regions following Nader Shah's death and the demise of the Zands, including putting down what in Iranian eyes was seen as treason on the part Erekle II. Agha Mohammad Khan subsequently demanded that Erekle renounce its 1783 treaty with Russia, and to submit again to Iranian suzerainty, in return for peace and the security of his kingdom. The Ottomans, Iran's neighboring rival, recognized the latter's rights over Kartli and Kakheti for the first time in four centuries. Heraclius appealed then to his theoretical protector, Empress Catherine II of Russia, pleading for at least 3,000 Russian troops, but he was ignored, leaving Georgia to fend off the Persian threat alone. Nevertheless, Heraclius II still rejected the Khan's ultimatum. As a response, Agha Mohammad Khan invaded the Caucasus region after crossing the Aras river, and, while on his way to Georgia, he re-subjugated Iran's territories of the Erivan Khanate, Shirvan, Nakhchivan Khanate, Ganja khanate, Derbent Khanate, Baku khanate, Talysh Khanate, Shaki Khanate, Karabakh Khanate, which comprise modern-day Armenia, Azerbaijan, Dagestan, and Igdir. Having reached Georgia with his large army, he prevailed in the Battle of Krtsanisi, which resulted in the capture and sack of Tbilisi, as well as the effective resubjugation of Georgia.[g] Upon his return from his successful campaign in Tbilisi and in effective control over Georgia, together with some 15,000 Georgian captives that were moved back to mainland Iran, Agha Mohammad was formally crowned Shah in 1796 in the Mughan plain, just as his predecessor Nader Shah was about sixty years earlier. Agha Mohammad Shah was later assassinated in 1797 while preparing a second expedition against Georgia in Shusha (now part of the Republic of Azerbaijan) and its King Heraclius II. The reassertion of Iranian hegemony over Georgia did not last long; in 1799 the Russians marched into Tbilisi. The Russians were already actively occupied with an expansionist policy towards its neighboring empires to its south, namely the Ottoman Empire and the successive Iranian kingdoms, since the late 17th/early 18th century. The next two years following Russia's entrance into Tbilisi were a time of confusion, and the weakened and devastated Georgian kingdom, with its capital half in ruins, was easily absorbed by Russia in 1801. As Iran could not permit or allow the cession of Transcaucasia and Dagestan, which had been an integral part of Iran for centuries, this would lead directly to the wars of several years later, namely the Russo-Persian Wars of 1804-1813 and 1826–1828. The outcome of these two wars (in the Treaty of Gulistan and the Treaty of Turkmenchay, respectively) proved for the irrevocable forced cession and loss of what is now eastern Georgia, Dagestan, Armenia, and Azerbaijan to Imperial Russia. The area to the north of the river Aras, among which the territory of the contemporary republic of Azerbaijan, eastern Georgia, Dagestan, and Armenia were Iranian territory until they were occupied by Russia in the course of the 19th century. Following the official loss of vast territories in the Caucasus, major demographic shifts were bound to take place. Following the 1804–1814 war, but also per the 1826–1828 war which ceded the last territories, large migrations of so-called Caucasian Muhajirs set off for mainland Iran. Some of these groups included the Ayrums, Qarapapaqs, Circassians, Shia Lezgins, and other Transcaucasian Muslims. After the Battle of Ganja of 1804, many thousands of Ayrums and Qarapapaqs were settled in Tabriz. During the remaining part of the 1804–1813 war, as well as through the 1826–1828 war, a large number of the Ayrums and Qarapapaqs that were still remaining in newly conquered Russian territories were settled in and migrated to Solduz (in modern-day Iran's West Azerbaijan province). As the Cambridge History of Iran states; "The steady encroachment of Russian troops along the frontier in the Caucasus, General Yermolov's brutal punitive expeditions and misgovernment, drove large numbers of Muslims, and even some Georgian Christians, into exile in Iran." From 1864 until the early 20th century, another mass expulsion took place of Caucasian Muslims as a result of the Russian victory in the Caucasian War. Others simply voluntarily refused to live under Christian Russian rule, and thus departed for Turkey or Iran. These migrations once again, towards Iran, included masses of Caucasian Azerbaijanis, other Transcaucasian Muslims, as well as many North Caucasian Muslims, such as Circassians, Shia Lezgins and Laks. Many of these migrants would prove to play a pivotal role in further Iranian history, as they formed most of the ranks of the Persian Cossack Brigade, which was established in the late 19th century. The initial ranks of the brigade would be entirely composed of Circassians and other Caucasian Muhajirs. This brigade would prove decisive in the following decades in Qajar history. Furthermore, the 1828 Treaty of Turkmenchay included the official rights for the Russian Empire to encourage settling of Armenians from Iran in the newly conquered Russian territories.[h] At the close of the fourteenth century, after Timur's campaigns, the Timurid Renaissance flourished and Islam had become the dominant faith. Armenians had by then become a minority in Eastern Armenia. After centuries of constant warfare on the Armenian plateau, many Armenians chose to emigrate and settle elsewhere. Following Shah Abbas I's massive relocation of Armenians and Muslims in 1604–05, their numbers dwindled even further. At the time of the Russian invasion of Iran, some 80% of the population of Iranian Armenia were Muslims (Persians, Turkics, and Kurds) whereas Christian Armenians constituted a minority of about 20%. As a result of the Treaty of Gulistan (1813) and the Treaty of Turkmenchay (1828), Iran was forced to cede Iranian Armenia (which also constituted the present-day Armenia), to the Russians. After the Russian administration took hold of Iranian Armenia, the ethnic make-up shifted, and thus for the first time in more than four centuries, ethnic Armenians started to form a majority once again in one part of historic Armenia. The new Russian administration encouraged the settling of ethnic Armenians from Iran proper and Ottoman Turkey. As a result, by 1832, the number of ethnic Armenians had matched that of the Muslims. It would be only after the Crimean War and the Russo-Turkish War of 1877–1878, which brought another influx of Turkish Armenians, that ethnic Armenians once again established a solid majority in Eastern Armenia. Nevertheless, the city of Erivan retained a Muslim majority up to the twentieth century. According to the traveller H. F. B. Lynch, the city of Erivan was about 50% Armenian and 50% Muslim (Tatars[i] i.e. Azeris and Persians) in the early 1890s. Fath Ali Shah's reign saw increased diplomatic contacts with the West and the beginning of intense European diplomatic rivalries over Iran. His grandson Mohammad Shah, who succeeded him in 1834, fell under the Russian influence and made two unsuccessful attempts to capture Herat. When Mohammad Shah died in 1848 the succession passed to his son Naser al-Din Shah Qajar, who proved to be the ablest and most successful of the Qajar sovereigns. He founded the first modern hospital in Iran. The Great Persian Famine of 1870–1871 is believed to have caused the death of two million people. A new era in the history of Iran dawned with the Persian Constitutional Revolution against the shah in the late 19th and early 20th centuries. The shah managed to remain in power, granting a limited constitution in 1906 (making the country a constitutional monarchy). The first Majlis (parliament) was convened on 7 October 1906. The discovery of petroleum in 1908 by the British in Khuzestan spawned intense renewed interest in Persia by the British Empire (see William Knox D'Arcy and Anglo-Iranian Oil Company, now BP). Britain's influence was solidified by the establishment of the Indo-European Telegraph Department in the 1860s and the Imperial Bank of Persia in 1889. By the end of the 19th century, European interference became so pronounced that Iran's central government required Anglo-Russian approval for ministerial appointments. Control of Persia remained contested between the United Kingdom and Russia, in what became known as The Great Game, and codified in the Anglo-Russian Convention of 1907, which divided Iran into spheres of influence, regardless of her national sovereignty. During World War I, the country was occupied by British, Ottoman and Russian forces but was essentially neutral (see Persian Campaign). In 1919, after the Russian Revolution and their withdrawal, Britain attempted to establish a protectorate in Iran, which was unsuccessful. The Constitutionalist movement of Gilan and the central power vacuum caused by the instability of the Qajar government resulted in the rise of Reza Khan, later Reza Shah Pahlavi, who established the Pahlavi dynasty in 1925. In 1921, Reza Khan, an officer of the Persian Cossack Brigade, (along with Seyyed Zia'eddin Tabatabai) led a military coup against governing officials (leaving the Qajar monarchy nominally head of state).[j] In 1925, after being prime minister for two years, Reza Khan did depose the Qajar dynasty and became the first shah of the Pahlavi dynasty. Reza Shah ruled for almost 16 years until 16 September 1941, when he was forced to abdicate by the Anglo-Soviet invasion of Iran. He established an authoritarian government that valued nationalism, militarism, secularism and anti-communism combined with strict censorship and state propaganda. Reza Shah introduced many socio-economic reforms, reorganizing the army, government administration, and finances. To his supporters, his reign brought "law and order, discipline, central authority, and modern amenities – schools, trains, buses, radios, cinemas, and telephones". However, his attempts of modernisation have been criticised for being "too fast" and "superficial", and his reign a time of "oppression, corruption, taxation, lack of authenticity" with "security typical of police states." Many of the new laws and regulations created resentment among devout Muslims and the clergy. For example, mosques were required to use chairs; most men were required to wear western clothing, including a hat with a brim; women were encouraged to discard the hijab—hijab was eventually banned in 1936; men and women were allowed to congregate freely, violating Islamic mixing of the sexes. Tensions boiled over in 1935, when bazaaris and villagers rose up in rebellion at the Imam Reza shrine in Mashhad to protest against plans for the hijab ban, chanting slogans such as 'The Shah is a new Yezid.' Dozens were killed and hundreds were injured when troops finally quelled the unrest. While German armies were highly successful against the Soviet Union, the Iranian government expected Germany to win the war and establish a powerful force on its borders. It rejected British and Soviet demands to expel German residents from Iran. In response, the two Allies invaded in August 1941 and easily overwhelmed the weak Iranian army in Operation Countenance. Iran became the major conduit of Allied Lend-Lease aid to the Soviet Union. The purpose was to secure Iranian oil fields and ensure Allied supply lines (see Persian Corridor). Iran remained officially neutral. Its monarch Rezā Shāh was deposed during the subsequent occupation and replaced with his young son Mohammad Reza Pahlavi. At the Tehran Conference of 1943, the Allies issued the Tehran Declaration which guaranteed the post-war independence and boundaries of Iran. However, when the war actually ended, Soviet troops stationed in northwestern Iran not only refused to withdraw but backed revolts that established short-lived, pro-Soviet separatist national states in the northern regions of Azerbaijan and Iranian Kurdistan, the Azerbaijan People's Government and the Republic of Kurdistan respectively, in late 1945. Soviet troops did not withdraw from Iran proper until May 1946 after receiving a promise of oil concessions. The Soviet republics in the north were soon overthrown and the oil concessions were revoked. Initially there were hopes that post-occupation Iran could become a constitutional monarchy. The new, young Shah Mohammad Reza Shah Pahlavi initially took a very hands-off role in government, and allowed parliament to hold a lot of power. Some elections were held in the first shaky years, although they remained mired in corruption. Parliament became chronically unstable, and from the 1947 to 1951 period Iran saw the rise and fall of six different prime ministers. Pahlavi increased his political power by convening the Iran Constituent Assembly, 1949, which finally formed the Senate of Iran—a legislative upper house allowed for in the 1906 constitution but never brought into being. The new senators were largely supportive of Pahlavi, as he had intended. In 1951 Prime Minister Mohammed Mosaddeq received the vote required from the parliament to nationalize the British-owned oil industry, in a situation known as the Abadan Crisis. Despite British pressure, including an economic blockade, the nationalization continued. Mosaddeq was briefly removed from power in 1952 but was quickly re-appointed by the Shah, due to a popular uprising in support of the premier, and he, in turn, forced the Shah into a brief exile in August 1953 after a failed military coup by Imperial Guard Colonel Nematollah Nassiri. Shortly thereafter on 19 August a successful coup was headed by retired army general Fazlollah Zahedi, aided by the United States (CIA) with the active support of the British (MI6) (known as Operation Ajax and Operation Boot to the respective agencies). The coup—with a black propaganda campaign designed to turn the population against Mosaddeq — forced Mosaddeq from office. Mosaddeq was arrested and tried for treason. Found guilty, his sentence was reduced to house arrest on his family estate while his foreign minister, Hossein Fatemi, was executed. Zahedi succeeded him as prime minister, and suppressed opposition to the Shah, specifically the National Front and Communist Tudeh Party. Iran was ruled as an autocracy under the Shah with American support from that time until the revolution. The Iranian government entered into agreement with an international consortium of foreign companies which ran the Iranian oil facilities for the next 25 years, splitting profits fifty-fifty with Iran but not allowing Iran to audit their accounts or have members on their board of directors. In 1957 martial law was ended after 16 years and Iran became closer to the West, joining the Baghdad Pact and receiving military and economic aid from the US. In 1961, Iran initiated a series of economic, social, agrarian and administrative reforms to modernize the country that became known as the Shah's White Revolution. The core of this program was land reform. Modernization and economic growth proceeded at an unprecedented rate, fueled by Iran's vast petroleum reserves, the third-largest in the world. However, the reforms, including the White Revolution, did not greatly improve economic conditions and the liberal pro-Western policies alienated certain Islamic religious and political groups. In early June 1963 several days of massive rioting occurred in support of Ayatollah Ruhollah Khomeini following the cleric's arrest for a speech attacking the Shah. Two years later, premier Hassan Ali Mansur was assassinated and the internal security service, SAVAK, became more violently active. In the 1970s, leftist guerilla groups such as Mujaheddin-e-Khalq (MEK), emerged and contributed to overthrowing the Shah during the 1979 Iranian Revolution. Nearly a hundred Iran political prisoners were killed by the SAVAK during the decade before the revolution and many more were arrested and tortured. The Islamic clergy, headed by the Ayatollah Ruhollah Khomeini (who had been exiled in 1964), were becoming increasingly vociferous. Iran greatly increased its defense budget and by the early 1970s was the region's strongest military power. Bilateral relations with Iraq were not good, mainly due to a dispute over the Shatt al-Arab waterway. In November 1971, Iranian forces seized control of three islands at the mouth of the Persian Gulf; in response, Iraq expelled thousands of Iranian nationals. Following a number of clashes in April 1969, Iran abrogated the 1937 accord and demanded a renegotiation. In mid-1973, the Shah returned the oil industry to national control. Following the Arab-Israeli War of October 1973, Iran did not join the Arab oil embargo against the West and Israel. Instead, it used the situation to raise oil prices, using the money gained for modernisation and to increase defense spending. A border dispute between Iraq and Iran was resolved with the signing of the Algiers Accord on 6 March 1975. Contemporary period The Iranian Revolution, also known as the Islamic Revolution, was the revolution that transformed Iran from an absolute monarchy under Shah Mohammad Reza Pahlavi to an Islamic republic under Ayatollah Ruhollah Khomeini, one of the leaders of the revolution and founder of the Islamic Republic. Its time span can be said to have begun in January 1978 with the first major demonstrations, and concluded with the approval of the new theocratic Constitution—whereby Ayatollah Khomeini became Supreme Leader of the country—in December 1979. In between, Mohammad Reza Pahlavi left the country for exile in January 1979 after strikes and demonstrations paralyzed the country, and on 1 February 1979, Ayatollah Khomeini returned to Tehran. The final collapse of the Pahlavi dynasty occurred shortly after on 11 February when Iran's military declared itself "neutral" after guerrillas and rebel troops overwhelmed troops loyal to the Shah in armed street fighting. Iran officially became an Islamic Republic on 1 April 1979, after Iranians overwhelmingly approved a national referendum to make it so a day before. The ideology of the revolutionary government was populist, nationalist and most of all Shi'a Islamic. Its unique constitution is based on the concept of velayat-e faqih the idea advanced by Khomeini that Muslims – in fact everyone – requires "guardianship", in the form of rule or supervision by the leading Islamic jurist or jurists. Khomeini served as this ruling jurist, or supreme leader, until his death in 1989. Iran's rapidly modernising, capitalist economy was replaced by populist and Islamic economic and cultural policies. Much industry was nationalized, laws and schools Islamicized, and Western influences banned. The Islamic revolution also created great impact around the world. In the non-Muslim world it has changed the image of Islam, generating much interest in the politics and spirituality of Islam, along with "fear and distrust towards Islam" and particularly the Islamic Republic and its founder. Khomeini served as leader of the revolution or as Supreme Leader of Iran from 1979 to his death on 3 June 1989. This era was dominated by the consolidation of the revolution into a theocratic republic under Khomeini, and by the costly and bloody war with Iraq. Revolutionary factions disagreed on the shape of the new Iran. Those who thought the Shah would be replaced by a democratic government soon found Khomeini disagreed. In early March 1979, he announced, "do not use this term, 'democratic.' That is the Western style." In succession the National Democratic Front was banned in August 1979, the provisional government was disempowered in November, the Muslim People's Republican Party banned in January 1980, the People's Mujahedin of Iran (MEK) and its supporters came under attack between 1979 and 1981, a purge of universities was begun in March 1980, and leftist President Abolhassan Banisadr was impeached in June 1981. The consolidation lasted until 1982–3, as Iran coped with the damage to its economy, military, and apparatus of government, and protests and uprisings by secularists, leftists, and more traditional Muslims—formerly ally revolutionaries but now rivals—were effectively suppressed. Many political opponents were executed by the new regimes. Following the events of the revolution, Marxist guerrillas and federalist parties revolted in regions comprising Khuzistan, Kurdistan and Gonbad-e Qabus, resulting in severe fighting between rebels and revolutionary forces. These revolts began in April 1979 and lasted between several months to over a year, depending on the region. The Kurdish uprising, led by the KDPI, was the most violent, lasting until 1983 and resulting in 10,000 casualties. In the summer of 1979 a new constitution giving Khomeini a powerful post as guardian jurist Supreme Leader and a clerical Council of Guardians power over legislation and elections, was drawn up by an Assembly of Experts for Constitution. The new constitution was approved by referendum in December 1979. An early event in the history of the Islamic republic that had a long-term impact was the Iran hostage crisis. Following the admitting of the former Shah of Iran into the United States for cancer treatment, on 4 November 1979, Iranian students seized US embassy personnel, labeling the embassy a "den of spies." Fifty-two hostages were held for 444 days until January 1981. An American military attempt to rescue the hostages failed. The takeover was enormously popular in Iran, where thousands gathered in support of the hostage takers, and it is thought to have strengthened the prestige of the Ayatollah Khomeini and consolidated the hold of anti-Americanism. It was at this time that Khomeini began referring to America as the "Great Satan." In America, where it was considered a violation of the long-standing principle of international law that diplomats may be expelled but not held captive, it created a powerful anti-Iranian backlash. Relations between the two countries have remained deeply antagonistic and American international sanctions have hurt Iran's economy. During this political and social crisis, Iraqi leader Saddam Hussein attempted to take advantage of the disorder of the Revolution, the weakness of the Iranian military and the revolution's antagonism with Western governments. The once-strong Iranian military had been disbanded during the revolution, and with the Shah ousted, Hussein had ambitions to position himself as the new strong man of the Middle East. He sought to expand Iraq's access to the Persian Gulf by acquiring territories that Iraq had claimed earlier from Iran during the Shah's rule. Of chief importance to Iraq was Khuzestan which not only boasted a substantial Arab population, but rich oil fields as well. On the unilateral behalf of the United Arab Emirates, the islands of Abu Musa and the Greater and Lesser Tunbs became objectives as well. With these ambitions in mind, Hussein planned a full-scale assault on Iran, boasting that his forces could reach the capital within three days. On 22 September 1980, the Iraqi army invaded Iran at Khuzestan, precipitating the Iran–Iraq War. The attack took revolutionary Iran completely by surprise. Although Saddam Hussein's forces made several early advances, Iranian forces had pushed the Iraqi army back into Iraq by 1982. Khomeini sought to export his Islamic revolution westward into Iraq, especially on the majority Shi'a Arabs living in the country. The war then continued for six more years until 1988, when Khomeini, in his words, "drank the cup of poison" and accepted a truce mediated by the United Nations. Tens of thousands of Iranian civilians and military personnel were killed when Iraq used chemical weapons in its warfare. Iraq was financially backed by Egypt, the Arab countries of the Persian Gulf, the Soviet Union and the Warsaw Pact states, the United States (beginning in 1983), France, the United Kingdom, Germany, Brazil, and the People's Republic of China (which also sold weapons to Iran). There were more than 182,000 Kurdish victims of Iraq's chemical weapons during the eight-year war. The total Iranian casualties of the war were estimated to be between 500,000 and 1,000,000. Almost all relevant international agencies have confirmed that Saddam engaged in chemical warfare to blunt Iranian human wave attacks; these agencies unanimously confirmed that Iran never used chemical weapons during the war. Starting on 19 July 1988 and lasting for about five months the government systematically executed thousands of political prisoners across Iran. This is commonly referred to as the 1988 executions of Iranian political prisoners or the 1988 Iranian Massacre. The main target was the membership of the People's Mojahedin Organization of Iran (PMOI), although a lesser number of political prisoners from other leftist groups were also included such as the Tudeh Party of Iran (Communist Party). Estimates of the number executed vary from 1,400 to 30,000. On his deathbed in 1989, Khomeini appointed a 25-man Constitutional Reform Council which named then president Ali Khamenei as the next Supreme Leader, and made a number of changes to Iran's constitution. A smooth transition followed Khomeini's death on 3 June 1989. While Khamenei lacked Khomeini's "charisma and clerical standing", he developed a network of supporters within Iran's armed forces and its economically powerful religious foundations. Under his reign Iran's regime is said – by at least one observer – to resemble more "a clerical oligarchy ... than an autocracy." Ali-Akbar Hashemi Rafsanjani succeeded Khamenei as president on 3 August 1989, as a pragmatic conservative who served two four-year terms and focused his efforts on rebuilding the country's economy and infrastructure damaged by war, though hampered by low oil prices. Rafsanjani sought to restore confidence in the government among the general population by privatizing the companies that had been nationalized in the first few years of the Islamic Republic, as well as by bringing in qualified technocrats to manage the economy. The state of their economy also influenced the government to move towards ending their diplomatic isolation. This was achieved through the reestablishment of normalized relations with neighbors such as Saudi Arabia and an attempt to improve its reputation in the region with assertions that its revolution was not exportable to other states. During the Persian Gulf War in 1991 the country remained neutral, restricting its action to the condemnation of the U.S. and allowing fleeing Iraqi aircraft and refugees into the country.[citation needed] Iran in the 1990s had a greater secular behavior and admiration for Western popular culture than in the previous decades. This admiration had become a way in which the urban population expressed their resentment at the invasive Islamic policies of the government. The pressures from the population placed on the new Supreme Leader Ayatollah Ali Khamenei led to an uneasy alliance between him and President Akbar Hashemi Rafsanjani. Through this alliance they attempted to hinder the ulama's ability to gain further control of the state. In 1989, they created a sequence of constitutional amendments that removed the office of prime minister and increased the scope of presidential power. However, these new amendments did not curtail the powers of the Supreme Leader of Iran in any way; this position still contained the ultimate authority over the armed forces, the making of war and peace, the final say in foreign policy, and the right to intervene in the legislative process whenever he deemed it necessary. President Rafsanjani's economic policies led to stronger relations with the outside world. But his government's relaxation of the enforcement of certain regulations on social behavior were met with some responses of widespread disenchantment among the general population with the ulama as rulers of the country. This led to the defeat of the government's candidate for president in 1997, who had the backing of the supreme Islamic jurist. He was beaten by an independent candidate from the Reformists, Mohammad Khatami. He received 69% of the vote and enjoyed particular support from two groups of the population that had felt ostracized by the practices of the state: women and youth. The younger generations in the country had been too young to experience the shah's regime or the revolution that ended it, and now they resented the restrictions placed on their daily lives under the Islamic Republic. Mohammad Khatami's presidency was soon marked by tensions between the reform-minded government and an increasingly conservative and vocal clergy. This rift reached a climax in July 1999 when massive anti-government protests erupted in the streets of Tehran. The disturbances lasted over a week before police and pro-government vigilantes dispersed the crowds. During his first term, President Khatami oversaw Iran’s second five-year development plan and introduced a new plan for 2000–2004 focused on economic reconstruction alongside social and political reforms. The plan aimed for privatization, job creation, and reduced subsidies but fell short on employment targets. Despite this, Iran saw improved economic indicators: real GDP growth rose to nearly 6 percent, unemployment and inflation declined, external debt dropped significantly, and the government authorized private banks for the first time since 1979. Poverty levels also decreased modestly. In the Majlis elections of 2000, for the first time liberals and Khatami’s supporters gained parliamentary control from conservatives. That same year, following the adoption of a new press law, authorities banned the publication of 16 reformist newspapers. Khatami was re-elected in June 2001 but his efforts were repeatedly blocked by the conservatives in the parliament. Conservative elements within Iran's government moved to undermine the reformist movement, banning liberal newspapers and disqualifying candidates for parliamentary elections. This clampdown on dissent, combined with the failure of Khatami to reform the government, led to growing political apathy among Iran's youth. Following the September 11 attacks in 2001, Iran initially was sympathetic with the United States. However, relations deteriorated sharply after President George W. Bush labeled Iran part of the "Axis of Evil" in 2002, accusing the country of pursuing weapons of mass destruction that posed a threat to the U.S. Despite firm U.S. opposition, in 2002 Russian teams commenced work on Iran’s inaugural nuclear reactor at Bushehr. In June 2003, anti-government protests by several thousand students took place in Tehran. Shirin Ebadi, a lawyer and human rights advocate, became the first Iranian to win the Nobel Peace Prize in 2003. She had been the country's first female judge until being forced to step down after the 1979 revolution. The response to the award in Iran was mixed—enthusiastic supporters greeted her at the airport upon her return, the conservative media underplayed it, and Khatami criticized it as political. A violent earthquake struck the Kerman province of southeastern Iran in December 2003. The earthquake was particularly destructive in Bam, with the death toll amounting to at least 34,000 people and injuring up to 200,000. After the hardline Council of Guardians disqualified thousands of reformist candidates, conservatives regained control of parliament in the elections of 2004. In the 2005 Iranian presidential election, Mahmoud Ahmadinejad, mayor of Tehran, became the sixth president of Iran, after winning 62 percent of the vote in the run-off poll, against former president Ali-Akbar Hashemi Rafsanjani. During the authorization ceremony he kissed Khamenei's hand in demonstration of his loyalty to him. During this time, the American invasion of Iraq, the overthrow of Saddam Hussein's regime and empowerment of its Shi'a majority, all strengthened Iran's position in the region particularly in the mainly Shi'a south of Iraq, where a top Shia leader in the week of 3 September 2006 renewed demands for an autonomous Shi'a region. At least one commentator (former U.S. Defense Secretary William S. Cohen) has stated that as of 2009 Iran's growing power has eclipsed anti-Zionism as the major foreign policy issue in the Middle East. During 2005 and 2006, there were claims that the United States and Israel were planning to attack Iran, with the most cited reason being Iran's civilian nuclear energy program which the United States and some other states fear could lead to a nuclear weapons program. China and Russia opposed military action of any sort and opposed economic sanctions. Khamenei issued a fatwa forbidding the production, stockpiling and use of nuclear weapons. The fatwa was cited in an official statement by the Iranian government at an August 2005 meeting of the International Atomic Energy Agency (IAEA) in Vienna. However, The IAEA reported in 2008 that Iran’s suspected nuclear weapons research remained “a matter of serious concern,” prompting European Union countries to agree on new sanctions. Additional U.N. sanctions followed in 2010. In 2011, Iran announced that the Bushehr Nuclear Power Plant had been connected to the national electricity grid for the first time. Eventually, the sanctions severely impacted Iran’s economy, contributing to a dramatic depreciation of the rial, which reportedly fell to a record low of 35,000 to the US dollar—an 80% drop since late 2011. In 2007, a diplomatic standoff erupted between Iran and the UK after Iranian forces detained 15 British sailors and marines near the Shatt al-Arab waterway, which forms part of the Iran-Iraq border. In 2009, Ahmadinejad's reelection was hotly disputed and marred by large protests that formed the "greatest domestic challenge" to the leadership of the Islamic Republic "in 30 years". The resulting social unrest is widely known as the Iranian Green Movement. Reformist opponent Mir-Hossein Mousavi and his supporters alleged voting irregularities and by 1 July 2009, 1000 people had been arrested and 20 killed in street demonstrations. Supreme Leader Ali Khamenei and other Islamic officials blamed foreign powers for fomenting the protest. In 2010, Stuxnet was reportedly found in the Natanz Nuclear Facility. Stuxnet is a malicious computer worm thought to have been in development since at least 2005. Stuxnet targets supervisory control and data acquisition (SCADA) systems and is believed to be responsible for causing substantial damage to the Iran nuclear program. Although neither the United States nor Israel has openly admitted responsibility, multiple independent news organizations claim Stuxnet to be a cyberweapon built jointly by the two countries in a collaborative effort known as Operation Olympic Games. The program, started during the Bush administration, was rapidly expanded within the first months of Barack Obama's presidency. On 14 February 2011, widespread protests erupted in Tehran as thousands gathered in response to opposition calls, expressing solidarity with pro-democracy movements in the region and reviving dissent over the contested 2009 presidential election. Security forces quickly suppressed the demonstrations, resulting in two deaths and numerous injuries. Further protests followed, including on 20 February and 1 March, when the opposition reported around 200 arrests. Authorities subsequently managed to prevent large-scale demonstrations. Reports of growing tensions between Ahmadinejad and Khamenei emerged during this period. In the 2012 parliamentary elections, Ahmadinejad’s allies lost ground to factions loyal to Khamenei, while the opposition Green Movement remained banned. Its leaders, Mehdi Karroubi and Mir-Hossein Mousavi, were placed under house arrest in early 2011 and have remained out of public view, with some government supporters demanding their execution. On 15 June 2013, Hassan Rouhani won the presidential election in Iran, with a total number of 36,704,156 ballots cast; Rouhani won 18,613,329 votes. In his press conference one day after election day, Rouhani reiterated his promise to recalibrate Iran's relations with the world. On 14 July 2015, after years of negotiations, Iran and the P5+1 group of world powers (China, France, Russia, the United Kingdom, the United States, plus Germany) together with the European Union finalized the Joint Comprehensive Plan of Action (JCPOA), commonly known as the Iran nuclear deal. The agreement aimed to limit Iran’s nuclear program in exchange for relief from international sanctions. It followed the 2013 Joint Plan of Action, an interim deal that opened formal negotiations. By April 2015, negotiators had agreed on a framework that set the stage for the final accord in Vienna. Under the JCPOA, Iran agreed to significant restrictions on its nuclear activities, including limits on uranium enrichment levels, the number and type of operating centrifuges, and the size of its enriched uranium stockpile. Key facilities at Fordow, Natanz, and Arak were to be repurposed for civilian research and medical uses. Iran also accepted more intrusive inspections by the International Atomic Energy Agency to verify compliance. In return, it received relief from nuclear-related sanctions imposed by the United Nations, the European Union, and the United States, although many other U.S. sanctions remained in place, especially those targeting Iran’s missile program and regional activities. Beginning on 28 December 2017, protests known as the Dey protests spread across Iran, starting over economic grievances in Mashhad but quickly expanding to political opposition to Supreme Leader Ali Khamenei and the theocratic system. Marking the most serious unrest since 2009, the largely leaderless protests featured anti-regime chants and attacks on government sites, with at least twenty-one protesters and two security personnel killed, and around 3,700 arrested by early January 2018. In response, thousands of government supporters held pro-government rallies in multiple cities. In May 2018, Donald Trump decided to pull out of the JCPOA, announcing he would reimpose economic sanctions on Iran effective from 4 November that year. This marked the beginning of the Trump administration's maximum pressure campaign, an effort to force Iran to renegotiate the nuclear agreement by imposing intensified sanctions. On 22 September 2018, the Ahvaz military parade was attacked by gunmen in the southwestern Iranian city of Ahvaz. The shooters killed 25 people, including soldiers of the Islamic Revolutionary Guard Corps (IRGC) and civilian bystanders. The Islamic State claimed responsibility for the attack. Iran blamed "militants in Syria" and claimed the "U.S. and the Gulf states enabled the attack" and vowed revenge. The U.S., Saudi Arabia and the United Arab Emirates rejected the accusation. From mid-March to April 2019 widespread flash flooding affected large parts of Iran, most severely in Golestan, Fars, Khuzestan, Lorestan, and other provinces. Iran was hit by three major waves of rain and flooding over the course of two weeks which led to flooding in at least 26 of Iran's 31 provinces. At least 70 people died nationwide. The 2019–2020 Iranian protests began in response to a 50–200% fuel price increase and quickly spread to 21 cities, becoming the most violent unrest since the 1979 revolution. Security forces reportedly shot protesters from rooftops, helicopters, and at close range, killing around 1,500 people according to U.S. sources, while Amnesty International described efforts to cover up the scale of the violence. Protesters attacked 731 banks, 50 military bases, and nine religious centers, prompting the government to impose a near-total internet blackout for six days. The uprising was crushed within three days, though sporadic protests continued. On 3 January 2020, the United States military executed a drone strike at Baghdad Airport, killing Qasem Soleimani, the leader of the Quds Force, an elite branch of the Iranian Islamic Revolutionary Guard Corps (IRGC). The assassination sharply increased tensions between the two countries. Iran vowed retaliation, and on 8 January launched missile attacks on U.S. forces based in Iraq, marking the first direct military exchange between Iran and the U.S. since 1988. The same day, the IRGC mistakenly shot down Ukraine International Airlines Flight 752. Following these events, no further military escalation occurred. The 2020 parliamentary elections in Iran were marked by historically low voter turnout, officially reported at 42.6%—the lowest since the 1979 revolution. The elections took place in the wake of widespread public disillusionment following the violent crackdown on protests in late 2019, which severely damaged the credibility of President Hassan Rouhani and the reformist camp. As a result, conservative candidates won a dominant majority in the parliament, securing 221 out of 290 seats, while reformists managed to win only a small fraction. The outcome was widely seen as a significant blow to Rouhani ahead of the end of his term in 2021. The COVID-19 pandemic in Iran led to 7,627,863 confirmed cases of COVID-19 and 146,837 deaths. The first cases were reported in Qom on 19 February 2020. The government responded by cancelling public events, closing institutions and shrines, and requesting a $5 billion emergency loan from the IMF. Initial resistance to quarantines and travel restrictions contributed to the virus’s spread before a ban on intercity travel was implemented. After restrictions eased in April, cases surged again, peaking in June and July. Despite these rising case numbers, the government had no option but to keep the economy open, as it was already under strain from U.S. sanctions and had suffered a further 15% GDP decline due to the pandemic by June 2020. Estimates of deaths have varied widely, with some leaked data suggesting a much higher toll than official figures, and the government faced allegations of mismanagement and censorship. The virus also impacted Iran’s leadership, infecting 23 MPs by early March and killing at least 17 officials by late March. On 3 August 2021 Ebrahim Raisi was elected 8th President of Iran. On 16 September 2022, 22-year-old Iranian woman Mahsa Amini died in a hospital in Tehran, Iran, under suspicious circumstances. The Guidance Patrol, the religious morality police of Iran's government, had arrested Amini for allegedly not wearing the hijab in accordance with government standards. The Law Enforcement Command of the Islamic Republic of Iran stated that she had a heart attack at a police station, collapsed, and fell into a coma before being transferred to a hospital. However, eyewitnesses reported that she was severely beaten and that she died as a result of police brutality, which was denied by the Iranian authorities. Amini's death resulted in a series of protests described as more widespread and larger than previous large protests. Iran Human Rights reported that by December 2022 at least 476 people had been killed by security forces attacking protests across the country. By spring 2023, the protests had largely subsided, ultimately leaving the political leadership unchanged and firmly entrenched in power. In October 2023, an IAEA report estimated Iran had increased its uranium stockpile 22 times over the 2015 agreed JCPOA limit. On 1 April 2024, Israel's air strike on an Iranian consulate building in the Syrian capital Damascus killed an important senior commander of the Islamic Revolutionary Guards Corps (IRGC), Brig Gen Mohammad Reza Zahedi. In retaliation for the Israeli strike, Iran attacked Israel with over 300 drones and missiles on 13 April. However, the Iranian attack was mainly intercepted either outside Israeli airspace or over the country itself. It was the biggest missile attack in Iranian history, and its first ever direct attack on Israel. It was followed by a retaliatory missile strike by Israel on Isfahan, Iran on 19 April. On 19 May 2024, Ebrahim Raisi died in a helicopter crash in the country’s East Azerbaijan province. First Vice President Mohammad Mokhber was appointed acting president after the death of President Raisi. On 28 July 2024, Masoud Pezeshkian was formally endorsed as Iran's new president by Iran's supreme leader, Ayatollah Ali Khamenei. Pezeshkian, a reformist, won in a presidential election runoff on 5 July. Three days later, Ismail Haniyeh, political chief of Palestinian political and military organisation Hamas, was assassinated in Iran’s capital, Tehran, where he was to attend the inauguration ceremony of Iran’s President Masoud Pezeshkian. On 1 October 2024, Iran launched about 180 ballistic missiles at Israel in retaliation for assassinations of Haniyeh, Hassan Nasrallah and Abbas Nilforoushan. On 27 October, Israel responded to that attack by strikes on a missile defence system in the Iranian region of Isfahan. In December 2024, the fall of the Assad regime in Syria, a close ally of Iran, was a severe setback for the political influence of Iran in the region. In early 2025, Iran was enriching substantial quantities of uranium to 60% purity, close to weapons-grade. Analysts warned that such activity exceeded any plausible civilian justification. Beginning in April 2025, Iran and the United States entered negotiations for a new nuclear agreement, but progress stalled as Iran's leaders have refused to stop enriching uranium. Among the main points of disagreement were the conditions for lifting sanctions against Iran. In June 2025, IAEA found Iran non-compliant with its nuclear obligations for the first time in two decades. In response, Iran announced the activation of a new enrichment facility and began installing additional advanced centrifuges. On 13 June 2025, Israel launched coordinated strikes across Iran, targeting nuclear facilities and eliminating top members of Iran's military leadership. This was the beginning of the Iran–Israel war Iran retaliated with waves of missile and drone strikes against Israeli cities and military sites. United States strikes on Iranian nuclear sites occurred on 22 June 2025. On 24 June, Israel and Iran agreed to a ceasefire after insistence from the US. Beginning on 28 December 2025, mass demonstrations erupted across multiple cities in Iran amid widespread dissatisfaction with the Islamic Republic government and a deepening economic crisis. The movement quickly became the largest outbreak of unrest in Iran since the 2022–2023 protests following the death of Mahsa Amini. The ensuing crackdown, carried out under Ali Khamenei's and senior officials' order for live fire on protesters, resulted in massacres that left thousands of protesters dead. The Iranian government faced accusations of committing crimes against humanity. See also Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-402] | [TOKENS: 12858] |
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-Lin,_An-115] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Pre-Islamic_Arabia] | [TOKENS: 14124] |
Contents Pre-Islamic Arabia The era of pre-Islamic Arabia encompasses human history in all parts of the Arabian Peninsula before the spread of Islam beginning in 610 CE. During the prehistoric period, humans first migrate and settle into the peninsula. In the early first millennium BC, writing and recorded history are introduced into the Peninsula, along with the rise of the first kingdoms in the south. In the early seventh century, the pre-Islamic period quickly comes to a close, from the beginning of Muhammad's preachings of Islam, to his establishment of the first Islamic state in 622 in Medina, and the subsequent conquest and political unification of the peninsula shortly after Muhammad's death, in the 630s. Some strands of Islamic tradition interpret the pre-Islamic period as a barbaric, morally un-enlightened period known as the "Jahiliyyah" (Arabic: جَاهِلِيَّة), but historians have not adopted this convention. Pre-Islamic Arabia's demographics included both nomadic and settled populations, the latter of which eventually developed into distinctive civilizations. Eastern Arabia was home to the region's earliest civilizations, such as Dilmun, which is attested as a prominent trade partner of Mesopotamia during the Bronze Age; and its later pre-Islamic history is marked by the reign of consecutive Iranian empires, including those of the Parthians and the Sasanians. From the early 1st millennium BCE onward, South Arabia became home to a number of kingdoms, such as Sheba and Ma'in; while part of North Arabia became home to the Nabataean Kingdom, which was conquered and annexed by the Roman Empire in 106, becoming the province Roman Arabia, and starting a period of Roman influence. Arabian tribes and the southern kingdoms structured much of pre-Islamic society, and memory of these societies is filtered today through Islamic literature and pre-Islamic poetry. Pre-Islamic tribes engaged in warfare and formed alliances, and for most of history, practiced Arabian religions. Religion in pre-Islamic Arabia was diverse. Polytheism was prevalent for most of the region's history, with beliefs and practices having a common origin in ancient Semitic religion. Christianity, Judaism, and monotheism became common in the region in the fourth century, a trend driven by Christian proselytization from the Eastern Roman Empire and the Kingdom of Aksum, as well as the conversion to monotheism and Judaism by the elite of the Himyarite Kingdom. Territory The Arabian Peninsula is a region of great ecological and environmental diversity and gave rise to distinct forms of human occupation throughout the region. It has an area of 2.5 million km2 and includes the modern-day regions of Saudi Arabia, Yemen, Oman, the United Arab Emirates, Qatar, Bahrain, Kuwait, and parts of Jordan. The Peninsula has 7,000 km of coastline, and most of the interior is covered by vast wastelands called dunes. Before Islam, the territory implied by the word Arabia was different across many surviving sources, but it was not a synonym for the Arabian Peninsula. Instead, in the earliest sources, it encompassed both the peninsula, in addition to the steppe and desert wastes on the borders of Egypt and the Fertile Crescent. For Herodotus, an ancient Greek historian in the 5th century BC, "Arabia" refers to the areas as far out as eastern Egypt, the Sinai Peninsula, and the Negev. The Arabâya mentioned in Persian administrative sources includes the territory described by Herodotus, in addition to the areas of the Syrian desert. For Pliny the Elder, the Syrian desert (Arabia Deserta) itself was the territory of the "Arabia of the nomads". Prehistoric Arabia Prehistoric Arabia is the era of the history of the Arabian peninsula before its earliest documented civilizations. Early human migration into Arabia took place during the Paleolithic period. Human occupation was not continuous, but punctuated, heavily influenced by changing patterns of rainfall and precipitation, resulting in expansions, contractions, and migrations of early Arabian populations of humans. Among the earliest human settlements that have been found date back to 240–190 thousand years ago, and the oldest human fossils known from Arabia are over 80,000 years old. The earliest human populations likely migrated into Arabia from Africa, settling into the Eastern coastline. In the Neolithic period, Arabia witnessed a large demographic expansion, and humans began to widely settle the south and inland regions of Arabia. Eventually, by 6,000 years ago, the Arabian economy transitioned into one of nomadic pastoralism, but it continues to be debated if this technology spread into Arabia through the migration of Levantine populations where this practice had already been established, or if it was an internal development that may have come about from trade with the Levant.[page needed] Eastern Arabia Eastern Arabia is a geographic region that generally refers to the territories covered by modern-day Kuwait, Bahrain, Qatar, the east coast of Saudi Arabia, the United Arab Emirates, and Oman. The main language in this region among sedentary peoples was Aramaic, Arabic, and to some degree, Persian. The Syriac language also came to be spoken as a liturgical language. Many religions were practiced in the area. Practitioners included Arab Christians (including the tribe of Abd al-Qays), Aramean Christians, Persian-speaking Zoroastrians and Jewish agriculturalists. One hypothesis holds that the contemporary Baharna are descendants of Arameans, Jews, and Persians from the area. Zoroastrianism was also present in Eastern Arabia. The Zoroastrians of Eastern Arabia were known as "Majoos" in pre-Islamic times. The sedentary dialects of Eastern Arabia, including Bahrani Arabic, were influenced by Akkadian, Aramaic and Syriac languages. During the Bronze Age, most of Eastern Arabia was part of the land of Dilmun, including modern-day Kuwait, Bahrain, Qatar and the adjacent coast of Saudi Arabia. Its capital was located in Bahrain. Dilmun is the earliest recorded civilization from Eastern Arabia, mentioned in written records in the 3rd millennium BCE, with archaeological evidence indicating activity from the fourth to first millennia BCE, its importance faltering after 1800. Dilmun is regarded as one of the oldest ancient civilizations in the Middle East in general. The Dilmun civilization was an important trading center which at the height of its power controlled the Persian Gulf trading routes. This was enabled by a number of natural advantages to the region, and part of this was its abundant underground water supplies and easy anchorages for ships. It became a center for long-distance trade and all types of commodities passed through it (trade extending to areas as far as the Indus Valley), including a variety of exotic goods. As a result, Dilmun became legendary in Mesopotamian literature. The Sumerians regarded Dilmun as holy land. The Sumerians described Dilmun as a paradise garden in the Epic of Gilgamesh. The Sumerian tale of the garden paradise of Dilmun may have been an inspiration for the Garden of Eden story. Dilmun, sometimes described as "the place where the sun rises" and "the Land of the Living", is the scene of some versions of the Eridu Genesis, and the place where the deified Sumerian hero of the flood, Utnapishtim (Ziusudra), was taken by the gods to live forever. Thorkild Jacobsen's translation of the Eridu Genesis calls it "Mount Dilmun" which he locates as a "faraway, half-mythical place". Dilmun was mentioned in two letters dated to the reign of Burna-Buriash II (c. 1370 BCE) recovered from Nippur, during the Kassite dynasty of Babylon. These letters were from a provincial official, Ilī-ippašra, in Dilmun to his friend Enlil-kidinni in Mesopotamia. The names referred to are Akkadian. These letters and other documents, hint at an administrative relationship between Dilmun and Babylon at that time. Following the collapse of the Kassite dynasty, Mesopotamian documents make no mention of Dilmun with the exception of Assyrian inscriptions dated to 1250 BCE which proclaimed the Assyrian king to be king of Dilmun and Meluhha. Assyrian inscriptions recorded tribute from Dilmun. There are other Assyrian inscriptions during the first millennium BCE indicating Assyrian sovereignty over Dilmun. Dilmun was also later on controlled by the Kassite dynasty in Mesopotamia. In 19th and early 20th centuries, some historians believed that the Phoenicians originated from Eastern Arabia, particularly Dilmun. However, this theory has since been abandoned. Upon the Late Bronze Age collapse, the major powers across the Near East lost much of their power and this allowed many smaller and far-flung states to become more independent and a number of minor states to emerge. In addition, new scripts and alphabets were developed, which were of great utility to merchants for producing contracts and conducting other and related economic affairs. In turn, the Middle Assyrian Empire became much more prominent and undertook an aggressively expansionist policy. In the second half of the 13th century BCE, Tukulti-Ninurta I took on the title "king of Dilmun and Meluhha". Dilmun disappears from Assyrian sources until the reign of Sargon II, king of the Neo-Assyrian Empire. Here and across the next century, Dilmun appears as a polity that is not directly ruled by Assyria, although it sends tribute to the Assyrian rulers in exchange for peace and independence. When the Neo-Babylonian Empire overthrew the Assyrian Empire, its influence on Dilmun is also attested. Finally, Dilmun falls under the sway of the Persians after the rise of the Achaemenid Empire which replaces the Babylonian one. After Alexander the Great returned from his conquests that reached India, settling in Babylonia, it is said by the historian Arrian that he was turning his attention towards an invasion of the Arabian peninsula. Although he died too soon for this campaign to take place, Alexander did dispatch three intelligence-gathering missions to gather knowledge about the peninsula, and it was these missions that greatly enhanced information about the peninsula to the Hellenistic world. Alexander, and his Seleucid successors who took control of the relevant region near Arabia after his death, took an interest to Arabia because of trade happening there involving luxury products. In this context, Gerrha was an ancient city of great importance in Eastern Arabia. It was located on the west side of the Persian Gulf. Soon after the conquests of Alexander the Great, it became the most important center of trader for the Hellenistic world in the Gulf region, known for its transport of Arabian aromatics and goods from India further away. It retained its prominence until the first centuries of the common era. While there is no certainty as to which archaeological site the Gerrha of Greek sources can be identified with, the most prominent candidates have been Thaj and Hagar (modern-day Hofuf). The Greeks also described Bahrain in their writings, referring to it by their exonym, Tylos. Tylos was the center of pearl trading, when Nearchus came to discover it serving under Alexander the Great. From the 6th to 3rd century BCE Bahrain was part of the Achaemenid Empire, an Iranian dynasty. The Greek admiral Nearchus is believed to have been the first of Alexander's commanders to visit these islands, and he found a verdant land that was part of a wide trading network; he recorded: "That in the island of Tylos, situated in the Persian Gulf, are large plantations of cotton tree, from which are manufactured clothes called sindones, a very different degrees of value, some being costly, others less expensive. The use of these is not confined to India, but extends to Arabia." The Greek historian, Theophrastus, states that much of the islands were covered in these cotton trees and that Tylos was famous for exporting walking canes engraved with emblems that were customarily carried in Babylon. It is not known whether Bahrain was part of the Seleucid Empire, although the archaeological site at Qalat Al Bahrain has been proposed as a Seleucid base in the Persian Gulf. Tylos was integrated well into the Hellenised world: the language of the upper classes was Greek (although Aramaic was in everyday use), while Zeus was worshipped in the form of the Arabian sun-god Shams. Tylos even became the site of Greek athletic contests. Another important player in this time was the Parthian Empire, which emerged from northeastern Iran and relinquished a significant amount of territory from the eastern borders of the Seleucids. The Seleucids would be eventually vanquished by the Roman Empire over the course of the 1st century BCE, leading to the Romans and Parthians being the main power contenders in the region, separated by the Syrian desert. Parthian influence extended over the Persian Gulf and reached as far as Oman. Garrisons were established at the southern coast of the Gulf to help extend their power. A major shift in dynamics accompanied the replacement of the Parthian dynasty in Persian lands by the Sasanians, leading to the rise of the Sasanian Empire around 240 CE. Seizing the moment, Ardashir I, the first ruler of this dynasty marched down the Persian Gulf to Oman and Bahrain and defeated Sanatruq II (or Satiran), probably the Parthian governor of Eastern Arabia. He appointed his son Shapur I as governor of Eastern Arabia. Shapur constructed a new city there and named it Vahman Ardashir after his father. At this time, Eastern Arabia incorporated the southern Sasanian province covering the Persian Gulf's southern shore plus the archipelago of Bahrain. The southern province of the Sasanians was subdivided into the three districts of Haggar (Hofuf, Saudi Arabia), Vahman Ardashir (al-Qatif province, Saudi Arabia), and Mishmahig (Muharraq, Bahrain; also referred to as Samahij) which included the Bahrain archipelago that was earlier called Aval. Sasanian interests in the region largely lay in controlling traffic through the Persian Gulf, but land-incursions into the peninsula were occasionally undertaken, such as when the inhabitants of Eastern Arabia invaded southern Iran during the reign of Shapur II in the fourth century CE. During the political struggles of the sixth century, Khosrow I began to exert more direct rule over Eastern Arabia, including the direct appointment of regional governors. Another development of the Sasanian period is the rise of Christianity in Eastern Arabia. In antiquity, Syriac Christians referred to a region in northeast Arabia as Beth Qatraye, or the "region of the Qataris". This region encompassed a territory that, today, includes Bahrain, Tarout Island, Al-Khatt, Al-Hasa, Qatar, and possibly the United Arab Emirates. The region was also sometimes called "the Isles". By the 5th century, Beth Qatraye was a major centre for Nestorian Christianity, which had come to dominate the southern shores of the Persian Gulf. As a sect, the Nestorians were often persecuted as heretics by the Byzantine Empire, but eastern Arabia was outside the Empire's control offering some safety. Several notable Nestorian writers originated from Beth Qatraye, including Isaac of Nineveh, Dadisho Qatraya, Gabriel of Qatar and Ahob of Qatar. Christianity's significance was diminished by the arrival of Islam in Eastern Arabia by 628. In 676, the bishops of Beth Qatraye stopped attending synods; although the practice of Christianity persisted in the region until the late 9th century. The dioceses of Beth Qatraye did not form an ecclesiastical province, except for a short period during the mid-to-late seventh century. They were instead subject to the Metropolitan of Fars. Oman and the United Arab Emirates comprised the ecclesiastical province known as Beth Mazunaye. The name was derived from 'Mazun', the Persian name for Oman and the United Arab Emirates. South Arabia South Arabia roughly corresponds to modern-day Yemen, with Oman being designated as part of Eastern Arabia. In contrast to the rest of Arabia, South Arabia is a self-contained cultural area that retained the independence of its cultural, political, and linguistic dynamics from the rise of its first known kingdoms until the end of Late Antiquity. The rise of the South Arabian kingdoms owes itself to the construction of irrigation complexes that captured precipitation by the biannual monsoon rains, enabling agriculture, and the trade routes that carried incense and other spices, giving rise to tales of legendary wealth about the region among Greco-Roman observers. The South Arabian kingdoms emerged in the early first millennium BCE, and they included the Kingdoms of Ma'in, Saba, Hadhramaut, Aswan, and Qataban. Until recently, very little was known about human activity in South Arabia prior to the first millennium BCE. However, recent decades of archaeological work have begun to rapidly change this situation. This has helped increase the prominence of the discipline, though it has not yet became a mainstream topic in Near Eastern archaeology. In the third century CE, the Kingdom of Himyar emerged and conquered its neighbours to exert complete political domination over South Arabia. This situation persisted for several centuries, until the Himyarite polity unravelled over the course of the sixth century CE and experienced a societal collapse. The collapse had no single cause, instead, a number of coinciding events contributed to this situation. First, a rapid series of turbulent political events took place: the violent coup of Dhu Nuwas, the massacre of the Christian community of Najran, the Aksumite conquest of Himyar, and the rebellion of the Ethiopian soldiers in South Arabia against Aksum. Epidemic and climatological factors also contributed: one inscription (CIH 541) from the 550s indicates that the Plague of Justinian struck South Arabia. Severe droughts took place from 500 to 530 CE, and around the mid-6th century, there was the Late Antique Little Ice Age. Across the Arabian Peninsula, the effect of each of these factors was the most severe in the South, and there, especially the Southwest. By the 550s and 560s, Himyar's decline was completed, as it faced military incursions from Central and Northwest Arabia, and local insurrections. In 559 CE, the final Himyarite inscription was recorded. The collapse of the traditional order is indicated by the breakdown of the Marib Dam over the course of the 570s. A creeping in of influence from the Persian Sasanian Empire is evident towards the end of the 6th century. The Kingdom of Saba (or Sheba) was regarded by the South Arabian and earliest Ethiopian kingdoms as the locus for the birth of South Arabian civilization. The kingdom spoke Sabaic and constructed many impressive architectural complexes, such as the Marib Dam, which helped sequester the monsoon rains through an irrigation network that laid the foundation for the emergence of a civilization, and many temples, including the Temple of Awwam for their national god Almaqah where hundreds of inscriptions have been discovered. The Sabaeans had close contact with the cultures of the Horn of Africa. They went on to conquer both Eritrea and northern Ethiopia to establish the Kingdom of Dʿmt, where a hybrid Ethiosemitic script emerged. Under the leadership of Karib'il Watar, Saba dominated most of modern-day Yemen, a feat that would not be accomplished again in the region until the Himyarite Kingdom a thousand years later. Their legacy was remembered in both biblical and Islamic tradition, especially in the legendary story of the Queen of Sheba. The Sabaean kingdom emerged some time around the turn of the 1st millennium BCE. By the time that the formative period of Sabaean history was complex, a fully developed alphabetic script was available, as well as the technological prowess to construct cities and other architectural complexes. There is some debate as to the degree to which the movement out of the formative phase was channeled by endogenous processes, or the transfer or technologies from other centers, perhaps via trade and immigration. The first major phase of Sabaean civilization lasted from the 8th to the 1st centuries BCE. Rulers referred to themselves by the title Mukarrib ("federator") as a testimony to the hegemony they exerted over neighbouring polities. The period was dominated by a caravan economy that had market ties with the rest of the Near East. Its first major trading partners were at Khindanu and the Middle Euphrates. Later, this moved to Gaza during the Persian period, and finally, to Petra in Hellenistic times. The South Arabian deserts gave rise to important aromatics which were exported in trade, especially frankincense and myrrh. It also acted as an intermediary for overland trade with neighbours in Africa and further off from India. Saba was a theocratic monarchy with a common cult surrounding their national god, Almaqah. Four other deities were also worshipped: Athtar, Haubas, Dhat-Himyam, and Dhat-Badan. The first Sabaean period came to a close as the Roman Republic expanded to conquer Syria and Egypt in 63 and 30 BCE, respectively. They diverted the overland trade route through the Sabaean kingdom into a maritime trade route that went through the Hadhramaut port city of Qani. They even attempted to siege Marib, the Sabaean capital, but were unsuccessful. Greatly weakened, they were annexed by the neighbouring Himyarite Kingdom. Saba was able to regain their independence around 100 CE to onset a second period of their civilization. Notably, power dynamics had shifted from oasis cities like Marib and Sirwah to groups occupying the highlands. Ultimately, Himyar permanently re-annexed them, around 275 CE. Awsan was a South Arabian kingdom that lasted from the 8th to 7th centuries BCE, with a brief resurgence in the 2nd or 1st century BCE. Awsan is centered around a wadi called the Wadi Markha. The name of the capital of Awsan is unknown, but it is assumed to be the tell that is today known as Hagar Yahirr, the largest settlement in the wadi. The territory under its control was sizable enough that it was a powerful contender in local power politics. In the late 7th century BCE, under the reign of its ruler Murattaʿ, Awsan entered a military conflict with the Kingdom of Saba that brought about its demise. The Sabaean king, Karib'il Watar, defeated Awsan and proceeded to obliterate it. An inscription left behind claims that Karib'il killed over 16,000 people and took 40,000 more as prisoners. After this event, the wadi was left abandoned, and Awsan disappeared from the historical record for the time being. Saba had divided its territory between its then-allies, Qataban and Hadhramaut. Half a millennium later, when Qataban's control over the Wadi Markha was declining, Awsan was able to briefly re-emerge, in the 2nd or 1st centuries BCE. This final phase of the Awsanite kingdom is the only period in South Arabian history where kings were deified. The Minaeans, or the inhabitants of the Kingdom of Ma'in, had their capital at Qarna (modern-day Sa'dah). Another important city was Yathill (now known as Baraqish). The Minaean Kingdom was centered in northwestern Yemen, with most of its cities lying along Wādī Madhab. Ma'in was responsible for managing an international frankincense trade and it set up a number of colonies across Arabia and the Mediterranean to manage it. For this reason, Minaic inscriptions have been found far afield of the Kingdom of Maīin, as far away as al-'Ula in northwestern Saudi Arabia and even on the island of Delos and Egypt. Qataban was one of the ancient Yemeni kingdoms which thrived in the Beihan valley. Like the other Southern Arabian kingdoms, it gained great wealth from the trade of frankincense and myrrh incense, which were burned at altars. The capital of Qataban was named Timna and was located on the trade route which passed through the other kingdoms of Hadramaut, Saba and Ma'in. The chief deity of the Qatabanians was Amm, or "Uncle" and the people called themselves the "children of Amm". Hadhramaut is first mentioned in a 7th century BCE inscription from the king of the Sabaean kingdom, Karib'il Watar, mentioned as an ally. For commercial reasons, Hadhramaut became one of the confederates of Ma'in when they took control of the caravan routes. After the fall of Ma'in, it experienced a period of independence. Hadhramaut had to repel attacks by Himyar in the 1st century BCE, and managed to annex Qataban in the 2nd century CE, when it reached its greatest size. Ultimately, the kingdom did fall to an invasion by the Himyarite king Shammar Yahri'sh in the 3rd century CE, making it the final one of the South Arabian kingdoms to fall to Himyar. Himyar was a polity in the southern highlands of Yemen, as well as the name of the region which it claimed. Until 110 BCE, it was integrated into the Qatabanian kingdom, afterwards being recognized as an independent kingdom. According to classical sources, their capital was the ancient city of Zafar, relatively near the modern-day city of Sana'a. Himyarite power eventually shifted to Sana'a as the population increased in the fifth century. After the establishment of their kingdom, it was ruled by kings from dhū-Raydān tribe. The kingdom was named Raydān. The kingdom conquered neighbouring Saba' in c. 25 BCE (for the first time), Qataban in c. 200 CE, and Haḍramaut c. 300 CE. Its political fortunes relative to Saba' changed frequently until it finally conquered the Sabaean Kingdom around 280. With successive invasion and Arabization, the kingdom collapsed in the early sixth century, as the Kingdom of Aksum conquered it in 530 CE. The Himyarites originally worshiped most of the South-Arabian pantheon, including Wadd, ʿAthtar, 'Amm and Almaqah. Since at least the reign of Malkikarib Yuhamin (c. 375–400 CE), Judaism was adopted as the de facto state religion. The religion may have been adopted to some extent as much as two centuries earlier, but inscriptions to polytheistic deities ceased after this date. It was embraced initially by the upper classes, and possibly a large proportion of the general population over time. Native Christian kings ruled Himyar in 500 CE until 521–522 CE as well, Christianity itself became the main religion after the Aksumite conquest in 530 CE. In response to the massacre of the Christian community of Najran under the reign of the Jewish king Dhu Nuwas, the Christian king of the Kingdom of Aksum, Kaleb, responded by invading and annexing Himyar. In the second half of the sixth century, the Sasanian Empire conquered the Himyarite Kingdom and ended Aksumite occupation of South Arabia. This event is not mentioned in Sasanian sources and is noted only in passing in Byzantine sources. The bulk of what has been written about the period comes from Arabic sources, most famously that of Al-Tabari in his History of the Prophets and Kings, relying on an earlier account by Ibn Ishaq. However, there are six major Arabic accounts describing the Sasanian conquest of South Arabia and they differ over a range of major and minor details, including who the key actors were and their relative roles, the religious identities of some of the authors, the sizes of the armies, and so forth. In Al-Tabari's reporting, the Persian king Khosrau I sent troops under the command of Vahriz (Persian: اسپهبد وهرز), who helped the semi-legendary Sayf ibn Dhi Yazan to drive the Aksumites out of Yemen. Southern Arabia became a Persian dominion under a Yemenite vassal and thus came within the sphere of influence of the Sasanian Empire. After the demise of the Lakhmids, another army was sent to Yemen, making it a province of the Sasanian Empire under a Persian satrap. Following the death of Khosrau II in 628, the Persian governor in Southern Arabia, Badhan, converted to Islam and Yemen followed the new religion. Western Arabia (Hejaz) The history of the Hejaz has often centred around its major oasis cities, especially Yathrib (Medina), Fadak, Khaybar, Taymah, and Al-Ula. These cities benefited from regular access to water, and became major trading cities as early as the Bronze Age, especially with the domestication of the dromedary camel that allowed for long distance trade, and the rise of the incense trade that demanded the movement of incense, spices and other luxury goods into the Eastern Mediterranean from South Arabia, passing through the the Hejaz along the way. The Thamud (Arabic: ثمود) was an ancient civilization in the northwestern Hejaz with their center at Hegra from the eighth century BCE until the fifth century CE. They are attested in Mesopotamian, Classical, and Arabian sources. They are famously remembered in pre-Islamic poetry and the Quran. The Quran mentions them 26 times, as a polytheistic people destroyed by God for their rejection of the prophet Salih. In Quranic history, Thamud is part of a grander pattern of rebellion and destruction of past groups of people that did not heed the warnings of various prophets sent to them by God, along with others like Ad, Lot, and Noah. When Salih calls Thamud to serve one God, they demand a sign from him. He presents them with a miraculous she-camel. Thamud, unconvinced, injures the camel: for this God destroys them, except Salih and his followers. The Islamic exegetical tradition embellishes the story with more details. In Islamic genealogy, Thamud is among the true Arab tribes (as opposed to the "Arabicized Arabs"). Lihyan, also called Dedan, was a northwestern kingdom whose language was Dadanitic. The kingdom existed sometime between the 5th and 1st centuries BCE, but the end-date is uncertain. It is unclear if Lihyan was conquered by the Nabataeans or if the Nabataeans captured the territory after Lihyan had already fallen. North Arabia The most organized of the Northern Arabian tribes, at the height of their rule in the 6th century BCE, the Kingdom of Qedar spanned a large area between the Persian Gulf and the Sinai. An influential force between the 8th and 4th centuries BCE, Qedarite monarchs are first mentioned in inscriptions from the Assyrian Empire. Some early Qedarite rulers were vassals of that empire, with revolts against Assyria becoming more common in the 7th century BCE. It is thought that the Qedarites were eventually subsumed into the Nabataean state after their rise to prominence in the 2nd century CE. Achaemenid Arabia corresponded to the lands between Nile Delta (Egypt) and Mesopotamia, later known to Romans as Arabia Petraea. According to Herodotus, Cambyses did not subdue the Arabs when he attacked Egypt in 525 BCE. His successor Darius the Great does not mention the Arabs in the Behistun inscription from the first years of his reign, but does mention them in later texts. This suggests that Darius might have conquered this part of Arabia or that it was originally part of another province, perhaps Achaemenid Babylonia, but later became its own province. Arabs were not considered as subjects to the Achaemenids, as other peoples were, and were exempt from taxation. Instead, they simply provided 1,000 talents of frankincense a year. They participated in the Second Persian invasion of Greece (480-479 BCE) while also helping the Achaemenids invade Egypt by providing water skins to the troops crossing the desert. The Nabataeans are first mentioned as inhabiting the area east of the Syro-African rift between the Dead Sea and the Red Sea, that is, in the land that had once been Edom. And although the first sure reference to them dates from 312 BCE, it is possible that they were present much earlier. Josephus, writing in Jewish Antiquities 1.12.4) in the Roman era, described the descendants of Ishmael as Arabs, linking them with the Nabataeans, the tribe of Nebaioth: twelve sons in all were born to Ishmael, Nabaioth(es), Kedar, Abdeêl, Massam, Masma, Idum(as), Masmes, Chodam, Thaiman, Jetur, Naphais, Kadmas. These occupied the whole country extending from the Euphrates to the Red Sea and called it Nabatene. And it is these who conferred their names on the Arabian nation (to tōn Arabōn ethnos) and its tribes. The identification of the Arabs as Ishmaelites has also been expressed by Apollonius Molon and Origen, and was later adopted by Eusebius and Jerome. Classical Arab historians sometimes name Nebaioth as an ancestor of Muhammad. However the majority of traditions point to Kedar, another son of Ishmael, as his ancestor. Petra (from the Greek petra, meaning 'of rock') lies in the Jordan Rift Valley, east of Wadi `Araba in Jordan about 80 km (50 mi) south of the Dead Sea. It came into prominence in the late 1st century BCE through the success of the spice trade. The city was the principal city of ancient Nabataea and was famous above all for two things: its trade and its hydraulic engineering systems. It was locally autonomous until the reign of Trajan, but it flourished under Roman rule. The town grew up around its Colonnaded Street in the 1st century and by the middle of the 1st century had witnessed rapid urbanization. The quarries were probably opened in this period, and there followed virtually continuous building through the 1st and 2nd centuries CE. The Kingdom of Hatra (Arabic: مملكة الحضر, romanized: Mamlakat al-Ḥażr), also called Kingdom of Arabaya and Araba, was a 2nd-century Arab kingdom centered on the city of Hatra, located between the Roman and the Parthian empires, mostly under Parthian suzerainty, in modern-day northern Iraq. In the first and second century, Hatra was ruled by a dynasty of Arab princes. It capital rose to prominence and became an important religious center as a result of its strategic position along caravan trade routes. Hatra is one of the first Arab states to be established outside of the Arabian Peninsula. Hatra withstood repeated sieges - in the 2nd century by Roman emperors Trajan and Septimius Severus, and in the 220s by the Sasanian king Ardashir I. The kingdom was finally conquered after the 240/41 capture of its capital by the Sasanians under Shapur I, who destroyed the city. Osroene, or Edessa, was one of several states that acquired independence from the collapsing Seleucid Empire through the Abgarid dynasty, established by the Osrhoeni, a nomadic Nabataean tribe from Southern Canaan and North Arabia, beginning in 136 BC. Osroene's name either derives from the name of this tribe, or from Orhay (Urhay), the original Aramaic name of Edessa. Arab influence had been strong in the region. In his writings, Pliny the Elder refers to the natives of Osroene and Commagene as Arabs and the region as Arabia. Abgar II is called "an Arab phylarch" by Plutarch, while Abgar V is described as "king of the Arabs" by Tacitus. The Edessene onomastic contains many Arabic names. The most common one in the ruling dynasty of Edessa being Abgar, a well-attested name among Arabic groups of antiquity. Some members of the dynasty bore Iranian names, while others had Arabic names. Judah Segal notes that the names ending in "-u" are "undoubtedly Nabatean". The Abgarid dynasts spoke "a form of Aramaic". Abgar V is a legendary King who ruled Osroene at the time of Jesus, and is said to have been the first King to embrace Christianity. There is no doubt Christianity came early to Osroene and was widely embraced by the reign of Abgar VIII the Great (177 – 212), who was either Christian himself or not at all hostile to Christians. The Christian writer Sextus Julius Africanus (c. 160 – c. 240) stayed at Abgar the Great's court in 195, and a Christian inscription was produced in Edessa, which is from the same period or few decades later than the Inscription of Abercius from 216. It is estimated that Christianity was preached in Edessa since 160 – 170, and a flood in 201 destroyed "the temple of the church of the Christians", indicating a community large enough to have had a building of notable importance to the city at the time. Osroene endured for four centuries, with twenty-eight rulers occasionally named "king" on their coins. Most of the kings of Osroene were called Abgar or Manu and settled in urban centers. The Lakhmid kingdom was founded in the late third century. Spanning Eastern Arabia and Southern Mesopotamia, it existed as a dependency of the Sasanian Empire, though the Lakhmids held al-Hira as their own capital city and governed from there independently. For the Sasanians, the Lakhmids served as a buffer state to protect themselves from nomadic Arab invasions, and to project their own power over Arab territories. The Lakhmids were also contenders for the Ghassanids, another major Arab tribal confederation which existed as a client state to the Byzantine Empire. The Lakhmids and Ghassanids fought proxy wars on the part of the two empires, and assisted the empires when they entered more dire conflict. Under the leadership of Al-Mundhir III, the power of the Lakhmids reached its height, and they reigned a devastating defeat on the Romans and Ghassanids at the Battle of Callinicum, to the point that the Romans paid them tribute to avoid invasion. However, this apogee declined after the death of Al-Mundhir III. The Lakhmids became weak in the second half of the sixth century, leading to issues to them with the Persians. The final Lakhmid king, Al-Nu'man III, was converted to Christianity as the religion grew at Al-Hira. Years later, as the Sasanians sought to take direct control over their borders with Arab groups, Al-Nu'man III was deposed by the Sasanians themselves around 602 AD, bringing an end to the Lakhmid kingdom. The Ghassanid kingdom was a major Arab tribal confederation founded in the early third century. Early on, the Ghassanids converted to Christianity, and formed a cliental relationship with the Roman Empire, and later, the Eastern Roman (Byzantine) Empire. They served as rivals for the Lakhmid kingdom, the vassals of the Persian Sasanian Empire. Like the Lakhmids, they acted as a buffer state, preventing Arab nomadic incursion (into Byzantine territory) and playing the role of projecting the power of the empire into Arabian lands. The Byzantines ultimately deposed them in the late sixth century, and they ceased to be an entity during the early Muslim conquests. Central Arabia Kinda was an Arab kingdom by the Kinda tribe; the tribe's existence dates back to the second century BCE. The Kindites established a kingdom in Najd in Central Arabia unlike the organized states of Yemen; its kings exercised an influence over a number of associated tribes more by personal prestige than by coercive settled authority. Their first capital is called Qaryat al-Faw, then known as Qaryat Dhāt Kāhil. According to Islamic tradition, Kindite supremacy over Central Arabia collapsed after the First Battle of Kulab. Ancient South Arabian inscriptions mention a tribe settling in Najd called kdt, who had a king called rbˁt (Rabi'ah) from ḏw ṯwr-m (the people of Thawr), who had sworn allegiance to the king of Saba' and Dhū Raydān. Since later Arab genealogists trace Kinda back to a person called Thawr ibn 'Uqayr, modern historians have concluded that this rbˁt ḏw ṯwrm (Rabī'ah of the People of Thawr) must have been a king of Kinda (kdt); the Musnad inscriptions mention that he was king both of kdt (Kinda) and qhtn (Qaḥṭān). They played a major role in the Himyarite-Ḥaḑramite war. Following the Himyarite victory, a branch of Kinda established themselves in the Marib region, while the majority of Kinda remained in their lands in central Arabia. In the mid-sixth century, the Byzantine emperor Justinian wanted to spread his influence over the Arabian Peninula, in competition with the growing role of the Sasanians in the region. Justinian sent the ambassador Nonnosus to meet with the leadership of Central Arabia, a king named Kaïsos (Greek: Καισος, Arabic: Qays), the nephew of Aretha (Greek: Άρεθα, Arabic: Ḥārith), who is said to rule over both the Khindynoi (Greek Χινδηνοι), or Kinda, and the Maadynoi (Greek: Μααδηνοι), or the Ma'add, the two most important tribes in the area in terms of territory and number. He calls the king of Kinda Kaïsos (Greek: Καισος, Arabic: Qays). Kinda's leadership met Justinian at the Byzantine royal court, and agreed to become his clients, receiving the title of phylarch, like that of the other rulers of Byzantium's Arab client-states, such as the Ghassanids. Ma'add was a group of nomadic and semi-nomadic groups occupying central Arabia, beyond the territorial domain of the major powers of its day: north of the direct territorial control of the Himyarite Kingdom, and south of that of the Lakhmids. The Ma'addites maintained independence from the empires and kingdoms to their north and south by living in remote areas and arranging militarized societies. In the 4th–6th centuries, their center was at a site called Ma'sal al‐Jumh in the Najd. In Islamic times, Ma'add was transformed into a folkloric ancestor for all Arabs. As time passed on, Arab genealogy expanded, and Ma'add was reduced to being an ancestor of some of the "northern" Arabs. Economy and trade The economy and trade of pre-Islamic Arabia refers to the land- and sea-trade networks used by the inhabitants of Pre-Islamic Arabia, both inter-regionally (between different regions of Arabia) and internationally. Famously, the Arabian Peninsula, situated between the Levant, Mesopotamia, Persia, Egypt, and the Horn of Africa, is known for its role in the ancient incense trade route, which saw the movement of spices across regions as distant as India to Europe. The documentation of trade in the region goes back to the 3rd millennium BC, where Dilmun, a civilization covering most of Eastern Arabia, was known in Mesopotamian traders as a legendary source of wealth and goods in the Bronze Age. Interactions with foreign civilizations Shortly after the Roman annexation of Egypt in the aftermath of the Battle of Alexandria (30 BC), the Roman emperor Augustus (27 BCE – 14 CE) set his eyes on the Arabian Peninsula, and ordered the Prefect of Egypt, Aelius Gallus, on an expedition. Gallus invaded the south, and captured major cities including Najran and Baraqish, but during his siege of Marib, ultimately had to return due to a growing number of problems besetting the expedition, especially water shortage. In the early second century CE, in the year 106, the Romans conquered the Nabataean Kingdom, whose territory spanned parts of Syria, Jordan, and the northwest of the Peninsula. with its capital at Petra. This conquest led to the creation of the new province of Arabia Petraea, sometimes known as Roman Arabia. The desert frontier of Arabia Petraea, representing the borders between Roman and Arabian territory, was called by the Romans the Limes Arabicus. At the height of the Roman encroachment into the peninsula, regions of the northern Hejaz, particularly the city of Hegra (Mada'in Saleh), came under Roman occupation in the second, third, and early fourth centuries CE, attested by the Ruwafa inscriptions and archaeological finds. In the fourth century onwards, the Romans pulled back to Arabia Petraea, preferring indirect rule through proxy Arab tribal confederations, like the Ghassanids. In the sixth century, guided by the expansionist policies of the Byzantine emperor Justinian and to counter Sasanian influence over Eastern Arabia and the Persian Gulf, the Romans began to more directly re-assert themselves in the Arabian Peninsula. The Tiran Island, part of modern-day Saudi Arabia, and located between the Arabian and Sinai peninsulas, came under direct Roman rule. Within a few years, Justinian established a client network across the coast of Western Arabia, Central Arabia, and attempted to bring Himyar (the South Arabian kingdom) and Aksum (the Ethiopian kingdom) under his sway. Justinian sent an ambassador and diplomat, Nonnosus, to Kaisos, the joint ruler over the Kingdom of Kinda and Ma'add (a major tribal confederation), which were the main powers in Central Arabia. Nonnosus convinced Kaisos to come to the Byzantine capital, and there, Justinian negotiated Kaisos to agreeing to become a phylarch of over Palestinian territories, with his brothers becoming Justinian's client rulers over Central Arabia. Justinian is also said to have been given rule over the "Palm Grove" (either Hegra or Tayma) by its leader Abu Karib, who Justinian in turn made phylarch over the region. Another emperor, whose identity is not clear, had (according to Islamic tradition) the tribal confederation of Mudar, which ruled over the Hejaz, as one of its clients, and with whom it participated in joint military ventures with. Mudar helped the Byzantines fight the Sasanians, while the Byzantines helped Mudar capture Mecca. The Persian empires, during both its Parthian and Sasanian phases, had a long history and presence in the Arabian Peninsula, especially Eastern Arabia, located immediately across of Iran after crossing the Persian Gulf, and Oman. In the third century, the Lakhmid kingdom emerged, and became the main Arab client-state of the Sasanians for extending their hegemony into these areas of the peninsula. This allowed the Persians to exert their power without a direct presence, although sometimes, they also directly operated in the peninsula; in 2019, a late antique Sasanian fort was discovered in the Batinah Plain of Oman at Fulayj. In the late sixth century, the Sasanian strategy changed as Himyar, the long-dominant power over South Arabia, began to crumble: through the Aksumite–Persian wars, the Sasanian Empire also ruled over South Arabia. In Arabic memory, the Battle of Dhu Qar in the early years of the seventh century is usually seen as the turning point and when Arab tribes were no longer militarily subordinate to the Sasanians. The Hejaz, without being annexed into Sasanian territory, was often the subject of the soft power of the Sasanians through economic and political means, ever since its establishment in the third century, through the Lakhmids. The Šahrestānīhā ī Ērānšahr, a Middle Persian document, places the region with both Mecca and Medina in the domain of the Iranian empire during the third century. Medina may have been directly controlled, at one point, by Khosrow I, who is said to have appointed the Lakhmid king Al-Mundhir III (r. 569–581) over all Arabs living between, on the one side, Oman, Bahrain, and Al-Yamama, and to the other side, Al-Ta'if and the rest of the Hejaz. A Sasanian governor, whose main seat was on the coast of the Persian Gulf, is said to have indirectly ruled Medina and Tihama, and where he was represented by an official called an ʿamīl. Banu Qurayza and Banu Nadir, two Jewish tribes, were said to have exacted tribute from two other tribes, Banu Aws and Banu Khazraj, on account of the Sasanians. Some sources also suggest a Sasanian presence in the Najd and Yemen to extract mineral resources, which would have involved Sasanian servicemen and laborers in the region with local involvement to operate. This is evidenced by the Antiquities of South Arabia of Al-Hamdani, transmits family names from these regions, some of which are Middle Persian. Earlier, Kavad I is said to have attempted promoting Mazdakism in the same area. In the second half of the sixth century, it is said that the Lakhmid ruler Al-Nu'man III (r. 582–602) appointed a king, ʿAmr b. al-Itnaba from the Khazraj, over Medina. Later, the Sasanians conquered South Arabia, replacing rule by the Kingdom of Aksum. The Sasanian emperor at this time, Khosrow II, may have considered Mecca to be part of his domain, as he is said to have sent a governor to Mecca from Yamama over collecting taxes. Ethiopia, and the Horn of Africa at large, is found south-west of the Arabian Peninsula, separated only by the Red Sea. Contact, trade, and even warfare between Arabian and civilizations in the Horn (including parts of modern-day Ethiopia, Eritrea, and Somalia) dates at least to the early 1st-millennium BC, when the Sabaean Kingdom may have established a state in parts of modern-day Eritrea and Ethiopia, called Da'amat, although the precise nature of the Sabaean cultural presence in this area is debated. After the collapse of Da'amat, contact between South Arabia and Ethiopia declined. This was revitalized after the establishment of the Kingdom of Aksum, a new and powerful Ethiopian kingdom centered in Eritrea and Ethiopia emerging in the 1st century CE. Contact, and occasional battle, continued between the two, for centuries. Ethiopia maintained a policy of irredentism, believing that the southern territories of the peninsula rightfully belonged under its own rule. The Kingdom of Aksum conquered South Arabia in the early 6th century, and Ethiopian rule over South Arabia reached its height during the reign of Abraha, who conquered most of the peninsula. Control was finally lost to the Sasanian Empire during the Aksumite–Persian wars. Genealogical tradition Arab traditions relating to the origins and classification of the Arabian tribes is based on biblical genealogy. The general consensus among 14th-century Arab genealogists was that Arabs were three kinds: Modern historians believe that these distinctions were created during the Umayyad period, to support the cause of different political factions. Religion Religion in pre-Islamic Arabia included pre-Islamic Arabian polytheism, ancient Semitic religions, and Abrahamic religions such as Judaism and Christianity. Other religions that may have existed in pre-Islamic Arabia are Samaritanism, Mandaeism, and Iranian religions like Zoroastrianism and Manichaeism. Arabian polytheism was, according to Islamic tradition, the dominant form of religion in pre-Islamic Arabia, based on veneration of deities and spirits. Worship was directed to various gods and goddesses, including Hubal and the goddesses al-Lāt, Al-'Uzzá and Manāt, at local shrines and temples, maybe such as the Kaaba in Mecca. Deities were venerated and invoked through a variety of rituals, including pilgrimages and divination, as well as ritual sacrifice. Different theories have been proposed regarding the role of Allah in Meccan religion. Many of the physical descriptions of the pre-Islamic gods are traced to idols, especially near the Kaaba, which is said to have contained up to 360 of them in Islamic tradition. Other religions were represented to varying, lesser degrees. The influence of the adjacent Roman and Aksumite resulted in Christian communities in the northwest, northeast and south of Arabia. Christianity in pre-Islamic Arabia made a lesser impact, but secured some conversions, in the remainder of the peninsula. With the exception of Nestorianism in the northeast and the Persian Gulf, the dominant form of Christianity was Miaphysitism. The peninsula had been a destination for Jewish migration since pre-Roman times, which had resulted in a diaspora community supplemented by local converts. Additionally, the influence of the Sasanian Empire resulted in Iranian religions being present in the peninsula. While Zoroastrianism existed in the eastern and southern Arabia, there was no existence of Manichaeism in Mecca. From the fourth-century onwards, monotheism became increasingly prevalent in pre-Islamic Arabia, as is attested in texts like the inscriptions from Jabal Dabub, Ri al-Zallalah, and the Abd Shams inscription. Culture Michael C.A. MacDonald classifies societies as literate or non-literate based on the role played by writing in that society. Writing may be widespread, but if it is not involved in the administration of a society (contracts, treaties, letters and diplomacy, monumental inscriptions, etc), it is considered non-literate. By contrast, a literate society relies on writing for its administration. Contrary to myth, the ability to read and write was common in pre-Islamic Arabia, and pre-Islamic Arabia at large represented a literate society. Writing was used to different extents in different regions of the Peninsula. South Arabia was a literate society to the most extensive degree: not only are thousands of graffiti known, suggesting commoners knew how to write at a basic level, but thousands of public inscriptions also show that writing was used to run South Arabian societies, across all ranges of purposes: for legal, commemorative, and dedicatory purposes, as well as for issuing public decrees, documenting history, and for legal affairs. The major oases towns in North and West Arabia were also literate societies. The northern nomadic Arabs were non-literate societies, but the tens of thousands of pre-Islamic Arabian inscriptions found even among these societies suggest that the ability to write was common, and was mainly used for entertainment and passing time. No native literature survives from pre-Islamic Arabia, however, over 65,000 pre-Islamic inscriptions have been found on stone, metal, pottery, wood, and other surfaces, and published. These inscriptions suggest a copious literature once existed in the area, but it has not survived, likely because it was written on perishable materials. Most of these inscriptions are from North Arabia, where 50,000 inscriptions are known. The remaining 15,000 are from South Arabia. The Arabian corpus of inscriptions is more extensive than that of Ugarit or Phoenicia in Punic, Aramaic, and Hebrew. It is second only in size to Akkadian, but remains behind in the field of Semitic studies due to a lack of accessible tools. While not written down in the pre-Islamic period, the period may have had a vibrant poetic culture. Pre-Islamic Arabia was Hellenized, a process that refers to when local cultures mix with the Greco-Roman culture spread by Alexander the Great's conquests. After Alexander's fall, and before reaching the Arabian Peninsula, Hellenistic and Roman rule were imposed for centuries on Arabic-speaking populations in Syria, the Jordan, and Palestine. Hellenization first reaches the peninsula in the 3rd century BC, in Eastern Arabia, shown by the amphorae in that region that came from Rhodes and Chios. In South Arabia, Hellenization begins in the 2nd or 1st centuries BCE, around the time of the last of the kings of Qataban. In the resurgent period of the South Arabian Kingdom of Awsan, during the 2nd or 1st centuries BCE, an evolution of the iconography of the kings is seen: they are transformed from wearing traditional South Arabian clothing to being shown dressed as a Roman citizen would, with curly hair and wearing a toga. At the Kingdom of Saba, the traditional Near Eastern norms of religious iconography gave way to Roman and Hellenistic anthropomorphic styles around the turn of the Christian era. In Central Arabia, statues of Greek deities like Artemis, Heracles, and Harpocrates have been discovered have been found at Qaryat al-Faw, the former capital of the Kingdom of Kinda. Roman military presence, coins, and Greek and Latin inscriptions have been documented in many sites in Saudi Arabia, including invocations of Roman gods in the northwestern Hejaz. Many Arabic inscriptions mention the emperor (as Caesar) and Rome or Romans, citing them with the words rm, ʾl rm, or hrm. One Arabic inscription accompanies a rock drawing of a Roman soldier in a plumed helmet and lamellar armor on horseback. In 106 CE, the Roman Empire conquered the Nabataean Kingdom and they set up a province called Arabia Petraea (Roman Arabia) encompassing both northern Arabia and the northwest Hejaz. Roman military encampments were set up at Hegra (Mada'in Salih) in the Medina Province, at Ruwafa, and as far south as the Farasan Islands. The Limes Arabicus was the desert frontier that separated the Roman Empire from the rest of the Arabian Peninsula. Christianity expanded into pre-Islamic Arabia. The Letter of the Archimandrites dating to 569/570, composed in Greek but preserved in Syriac, demonstrates a dense network of churches and monasteries in Roman Arabia. Arabic-speaking tribes were gradually converting to Christianity or becoming foederati of the emperor, resulting in increasing integration into the Roman world over time. In the mid-sixth century, for example, Justinian I was closely allied with the Ghassanids, a Hellenized Christian Arab kingdom. Hellenistic influences are also seen on the corpus of pre-Islamic Arabic poetry. The art is similar to that of neighbouring cultures. Pre-Islamic Yemen produced stylized alabaster (the most common material for sculpture) heads of great aesthetic and historic charm. Late Antiquity The early 7th century in Arabia began with the longest and most destructive period of the Byzantine–Sasanian Wars. It left both the Byzantine and Sasanian empires exhausted and susceptible to third-party attacks, particularly from nomadic Arabs united under a newly formed religion. According to historian George Liska, the "unnecessarily prolonged Byzantine–Persian conflict opened the way for Islam". The demographic situation also favoured Arab expansion: overpopulation and lack of resources encouraged Arabs to migrate out of Arabia. Before the Byzantine–Sasanian War of 602–628, the Plague of Justinian had erupted (541–542), spreading through Persia and into Byzantine territory. The Byzantine historian Procopius, who witnessed the plague, documented that citizens died at a rate of 10,000 per day in Constantinople. The exact number; however, is often disputed by contemporary historians. Both empires were permanently weakened by the pandemic as their citizens struggled to deal with death as well as heavy taxation, which increased as each empire campaigned for more territory. Despite almost succumbing to the plague, Byzantine emperor Justinian I (reigned 527–565) attempted to resurrect the might of the Roman Empire by expanding into Arabia. The Arabian Peninsula had a long coastline for merchant ships and an area of lush vegetation known as the Fertile Crescent which could help fund his expansion into Europe and North Africa. The drive into Persian territory would also put an end to tribute payments to the Sasanians, which resulted in an agreement to give 11,000 lb (5,000 kg) of tribute to the Persians annually in exchange for a ceasefire. However, Justinian could not afford further losses in Arabia. The Byzantines and the Sasanians sponsored powerful nomadic mercenaries from the desert with enough power to trump the possibility of aggression in Arabia. Justinian viewed his mercenaries as so valued for preventing conflict that he awarded their chief with the titles of patrician, phylarch, and king – the highest honours that he could bestow on anyone. By the late 6th century, an uneasy peace remained until disagreements erupted between the mercenaries and their patron empires. The Byzantines' ally was a Christian Arabic tribe from the frontiers of the desert known as the Ghassanids. The Sasanians' ally; the Lakhmids, were also Christian Arabs, but from what is now Iraq. However, denominational disagreements about God forced a schism in the alliances. The Byzantines' official religion was Orthodox Christianity, which believed that Jesus Christ and God were two natures within one entity. The Ghassanids, as Monophysite Christians from Iraq, believed that God and Jesus Christ were only one nature. This disagreement proved irreconcilable and resulted[when?] in a permanent break in the alliance. Meanwhile, the Sasanian Empire broke its alliance with the Lakhmids due to false accusations that the Lakhmids' leader had committed treason; the Sasanians annexed the Lakhmid kingdom in 602. The fertile lands and important trade routes of Iraq were now open ground for upheaval. When the military stalemate was finally broken and it seemed that Byzantium had finally gained the upper hand in battle, nomadic Arabs invaded from the desert frontiers, bringing with them a new social order that emphasized religious devotion over tribal membership. The political apparatus created by Muhammad (d. 632) was able to conquer Arabia within a few years of his death. Afterwards, this group invaded the Near East into both Sasanian and Byzantine territory. Within a few decades, the Sasanian empire had fallen entirely, with Byzantine territories in the Levant, the Caucasus, Egypt, Syria and North Africa also taken.[need quotation to verify] By the end of the seventh century, an empire stretching from the Pyrenees Mountains in Europe to the Indus River valley in South Asia had been established. Sources of information Detailed narrative literature that records or summarizes the historical past is absent from pre-Islamic Arabia. "There is no Arabian Tacitus or Josephus to furnish us with a grand narrative." Information from the time period (contemporary information) could come from the archaeology of the Arabian Peninsula, pre-Islamic Arabian inscriptions, and literary accounts from observers outside of the peninsula (including Assyrians, Babylonians, Israelites, Greeks, Romans, and Persians). Texts specifically related to Arabia in the pre-Islamic period include the Periplus of the Erythraean Sea, parts of the 16th book of Strabo's Geography, the Book of the Himyarites, Jacob of Serugh's Letter to the Himyarites, the Letter of the Archimandrites of Arabia, the Martyrdom of Arethas, and the Martyrdom of Azqir. At the turn of the Islamic era, the Quran and the Constitution of Medina can act as helpful literary sources for learning about pre-Islamic Arabia. During the Islamic era, many scholars collected or wrote about pre-Islamic times. For example, many efforts were made beginning in the eighth century to compile pre-Islamic Arabic poetry, some of which is considered authentic by historians. One narrative genre, the Days of the Arabs, were attempts in later periods to document the notable moments of warfare in pre-Islamic Arabia. Another literary genre focused on the genealogies of peoples, tribes, and kingdoms in pre-Islamic Arabia. A number of Muslim scholars wrote what would become, in Islamic memory, major sources for understanding the pre-Islamic Arabian past, although they are permeated by large amounts of legendary material. Some of these include the Book of Idols by Ibn al-Kalbi, History of the Prophets and Kings by Al-Tabari, Al-Iklil by al-Hamdani, The Book of Crowns on the Kings of Himyar by Ibn Hisham, the History of al-Ya'qubi by Al-Ya'qubi, the Life of the Prophet by Ibn Ishaq, the Kitab al-Ma'arif of Ibn Qutaybah, and the Book of the Conquests of Egypt of ʿAbd al-Ḥakam. A few Persian sources for pre-Islamic Arabia, which were also written down during the Islamic period, also exist, like the Bundahišn, the Shahnameh by Ferdowsi, and the Šahrestānīhā ī Ērānšahr. Systematic archaeology in the region is ongoing, but recent. For this reason, no firm chronology has yet been established for Arabian material culture. Numismatics, the study of coins, has also helped learn about legend, iconography, and the history of rulership. Of any one region, the archaeology of Eastern Arabia is the most advanced so far, which is also the region with the earliest documented literary history from Mesopotamian sources as far back as 2500 BCE. Documentation of North and South Arabia begins around 900 BCE. No one source of information is perfect: sources can be late, incomplete, or biased. See also Notes Bibliography Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Single-player_video_game] | [TOKENS: 1077] |
Contents Single-player video game A single-player video game is a video game where input from only one player is expected throughout the gameplay. Video games in general can feature several game modes, including single-player modes designed to be played by a single player in addition to multi-player modes. Most modern console games, PC games and arcade games are designed so that they can be played by a single player; although many of these games have modes that allow two or more players to play (not necessarily simultaneously), very few actually require more than one player for the game to be played. The Unreal Tournament series is one example of such. History The earliest video games, such as Tennis for Two (1958), Spacewar! (1962), and Pong (1972), were symmetrical games designed to be played by two players. Single-player games gained popularity only after this, with early titles such as Speed Race (1974) and Space Invaders (1978). The reason for this, according to Raph Koster, is down to a combination of several factors: increasingly sophisticated computers and interfaces that enabled asymmetric gameplay, cooperative gameplay and story delivery within a gaming framework, coupled with the fact that the majority of early games players had introverted personality types (according to the Myers-Briggs personality type indicator). Although most modern games incorporate a single-player element either as the core or as one of several game modes, single-player gaming had been viewed by the video game industry as peripheral to the future of gaming, with Electronic Arts vice president Frank Gibeau stating in 2012 that he had not approved one game to be developed as a single-player experience. The question of the financial viability of single-player AAA games was raised following the closure of Visceral Games by Electronic Arts (EA) in October 2017. Visceral had been a studio that established itself on a strong narrative single-player focus with Dead Space, and had been working on a single-player, linear narrative Star Wars game at the time of the closure; EA announced following this that they would be taking the game in a different direction, specifically "a broader experience that allows for more variety and player agency". Many commentators felt that EA made the change as they did not have confidence that a studio with an AAA-scale budget could produce a viable single-player game based on the popular Star Wars franchise. Alongside this, as well as relatively poor sales of games in the year prior that were principally AAA single-player games (Resident Evil 7, Prey, Dishonored 2, and Deus Ex: Mankind Divided) against financially successful multiplayer games and those offer a games-as-a-service model (Overwatch, Destiny 2, and Star Wars Battlefront 2), were indicators to many that the single-player model for AAA was waning. Manveer Heir, who had left EA after finishing his gameplay design work for Mass Effect Andromeda, acknowledged that the culture within EA was against the development of single-player games, and with Visceral's closure, "that the linear single-player triple-A game at EA is dead for the time being". Bethesda on December 7, 2017, decided to collaborate with Lynda Carter to launch a Public Safety Announcement to save single-player gaming. A few years later in 2021, EA was reported to have revived interest in single-player games, following the successful launch of Star Wars Jedi: Fallen Order in 2020. The company still planned on releasing live service games with multiplayer components, but began evaluating its IP catalog for more single-player titles to revive, such as a remake of the Dead Space franchise. Around the same time, head of Xbox Game Studios Phil Spencer said that they still see a place for narrative-driven single-player games even though the financial drivers of the market tended to be live service games. Spencer said that developing such games with AAA-scale budgets can be risky, but with availability of services like cloud gaming and subscription services, they can gauge audience reaction to these games early on and reduce the risk involved before releases. Game elements As the narrative and conflict in single-player gameplay is created by a computer rather than a human opponent, single-player games are able to deliver certain gaming experiences that are typically absent—or de-emphasised—in multiplayer games. Single-player games rely more heavily on compelling stories to draw the player into the experience and to create a sense of investment. Humans are unpredictable, so human players - allies or enemies - cannot be relied upon to carry a narrative in a particular direction, and so multiplayer games tend not to focus heavily on a linear narrative. By contrast, many single-player games are built around a compelling story. While a multiplayer game relies upon human-human interaction for its conflict, and often for its sense of camaraderie, a single-player game must build these things artificially. As such, single-player games require deeper characterisation of their non-player characters in order to create connections between the player and the sympathetic characters and to develop deeper antipathy towards the game's antagonists. This is typically true of role-playing games (RPGs), such as Dragon Quest and the Final Fantasy, which are primarily character-driven and have a different setting. These game elements are not firm, fixed rules; single-player puzzle games such as Tetris or racing games focus squarely on gameplay. See also References Bibliography |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Silver_bullet] | [TOKENS: 1039] |
Contents Silver bullet In folklore, a silver bullet is often one of the few weapons that are effective against a werewolf, vampire, witch, or other supernatural being. The term silver bullet is also a metaphor for a simple, seemingly magical, solution to a difficult problem: for example, penicillin c. 1930 was a "silver bullet" or magic bullet that allowed doctors to treat and successfully cure many bacterial infections. In folklore Some authors asserted that the idea of the werewolf's supposed vulnerability to bullets cast from silver dates back to the Beast of Gévaudan, a man-eating animal killed by the hunter Jean Chastel in the year 1767. However, the allegations of Chastel purportedly using a gun loaded with silver bullets are derived from a distorted detail based primarily on Henri Pourrat's Histoire fidèle de la bête en Gévaudan (1946). In this novel, the French writer imagines that the beast was shot thanks to medals of the Virgin Mary, worn by Jean Chastel in his hat and then melted down to make bullets. An account of a resident of Jämtland, Sweden describes bullets of silver used as the means to kill were-bears in 1936. Inability of the great bear being invulnerable except to silver bullets also features in the 1891 novel Gösta Berling's Saga. Swedish folklore also describes silver bullets as effective against the skogsrå, and against the beguiling mermaid (sjöjungfru), as also told in Swedish-speaking parts of Finland.[a] A Swedish superstition held that carrying a human leg taken from the churchyard made one invulnerable except to a silver bullet. In the Brothers Grimm fairy-tale of The Two Brothers, a bullet-proof witch is shot down by silver buttons, fired from a gun. In some epic folk songs about Bulgarian rebel leader Delyo, he is described as invulnerable to normal weapons, driving his enemies to cast a silver bullet in order to murder him. In popular culture A number of fictional Wild West heroes used silver bullets as weapons, to symbolize their purity of heart. The best known of these was the Lone Ranger in all his incarnations: after solving the problem of the week, he would leave a silver bullet behind as his mark. Clayton Moore, who played the Ranger in the television series, was known to give away silver bullet props, made from aluminum, to fans in the 1950s. Fantasy-horror has continued the use of silver bullets as monster-slayers. In an untitled early Batman serial from 1939, written by Gardner Fox for Detective Comics issues 31 and 32, Batman declares that "only a silver bullet may kill a vampire," and swiftly forges such a weapon to defeat the coven of vampires who kidnapped Bruce Wayne's fiancée. The 1941 film The Wolf Man, and its sequels and spinoffs, codified silver (whether in bullet form or otherwise) as the definitive death-dealer for werewolves, to the point where this weakness is often regarded as exclusive to lycanthropes. Notable film examples are Silver Bullet (1985) and Cursed (2005), with the latter being in part a self-referential spoof of the 1941 film. The Strain novels by Guillermo del Toro and Chuck Hogan return the silver bullet to its earlier status as a weapon against the strigoi, who are broadly analogous to vampires. Ballistic effectiveness Silver bullets differ from lead bullets in several respects. Lead has a 10% higher density than silver, so a silver bullet will have a little less mass than a lead bullet of identical dimensions. Pure silver is less malleable than lead and falls between lead and copper in terms of hardness (1.5 < 2.5 < 3.0 Mohs) and shear modulus (5.6 < 30 < 48 GPa). A silver bullet accepts the rifling of a gun barrel. The terminal impact is somewhat speculative and will depend on a variety of factors including bullet size and shape, flight distance, and target material. At short ranges, the silver bullet will most likely give better penetration due to its higher shear modulus, and will not deform as much as a lead bullet. A 2007 episode of MythBusters demonstrated a greater penetration depth of lead bullets versus silver bullets; the experiment utilized a 250-grain (16 g) lead slug in a .45-caliber Colt long shell vs a lighter, 190-grain (12 g) silver slug fired at closer range. Another MythBusters episode, from 2012, showed that silver bullets are less accurate than lead bullets when fired from an M1 Garand. Michael Briggs also did some experiments with silver bullets compared to lead bullets. After making a custom mold to ensure that the sizes of the silver bullets were comparable to the lead bullets, he fired them. He found that the silver bullets were slightly slower than the lead bullets and less accurate. See also Explanatory notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-403] | [TOKENS: 12858] |
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/POV-Ray] | [TOKENS: 1685] |
Contents POV-Ray The Persistence of Vision Ray Tracer, most commonly acronymed as POV-Ray, is a cross-platform ray-tracing program that generates images from a text-based scene description. It was originally based on DKBTrace, written by David Kirk Buck and Aaron A. Collins for Amiga computers. There are also influences from the earlier Polyray raytracer because of contributions from its author, Alexander Enzmann. POV-Ray is free and open-source software, with the source code available under the AGPL-3.0-or-later license. History Sometime in the 1980s, David Kirk Buck downloaded the source code for a Unix ray tracer[which?] to his Amiga. He experimented with it for a while and eventually decided to write his own ray tracer named DKBTrace after his initials. He posted it to the "You Can Call Me Ray" bulletin board system (BBS) in Chicago, thinking others might be interested in it. In 1987, Aaron A. Collins downloaded DKBTrace and began working on an x86 port of it. He and David Buck collaborated to add several more features. POV also had/has similarities with (and borrows from) Rayshade, another BBS era raytracer, including the Rayshade book. When the program proved to be more popular than anticipated, they could not keep up with demand for more features. Thus, in July 1991, David turned over the project to a team of programmers working in the "GraphDev" forum on CompuServe. At the same time, David felt that it was inappropriate to use his initials on a program he no longer maintained. The name "STAR-Light" (Software Taskforce on Animation and Rendering) was initially used, but eventually the name became "PV-Ray", and then ultimately "POV-Ray" (Persistence of Vision Ray Tracer), a name inspired by Dalí's painting, The Persistence of Memory. Features of the application, and a summary of its history, are discussed in a February 2008 interview with David Kirk Buck and Chris Cason on episode 24 of FLOSS Weekly. Features POV-Ray has matured substantially since it was created. Recent versions of the software include the following features: One of POV-Ray's main attractions is its large collection of third-party-made assets and tools. A large number of tools, textures, models, scenes, and tutorials can be found on the web. It is also a useful reference for those wanting to learn how ray tracing and related 3D geometry and computer graphics algorithms work. The current official version of POV-Ray is 3.7. This version introduces: Some of the main introduced features of the previous release (3.6) are: In July 2006, Intel Corporation started using the beta version of 3.7 to demonstrate their new dual-core Conroe processor due to the efficiency of the SMP (symmetric multiprocessing) implementation. POV-Ray, in addition to standard 3D geometric shapes like tori, spheres, and heightfields, supports mathematically defined primitives such as the isosurface (a finite approximation of an arbitrary function), the polynomial primitive (an infinite object defined by a 15th order or lower polynomial), the julia fractal (a 3-dimensional slice of a 4-dimensional fractal), the superquadratic ellipsoid (an intermediate between a sphere and a cube), and the parametric primitive (using equations that represent its surface, rather than its interior). POV-Ray internally represents objects using their mathematical definitions; all POV-Ray primitive objects can be described by mathematical functions. This is different from many computer programs that include 3D models, which typically use triangle meshes to compose all the objects in a scene. This fact provides POV-Ray with several advantages and disadvantages over other rendering and modeling systems; POV-Ray primitives are more accurate than their polygonal counterparts: objects that can be described in terms of spheres, planar surfaces, cylinders, tori, and the like, are perfectly smooth and mathematically accurate in POV-Ray renderings, whereas polygonal artifacts may be visible in mesh-based modeling software. POV-Ray primitives are also simpler to define than most of their polygonal counterparts, e.g., in POV-Ray, a sphere is described simply by its center and radius; in a mesh-based environment, a sphere must be described by a multitude of small connected polygons (usually quads or triangles). On the other hand, script-based primitive modeling is not always a practical method to create certain objects, such as realistic characters or complex man-made artifacts like cars. Those objects can be created first in mesh-based modeling applications such as Wings 3D and Blender, and then they can be converted to POV-Ray's own mesh format. The following is an example of the scene description language used by POV-Ray to describe a scene to render. It demonstrates the use of a background colour, camera, lights, a simple box shape having a surface normal and finish, and the transforming effects of rotation. The following script fragment shows the use of variable declaration, assignment, comparison and the while loop construct: Modeling The POV-Ray program itself does not include a modeling feature; it is essentially a pure renderer with a sophisticated model description language. To accompany this feature set, third parties have developed a large variety of modeling software, some specialized for POV-Ray, others supporting import and export of its data structures, including the free and open-source 3D creation suite Blender. A number of additional POV-Ray compatible modelers are linked from Povray.org: Modelling Programs. In 2007, POV-Ray acquired the rights to Moray Archived 2020-04-28 at the Wayback Machine, an interactive 3-D modeling program long used with POV-Ray. However, as of December 2016, Moray development is stalled. Software Official modifications to the POV-Ray source tree are done and/or approved by the POV-Team. Most patch submission and/or bug reporting is done in the POV-Ray newsgroups on the news.povray.org news server (with a Web interface also available). Since POV-Ray's source is available there are unofficial forks and patched versions of POV-Ray available from third parties; however, these are not officially supported by the POV-Team. Official POV-Ray versions currently do not support shader plug-ins. Some features, like radiosity and splines are still in development and may be subject to syntactical change. POV-Ray 3.6 is distributed in compiled format for Mac, Windows and Linux. Support for Intel Macs is not available in the Mac version, but since Mac OS X is a version of Unix the Linux version can be compiled on it. The 3.7 versions with SMP support are officially supported for Windows and Linux. Unofficial Mac versions for v3.7 can be found. POV-Ray can be ported to any platform which has a compatible C++ compiler. Originally, POV-Ray was distributed under its own POV-Ray License; namely, the POV-Ray 3.6 Distribution License and the POV-Ray 3.6 Source License, which permitted free distribution of the program source code and binaries, while restricting commercial distribution and the creation of derivative works other than fully functional versions of POV-Ray. Although the source code of older versions is available for modification, due to the above 3.6 and prior license restrictions, it was not open source or free software according to the OSI or the FSF definition of the term. This was a problem as source code exchange with the greater FOSS ecosystem was impossible due to license incompatibility with copyleft licenses. One of the reasons that POV-Ray was not originally licensed under the free software GNU General Public License (GPL), or other open source licenses, is that POV-Ray was developed before the GPL-style licenses became widely used; the developers wrote their own license for the release of POV-Ray, and contributors to the software worked under the assumption their contributions would be licensed under the POV-Ray 3.6 Licenses. In 2013, with version 3.7, POV-Ray was relicensed under the GNU Affero General Public License version 3 (or later). Thus POV-Ray is since then free software according to the FSF definition and also open source software according to the Open Source Definition. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_note-ieee-118] | [TOKENS: 10628] |
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Water] | [TOKENS: 15420] |
Contents Water Water is an inorganic compound with the chemical formula H2O. It is a transparent, tasteless, odorless,[c] and nearly colorless chemical substance. It is the main constituent of Earth's streams, lakes, and oceans and the fluids of all known living organisms, in which it acts as a solvent. Water, being a polar molecule, undergoes strong intermolecular hydrogen bonding which is a large contributor to its physical and chemical properties. It is vital for all known forms of life, despite not providing food energy or being an organic micronutrient. Due to its presence in all organisms, its chemical stability, its worldwide abundance, and its strong polarity relative to its small molecular size, water is often referred to as the "universal solvent". Because Earth's surface temperature is relatively close to water's triple point, water exists on Earth as a solid, a liquid, and a gas. It forms precipitation in the form of rain and aerosols in the form of fog. Clouds consist of suspended droplets of water and ice, its solid state. When finely divided, crystalline ice may precipitate in the form of snow. The gaseous state of water is steam or water vapor. Water covers about 71.0% of the Earth's surface, with seas and oceans making up most of the water volume (about 96.5%). Small portions of water occur as groundwater (1.7%), in the glaciers and the ice caps of Antarctica and Greenland (1.7%), and in the air as vapor, clouds (consisting of ice and liquid water suspended in air), and precipitation (0.001%). Water moves continually through the water cycle of evaporation, transpiration (evapotranspiration), condensation, precipitation, and runoff, usually reaching the sea. Water plays an important role in the world economy. Approximately 70% of the fresh water used by humans goes to agriculture. Fishing in salt and fresh water bodies has been, and continues to be, a major source of food for many parts of the world, providing 6.5% of global protein. Much of the long-distance trade of commodities (such as oil, natural gas, and manufactured products) is transported by boats through seas, rivers, lakes, and canals. Large quantities of water, ice, and steam are used for cooling and heating in industry and homes. Water is an excellent solvent for a wide variety of substances, both mineral and organic; as such, it is widely used in industrial processes and in cooking and washing. Water, ice, and snow are also central to many sports and other forms of entertainment, such as swimming, pleasure boating, boat racing, surfing, sport fishing, diving, ice skating, snowboarding, and skiing. Etymology The word water comes from Old English wæter, from Proto-Germanic *watar (source also of Old Saxon watar, Old Frisian wetir, Dutch water, Old High German wazzar, German Wasser, vatn, Gothic 𐍅𐌰𐍄𐍉 (wato)), from Proto-Indo-European *wod-or, suffixed form of root *wed- ('water'; 'wet'). Also cognate, through the Indo-European root, with Greek ύδωρ (ýdor; from Ancient Greek ὕδωρ (hýdōr), whence English 'hydro-'), Russian вода́ (vodá), Irish uisce, and Albanian ujë. History One factor in estimating when water appeared on Earth is that water is continually being lost to space. H2O molecules in the atmosphere are broken up by photolysis, and the resulting free hydrogen atoms can sometimes escape Earth's gravitational pull. When the Earth was younger and less massive, water would have been lost to space more easily. Lighter elements like hydrogen and helium are expected to leak from the atmosphere continually, but isotopic ratios of heavier noble gases in the modern atmosphere suggest that even the heavier elements in the early atmosphere were subject to significant losses. In particular, xenon is useful for calculations of water loss over time. Not only is it a noble gas (and therefore is not removed from the atmosphere through chemical reactions with other elements), but comparisons between the abundances of its nine stable isotopes in the modern atmosphere reveal that the Earth lost at least one ocean of water, a volume of water approximately equal to modern ocean volume, early in its history. This is likely to have occurred between the Hadean and Archean eons in cataclysmic events such as the moon forming impact. Any water on Earth during the latter part of its accretion would have been disrupted by the Moon-forming impact (~4.5 billion years ago), which likely vaporized much of Earth's crust and upper mantle and created a rock-vapor atmosphere around the young planet. The rock vapor would have condensed within two thousand years, leaving behind hot volatiles which probably resulted in a majority carbon dioxide atmosphere with hydrogen and water vapor. Afterward, liquid water oceans may have existed despite the surface temperature of 230 °C (446 °F) due to the increased atmospheric pressure of the CO2 atmosphere. As the cooling continued, most CO2 was removed from the atmosphere by subduction and dissolution in ocean water, but levels oscillated wildly as new surface and mantle cycles appeared. Geological evidence also helps constrain the time frame for liquid water existing on Earth. A sample of pillow basalt (a type of rock formed during an underwater eruption) was recovered from the Isua Greenstone Belt and provides evidence that water existed on Earth 3.8 billion years ago. In the Nuvvuagittuq Greenstone Belt, Quebec, Canada, rocks dated at 3.8 billion years old by one study and 4.28 billion years old by another show evidence of the presence of water at these ages. If oceans existed earlier than this, any geological evidence has yet to be discovered (which may be because such potential evidence has been destroyed by geological processes like crustal recycling). More recently, in August 2020, researchers reported that sufficient water to fill the oceans may have always been on the Earth since the beginning of the planet's formation. Unlike rocks, minerals called zircons are highly resistant to weathering and geological processes and so are used to understand conditions on the very early Earth. Mineralogical evidence from zircons has shown that liquid water and an atmosphere must have existed 4.404 ± 0.008 billion years ago, very soon after the formation of Earth. This presents somewhat of a paradox, as the cool early Earth hypothesis suggests temperatures were cold enough to freeze water between about 4.4 billion and 4.0 billion years ago. Other studies of zircons found in Australian Hadean rock point to the existence of plate tectonics as early as 4 billion years ago. If true, that implies that rather than a hot, molten surface and an atmosphere full of carbon dioxide, early Earth's surface was much as it is today (in terms of thermal insulation). The action of plate tectonics traps vast amounts of CO2, thereby reducing greenhouse effects, leading to a much lower surface temperature and the formation of solid rock and liquid water. Properties Water (H2O) is a polar inorganic compound. At room temperature it is a tasteless and odorless liquid, nearly colorless with a hint of blue. The simplest hydrogen chalcogenide, it is by far the most studied chemical compound and is sometimes described as the "universal solvent" for its ability to dissolve more substances than any other liquid, though it is poor at dissolving nonpolar substances. This allows it to be the "solvent of life": indeed, water as found in nature almost always includes various dissolved substances, and special steps are required to obtain chemically pure water. Water is the only common substance to exist as a solid, liquid, and gas in normal terrestrial conditions. Along with oxidane, water is one of the two official names for the chemical compound H2O; it is also the liquid phase of H2O. The other two common states of matter of water are the solid phase, which is ice, and the gaseous phase, water vapor or steam. The addition or removal of heat can cause phase transitions: freezing (water to ice), melting (ice to water), vaporization (water to vapor), condensation (vapor to water), sublimation (ice to vapor) and deposition (vapor to ice). Water is one of only a few common naturally occurring substances which, for some temperature ranges, become less dense as they cool; it is the only known naturally occurring substance which does so while liquid. In addition, it is unusual because it becomes significantly less dense as it freezes, though it is not unique in that respect.[d] At 1 atm pressure, it reaches its maximum density of 999.972 kg/m3 (62.4262 lb/cu ft) at 3.98 °C (39.16 °F). Below that temperature, but above the freezing point of 0 °C (32 °F), water expands (becoming less dense) until it reaches the freezing point, at which its density in the liquid phase is 999.8 kg/m3 (62.4155 lb/cu ft). As it freezes and becomes ice, water expands by about 9%, reaching a density of 917 kg/m3 (57.25 lb/cu ft). This expansion can exert enormous pressure, bursting pipes and cracking rocks. As a solid, it displays the usual behavior of contracting and becoming more dense as it cools. These unusual thermal properties have important consequences for life on earth. In a lake or ocean, water at 4 °C (39 °F) sinks to the bottom, and ice forms on the surface, floating on the liquid water. This ice insulates the water below, preventing it from freezing solid. Without this protection, most aquatic organisms residing in lakes would perish during the winter. In addition, this anomalous behavior is an important part of the thermohaline circulation which distributes heat around the planet's oceans. Water is a diamagnetic material. Though interaction is weak, with superconducting magnets it can attain a notable interaction. At a pressure of one atmosphere (atm), ice melts or water freezes (solidifies) at 0 °C (32 °F) and water boils or vapor condenses at 100 °C (212 °F). However, even below the boiling point, water can change to vapor at its surface by evaporation (vaporization throughout the liquid is known as boiling). Sublimation and deposition also occur on surfaces. For example, frost is deposited on cold surfaces while snowflakes form by deposition on an aerosol particle or ice nucleus. In the process of freeze-drying, a food is frozen and then stored at low pressure so the ice on its surface sublimates. The melting and boiling points depend on pressure. A good approximation for the rate of change of the melting temperature with pressure is given by the Clausius–Clapeyron relation: d T d P = T ( v L − v S ) L f {\displaystyle {\frac {dT}{dP}}={\frac {T\left(v_{\text{L}}-v_{\text{S}}\right)}{L_{\text{f}}}}} where v L {\displaystyle v_{\text{L}}} and v S {\displaystyle v_{\text{S}}} are the molar volumes of the liquid and solid phases, and L f {\displaystyle L_{\text{f}}} is the molar latent heat of melting. In most substances, the volume increases when melting occurs, so the melting temperature increases with pressure. However, because ice is less dense than water, the melting temperature decreases. In glaciers, pressure melting can occur under sufficiently thick volumes of ice, resulting in subglacial lakes. The Clausius-Clapeyron relation also applies to the boiling point, but with the liquid/gas transition the vapor phase has a much lower density than the liquid phase, so the boiling point increases with pressure. Water can remain in a liquid state at high temperatures in the deep ocean or underground. For example, temperatures exceed 205 °C (401 °F) in Old Faithful, a geyser in Yellowstone National Park. In hydrothermal vents, the temperature can exceed 400 °C (752 °F). At sea level, the boiling point of water is 100 °C (212 °F). As atmospheric pressure decreases with altitude, the boiling point decreases by 1 °C every 274 meters. High-altitude cooking takes longer than sea-level cooking. For example, at 1,524 metres (5,000 ft), cooking time must be increased by a fourth to achieve the desired result. Conversely, a pressure cooker can be used to decrease cooking times by raising the boiling temperature. In a vacuum, water will boil at room temperature. On a pressure/temperature phase diagram (see figure), there are curves separating solid from vapor, vapor from liquid, and liquid from solid. These meet at a single point called the triple point, where all three phases can coexist. The triple point is at a temperature of 273.16 K (0.01 °C; 32.02 °F) and a pressure of 611.657 pascals (0.00604 atm; 0.0887 psi); it is the lowest pressure at which liquid water can exist. Until 2019, the triple point was used to define the Kelvin temperature scale. The water/vapor phase curve terminates at 647.096 K (373.946 °C; 705.103 °F) and 22.064 megapascals (3,200.1 psi; 217.75 atm). This is known as the critical point. At higher temperatures and pressures the liquid and vapor phases form a continuous phase called a supercritical fluid. It can be gradually compressed or expanded between gas-like and liquid-like densities; its properties (which are quite different from those of ambient water) are sensitive to density. For example, for suitable pressures and temperatures it can mix freely with nonpolar compounds, including most organic compounds. This makes it useful in a variety of applications including high-temperature electrochemistry and as an ecologically benign solvent or catalyst in chemical reactions involving organic compounds. In Earth's mantle, it acts as a solvent during mineral formation, dissolution and deposition. The normal form of ice on the surface of Earth is ice Ih, a phase that forms crystals with hexagonal symmetry. Another with cubic crystalline symmetry, ice Ic, can occur in the upper atmosphere. As the pressure increases, ice forms other crystal structures. As of 2024, twenty have been experimentally confirmed and several more are predicted theoretically. The eighteenth form of ice, ice XVIII, a face-centred-cubic, superionic ice phase, was discovered when a droplet of water was subject to a shock wave that raised the water's pressure to millions of atmospheres and its temperature to thousands of degrees, resulting in a structure of rigid oxygen atoms in which hydrogen atoms flowed freely. When sandwiched between layers of graphene, ice forms a square lattice. The details of the chemical nature of liquid water are not well understood; some theories suggest that its unusual behavior is due to the existence of two liquid states. Pure water is usually described as tasteless and odorless, although humans have specific sensors that can feel the presence of water in their mouths, and frogs are known to be able to smell it. However, water from ordinary sources (including mineral water) usually has many dissolved substances that may give it varying tastes and odors. Humans and other animals have developed senses that enable them to evaluate the potability of water to avoid water that is too salty or putrid. Pure water is visibly blue due to absorption of light in the region c. 600–800 nm. The color can be easily observed in a glass of tap-water placed against a pure white background, in daylight. The principal absorption bands responsible for the color are overtones of the O–H stretching vibrations. The apparent intensity of the color increases with the depth of the water column, following Beer's law. This also applies, for example, with a swimming pool when the light source is sunlight reflected from the pool's white tiles. In nature, the color may also be modified from blue to green due to the presence of suspended solids or algae. In industry, near-infrared spectroscopy is used with aqueous solutions as the greater intensity of the lower overtones of water means that glass cuvettes with short path-length may be employed. To observe the fundamental stretching absorption spectrum of water or of an aqueous solution in the region around 3,500 cm−1 (2.85 μm) a path length of about 25 μm is needed. Also, the cuvette must be both transparent around 3500 cm−1 and insoluble in water; calcium fluoride is one material that is in common use for the cuvette windows with aqueous solutions. The Raman-active fundamental vibrations may be observed with, for example, a 1 cm sample cell. Aquatic plants, algae, and other photosynthetic organisms can live in water up to hundreds of meters deep, because sunlight can reach them. Practically no sunlight reaches the parts of the oceans below 1,000 metres (3,300 ft) of depth. The refractive index of liquid water (1.333 at 20 °C (68 °F)) is much higher than that of air (1.0), similar to those of alkanes and ethanol, but lower than those of glycerol (1.473), benzene (1.501), carbon disulfide (1.627), and common types of glass (1.4 to 1.6). The refraction index of ice (1.31) is lower than that of liquid water. In a water molecule, the hydrogen atoms form a 104.5° angle with the oxygen atom. The hydrogen atoms are close to two corners of a tetrahedron centered on the oxygen. At the other two corners are lone pairs of valence electrons that do not participate in the bonding. In a perfect tetrahedron, the atoms would form a 109.5° angle, but the repulsion between the lone pairs is greater than the repulsion between the hydrogen atoms. The O–H bond length is about 0.096 nm. Other substances have a tetrahedral molecular structure, for example methane (CH4) and hydrogen sulfide (H2S). However, oxygen is more electronegative than most other elements, so the oxygen atom has a negative partial charge while the hydrogen atoms are partially positively charged. Along with the bent structure, this gives the molecule an electrical dipole moment and it is classified as a polar molecule. Water is a good polar solvent, dissolving many salts and hydrophilic organic molecules such as sugars and simple alcohols such as ethanol. Water also dissolves many gases, such as oxygen and carbon dioxide –the latter giving the fizz of carbonated beverages, sparkling wines and beers. In addition, many substances in living organisms, such as proteins, DNA and polysaccharides, are dissolved in water. The interactions between water and the subunits of these biomacromolecules shape protein folding, DNA base pairing, and other phenomena crucial to life (hydrophobic effect). Many organic substances (such as fats and oils and alkanes) are hydrophobic, that is, insoluble in water. Many inorganic substances are insoluble too, including most metal oxides, sulfides, and silicates. Because of its polarity, a molecule of water in the liquid or solid state can form up to four hydrogen bonds with neighboring molecules. Hydrogen bonds are about ten times as strong as the Van der Waals force that attracts molecules to each other in most liquids. This is the reason why the melting and boiling points of water are much higher than those of other analogous compounds like hydrogen sulfide. They also explain its exceptionally high specific heat capacity (about 4.2 J/(g·K)), heat of fusion (about 333 J/g), heat of vaporization (2257 J/g), and thermal conductivity (between 0.561 and 0.679 W/(m·K)). These properties make water more effective at moderating Earth's climate, by storing heat and transporting it between the oceans and the atmosphere. The hydrogen bonds of water are around 23 kJ/mol (compared to a covalent O–H bond at 492 kJ/mol). Of this, it is estimated that 90% is attributable to electrostatics, while the remaining 10% is partially covalent. These bonds are the cause of water's high surface tension and capillary forces. Capillary action refers to the tendency of water to move up a narrow tube against the force of gravity. This property is relied upon by all vascular plants, such as trees. Water is a weak solution of hydronium hydroxide –there is an equilibrium 2H2O ⇌ H3O+ + OH−, in combination with solvation of the resulting hydronium and hydroxide ions. Pure water has a low electrical conductivity, which increases with the dissolution of a small amount of ionic material such as common salt. Liquid water can be split into the elements hydrogen and oxygen by passing an electric current through it—a process called electrolysis. The decomposition requires more energy input than the heat released by the inverse process (285.8 kJ/mol, or 15.9 MJ/kg). Liquid water can be assumed to be incompressible for most purposes: its compressibility ranges from 4.4 to 5.1×10−10 Pa−1 in ordinary conditions. Even in oceans at 4 km depth, where the pressure is 400 atm, water suffers only a 1.8% decrease in volume. The viscosity of water is about 10−3 Pa·s or 0.01 poise at 20 °C (68 °F), and the speed of sound in liquid water ranges between 1,400 and 1,540 metres per second (4,600 and 5,100 ft/s) depending on temperature. Sound travels long distances in water with little attenuation, especially at low frequencies (roughly 0.03 dB/km for 1 kHz), a property that is exploited by cetaceans and humans for communication and environment sensing (sonar). Metallic elements which are more electropositive than hydrogen, particularly the alkali metals and, to a lesser extent, alkaline earth metals, displace hydrogen from water, forming hydroxides and releasing hydrogen. At high temperatures, coke, a form of carbon, reacts with steam to form carbon monoxide and hydrogen. On Earth Hydrology is the study of the movement, distribution, and quality of water throughout the Earth. The study of the distribution of water is hydrography. The study of the distribution and movement of groundwater is hydrogeology, of glaciers is glaciology, of inland waters is limnology and distribution of oceans is oceanography. Ecological processes with hydrology are in the focus of ecohydrology. The collective mass of water found on, under, and over the surface of a planet is called the hydrosphere. Earth's approximate water volume (the total water supply of the world) is 1.386 billion cubic kilometres (333 million cubic miles). Liquid water is found in bodies of water, such as an ocean, sea, lake, river, stream, canal, pond, or puddle. The majority of water on Earth is seawater. Water is also present in the atmosphere in solid, liquid, and vapor states. It also exists as groundwater in aquifers. Water is important in many geological processes. Groundwater is present in most rocks, and the pressure of this groundwater affects patterns of faulting. Water in the mantle is responsible for the melt that produces volcanoes at subduction zones. On the surface of the Earth, water is important in both chemical and physical weathering processes. Water, and to a lesser but still significant extent, ice, are also responsible for a large amount of sediment transport that occurs on the surface of the earth. Deposition of transported sediment forms many types of sedimentary rocks, which make up the geologic record of Earth history. The water cycle (known scientifically as the hydrologic cycle) is the continuous exchange of water within the hydrosphere, between the atmosphere, soil water, surface water, groundwater, and plants. Water moves perpetually through each of these regions in the water cycle consisting of the following transfer processes: Most water vapors found mostly in the ocean returns to it, but winds carry water vapor over land at the same rate as runoff into the sea, about 47 Tt per year while evaporation and transpiration happening in land masses also contribute another 72 Tt per year. Precipitation, at a rate of 119 Tt per year over land, has several forms: most commonly rain, snow, and hail, with some contribution from fog and dew. Dew is small drops of water that are condensed when a high density of water vapor meets a cool surface. Dew usually forms in the morning when the temperature is the lowest, just before sunrise and when the temperature of the earth's surface starts to increase. Condensed water in the air may also refract sunlight to produce rainbows. Water runoff often collects over watersheds flowing into rivers. Through erosion, runoff shapes the environment creating river valleys and deltas which provide rich soil and level ground for the establishment of population centers. A flood occurs when an area of land, usually low-lying, is covered with water which occurs when a river overflows its banks or a storm surge happens. On the other hand, drought is an extended period of months or years when a region notes a deficiency in its water supply. This occurs when a region receives consistently below average precipitation either due to its topography or due to its location in terms of latitude. Water resources are natural resources of water that are potentially useful for humans, for example as a source of drinking water supply or irrigation water. Water occurs as both "stocks" and "flows". Water can be stored as lakes, water vapor, groundwater or aquifers, and ice and snow. Of the total volume of global freshwater, an estimated 69 percent is stored in glaciers and permanent snow cover; 30 percent is in groundwater; and the remaining 1 percent in lakes, rivers, the atmosphere, and biota. The length of time water remains in storage is highly variable: some aquifers consist of water stored over thousands of years, but lake volumes may fluctuate on a seasonal basis, decreasing during dry periods and increasing during wet ones. A substantial fraction of the water supply for some regions consists of water extracted from water stored in stocks, and when withdrawals exceed recharge, stocks decrease. By some estimates, as much as 30 percent of total water used for irrigation comes from unsustainable withdrawals of groundwater, causing groundwater depletion. Seawater contains about 3.5% sodium chloride on average, plus smaller amounts of other substances. The physical properties of seawater differ from fresh water in some important respects. It freezes at a lower temperature (about −1.9 °C (28.6 °F)) and its density increases with decreasing temperature to the freezing point, instead of reaching maximum density at a temperature above freezing. The salinity of water in major seas varies from about 0.7% in the Baltic Sea to 4.0% in the Red Sea. (The Dead Sea, known for its ultra-high salinity levels of between 30 and 40%, is really a salt lake.) Tides are the cyclic rising and falling of local sea levels caused by the tidal forces of the Moon and the Sun acting on the oceans. Tides cause changes in the depth of the marine and estuarine water bodies and produce oscillating currents known as tidal streams. The changing tide produced at a given location is the result of the changing positions of the Moon and Sun relative to the Earth coupled with the effects of Earth rotation and the local bathymetry. The strip of seashore that is submerged at high tide and exposed at low tide, the intertidal zone, is an important ecological product of ocean tides. Effects on life From a biological standpoint, water has many distinct properties that are critical for the proliferation of life. It carries out this role by allowing organic compounds to react in ways that ultimately allow replication. All known forms of life depend on water. Water is vital both as a solvent in which many of the body's solutes dissolve and as an essential part of many metabolic processes within the body. Metabolism is the sum total of anabolism and catabolism. In anabolism, water is removed from molecules (through energy requiring enzymatic chemical reactions) to grow larger molecules (e.g., starches, triglycerides, and proteins for storage of fuels and information). In catabolism, water is used to break bonds to generate smaller molecules (e.g., glucose, fatty acids, and amino acids to be used for fuels for energy use or other purposes). Without water, these particular metabolic processes could not exist. Water is fundamental to both photosynthesis and respiration. Photosynthetic cells use the sun's energy to split off water's hydrogen from oxygen. In the presence of sunlight, hydrogen is combined with CO2 (absorbed from air or water) to form glucose and release oxygen. All living cells use such fuels and oxidize the hydrogen and carbon to capture the sun's energy and reform water and CO2 in the process (cellular respiration). Water is also central to acid-base neutrality and enzyme function. An acid, a hydrogen ion (H+, that is, a proton) donor, can be neutralized by a base, a proton acceptor such as a hydroxide ion (OH−) to form water. Water is considered to be neutral, with a pH (the negative log of the hydrogen ion concentration) of 7 in an ideal state. Acids have pH values less than 7 while bases have values greater than 7. Earth's surface waters are filled with life. The earliest life forms appeared in water; nearly all fish live exclusively in water, and there are many types of marine mammals, such as dolphins and whales. Some kinds of animals, such as amphibians, spend portions of their lives in water and portions on land. Plants such as kelp and algae grow in the water and are the basis for some underwater ecosystems. Plankton is generally the foundation of the ocean food chain. Aquatic vertebrates must obtain oxygen to survive, and they do so in various ways. Fish have gills instead of lungs, although some species of fish, such as the lungfish, have both. Marine mammals, such as dolphins, whales, otters, and seals need to surface periodically to breathe air. Some amphibians are able to absorb oxygen through their skin. Invertebrates exhibit a wide range of modifications to survive in poorly oxygenated waters including breathing tubes (see insect and mollusc siphons) and gills (Carcinus). However, as invertebrate life evolved in an aquatic habitat most have little or no specialization for respiration in water. Effects on human civilization Civilization has historically flourished around rivers and major waterways; Mesopotamia, one of the so-called cradles of civilization, was situated between the major rivers Tigris and Euphrates; the ancient society of the Egyptians depended entirely upon the Nile. The early Indus Valley civilization (c. 3300 BCE – c. 1300 BCE) developed along the Indus River and tributaries that flowed out of the Himalayas. Rome was also founded on the banks of the Italian river Tiber. Large metropolises like Rotterdam, London, Montreal, Paris, New York City, Buenos Aires, Shanghai, Tokyo, Chicago, and Hong Kong owe their success in part to their easy accessibility via water and the resultant expansion of trade. Islands with safe water ports, like Singapore, have flourished for the same reason. In places such as North Africa and the Middle East, where water is more scarce, access to clean drinking water was and is a major factor in human development. Water fit for human consumption is called drinking water or potable water. Water that is not potable may be made potable by filtration or distillation, or by a range of other methods. More than 660 million people do not have access to safe drinking water. Water that is not fit for drinking but is not harmful to humans when used for swimming or bathing is called by various names other than potable or drinking water, and is sometimes called safe water, or "safe for bathing". Chlorine is a skin and mucous membrane irritant that is used to make water safe for bathing or drinking. Its use is highly technical and is usually monitored by government regulations (typically 1 part per million (ppm) for drinking water, and 1–2 ppm of chlorine not yet reacted with impurities for bathing water). Water for bathing may be maintained in satisfactory microbiological condition using chemical disinfectants such as chlorine or ozone or by the use of ultraviolet light. Water reclamation is the process of converting wastewater (most commonly sewage, also called municipal wastewater) into water that can be reused for other purposes. There are 2.3 billion people who reside in nations with water scarcities, which means that each individual receives less than 1,700 cubic metres (60,000 cu ft) of water annually. 380 billion cubic metres (13×10^12 cu ft) of municipal wastewater are produced globally each year. Freshwater is a renewable resource, recirculated by the natural hydrologic cycle, but pressures over access to it result from the naturally uneven distribution in space and time, growing economic demands by agriculture and industry, and rising populations. Currently, nearly a billion people around the world lack access to safe, affordable water. In 2000, the United Nations established the Millennium Development Goals for water to halve by 2015 the proportion of people worldwide without access to safe water and sanitation. Progress toward that goal was uneven, and in 2015 the UN committed to the Sustainable Development Goals of achieving universal access to safe and affordable water and sanitation by 2030. Poor water quality and bad sanitation are deadly; some five million deaths a year are caused by water-related diseases. The World Health Organization estimates that safe water could prevent 1.4 million child deaths from diarrhea each year. In developing countries, 90% of all municipal wastewater still goes untreated into local rivers and streams. Some 50 countries, with roughly a third of the world's population, also suffer from medium or high water scarcity and 17 of these extract more water annually than is recharged through their natural water cycles. The strain not only affects surface freshwater bodies like rivers and lakes, but it also degrades groundwater resources. The most substantial human use of water is for agriculture, including irrigated agriculture, which accounts for as much as 80 to 90 percent of total human water consumption. In the United States, 42% of freshwater withdrawn for use is for irrigation, but the vast majority of water "consumed" (used and not returned to the environment) goes to agriculture. Access to fresh water is often taken for granted, especially in developed countries that have built sophisticated water systems for collecting, purifying, and delivering water, and removing wastewater. But growing economic, demographic, and climatic pressures are increasing concerns about water issues, leading to increasing competition for fixed water resources, giving rise to the concept of peak water. As populations and economies continue to grow, consumption of water-thirsty meat expands, and new demands rise for biofuels or new water-intensive industries, new water challenges are likely. An assessment of water management in agriculture was conducted in 2007 by the International Water Management Institute in Sri Lanka to see if the world had sufficient water to provide food for its growing population. It assessed the current availability of water for agriculture on a global scale and mapped out locations suffering from water scarcity. It found that a fifth of the world's people, more than 1.2 billion, live in areas of physical water scarcity, where there is not enough water to meet all demands. A further 1.6 billion people live in areas experiencing economic water scarcity, where the lack of investment in water or insufficient human capacity make it impossible for authorities to satisfy the demand for water. The report found that it would be possible to produce the food required in the future, but that continuation of today's food production and environmental trends would lead to crises in many parts of the world. To avoid a global water crisis, farmers will have to strive to increase productivity to meet growing demands for food, while industries and cities find ways to use water more efficiently. Water scarcity is also caused by production of water intensive products. For example, cotton: 1 kg of cotton—equivalent of a pair of jeans—requires 10.9 cubic metres (380 cu ft) water to produce. While cotton accounts for 2.4% of world water use, the water is consumed in regions that are already at a risk of water shortage. Significant environmental damage has been caused: for example, the diversion of water by the former Soviet Union from the Amu Darya and Syr Darya rivers to produce cotton was largely responsible for the disappearance of the Aral Sea. On 7 April 1795, the gram was defined in France to be equal to "the absolute weight of a volume of pure water equal to a cube of one-hundredth of a meter, and at the temperature of melting ice". For practical purposes though, a metallic reference standard was required, one thousand times more massive, the kilogram. Work was therefore commissioned to determine precisely the mass of one liter of water. In spite of the fact that the decreed definition of the gram specified water at 0 °C (32 °F)—a highly reproducible temperature—the scientists chose to redefine the standard and to perform their measurements at the temperature of highest water density, which was measured at the time as 4 °C (39 °F). The Kelvin temperature scale of the SI system was based on the triple point of water, defined as exactly 273.16 K (0.01 °C; 32.02 °F), but as of May 2019 is based on the Boltzmann constant instead. The scale is an absolute temperature scale with the same increment as the Celsius temperature scale, which was originally defined according to the boiling point (set to 100 °C (212 °F)) and melting point (set to 0 °C (32 °F)) of water. Natural water consists mainly of the isotopes hydrogen-1 and oxygen-16, but there is also a small quantity of heavier isotopes oxygen-18, oxygen-17, and hydrogen-2 (deuterium). The percentage of the heavier isotopes is very small, but it still affects the properties of water. Water from rivers and lakes tends to contain less heavy isotopes than seawater. Therefore, standard water is defined in the Vienna Standard Mean Ocean Water specification. The human body contains, on average, 50–60% water, depending on age, gender and body size, although individuals may have anywhere between 45% and 75%. The U.S. National Academies of Sciences, Engineering, and Medicine recommends a daily intake of 3.7 liters (0.98 U.S. gallons) of water for adult men and 2.7 L (0.71 U.S. gal) for women. The precise amount depends on the level of activity, temperature, humidity, and other factors. Most of this is ingested through foods or beverages other than drinking straight water. Medical literature favors a lower consumption, typically 1 liter of water for an average male, excluding extra requirements due to fluid loss from exercise or warm weather. Healthy kidneys can excrete 0.8 to 1 liter of water per hour, but stress such as exercise can reduce this amount. People can drink far more water than necessary while exercising, putting them at risk of water intoxication (hyperhydration), which can be fatal. The popular claim that "a person should consume eight glasses of water per day" seems to have no real basis in science. Studies have shown that extra water intake, especially up to 500 millilitres (18 imp fl oz; 17 US fl oz) at mealtime, was associated with weight loss. Adequate fluid intake is helpful in preventing constipation. An original recommendation for water intake in 1945 by the Food and Nutrition Board of the U.S. National Research Council read: "An ordinary standard for diverse persons is 1 milliliter for each calorie of food. Most of this quantity is contained in prepared foods." The latest dietary reference intake report by the U.S. National Research Council in general recommended, based on the median total water intake from US survey data (including food sources): 3.7 litres (0.81 imp gal; 0.98 US gal) for men and 2.7 litres (0.59 imp gal; 0.71 US gal) of water total for women, noting that water contained in food provided approximately 19% of total water intake in the survey. Specifically, pregnant and breastfeeding women need additional fluids to stay hydrated. The US Institute of Medicine recommends that, on average, men consume 3 litres (0.66 imp gal; 0.79 US gal) and women 2.2 litres (0.48 imp gal; 0.58 US gal); pregnant women should increase intake to 2.4 litres (0.53 imp gal; 0.63 US gal) and breastfeeding women should get 3 liters (12 cups), since an especially large amount of fluid is lost during nursing. Also noted is that normally, about 20% of water intake comes from food, while the rest comes from drinking water and beverages (caffeinated included). Water is excreted from the body in multiple forms; through urine and feces, through sweating, and by exhalation of water vapor in the breath. With physical exertion and heat exposure, water loss will increase and daily fluid needs may increase as well. Humans require water with few impurities. Common impurities include metals like copper and lead; chemical compounds such as pesticides, PFAS, or bleach; and harmful bacteria, such as Campylobacter, E. coli O157, and Vibrio. Some solutes are acceptable and even desirable for taste enhancement and to provide needed electrolytes. The single largest (by volume) freshwater resource suitable for drinking is Lake Baikal in Siberia. Washing is a method of cleaning, usually with water and soap or detergent. Regularly washing and then rinsing both body and clothing is an essential part of good hygiene and health. Often people use soaps and detergents to assist in the emulsification of oils and dirt particles so they can be washed away. The soap can be applied directly, or with the aid of a washcloth or assisted with sponges or similar cleaning tools. In social contexts, washing refers to the act of bathing, or washing different parts of the body, such as hands, hair, or faces. Excessive washing may damage the hair, causing dandruff, or cause rough skin/skin lesions. Some washing of the body is done ritually in religions like Christianity and Judaism, as an act of purification. Washing can also refer to washing objects. For example, washing of clothing or other cloth items, like bedsheets, or washing dishes or cookwear. Keeping objects clean, especially if they interact with food or the skin, can help with sanitation. Other kinds of washing focus on maintaining cleanliness and durability of objects that get dirty, such washing one's car, by lathering the exterior with car soap, or washing tools used in a dirty process. Maritime transport (or ocean transport) or more generally waterborne transport, is the transport of people (passengers) or goods (cargo) via waterways. Freight transport by watercraft has been widely used throughout recorded history, as it provides a higher-capacity mode of transportation for passengers and cargo than land transport, the latter typically being more costly per unit payload due to it being affected by terrain conditions and road/rail infrastructures. The advent of aviation during the 20th century has diminished the importance of sea travel for passengers, though it is still popular for short trips and pleasure cruises. Transport by watercraft is much cheaper than transport by aircraft or land vehicles (both road and rail), but is significantly slower for longer journeys and heavily dependent on adequate port facilities. Maritime transport accounts for roughly 80% of international trade, according to UNCTAD in 2020. Maritime transport can be realized over any distance as long as there are connecting bodies of water that are navigable to boats, ships or barges such as oceans, lakes, rivers and canals. Shipping may be for commerce, recreation, or military purposes, and is an important aspect of logistics in human societies since early shipbuilding and river engineering were developed, leading to canal ages in various civilizations. While extensive inland shipping is less critical today, the major waterways of the world including many canals are still very important and are integral parts of worldwide economies. Particularly, especially any material can be moved by water; however, water transport becomes impractical when material delivery is time-critical such as various types of perishable produce. Still, water transport is highly cost effective with regular schedulable cargoes, such as trans-oceanic shipping of consumer products – and especially for heavy loads or bulk cargos, such as coal, coke, ores or grains. Arguably, the Industrial Revolution had its first impacts where cheap water transport by canal, navigations, or shipping by all types of watercraft on natural waterways supported cost-effective bulk transport. Containerization revolutionized maritime transport starting in the 1970s. "General cargo" includes goods packaged in boxes, cases, pallets, and barrels. When a cargo is carried in more than one mode, it is intermodal or co-modal. Water is widely used in chemical reactions as a solvent or reactant and less commonly as a solute or catalyst. In inorganic reactions, water is a common solvent, dissolving many ionic compounds, as well as other polar compounds such as ammonia and compounds closely related to water. In organic reactions, it is not usually used as a reaction solvent, because it does not dissolve the reactants well and is amphoteric (acidic and basic) and nucleophilic. Nevertheless, these properties are sometimes desirable. Also, acceleration of Diels-Alder reactions by water has been observed. Supercritical water has recently been a topic of research. Oxygen-saturated supercritical water combusts organic pollutants efficiently. Water and steam are a common fluid used for heat exchange, due to its availability and high heat capacity, both for cooling and heating. Cool water may even be naturally available from a lake or the sea. It is especially effective to transport heat through vaporization and condensation of water because of its large latent heat of vaporization. A disadvantage is that metals commonly found in industries such as steel and copper are oxidized faster by untreated water and steam. In almost all thermal power stations, water is used as the working fluid (used in a closed-loop between boiler, steam turbine, and condenser), and the coolant (used to exchange the waste heat to a water body or carry it away by evaporation in a cooling tower). In the United States, cooling power plants is the largest use of water. In the nuclear power industry, water can also be used as a neutron moderator. In most nuclear reactors, water is both a coolant and a moderator. This provides something of a passive safety measure, as removing the water from the reactor also slows the nuclear reaction down. However other methods are favored for stopping a reaction and it is preferred to keep the nuclear core covered with water so as to ensure adequate cooling. Water has a high heat of vaporization and is relatively inert, which makes it a good fire extinguishing fluid. The evaporation of water carries heat away from the fire. It is dangerous to use water on fires involving oils and organic solvents because many organic materials float on water and the water tends to spread the burning liquid. Use of water in fire fighting should also take into account the hazards of a steam explosion, which may occur when water is used on very hot fires in confined spaces, and of a hydrogen explosion, when substances which react with water, such as certain metals or hot carbon such as coal, charcoal, or coke graphite, decompose the water, producing water gas. The power of such explosions was seen in the Chernobyl disaster, although the water involved in this case did not come from fire-fighting but from the reactor's own water cooling system. A steam explosion occurred when the extreme overheating of the core caused water to flash into steam. A hydrogen explosion may have occurred as a result of a reaction between steam and hot zirconium. Some metallic oxides, most notably those of alkali metals and alkaline earth metals, produce so much heat in reaction with water that a fire hazard can develop. The alkaline earth oxide quicklime, also known as calcium oxide, is a mass-produced substance that is often transported in paper bags. If these are soaked through, they may ignite as their contents react with water. Humans use water for many recreational purposes, as well as for exercising and for sports. Some of these include swimming, waterskiing, boating, surfing and diving. In addition, some sports, like ice hockey and ice skating, are played on ice. Lakesides, beaches and water parks are popular places for people to go to relax and enjoy recreation. Many find the sound and appearance of flowing water to be calming, and fountains and other flowing water structures are popular decorations. Some keep fish and other flora and fauna inside aquariums or ponds for show, fun, and companionship. Humans also use water for snow sports, such as skiing, sledding, snowmobiling or snowboarding, which require the water to be at a low temperature either as ice or crystallized into snow. The water industry provides drinking water and wastewater services (including sewage treatment) to households and industry. Water supply facilities include water wells, cisterns for rainwater harvesting, water supply networks, and water purification facilities, water tanks, water towers, water pipes including old aqueducts. Atmospheric water generators are in development. Drinking water is often collected at springs, extracted from artificial borings (wells) in the ground, or pumped from lakes and rivers. Building more wells in adequate places is thus a possible way to produce more water, assuming the aquifers can supply an adequate flow. Other water sources include rainwater collection. Water may require purification for human consumption. This may involve the removal of undissolved substances, dissolved substances and harmful microbes. Popular methods are filtering with sand which only removes undissolved material, while chlorination and boiling kill harmful microbes. Distillation does all three functions. More advanced techniques exist, such as reverse osmosis. Desalination of abundant seawater is a more expensive solution used in coastal arid climates. The distribution of drinking water is done through municipal water systems, tanker delivery or as bottled water. Governments in many countries have programs to distribute water to the needy at no charge. Reducing usage by using drinking (potable) water only for human consumption is another option. In some cities such as Hong Kong, seawater is extensively used for flushing toilets citywide to conserve freshwater resources. Polluting water may be the biggest single misuse of water; to the extent that a pollutant limits other uses of the water, it becomes a waste of the resource, regardless of benefits to the polluter. Like other types of pollution, this does not enter standard accounting of market costs, being conceived as externalities for which the market cannot account. Thus other people pay the price of water pollution, while the private firms' profits are not redistributed to the local population, victims of this pollution. Pharmaceuticals consumed by humans often end up in the waterways and can have detrimental effects on aquatic life if they bioaccumulate and if they are not biodegradable. Municipal and industrial wastewater are typically treated at wastewater treatment plants. Mitigation of polluted surface runoff is addressed through a variety of prevention and treatment techniques. Many industrial processes rely on reactions using chemicals dissolved in water, suspension of solids in water slurries or using water to dissolve and extract substances, or to wash products or process equipment. Processes such as mining, chemical pulping, pulp bleaching, paper manufacturing, textile production, dyeing, printing, and cooling of power plants use large amounts of water, requiring a dedicated water source, and often cause significant water pollution. Water is used in power generation. Hydroelectricity is electricity obtained from hydropower. Hydroelectric power comes from water driving a water turbine connected to a generator. Hydroelectricity is a low-cost, non-polluting, renewable energy source. The energy is supplied by the motion of water. Typically a dam is constructed on a river, creating an artificial lake behind it. Water flowing out of the lake is forced through turbines that turn generators. Pressurized water is used in water blasting and water jet cutters. High pressure water guns are used for precise cutting. It works very well, is relatively safe, and is not harmful to the environment. It is also used in the cooling of machinery to prevent overheating, or prevent saw blades from overheating. Water is also used in many industrial processes and machines, such as the steam turbine and heat exchanger, in addition to its use as a chemical solvent. Discharge of untreated water from industrial uses is pollution. Pollution includes discharged solutes (chemical pollution) and discharged coolant water (thermal pollution). Industry requires pure water for many applications and uses a variety of purification techniques both in water supply and discharge. The digital sector, especially Artificial intelligence use large amount of water, so AI expansion can "threaten global and national water security". Boiling, steaming, and simmering are popular cooking methods that often require immersing food in water or its gaseous state, steam. Water is also used for dishwashing. Water also plays many critical roles within the field of food science. Solutes such as salts and sugars found in water affect the physical properties of water. The boiling and freezing points of water are affected by solutes, as well as air pressure, which is in turn affected by altitude. Water boils at lower temperatures with the lower air pressure that occurs at higher elevations. One mole of sucrose (sugar) per kilogram of water raises the boiling point of water by 0.51 °C (0.918 °F), and one mole of salt per kg raises the boiling point by 1.02 °C (1.836 °F); similarly, increasing the number of dissolved particles lowers water's freezing point. Solutes in water also affect water activity that affects many chemical reactions and the growth of microbes in food. Water activity can be described as a ratio of the vapor pressure of water in a solution to the vapor pressure of pure water. Solutes in water lower water activity—this is important to know because most bacterial growth ceases at low levels of water activity. Not only does microbial growth affect the safety of food, but also the preservation and shelf life of food. Water hardness is also a critical factor in food processing and may be altered or treated by using a chemical ion exchange system. It can dramatically affect the quality of a product, as well as playing a role in sanitation. Water hardness is classified based on concentration of calcium carbonate the water contains. Water is classified as soft if it contains less than 100 mg/L (UK) or less than 60 mg/L (US). According to a report published by the Water Footprint organization in 2010, a single kilogram of beef requires 15 thousand litres (3.3×10^3 imp gal; 4.0×10^3 US gal) of water; however, the authors also make clear that this is a global average and circumstantial factors determine the amount of water used in beef production. Water for injection is on the World Health Organization's list of essential medicines. Distribution in nature Much of the universe's water is produced as a byproduct of star formation. The formation of stars is accompanied by a strong outward wind of gas and dust. When this outflow of material eventually impacts the surrounding gas, the shock waves that are created compress and heat the gas. The water observed is quickly produced in this warm dense gas. On 22 July 2011, a report described the discovery of a gigantic cloud of water vapor containing "140 trillion times more water than all of Earth's oceans combined" around a quasar located 12 billion light years from Earth. According to the researchers, the "discovery shows that water has been prevalent in the universe for nearly its entire existence". Water has been detected in interstellar clouds within the Milky Way. Water probably exists in abundance in other galaxies, too, because its components, hydrogen, and oxygen, are among the most abundant elements in the universe. Based on models of the formation and evolution of the Solar System and that of other star systems, most other planetary systems are likely to have similar ingredients. Water is present as vapor in: Liquid water is present on Earth, covering 71% of its surface. Liquid water is also occasionally present in small amounts on Mars. Scientists believe liquid water is present in the Saturnian moons of Enceladus, as a 10-kilometre thick ocean approximately 30–40 kilometers below Enceladus' south polar surface, and Titan, as a subsurface layer, possibly mixed with ammonia. Jupiter's moon Europa has surface characteristics which suggest a subsurface liquid water ocean. Liquid water may also exist on Jupiter's moon Ganymede as a layer sandwiched between high pressure ice and rock. Water is present as ice on: And is also likely present on: Water and other volatiles probably comprise much of the internal structures of Uranus and Neptune and the water in the deeper layers may be in the form of ionic water in which the molecules break down into a soup of hydrogen and oxygen ions, and deeper still as superionic water in which the oxygen crystallizes, but the hydrogen ions float about freely within the oxygen lattice. The existence of liquid water, and to a lesser extent its gaseous and solid forms, on Earth are vital to the existence of life on Earth as we know it. The Earth is located in the habitable zone of the Solar System; if it were slightly closer to or farther from the Sun (about 5%, or about 8 million kilometers), the conditions which allow the three forms to be present simultaneously would be far less likely to exist. Earth's size also plays a role: its gravity allows it to hold an atmosphere, including air moisture. Smaller planets like Mars have extremely thin or no atmospheres. Water vapor and carbon dioxide in the atmosphere provide a temperature buffer (greenhouse effect) which helps maintain a relatively steady surface temperature.[citation needed] The surface temperature of Earth has been relatively constant through geologic time despite varying levels of incoming solar radiation (insolation), indicating that a dynamic process governs Earth's temperature via a combination of greenhouse gases and surface or atmospheric albedo. This proposal is known as the Gaia hypothesis.[citation needed] The state of water on a planet depends on ambient pressure, which is determined by the planet's gravity. If a planet is sufficiently massive, the water on it may be solid even at high temperatures, because of the high pressure caused by gravity, as it was observed on exoplanets Gliese 436 b and GJ 1214 b. Law, politics, and crisis Water politics is politics affected by water and water resources. Water, particularly fresh water, is a strategic resource across the world and an important element in many political conflicts. It causes health impacts and damage to biodiversity. Access to safe drinking water has improved over the last decades in almost every part of the world, but approximately one billion people still lack access to safe water and over 2.5 billion lack access to adequate sanitation. However, some observers have estimated that by 2025 more than half of the world population will be facing water-based vulnerability. A report, issued in November 2009, suggests that by 2030, in some developing regions of the world, water demand will exceed supply by 50%. 1.6 billion people have gained access to a safe water source since 1990. The proportion of people in developing countries with access to safe water is calculated to have improved from 30% in 1970 to 71% in 1990, 79% in 2000, and 84% in 2004. A 2006 United Nations report stated that "there is enough water for everyone", but that access to it is hampered by mismanagement and corruption. In addition, global initiatives to improve the efficiency of aid delivery, such as the Paris Declaration on Aid Effectiveness, have not been taken up by water sector donors as effectively as they have in education and health, potentially leaving multiple donors working on overlapping projects and recipient governments without empowerment to act. The authors of the 2007 Comprehensive Assessment of Water Management in Agriculture cited poor governance as one reason for some forms of water scarcity. Water governance is the set of formal and informal processes through which decisions related to water management are made. Good water governance is primarily about knowing what processes work best in a particular physical and socioeconomic context. Mistakes have sometimes been made by trying to apply 'blueprints' that work in the developed world to developing world locations and contexts. The Mekong river is one example; a review by the International Water Management Institute of policies in six countries that rely on the Mekong river for water found that thorough and transparent cost-benefit analyses and environmental impact assessments were rarely undertaken. They also discovered that Cambodia's draft water law was much more complex than it needed to be. In 2004, the UK charity WaterAid reported that a child dies every 15 seconds from easily preventable water-related diseases, which are often tied to a lack of adequate sanitation. Since 2003, the UN World Water Development Report, produced by the UNESCO World Water Assessment Programme, has provided decision-makers with tools for developing sustainable water policies. The 2023 report states that two billion people (26% of the population) do not have access to drinking water and 3.6 billion (46%) lack access to safely managed sanitation. People in urban areas (2.4 billion) will face water scarcity by 2050. Water scarcity has been described as endemic, due to overconsumption and pollution. The report states that 10% of the world's population lives in countries with high or critical water stress. Yet over the past 40 years, water consumption has increased by around 1% per year, and is expected to grow at the same rate until 2050. Since 2000, flooding in the tropics has quadrupled, while flooding in northern mid-latitudes has increased by a factor of 2.5. The cost of these floods between 2000 and 2019 was 100,000 deaths and $650 million. Organizations concerned with water protection include the International Water Association (IWA), WaterAid, Water 1st, and the American Water Resources Association. The International Water Management Institute undertakes projects with the aim of using effective water management to reduce poverty. Water related conventions are United Nations Convention to Combat Desertification (UNCCD), International Convention for the Prevention of Pollution from Ships, United Nations Convention on the Law of the Sea and Ramsar Convention. World Day for Water takes place on 22 March and World Oceans Day on 8 June. In 2026 the United Nation declared that humanity enter the era of "Water bankrupcy" because more than 50% of large lakes have declined since the 1990’s, 35% of wetlands dissapeared since 1970, 75% of the world’s population live in water-insecure countries, 4 billion people experience water scarcity for at least one month per year, and drought cost $307 billion per year. It calls for policies matching reality rather than past norms. In culture Water is considered a purifier in most religions. Faiths that incorporate ritual washing (ablution) include Christianity, Hinduism, Islam, Judaism, the Rastafari movement, Shinto, Taoism, and Wicca. Immersion (or aspersion or affusion) of a person in water is a central Sacrament of Christianity (where it is called baptism); it is also a part of the practice of other religions, including Islam (Ghusl), Judaism (mikvah) and Sikhism (Amrit Sanskar). In addition, a ritual bath in pure water is performed for the dead in many religions including Islam and Judaism. In Islam, the five daily prayers can be done in most cases after washing certain parts of the body using clean water (wudu), unless water is unavailable (see Tayammum). In Shinto, water is used in almost all rituals to cleanse a person or an area (e.g., in the ritual of misogi). In Christianity, holy water is water that has been sanctified by a priest for the purpose of baptism, the blessing of persons, places, and objects, or as a means of repelling evil. In Zoroastrianism, water (āb) is respected as the source of life. The Ancient Greek philosopher Empedocles saw water as one of the four classical elements (along with fire, earth, and air), and regarded it as an ylem, or basic substance of the universe. Thales, whom Aristotle portrayed as an astronomer and an engineer, theorized that the earth, which is denser than water, emerged from the water. Thales, a monist, believed further that all things are made from water. Plato believed that the shape of water is an icosahedron – flowing easily compared to the cube-shaped earth. The theory of the four bodily humors associated water with phlegm, as being cold and moist. The classical element of water was also one of the five elements in traditional Chinese philosophy (along with earth, fire, wood, and metal). Some traditional and popular Asian philosophical systems take water as a role-model. James Legge's 1891 translation of the Dao De Jing states, "The highest excellence is like (that of) water. The excellence of water appears in its benefiting all things, and in its occupying, without striving (to the contrary), the low place which all men dislike. Hence (its way) is near to (that of) the Tao" and "There is nothing in the world more soft and weak than water, and yet for attacking things that are firm and strong there is nothing that can take precedence of it—for there is nothing (so effectual) for which it can be changed." Guanzi in the "Shui di" 水地 chapter further elaborates on the symbolism of water, proclaiming that "man is water" and attributing natural qualities of the people of different Chinese regions to the character of local water resources. "Living water" features in Germanic and Slavic folktales as a means of bringing the dead back to life. Note the Grimm fairy-tale ("The Water of Life") and the Russian dichotomy of living [ru] and dead water [ru]. The Fountain of Youth represents a related concept of magical waters allegedly preventing aging. In the significant modernist novel Ulysses (1922) by Irish writer James Joyce, the chapter "Ithaca" takes the form of a catechism of 309 questions and answers, one of which is known as the "water hymn".: 91 According to Richard E. Madtes, the hymn is not merely a "monotonous string of facts", rather, its phrases, like their subject, "ebb and flow, heave and swell, gather and break, until they subside into the calm quiescence of the concluding 'pestilential fens, faded flowerwater, stagnant pools in the waning moon.'": 79 The hymn is considered one of the most remarkable passages in Ithaca, and according to literary critic Hugh Kenner, achieves "the improbable feat of raising to poetry all the clutter of footling information that has accumulated in schoolbooks.": 91 The literary motif of water represents the novel's theme of "everlasting, everchanging life," and the hymn represents the culmination of the motif in the novel.: 91 The following is the hymn quoted in full. What in water did Bloom, waterlover, drawer of water, watercarrier returning to the range, admire?Its universality: its democratic equality and constancy to its nature in seeking its own level: its vastness in the ocean of Mercator’s projection: its unplumbed profundity in the Sundam trench of the Pacific exceeding 8,000 fathoms: the restlessness of its waves and surface particles visiting in turn all points of its seaboard: the independence of its units: the variability of states of sea: its hydrostatic quiescence in calm: its hydrokinetic turgidity in neap and spring tides: its subsidence after devastation: its sterility in the circumpolar icecaps, arctic and antarctic: its climatic and commercial significance: its preponderance of 3 to 1 over the dry land of the globe: its indisputable hegemony extending in square leagues over all the region below the subequatorial tropic of Capricorn: the multisecular stability of its primeval basin: its luteofulvous bed: its capacity to dissolve and hold in solution all soluble substances including millions of tons of the most precious metals: its slow erosions of peninsulas and downwardtending promontories: its alluvial deposits: its weight and volume and density: its imperturbability in lagoons and highland tarns: its gradation of colours in the torrid and temperate and frigid zones: its vehicular ramifications in continental lakecontained streams and confluent oceanflowing rivers with their tributaries and transoceanic currents: gulfstream, north and south equatorial courses: its violence in seaquakes, waterspouts, artesian wells, eruptions, torrents, eddies, freshets, spates, groundswells, watersheds, waterpartings, geysers, cataracts, whirlpools, maelstroms, inundations, deluges, cloudbursts: its vast circumterrestrial ahorizontal curve: its secrecy in springs, and latent humidity, revealed by rhabdomantic or hygrometric instruments and exemplified by the well by the hole in the wall at Ashtown gate, saturation of air, distillation of dew: the simplicity of its composition, two constituent parts of hydrogen with one constituent part of oxygen: its healing virtues: its buoyancy in the waters of the Dead Sea: its persevering penetrativeness in runnels, gullies, inadequate dams, leaks on shipboard: its properties for cleansing, quenching thirst and fire, nourishing vegetation: its infallibility as paradigm and paragon: its metamorphoses as vapour, mist, cloud, rain, sleet, snow, hail: its strength in rigid hydrants: its variety of forms in loughs and bays and gulfs and bights and guts and lagoons and atolls and archipelagos and sounds and fjords and minches and tidal estuaries and arms of sea: its solidity in glaciers, icebergs, icefloes: its docility in working hydraulic millwheels, turbines, dynamos, electric power stations, bleachworks, tanneries, scutchmills: its utility in canals, rivers, if navigable, floating and graving docks: its potentiality derivable from harnessed tides or watercourses falling from level to level: its submarine fauna and flora (anacoustic, photophobe) numerically, if not literally, the inhabitants of the globe: its ubiquity as constituting 90% of the human body: the noxiousness of its effluvia in lacustrine marshes, pestilential fens, faded flowerwater, stagnant pools in the waning moon. Painter and activist Fredericka Foster curated The Value of Water, at the Cathedral of St. John the Divine in New York City, which anchored a year-long initiative by the Cathedral on our dependence on water. The largest exhibition to ever appear at the Cathedral, it featured over forty artists, including Jenny Holzer, Robert Longo, Mark Rothko, William Kentridge, April Gornik, Kiki Smith, Pat Steir, Alice Dalton Brown, Teresita Fernandez and Bill Viola. Foster created Think About Water,[full citation needed] an ecological collective of artists who use water as their subject or medium. Members include Basia Irland,[full citation needed] Aviva Rahmani, Betsy Damon, Diane Burko, Leila Daw, Stacy Levy, Charlotte Coté, Meridel Rubenstein, and Anna Macleod. To mark the 10th anniversary of access to water and sanitation being declared a human right by the UN, the charity WaterAid commissioned ten visual artists to show the impact of clean water on people's lives. 'Dihydrogen monoxide' is a technically correct but rarely used chemical name of water. This name has been used in a series of hoaxes and pranks that mock scientific illiteracy. This began in 1983, when an April Fools' Day article appeared in a newspaper in Durand, Michigan. The false story consisted of safety concerns about the substance. The word "Water" has been used by many Florida based rappers as a sort of catchphrase or adlib. Rappers who have done this include BLP Kosher and Ski Mask the Slump God. To go even further some rappers have made whole songs dedicated to the water in Florida, such as the 2023 Danny Towers song "Florida Water". Others have made whole songs dedicated to water as a whole, such as XXXTentacion, and Ski Mask the Slump God with their hit song "H2O". See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Lod#cite_ref-:6_37-0] | [TOKENS: 4733] |
Contents Lod Lod (Hebrew: לוד, fully vocalized: לֹד), also known as Lydda (Ancient Greek: Λύδδα) and Lidd (Arabic: اللِّدّ, romanized: al-Lidd, or اللُّدّ, al-Ludd), is a city 15 km (9+1⁄2 mi) southeast of Tel Aviv and 40 km (25 mi) northwest of Jerusalem in the Central District of Israel. It is situated between the lower Shephelah on the east and the coastal plain on the west. The city had a population of 90,814 in 2023. Lod has been inhabited since at least the Neolithic period. It is mentioned a few times in the Hebrew Bible and in the New Testament. Between the 5th century BCE and up until the late Roman period, it was a prominent center for Jewish scholarship and trade. Around 200 CE, the city became a Roman colony and was renamed Diospolis (Ancient Greek: Διόσπολις, lit. 'city of Zeus'). Tradition identifies Lod as the 4th century martyrdom site of Saint George; the Church of Saint George and Mosque of Al-Khadr located in the city is believed to have housed his remains. Following the Arab conquest of the Levant, Lod served as the capital of Jund Filastin; however, a few decades later, the seat of power was transferred to Ramla, and Lod slipped in importance. Under Crusader rule, the city was a Catholic diocese of the Latin Church and it remains a titular see to this day.[citation needed] Lod underwent a major change in its population in the mid-20th century. Exclusively Palestinian Arab in 1947, Lod was part of the area designated for an Arab state in the United Nations Partition Plan for Palestine; however, in July 1948, the city was occupied by the Israel Defense Forces, and most of its Arab inhabitants were expelled in the Palestinian expulsion from Lydda and Ramle. The city was largely resettled by Jewish immigrants, most of them expelled from Arab countries. Today, Lod is one of Israel's mixed cities, with an Arab population of 30%. Lod is one of Israel's major transportation hubs. The main international airport, Ben Gurion Airport, is located 8 km (5 miles) north of the city. The city is also a major railway and road junction. Religious references The Hebrew name Lod appears in the Hebrew Bible as a town of Benjamin, founded along with Ono by Shamed or Shamer (1 Chronicles 8:12; Ezra 2:33; Nehemiah 7:37; 11:35). In Ezra 2:33, it is mentioned as one of the cities whose inhabitants returned after the Babylonian captivity. Lod is not mentioned among the towns allocated to the tribe of Benjamin in Joshua 18:11–28. The name Lod derives from a tri-consonantal root not extant in Northwest Semitic, but only in Arabic (“to quarrel; withhold, hinder”). An Arabic etymology of such an ancient name is unlikely (the earliest attestation is from the Achaemenid period). In the New Testament, the town appears in its Greek form, Lydda, as the site of Peter's healing of Aeneas in Acts 9:32–38. The city is also mentioned in an Islamic hadith as the location of the battlefield where the false messiah (al-Masih ad-Dajjal) will be slain before the Day of Judgment. History The first occupation dates to the Neolithic in the Near East and is associated with the Lodian culture. Occupation continued in the Levant Chalcolithic. Pottery finds have dated the initial settlement in the area now occupied by the town to 5600–5250 BCE. In the Early Bronze, it was an important settlement in the central coastal plain between the Judean Shephelah and the Mediterranean coast, along Nahal Ayalon. Other important nearby sites were Tel Dalit, Tel Bareqet, Khirbat Abu Hamid (Shoham North), Tel Afeq, Azor and Jaffa. Two architectural phases belong to the late EB I in Area B. The first phase had a mudbrick wall, while the late phase included a circulat stone structure. Later excavations have produced an occupation later, Stratum IV. It consists of two phases, Stratum IVb with mudbrick wall on stone foundations and rounded exterior corners. In Stratum IVa there was a mudbrick wall with no stone foundations, with imported Egyptian potter and local pottery imitations. Another excavations revealed nine occupation strata. Strata VI-III belonged to Early Bronze IB. The material culture showed Egyptian imports in strata V and IV. Occupation continued into Early Bronze II with four strata (V-II). There was continuity in the material culture and indications of centralized urban planning. North to the tell were scattered MB II burials. The earliest written record is in a list of Canaanite towns drawn up by the Egyptian pharaoh Thutmose III at Karnak in 1465 BCE. From the fifth century BCE until the Roman period, the city was a centre of Jewish scholarship and commerce. According to British historian Martin Gilbert, during the Hasmonean period, Jonathan Maccabee and his brother, Simon Maccabaeus, enlarged the area under Jewish control, which included conquering the city. The Jewish community in Lod during the Mishnah and Talmud era is described in a significant number of sources, including information on its institutions, demographics, and way of life. The city reached its height as a Jewish center between the First Jewish-Roman War and the Bar Kokhba revolt, and again in the days of Judah ha-Nasi and the start of the Amoraim period. The city was then the site of numerous public institutions, including schools, study houses, and synagogues. In 43 BC, Cassius, the Roman governor of Syria, sold the inhabitants of Lod into slavery, but they were set free two years later by Mark Antony. During the First Jewish–Roman War, the Roman proconsul of Syria, Cestius Gallus, razed the town on his way to Jerusalem in Tishrei 66 CE. According to Josephus, "[he] found the city deserted, for the entire population had gone up to Jerusalem for the Feast of Tabernacles. He killed fifty people whom he found, burned the town and marched on". Lydda was occupied by Emperor Vespasian in 68 CE. In the period following the destruction of Jerusalem in 70 CE, Rabbi Tarfon, who appears in many Tannaitic and Jewish legal discussions, served as a rabbinic authority in Lod. During the Kitos War, 115–117 CE, the Roman army laid siege to Lod, where the rebel Jews had gathered under the leadership of Julian and Pappos. Torah study was outlawed by the Romans and pursued mostly in the underground. The distress became so great, the patriarch Rabban Gamaliel II, who was shut up there and died soon afterwards, permitted fasting on Ḥanukkah. Other rabbis disagreed with this ruling. Lydda was next taken and many of the Jews were executed; the "slain of Lydda" are often mentioned in words of reverential praise in the Talmud. In 200 CE, emperor Septimius Severus elevated the town to the status of a city, calling it Colonia Lucia Septimia Severa Diospolis. The name Diospolis ("City of Zeus") may have been bestowed earlier, possibly by Hadrian. At that point, most of its inhabitants were Christian. The earliest known bishop is Aëtius, a friend of Arius. During the following century (200-300CE), it's said that Joshua ben Levi founded a yeshiva in Lod. In December 415, the Council of Diospolis was held here to try Pelagius; he was acquitted. In the sixth century, the city was renamed Georgiopolis after St. George, a soldier in the guard of the emperor Diocletian, who was born there between 256 and 285 CE. The Church of Saint George and Mosque of Al-Khadr is named for him. The 6th-century Madaba map shows Lydda as an unwalled city with a cluster of buildings under a black inscription reading "Lod, also Lydea, also Diospolis". An isolated large building with a semicircular colonnaded plaza in front of it might represent the St George shrine. After the Muslim conquest of Palestine by Amr ibn al-'As in 636 CE, Lod which was referred to as "al-Ludd" in Arabic served as the capital of Jund Filastin ("Military District of Palaestina") before the seat of power was moved to nearby Ramla during the reign of the Umayyad Caliph Suleiman ibn Abd al-Malik in 715–716. The population of al-Ludd was relocated to Ramla, as well. With the relocation of its inhabitants and the construction of the White Mosque in Ramla, al-Ludd lost its importance and fell into decay. The city was visited by the local Arab geographer al-Muqaddasi in 985, when it was under the Fatimid Caliphate, and was noted for its Great Mosque which served the residents of al-Ludd, Ramla, and the nearby villages. He also wrote of the city's "wonderful church (of St. George) at the gate of which Christ will slay the Antichrist." The Crusaders occupied the city in 1099 and named it St Jorge de Lidde. It was briefly conquered by Saladin, but retaken by the Crusaders in 1191. For the English Crusaders, it was a place of great significance as the birthplace of Saint George. The Crusaders made it the seat of a Latin Church diocese, and it remains a titular see. It owed the service of 10 knights and 20 sergeants, and it had its own burgess court during this era. In 1226, Ayyubid Syrian geographer Yaqut al-Hamawi visited al-Ludd and stated it was part of the Jerusalem District during Ayyubid rule. Sultan Baybars brought Lydda again under Muslim control by 1267–8. According to Qalqashandi, Lydda was an administrative centre of a wilaya during the fourteenth and fifteenth century in the Mamluk empire. Mujir al-Din described it as a pleasant village with an active Friday mosque. During this time, Lydda was a station on the postal route between Cairo and Damascus. In 1517, Lydda was incorporated into the Ottoman Empire as part of the Damascus Eyalet, and in the 1550s, the revenues of Lydda were designated for the new waqf of Hasseki Sultan Imaret in Jerusalem, established by Hasseki Hurrem Sultan (Roxelana), the wife of Suleiman the Magnificent. By 1596 Lydda was a part of the nahiya ("subdistrict") of Ramla, which was under the administration of the liwa ("district") of Gaza. It had a population of 241 households and 14 bachelors who were all Muslims, and 233 households who were Christians. They paid a fixed tax-rate of 33,3 % on agricultural products, including wheat, barley, summer crops, vineyards, fruit trees, sesame, special product ("dawalib" =spinning wheels), goats and beehives, in addition to occasional revenues and market toll, a total of 45,000 Akçe. All of the revenue went to the Waqf. In 1051 AH/1641/2, the Bedouin tribe of al-Sawālima from around Jaffa attacked the villages of Subṭāra, Bayt Dajan, al-Sāfiriya, Jindās, Lydda and Yāzūr belonging to Waqf Haseki Sultan. The village appeared as Lydda, though misplaced, on the map of Pierre Jacotin compiled in 1799. Missionary William M. Thomson visited Lydda in the mid-19th century, describing it as a "flourishing village of some 2,000 inhabitants, imbosomed in noble orchards of olive, fig, pomegranate, mulberry, sycamore, and other trees, surrounded every way by a very fertile neighbourhood. The inhabitants are evidently industrious and thriving, and the whole country between this and Ramleh is fast being filled up with their flourishing orchards. Rarely have I beheld a rural scene more delightful than this presented in early harvest ... It must be seen, heard, and enjoyed to be appreciated." In 1869, the population of Ludd was given as: 55 Catholics, 1,940 "Greeks", 5 Protestants and 4,850 Muslims. In 1870, the Church of Saint George was rebuilt. In 1892, the first railway station in the entire region was established in the city. In the second half of the 19th century, Jewish merchants migrated to the city, but left after the 1921 Jaffa riots. In 1882, the Palestine Exploration Fund's Survey of Western Palestine described Lod as "A small town, standing among enclosure of prickly pear, and having fine olive groves around it, especially to the south. The minaret of the mosque is a very conspicuous object over the whole of the plain. The inhabitants are principally Moslim, though the place is the seat of a Greek bishop resident of Jerusalem. The Crusading church has lately been restored, and is used by the Greeks. Wells are found in the gardens...." From 1918, Lydda was under the administration of the British Mandate in Palestine, as per a League of Nations decree that followed the Great War. During the Second World War, the British set up supply posts in and around Lydda and its railway station, also building an airport that was renamed Ben Gurion Airport after the death of Israel's first prime minister in 1973. At the time of the 1922 census of Palestine, Lydda had a population of 8,103 inhabitants (7,166 Muslims, 926 Christians, and 11 Jews), the Christians were 921 Orthodox, 4 Roman Catholics and 1 Melkite. This had increased by the 1931 census to 11,250 (10,002 Muslims, 1,210 Christians, 28 Jews, and 10 Bahai), in a total of 2475 residential houses. In 1938, Lydda had a population of 12,750. In 1945, Lydda had a population of 16,780 (14,910 Muslims, 1,840 Christians, 20 Jews and 10 "other"). Until 1948, Lydda was an Arab town with a population of around 20,000—18,500 Muslims and 1,500 Christians. In 1947, the United Nations proposed dividing Mandatory Palestine into two states, one Jewish state and one Arab; Lydda was to form part of the proposed Arab state. In the ensuing war, Israel captured Arab towns outside the area the UN had allotted it, including Lydda. In December 1947, thirteen Jewish passengers in a seven-car convoy to Ben Shemen Youth Village were ambushed and murdered.In a separate incident, three Jewish youths, two men and a woman were captured, then raped and murdered in a neighbouring village. Their bodies were paraded in Lydda’s principal street. The Israel Defense Forces entered Lydda on 11 July 1948. The following day, under the impression that it was under attack, the 3rd Battalion was ordered to shoot anyone "seen on the streets". According to Israel, 250 Arabs were killed. Other estimates are higher: Arab historian Aref al Aref estimated 400, and Nimr al Khatib 1,700. In 1948, the population rose to 50,000 during the Nakba, as Arab refugees fleeing other areas made their way there. A key event was the Palestinian expulsion from Lydda and Ramle, with the expulsion of 50,000-70,000 Palestinians from Lydda and Ramle by the Israel Defense Forces. All but 700 to 1,056 were expelled by order of the Israeli high command, and forced to walk 17 km (10+1⁄2 mi) to the Jordanian Arab Legion lines. Estimates of those who died from exhaustion and dehydration vary from a handful to 355. The town was subsequently sacked by the Israeli army. Some scholars, including Ilan Pappé, characterize this as ethnic cleansing. The few hundred Arabs who remained in the city were soon outnumbered by the influx of Jews who immigrated to Lod from August 1948 onward, most of them from Arab countries. As a result, Lod became a predominantly Jewish town. After the establishment of the state, the biblical name Lod was readopted. The Jewish immigrants who settled Lod came in waves, first from Morocco and Tunisia, later from Ethiopia, and then from the former Soviet Union. Since 2008, many urban development projects have been undertaken to improve the image of the city. Upscale neighbourhoods have been built, among them Ganei Ya'ar and Ahisemah, expanding the city to the east. According to a 2010 report in the Economist, a three-meter-high wall was built between Jewish and Arab neighbourhoods and construction in Jewish areas was given priority over construction in Arab neighborhoods. The newspaper says that violent crime in the Arab sector revolves mainly around family feuds over turf and honour crimes. In 2010, the Lod Community Foundation organised an event for representatives of bicultural youth movements, volunteer aid organisations, educational start-ups, businessmen, sports organizations, and conservationists working on programmes to better the city. In the 2021 Israel–Palestine crisis, a state of emergency was declared in Lod after Arab rioting led to the death of an Israeli Jew. The Mayor of Lod, Yair Revivio, urged Prime Minister of Israel Benjamin Netanyahu to deploy Israel Border Police to restore order in the city. This was the first time since 1966 that Israel had declared this kind of emergency lockdown. International media noted that both Jewish and Palestinian mobs were active in Lod, but the "crackdown came for one side" only. Demographics In the 19th century and until the Lydda Death March, Lod was an exclusively Muslim-Christian town, with an estimated 6,850 inhabitants, of whom approximately 2,000 (29%) were Christian. According to the Israel Central Bureau of Statistics (CBS), the population of Lod in 2010 was 69,500 people. According to the 2019 census, the population of Lod was 77,223, of which 53,581 people, comprising 69.4% of the city's population, were classified as "Jews and Others", and 23,642 people, comprising 30.6% as "Arab". Education According to CBS, 38 schools and 13,188 pupils are in the city. They are spread out as 26 elementary schools and 8,325 elementary school pupils, and 13 high schools and 4,863 high school pupils. About 52.5% of 12th-grade pupils were entitled to a matriculation certificate in 2001.[citation needed] Economy The airport and related industries are a major source of employment for the residents of Lod. Other important factories in the city are the communication equipment company "Talard", "Cafe-Co" - a subsidiary of the Strauss Group and "Kashev" - the computer center of Bank Leumi. A Jewish Agency Absorption Centre is also located in Lod. According to CBS figures for 2000, 23,032 people were salaried workers and 1,405 were self-employed. The mean monthly wage for a salaried worker was NIS 4,754, a real change of 2.9% over the course of 2000. Salaried men had a mean monthly wage of NIS 5,821 (a real change of 1.4%) versus NIS 3,547 for women (a real change of 4.6%). The mean income for the self-employed was NIS 4,991. About 1,275 people were receiving unemployment benefits and 7,145 were receiving an income supplement. Art and culture In 2009-2010, Dor Guez held an exhibit, Georgeopolis, at the Petach Tikva art museum that focuses on Lod. Archaeology A well-preserved mosaic floor dating to the Roman period was excavated in 1996 as part of a salvage dig conducted on behalf of the Israel Antiquities Authority and the Municipality of Lod, prior to widening HeHalutz Street. According to Jacob Fisch, executive director of the Friends of the Israel Antiquities Authority, a worker at the construction site noticed the tail of a tiger and halted work. The mosaic was initially covered over with soil at the conclusion of the excavation for lack of funds to conserve and develop the site. The mosaic is now part of the Lod Mosaic Archaeological Center. The floor, with its colorful display of birds, fish, exotic animals and merchant ships, is believed to have been commissioned by a wealthy resident of the city for his private home. The Lod Community Archaeology Program, which operates in ten Lod schools, five Jewish and five Israeli Arab, combines archaeological studies with participation in digs in Lod. Sports The city's major football club, Hapoel Bnei Lod, plays in Liga Leumit (the second division). Its home is at the Lod Municipal Stadium. The club was formed by a merger of Bnei Lod and Rakevet Lod in the 1980s. Two other clubs in the city play in the regional leagues: Hapoel MS Ortodoxim Lod in Liga Bet and Maccabi Lod in Liga Gimel. Hapoel Lod played in the top division during the 1960s and 1980s, and won the State Cup in 1984. The club folded in 2002. A new club, Hapoel Maxim Lod (named after former mayor Maxim Levy) was established soon after, but folded in 2007. Notable people Twin towns-sister cities Lod is twinned with: See also References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Maximum_degree] | [TOKENS: 63] |
Glossary of graph theory This is a glossary of graph theory. Graph theory is the study of graphs, systems of nodes or vertices connected in pairs by lines or edges. Symbols A B C D E F G H I J K L M N O P Q R S T U V W See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Powerhouse_(programming_language)] | [TOKENS: 1318] |
Contents PowerHouse (programming language) PowerHouse is a byte-compiled fourth-generation programming language (or 4GL) originally produced by Quasar Corporation (later renamed Cognos Incorporated) for the Hewlett-Packard HP3000 mini-computer, as well as Data General and DEC VAX/VMS systems. It was initially composed of five components: History PowerHouse was introduced in 1982 and bundled together in a single product Quiz and Quick/QDesign, both of which had been previously available separately, with a new batch processor QTP. In 1983, Quasar changed its name to Cognos Corporation and began porting their application development tools to other platforms, notably Digital Equipment Corporation's VMS, Data General's AOS/VS II, and IBM's OS/400, along with the UNIX platforms from these vendors. Cognos also began extending their product line with add-ons to PowerHouse (for example, Architect) and end-user applications written in PowerHouse (for example, MultiView). [citation needed] Subsequent development of the product added support for platform-specific relational databases, such as HP's Allbase/SQL, DEC's Rdb, and Microsoft's SQL Server, as well as cross-platform relational databases such as Oracle, Sybase, and IBM's DB2. The PowerHouse language represented a considerable achievement.[according to whom?] Compared with languages like COBOL, Pascal and PL/1, PowerHouse substantially cut the amount of labour required to produce useful applications on its chosen platforms. It achieved this through the use of a central data-dictionary, a compiled file that extended the attributes of data fields natively available in the DBMS with frequently used programming idioms such as: In order to support the data dictionary, PowerHouse was tightly coupled to the underlying database management system and/or file system on each of the target platforms. In the case of the HP3000 this was the IMAGE shallow-network DBMS and KSAM indexed file system, and the entire PowerHouse language reflected its origins. Once described in the data dictionary, there was no further need to describe the attributes through any of the applications unless there was a need to change them on the fly, for example, to change the size of an item to make it fit within the constraints of a defined item. Simple QUICK screens could be generated in as few as four lines of source code: <screenname> was the name of the screen that the programmer assigned to the program. <filename> was the file name to be accessed in the data dictionary. Whether the items in the file would all fit in the screen would be determined by how many items and the size of them. If they didn't all fit, the program would have to be modified to eliminate unneeded items, change the size of items to some other size, etc. But, for a file with only a couple of items in it, it is quick and easy to generate a screen for data entry, deletion, or to simply look up data by an index. Simple QUIZ reports were almost as easy. A one-file report was as simple as: All items in the file would be sent to the screen, maybe not in the most desired fashion, but that simple. Since QTP programs usually involved adding, deleting, or modifying data, there was not much need for simple code programs. More care was exercised because of this potential danger, as a whole file (or files) of data could be wiped out rather easily. Any QUICK, QUIZ, or QTP programs could be run compiled (converted to machine language) or uncompiled (source code). Compiled programs generally ran faster, but there had to be a good method to modifications to the source code so as to not corrupt the object code. QUICK screens were used primarily for data entry, and could call other QUICK screens, QUIZ reports, or QTP applications to update data. Also, there were a few things that had to be done to the source code in order to generate compiled code. For example, the GO command to run the source code was equivalent to the BUILD command to generate the compiled code. Commands could be abbreviated to the first three characters when writing source code. ACCESS could be abbreviated to ACC, as well as DEFine, REPort, SCReen, etc. QUIZ reports could be routed to a printer, screen, or ASCII text files with the SET REPORT DEVICE <printer> <screen> <disk>command. Given the right access and commands, a novice could write simple report programs. Just as dangerous, though, the same novice could easily destroy the data as there was no security to whether one can call up any of the interpreters. If you had access to QUIZ, you also had access to QTP and QUICK. Like all virtual machine languages, PowerHouse is CPU intensive.[citation needed] This sometimes produced a visibly negative impact on overall transaction performance necessitating hardware upgrades. Cognos practice of tying license fees to hardware performance metrics resulted in high licensing costs for PowerHouse users.[citation needed] Migration to the PC Cognos initially attempted to move to the Intel platform in 1988 with the MS-DOS-based PowerHouse PC. While the product was used by a number of partners to build bespoke applications for small to medium-sized customers it was not entirely unsuccessful at that time. However, Cognos eventually produced Axiant (c.1995), which ported PowerHouse-like syntax to an Intel-based Microsoft Windows visual development environment and linked it to SQL aware DBMS running on these machines. The radical changes wrought by the PC revolution, which began just at the time PowerHouse was introduced, eventually brought down the cost of host computers to such an extent that high-priced software development tools such as PowerHouse became unattractive to customers.[citation needed] PowerHouse in the 21st Century Around 1999 PowerHouse Web was released in order to support the development of web-aware applications.[citation needed] Products like Business Intelligence and Financial Performance Management that run on commodity architectures and high-end UNIX servers now form the core of the Cognos product line. Cognos was acquired by IBM on January 30, 2008. The PowerHouse Application Development Tools including PowerHouse server, Axiant 4GL and PowerHouse Web were acquired from IBM by UNICOM Systems, a division of UNICOM Global on December 31, 2013, and UNICOM continues to support the worldwide customer base for the products. References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-FOOTNOTELeigh2018189-156] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Giza_pyramid_complex] | [TOKENS: 4526] |
Contents Giza pyramid complex The Giza pyramid complex (also called the Giza necropolis) in Egypt is home to the Great Pyramid, the pyramid of Khafre, and the pyramid of Menkaure, along with their associated pyramid complexes and the Great Sphinx. All were built during the Fourth Dynasty of the Old Kingdom of ancient Egypt, between c. 2600 – c. 2500 BC. The site also includes several temples, cemeteries, and the remains of a workers' village. The site is at the edge of the Western Desert, approximately 9 km (5.6 mi) west of the Nile River in the city of Giza, and about 13 km (8.1 mi) southwest of the city centre of Cairo. It forms the northernmost part of the 16,000 ha (160 km2; 62 sq mi) Pyramid Fields of the Memphis and its Necropolis UNESCO World Heritage Site, inscribed in 1979. The pyramid fields include the Abusir, Saqqara, and Dahshur pyramid complexes, which were all built in the vicinity of Egypt's ancient capital of Memphis. Further Old Kingdom pyramid fields were located at the sites Abu Rawash, Zawyet El Aryan, and Meidum. Most of the limestone used to build the pyramids originates from the underlying Mokattam Formation. The Great Pyramid and the Pyramid of Khafre are the largest pyramids built in ancient Egypt, and they have historically been common as emblems of Ancient Egypt in the Western imagination. They were popularised in Hellenistic times, when the Great Pyramid was listed by Antipater of Sidon as one of the Seven Wonders of the World. It is by far the oldest of the Ancient Wonders and the only one still in existence. Maadi settlements The earliest settlement of the Giza plateau predates the pyramid complexes. Four jars from the Maadi culture were found at the foot of the Great Pyramid, likely from a disturbed earlier settlement. Further Maadi settlement near the site was uncovered during work on the Greater Cairo Wastewater Project. Recent reassessment of the radiocarbon dating puts the Maadi culture's eponymous settlement to c. 3800 – c. 3400 BC, which is also the likely maximum possible range for the Giza remains. Pyramids and Sphinx The Giza pyramid complex consists of the Great Pyramid (also known as the Pyramid of Cheops or Khufu and constructed c. 2580 – c. 2560 BC), the slightly smaller Pyramid of Khafre (or Chephren) a few hundred metres to the south-west, and the relatively modest-sized Pyramid of Menkaure (or Mykerinos) a few hundred metres farther south-west. The Great Sphinx lies on the east side of the complex. Consensus among Egyptologists is that the head of the Great Sphinx is that of Khafre. Along with these major monuments are a number of smaller satellite edifices, known as "queens" pyramids, causeways, and temples. Besides the archaeological structures, the ancient landscape has also been investigated. Khufu's pyramid complex consists of a valley temple, now buried beneath the village of Nazlet el-Samman; diabase paving and nummulitic limestone walls have been found but the site has not been excavated. The valley temple was connected to a causeway that was largely destroyed when the village was constructed. The causeway led to the Mortuary Temple of Khufu, which was connected to the pyramid. Of this temple, the basalt pavement is the only thing that remains. The king's pyramid has three smaller queens' pyramids (G1-a, G1-b, and G1-c) associated with it and three boat pits.: 11–19 The boat pits contained a ship, and the two pits on the south side of the pyramid contained intact ships when excavated. One of these ships, the Khufu ship, has been restored and was originally displayed at the Giza Solar boat museum, then subsequently moved to the Grand Egyptian Museum. Khufu's pyramid still has a limited number of casing stones at its base. These casing stones were made of fine white limestone quarried at Tura. Khafre's pyramid complex consists of a valley temple, the Sphinx temple, a causeway, a mortuary temple, and the king's pyramid. The valley temple yielded several statues of Khafre. Several were found in a well in the floor of the temple by Mariette in 1860. Others were found during successive excavations by Sieglin (1909–1910), Junker, Reisner, and Hassan. Khafre's complex contained five boat-pits and a subsidiary pyramid with a serdab.: 19–26 Khafre's pyramid appears larger than the adjacent Khufu Pyramid by virtue of its more elevated location, and the steeper angle of inclination of its construction—it is, in fact, smaller in both height and volume. Khafre's pyramid retains a prominent display of casing stones at its apex. Menkaure's pyramid complex consists of a valley temple, a causeway, a mortuary temple, and the king's pyramid. The valley temple once contained several statues of Menkaure. During the 5th Dynasty, a smaller ante-temple was added on to the valley temple. The mortuary temple also yielded several statues of Menkaure. The king's pyramid, completed c. 2510 BC, has three subsidiary or queen's pyramids.: 26–35 Of the four major monuments, only Menkaure's pyramid is seen today without any of its original polished limestone casing. The Sphinx dates from the reign of king Khafre. During the New Kingdom, Amenhotep II dedicated a new temple to Hauron-Haremakhet and this structure was added onto by later rulers.: 39–40 Khentkaus I was buried in Giza. Her tomb is known as LG 100 and G 8400 and is located in the Central Field, near the valley temple of Menkaure. The pyramid complex of Queen Khentkaus includes her pyramid, a boat pit, a valley temple, and a pyramid town.: 288–289 Construction Most construction theories are based on the idea that the pyramids were built by moving huge stones from a quarry and dragging and lifting them into place. Disagreements arise over the feasibility of the different proposed methods by which the stones were conveyed and placed. In building the pyramids, the architects might have developed their techniques over time. They would select a site on a relatively flat area of bedrock—not sand—which provided a stable foundation. After carefully surveying the site and laying down the first level of stones, they constructed the pyramids in horizontal levels, one on top of the other. For the Great Pyramid, most of the stone for the interior seems to have been quarried immediately to the south of the construction site. The smooth exterior of the pyramid was made of a fine grade of white limestone that was quarried across the Nile. These exterior blocks had to be carefully cut, transported by river barge to Giza, and dragged up ramps to the construction site. Only a few exterior blocks remain in place at the bottom of the Great Pyramid. During the Middle Ages (5th century to 15th century), people may have taken the rest away for building projects in the city of Cairo. To ensure that the pyramid remained symmetrical, the exterior casing stones all had to be equal in height and width. Workers might have marked all the blocks to indicate the angle of the pyramid wall and trimmed the surfaces carefully so that the blocks fit together. During construction, the outer surface of the stone was smooth limestone; excess stone has eroded over time. New insights into the closing stages of the Great Pyramid building were provided by the 2013 find of Wadi el-Jarf papyri, especially the diary of inspector Merer, whose team was assigned to deliver the white limestone from Tura quarries to Giza. The journal was already published, as well as a popular account of the importance of this discovery. The pyramids of Giza and others are thought to have been constructed to house the remains of the deceased pharaohs who ruled Ancient Egypt. A portion of the pharaoh's spirit called his ka was believed to remain with his corpse. Proper care of the remains was necessary in order for the "former Pharaoh to perform his new duties as king of the dead". It is theorized the pyramid not only served as a tomb for the pharaoh, but also as a storage pit for various items he would need in the afterlife. The people of Ancient Egypt believed that death on Earth was the start of a journey to the next world. The embalmed body of the king was entombed underneath or within the pyramid to protect it and allow his transformation and ascension to the afterlife. The sides of all three of the Giza pyramids were astronomically oriented to the cardinal directions within a small fraction of a degree. According to the disputed Orion correlation theory, the arrangement of the pyramids is a representation of the constellation Orion. Workers' village The work of quarrying, moving, setting, and sculpting the huge amount of stone used to build the pyramids might have been accomplished by several thousand skilled workers, unskilled laborers and supporting workers. Bakers, carpenters, water carriers, and others were also needed for the project. Along with the methods used to construct the pyramids, there is also wide speculation regarding the exact number of workers needed for a building project of this magnitude. When Greek historian Herodotus visited Giza in 450 BC, he was told by Egyptian priests that "the Great Pyramid had taken 400,000 men 20 years to build, working in three-month shifts 100,000 men at a time." Evidence from the tombs indicates that a workforce of 10,000 laborers working in three-month shifts took around 30 years to build a pyramid. The Giza pyramid complex is surrounded by a large stone wall, outside which Mark Lehner and his team discovered a town where the pyramid workers were housed. The village is located to the southeast of the Khafre and Menkaure complexes. Among the discoveries at the workers' village are communal sleeping quarters, bakeries, breweries, and kitchens (with evidence showing that bread, beef, and fish were dietary staples), a copper workshop, a hospital, and a cemetery (where some of the skeletons were found with signs of trauma associated with accidents on a building site). The metal processed at the site was the so-called arsenical copper. The same material was also identified among the copper artefacts from the "Kromer" site, from the reigns of Khufu and Khafre. The workers' town appears to date from the middle 4th Dynasty (2520–2472 BC), after the accepted time of Khufu and completion of the Great Pyramid. According to Lehner and the AERA team: Using pottery shards, seal impressions, and stratigraphy to date the site, the team further concludes: Radiocarbon data for the Old Kingdom Giza plateau and the workers' settlement were published in 2006, and then re-evaluated in 2011. Cemeteries As the pyramids were constructed, the mastabas for lesser royals were constructed around them. Near the pyramid of Khufu, the main cemetery is G 7000, which lies in the East Field located to the east of the main pyramid and next to the Queen's pyramids. These cemeteries around the pyramids were arranged along streets and avenues. Cemetery G 7000 was one of the earliest and contained tombs of wives, sons, and daughters of these 4th Dynasty rulers. On the other side of the pyramid in the West Field, the royals' sons Wepemnofret and Hemiunu were buried in Cemetery G 1200 and Cemetery G 4000, respectively. These cemeteries were further expanded during the 5th and 6th Dynasties. The West Field is located to the west of Khufu's pyramid. It is divided into smaller areas such as the cemeteries referred to as the Abu Bakr Excavations (1949–1950, 1950–1951, 1952, and 1953), and several cemeteries named based on the mastaba numbers such as Cemetery G 1000, Cemetery G 1100, etc. The West Field contains Cemetery G1000 – Cemetery G1600, and Cemetery G 1900. Further cemeteries in this field are: Cemeteries G 2000, G 2200, G 2500, G 3000, G 4000, and G 6000. Three other cemeteries are named after their excavators: Junker Cemetery West, Junker Cemetery East, and Steindorff Cemetery.: 100–122 The East Field is located to the east of Khufu's pyramid and contains cemetery G 7000. This cemetery was a burial place for some of the family members of Khufu. The cemetery also includes mastabas from tenants and priests of the pyramids dated to the 5th Dynasty and 6th Dynasty.: 179–216 This cemetery dates from the time of Menkaure (Junker) or earlier (Reisner), and contains several stone-built mastabas dating from as late as the 6th Dynasty. Tombs from the time of Menkaure include the mastabas of the royal chamberlain Khaemnefert, the King's son Khufudjedef (master of the royal largesse), and an official named Niankhre.: 216–228 The Central Field contains several burials of royal family members. The tombs range in date from the end of the 4th Dynasty to the 5th Dynasty or even later.: 230–293 Tombs dating from the Saite and later period were found near the causeway of Khafre and the Great Sphinx. These tombs include the tomb of a commander of the army named Ahmose and his mother Queen Nakhtubasterau, who was the wife of Pharaoh Amasis II.: 289–290 The South Field includes mastabas dating from the 1st Dynasty to 3rd Dynasty as well as later burials. Of the more significant of these early dynastic tombs are one referred to as "Covington's tomb", otherwise known as Mastaba T, and the large Mastaba V which contained artifacts naming the 1st Dynasty pharaoh Djet. Other tombs date from the late Old Kingdom (5th and 6th Dynasty). The south section of the field contains several tombs dating from the Saite period and later.: 294–297 In 1990, tombs belonging to the pyramid workers were discovered alongside the pyramids, with an additional burial site found nearby in 2009. Although not mummified, they had been buried in mudbrick tombs with beer and bread to support them in the afterlife. The tombs' proximity to the pyramids and the manner of burial supports the theory that they were paid laborers who took pride in their work and were not slaves, as was previously thought. Evidence from the tombs indicates that a workforce of 10,000 laborers working in three-month shifts took around 30 years to build a pyramid. Most of the workers appear to have come from poor families. Specialists such as architects, masons, metalworkers, and carpenters were permanently employed by the king to fill positions that required the most skill. Shafts There are multiple burial-shafts and various unfinished shafts and tunnels located in the Giza complex that were discovered and mentioned prominently by Selim Hassan in his report Excavations at Giza 1933–1934. He states: "Very few of the Saitic [referring to the Saite Period] shafts have been thoroughly examined, for the reason that most of them are flooded.": 193 The Osiris Shaft is a narrow burial-shaft leading to three levels for a tomb and below it a flooded area. It was mentioned by Hassan, and a thorough excavation was conducted by a team led by Hawass in 1999. It was opened to tourists in November 2017. New Kingdom and Late Period During the New Kingdom Giza was still an active site. A brick-built chapel was constructed near the Sphinx during the early 18th Dynasty, probably by King Thutmose I. Amenhotep II built a temple dedicated to Hauron-Haremakhet near the Sphinx. As a prince, the future pharaoh Thutmose IV visited the pyramids and the Sphinx; he reported being told in a dream that if he cleared the sand that had built up around the Sphinx, he would be rewarded with kingship. This event is recorded in the Dream Stele, which he had installed between the Sphinx's front legs. During the early years of his reign, Thutmose IV, together with his wife Queen Nefertari, had stelae erected at Giza. Pharaoh Tutankhamun had a structure built, which is now referred to as the king's resthouse. During the 19th Dynasty, Seti I added to the temple of Hauron-Haremakhet, and his son Ramesses II erected a stela in the chapel before the Sphinx and usurped the resthouse of Tutankhamun.: 39–47 During the 21st Dynasty, the Temple of Isis Mistress-of-the-Pyramids was reconstructed. During the 26th Dynasty, a stela made in this time mentions Khufu and his Queen Henutsen.: 18 Division of the 1903–1905 excavation of the Giza Necropolis In 1903, rights to excavate the West Field and Pyramids of the Giza Necropolis were divided by three institutions from Italy, Germany, and the United States of America. Prior to the division of the Giza Plateau into three institutional concessions in 1903, amateur and private excavations at the Giza Necropolis had been permitted to operate. The work of these amateur archaeologists failed to meet high scientific standards. Montague Ballard, for instance, excavated in the Western Cemetery (with the hesitant permission of the Egyptian Antiquities Service) and neither kept records of his finds nor published them. In 1902, the Egyptian Antiquities Service under Gaston Maspero resolved to issue permits exclusively to authorized individuals representing public institutions. In November of that year, the Service awarded three scholars with concessions on the Giza Necropolis. They were the Italian Ernesto Schiaparelli from the Turin Museum, the German Georg Steindorff from the University of Leipzig who had funding from Wilhelm Pelizaeus, and the American George Reisner from the Hearst Expedition. Within a matter of months, the site had been divided between the concessionaires following a meeting at the Mena House Hotel involving Schiaparelli, Ludwig Borchardt (Steindorff's representative in Egypt), and Reisner. By the turn of the 20th century, the three largest pyramids on the Giza plateau were considered mostly exhausted by previous excavations, so the Western Cemetery and its collection of private mastaba tombs were thought to represent the richest unexcavated part of the plateau. George Reisner's wife, Mary, drew names from a hat to assign three long east-west plots of the necropolis among the Italian, German, and American missions. Schiaparelli was assigned the southernmost strip, Borchardt the center, and Reisner the northernmost. Rights to excavate the Pyramids were then also negotiated between Schiaparelli, Borchardt, and Reisner. Schiaparelli gained rights to excavate the Great Pyramid of Khufu along with its three associated queens' pyramids and most of its Eastern Cemetery. Borchardt received Khafre's pyramid, its causeway, the Sphinx, and the Sphinx's associated temples. Reisner claimed Menkaure's pyramid as well as its associated queens' pyramids and pyramid temple, along with a portion of Schiaparelli's Eastern Cemetery. Any future disputes were to be resolved by Inspector James Quibell, as per a letter from Borchardt to Maspero. This arrangement lasted until 1905, when, under the supervision of Schiaparelli and Francesco Ballerini, the Italian excavations ceased at Giza. As the Italians were more interested in sites which might yield more papyri, they turned their concession of the southern strip of the Western Cemetery over to the Americans under Reisner. Modern usage In 1978, the Grateful Dead played a series of concerts later released as Rocking the Cradle: Egypt 1978. In 2007, Colombian singer Shakira performed at the complex to a crowd of approximately 100,000 people. The complex was used for the final draw of the 2019 Africa Cup of Nations and the 2021 World Men's Handball Championship. Egypt's Minister of Tourism unveiled plans for a €17,000,000 revamp of the complex by the end of 2021, in order to boost tourism in Egypt as well as make the site more accessible and tourist-friendly. According to Lonely Planet, the refurbishment includes a new visitors' centre, an environmentally-friendly electric bus, a restaurant (the 9 Pyramids Lounge), as well as a cinema, public toilets, site-wide signage, food trucks, photo booths, and free Wi-Fi. The new facility is part of a wider plan to renovate the 4,500 year old site. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/New_Game_Plus] | [TOKENS: 609] |
Contents New Game Plus New Game Plus, also known as New Game+, NG+, or hero mode, is an unlockable video game mode available in some video game titles that allows the player to restart the game's story with all or some of the items or experience gained in the first playthrough. New Game Plus modes are typically unlocked after completing the game's story at least once and sometimes contain certain features not normally available in the initial playthrough, such as increased difficulty, altered combat or encounters, and more. Origin The term was coined in the 1995 role-playing video game Chrono Trigger, but examples can be found in earlier games, such as Digital Devil Story: Megami Tensei, The Legend of Zelda and Ghosts 'n Goblins.[citation needed] This play mode is most often found in role-playing video games, where starting a New Game Plus will usually have the player characters start the new game with the statistics and equipment with which they ended the last game. Key items that are related to the story are normally removed so they cannot ruin the game's progression, and are given back to the player at the time they are needed; likewise, characters that the player acquires throughout the story will also not appear until their scheduled place and time, but will get the enhanced stats from the previous playthrough. Examples Games with multiple endings, such as Chrono Trigger, may feature a New Game Plus mode which allows the player to explore alternate endings. Many games increase the difficulty in a New Game Plus mode, such as those in the Mega Man Battle Network series and Borderlands series. Others use the feature to advance the plot. In Astro Boy: Omega Factor, the player uses the game's Stage Select mechanism, explained in-story as a form of time travel, to avert disaster, while in Eternal Darkness, the player defeats three different final bosses, one in each playthrough, to access the true ending. Some New Game Plus variations alter established gameplay. This includes unlocking new characters, such as in Castlevania: Symphony of the Night; new areas, such as in Parasite Eve; new items, such as in the Metal Gear series; new challenges, such as in the .hack series, or new weapon and armor upgrades, like in God of War II's Bonus Mode and God of War's New Game+ mode. Games that connect to online marketplaces may require the player to complete a New Game Plus game to obtain certain achievements, such as the "Calamity Kid" achievement in the game Bastion. Others may require or be required by additional purchases in the form of downloadable content, such as in Yakuza: Like a Dragon. Final Fantasy XIV is a MMORPG with an extensive story line called the "main scenario quest" or colloquially the "MSQ". A New Game Plus option was introduced during the Shadowbringers expansion patch cycle, which allows players to replay the main scenario and selected other scenarios, either in their entirety or specific expansion's stories. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Gromia_sphaerica] | [TOKENS: 489] |
Contents Gromia sphaerica Gromia sphaerica is a large spherical testate amoeba, a single-celled eukaryotic organism and the largest of its genus, Gromia. The genus itself contains about 13 known species, 3 of which were discovered as late as 2005. It was discovered in 2000, along the Oman margin of the Arabian Sea, at depths around 1,163 to 1,194 meters (3,816 to 3,917 ft). Specimens range in size from 4.7 to 38 millimeters (0.19 to 1.50 inches) in diameter. The test (organic shell) is usually spherical in shape and honeycombed with pores. There are filaments on the bottom of the organism, where it is in contact with the seafloor, and it is mostly filled with stercomata (waste pellets). In 2008, 30-millimeter (1.2-inch) specimens were found off the coast of Little San Salvador in the Bahamas by researchers from the University of Texas. These Gromia were discovered to make mud trails as much as 50 centimeters (20 inches) in length. It was previously thought that single-celled organisms were incapable of making these kinds of trails, and their cause was previously a source of speculation. The mud trails made by the Bahamian Gromia appear to match prehistoric mud trails from the Precambrian, including 1.8 billion year-old fossil trails in the Stirling formation in Australia. Because the tracks of Gromia resemble the 1.8 billion year old traces that were believed to represent the traces of complex bilaterian worms, said tracks could have been a result of similarly giant single-celled organisms instead of complex animals. Description Gromia sphaerica mainly resembled a grape in size and body appearance. When the sediment was removed from one of the specimens, it showed that the skin was similar to that of a grape's skin, but much softer when touched. Tracks The tracks that G. sphaerica makes on the muddy sea floor are similar to the tracks of animals from the Ediacaran period. In some of the photos, the tracks can be seen as being curved. References This Cercozoa-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Planetary_system] | [TOKENS: 4155] |
Contents Planetary system A planetary system consists of a set of non-stellar bodies which are gravitationally bound to and in orbit of a star or star system. Generally speaking, such systems will include planets, and may include other objects such as dwarf planets, asteroids, natural satellites, meteoroids, comets, planetesimals, and circumstellar disks. The Solar System is an example of a planetary system, in which Earth, seven other planets, and other celestial objects are bound to and revolve around the Sun. The term exoplanetary system is sometimes used in reference to planetary systems other than the Solar System. By convention planetary systems are named after their host, or parent, star, as is the case with the Solar System being named after "Sol" (Latin for sun). As of 30 October 2025, there are 6,128 confirmed exoplanets in 4,584 planetary systems, with 1,017 systems having more than one planet. Debris disks are known to be common while other objects are more difficult to observe. Of particular interest to astrobiology is the habitable zone of planetary systems where planets could have surface liquid water, and thus, the capacity to support Earth-like life. Definition The International Astronomical Union (IAU) has described a planetary system as the system of planets orbiting one or more stars, brown dwarfs or stellar remnants. The IAU and NASA consider the Solar System a planetary system, including its star the Sun, its planets, and all other bodies orbiting the Sun. Other definitions of planetary system explicitly include all bodies gravitationally bound to one or more stars. History Heliocentrism is a planetary model that places the Sun is at the center of the universe, as opposed to geocentrism (placing Earth at the center of the universe). The idea was first proposed in Western philosophy and Greek astronomy as early as the 3rd century BC by Aristarchus of Samos, but received no support from most other ancient astronomers. Some also interpret Aryabhatta's writings in Āryabhaṭīya as implicitly heliocentric, although this has also been rebutted. De revolutionibus orbium coelestium by Nicolaus Copernicus, published in 1543, presented the first mathematically predictive heliocentric model of a planetary system. 17th-century successors Galileo Galilei, Johannes Kepler, and Sir Isaac Newton developed an understanding of physics which led to the gradual acceptance of the idea that the Earth moves around the Sun and that the planets are governed by the same physical laws that governed Earth. In the 16th century the Italian philosopher Giordano Bruno, an early supporter of the Copernican theory that Earth and other planets orbit the Sun, put forward the view that the fixed stars are similar to the Sun and are likewise accompanied by planets. In the 18th century, the same possibility was mentioned by Sir Isaac Newton in the "General Scholium" that concludes his Principia. Making a comparison to the Sun's planets, he wrote "And if the fixed stars are the centres of similar systems, they will all be constructed according to a similar design and subject to the dominion of One." His theories gained popularity through the 19th and 20th centuries despite a lack of supporting evidence. Long before their confirmation by astronomers, conjecture on the nature of planetary systems had been a focus of the search for extraterrestrial intelligence and has been a prevalent theme in fiction, particularly science fiction. The first confirmed detection of an exoplanet was in 1992, with the discovery of several terrestrial-mass planets orbiting the pulsar PSR B1257+12. The first confirmed detection of exoplanets of a main-sequence star was made in 1995, when a giant planet, 51 Pegasi b, was found in a four-day orbit around the nearby G-type star 51 Pegasi. The frequency of detections has increased since then, particularly through advancements in methods of detecting extrasolar planets and dedicated planet-finding programs such as the Kepler mission. Origin and evolution Planetary systems come from protoplanetary disks that form around stars as part of the process of star formation. During formation of a system, much material is gravitationally-scattered into distant orbits, and some planets are ejected completely from the system, becoming rogue planets. Planets orbiting pulsars have been discovered. Pulsars are the remnants of the supernova explosions of high-mass stars, but a planetary system that existed before the supernova would likely be mostly destroyed. Planets would either evaporate, be pushed off of their orbits by the masses of gas from the exploding star, or the sudden loss of most of the mass of the central star would see them escape the gravitational hold of the star, or in some cases the supernova would kick the pulsar itself out of the system at high velocity so any planets that had survived the explosion would be left behind as free-floating objects. Planets found around pulsars may have formed as a result of pre-existing stellar companions that were almost entirely evaporated by the supernova blast, leaving behind planet-sized bodies. Alternatively, planets may form in an accretion disk of fallback matter surrounding a pulsar. Fallback disks of matter that failed to escape orbit during a supernova may also form planets around black holes. Many low-mass stars are expected to have rocky planets, with their planetary systems primarily consisting of rock- and ice-based bodies. This is because low-mass stars have less material in their planetary disks, making it unlikely that the planetesimals within will reach the critical mass necessary to form gas giants. The planetary systems of low-mass stars also tend to be compact, as such stars tend to have lower temperatures, resulting in the formation of protoplanets closer to the star. As stars evolve and turn into red giants, asymptotic giant branch stars, and eventually planetary nebulae, they engulf the inner planets, evaporating or partially evaporating them depending on how massive they are. As the star loses mass, planets that are not engulfed move further out from the star. If an evolved star is in a binary or multiple system, then the mass it loses can transfer to another star, forming new protoplanetary disks and second- and third-generation planets which may differ in composition from the original planets, which may also be affected by the mass transfer. Free-floating planets in open clusters have similar velocities to the stars and so can be recaptured. They are typically captured into wide orbits between 100 and 105 AU. The capture efficiency decreases with increasing cluster size, and for a given cluster size it increases with the host/primary[clarification needed] mass. It is almost independent of the planetary mass. Single and multiple planets could be captured into arbitrary unaligned orbits, non-coplanar with each other or with the stellar host spin, or pre-existing planetary system. Some planet–host metallicity correlation may still exist due to the common origin of the stars from the same cluster. Planets would be unlikely to be captured around neutron stars because these are likely to be ejected from the cluster by a pulsar kick when they form. Planets could even be captured around other planets to form free-floating planet binaries. After the cluster has dispersed some of the captured planets with orbits larger than 106 AU would be slowly disrupted by the galactic tide and likely become free-floating again through encounters with other field stars or giant molecular clouds. System architectures The Solar System consists of an inner region of small rocky planets and outer region of large giant planets. However, other planetary systems can have quite different architectures. At present,[when?] few systems have been found to be analogous to the Solar System with small terrestrial planets in the inner region, as well as a gas giant with a relatively circular orbit, which suggests that this configuration is uncommon. More commonly, systems consisting of multiple Super-Earths have been detected. These super-Earths are usually very close to their star, with orbits smaller than that of Mercury. Other systems have been found to have a hot Jupiter gas giant very close to the star. Theories such as planetary migration or scattering have been proposed to explain the formation of large planets close to their parent stars. Overall, studies suggest that architectures of planetary systems are dependent on the conditions of their initial formation. Planetary system architectures may be partitioned into four classes based on how the mass of the planets is distributed around the host star: In Similar systems, the masses of all the planets are similar to each other. This architecture class is the most commonly-observed in our galaxy. TRAPPIST-1 is an example of a Similar system. Planets in Similar systems are said to be like 'peas in a pod', and the phrase now refers to a set of specific configuration characteristics. A 'peas in a pod' system will have planets that are similar or ordered in size, similar and ordered in mass, and tend to display "packing". Packing refers to the tendency of smaller planets to be closer together, and of larger planets to have larger orbital spacing. Lastly, 'peas in a pod' systems tend to display similar spacing between a pair of adjacent planets and the next pair of adjacent planets. Mixed systems are planetary systems in which the masses of the planets show larger increasing or decreasing variations. Gliese 876 and Kepler-89 are examples of mixed systems. Anti-Ordered systems have their massive planets close to the host star and the smaller planets further away. There are currently no known examples of this architecture class. Ordered systems have their planets ordered such that the less massive ones are closer to the star and the more massive planets are further from the star, with the mass of each planet increasing with distance from the star. The Solar System, with small rocky planets in the inner part and giant planets in the outer part, is a type of Ordered system. Most known exoplanets orbit stars roughly similar to the Sun: that is, main-sequence stars of spectral categories F, G, or K. One reason is that planet-search programs have tended to concentrate on such stars. In addition, statistical analyses indicate that lower-mass stars (red dwarfs, of spectral category M) are less likely to have planets massive enough to be detected by the radial-velocity method. Nevertheless, several tens of planets around red dwarfs have been discovered by the Kepler space telescope by the transit method, which can detect smaller planets. Exoplanetary systems may also feature planets extremely different from those in the Solar System, such as Hot Jupiters, Hot Neptunes, and Super-Earths. Hot Jupiters and Hot Neptunes are gas giants, like their namesakes, but orbit close to their stars and have orbital periods on the order of a few days. Super-Earths are planets that have a mass between that of Earth and planets like Neptune and Uranus, and can be made of rock and gas. There is a lot of variety among Super-Earths, with planets ranging from water worlds to mini-Neptunes. After planets, circumstellar disks are one of the most commonly-observed properties of planetary systems, particularly of young stars. The Solar System possesses at least four major circumstellar disks (the asteroid belt, Kuiper belt, scattered disc, and Oort cloud) and clearly-observable disks have been detected around nearby solar analogs including Epsilon Eridani and Tau Ceti. Based on observations of numerous similar disks, they are assumed to be quite common attributes of stars on the main sequence. Interplanetary dust clouds have been studied in the Solar System and analogs are believed to be present in other planetary systems. Exozodiacal dust, an exoplanetary analog of zodiacal dust, the 1–100 micrometre-sized grains of amorphous carbon and silicate dust that fill the plane of the Solar System has been detected around the 51 Ophiuchi, Fomalhaut, Tau Ceti, and Vega systems. As of November 2014[update] there are 5,253 known Solar System comets and they are thought to be common components of planetary systems. The first exocomets were detected in 1987 around Beta Pictoris, a very young A-type main-sequence star. There are now a total of 11 stars around which the presence of exocomets have been observed or suspected. All discovered exocometary systems (Beta Pictoris, HR 10, 51 Ophiuchi, HR 2174, 49 Ceti, 5 Vulpeculae, 2 Andromedae, HD 21620, HD 42111, HD 110411, and more recently HD 172555) are around very young A-type stars. Computer modelling of an impact in 2013 detected around the star NGC 2547-ID8 by the Spitzer Space Telescope, and confirmed by ground observations, suggests the involvement of large asteroids or protoplanets similar to the events believed to have led to the formation of terrestrial planets like the Earth. Based on observations of the Solar System's large collection of natural satellites, they are believed common components of planetary systems; however, the existence of exomoons has not yet been confirmed. The star 1SWASP J140747.93-394542.6, in the constellation Centaurus, is a strong candidate for a natural satellite. Indications suggest that the confirmed extrasolar planet WASP-12b also has at least one satellite. Unlike the Solar System, which has orbits that are nearly circular, many of the known planetary systems display much higher orbital eccentricity. An example of such a system is 16 Cygni. The mutual inclination between two planets is the angle between their orbital planes. Many compact systems with multiple close-in planets interior to the equivalent orbit of Venus are expected to have very low mutual inclinations, so the system (at least the close-in part) would be even flatter than the Solar System. Captured planets could be captured into any arbitrary angle to the rest of the system. As of 2016[update] there are only a few systems where mutual inclinations have actually been measured One example is the Upsilon Andromedae system: the planets c and d have a mutual inclination of about 30 degrees. Planetary systems can be categorized according to their orbital dynamics as resonant, non-resonant-interacting, hierarchical, or some combination of these. In resonant systems the orbital periods of the planets are in integer ratios. The Kepler-223 system contains four planets in an 8:6:4:3 orbital resonance. Giant planets are found in mean-motion resonances more often than smaller planets. In interacting systems, the planets' orbits are close enough together that they perturb the orbital parameters. The Solar System could be described as weakly interacting, as opposed to strongly interacting systems, in which Kepler's laws do not hold. In hierarchical systems the planets are arranged so that the system can be gravitationally considered as a nested system of two-bodies, e.g. in a star with a close-in hot Jupiter with another gas giant much further out, the star and hot Jupiter form a pair that appears as a single object to another planet that is far enough out. Other, as yet unobserved, orbital possibilities include: double planets, various co-orbital planets such as quasi-satellites, trojans, and exchange orbits, and interlocking orbits maintained by precessing orbital planes. Zones The habitable zone around a star is the region where the temperature range allows for liquid water to exist on a planet; that is, not too close to the star for the water to evaporate and not too far away from the star for the water to freeze. The heat produced by stars varies depending on the size and age of the star; this means the habitable zone will also vary accordingly. Also, the atmospheric conditions on the planet influence the planet's ability to retain heat so that the location of the habitable zone is also specific to each type of planet. Habitable zones have usually been defined in terms of surface temperature; however, over half of Earth's biomass is from subsurface microbes, and temperature increases as depth underground increases, so the subsurface can be conducive for life when the surface is frozen; if this is considered, the habitable zone extends much further from the star. Studies in 2013 indicate that an estimated 22±8% of Sun-like[a] stars have an Earth-sized[b] planet in the habitable[c] zone. The Venus zone is the region around a star where a terrestrial planet would have runaway greenhouse conditions like Venus, but not so near the star that the atmosphere completely escapes. As with the habitable zone, the location of the Venus zone depends on several factors, including the type of star and properties of the planets such as mass, rotation rate, and atmospheric clouds. Studies of the Kepler spacecraft data indicate that 32% of red dwarfs have potentially Venus-like planets based on planet size and distance from star, increasing to 45% for K-type and G-type stars.[d] Several candidates have been identified, but spectroscopic follow-up studies of their atmospheres are required to determine whether they are like Venus. Galactic distribution of planets The Milky Way is 100,000 light-years across, but 90% of planets with known distances are within about 2000 light years of Earth, as of July 2014. One method that can detect planets much further away is microlensing. The upcoming Nancy Grace Roman Space Telescope could use microlensing to measure the relative frequency of planets in the galactic bulge versus the galactic disk. So far, the indications are that planets are more common in the disk than the bulge. Estimates of the distance of microlensing events is difficult: the first planet considered with high probability of being in the bulge is MOA-2011-BLG-293Lb at a distance of 7.7 kiloparsecs (about 25,000 light years). Population I, or metal-rich stars, are those young stars whose metallicity is highest. The high metallicity of population I stars makes them more likely to possess planetary systems than older populations, because planets form by the accretion of metals.[citation needed] The Sun is an example of a metal-rich star. These are common in the disks of galaxies. Generally, the youngest stars, the extreme population I, are found farther in and intermediate population I stars are farther out, etc. The Sun is considered an intermediate population I star. Population I stars have regular elliptical orbits around the Galactic Center, with a low relative velocity. Population II, or metal-poor stars, are those with relatively low metallicity which can have hundreds (e.g. BD +17° 3248) or thousands (e.g. Sneden's Star) times less metallicity than the Sun. These objects formed during an earlier time of the universe. Intermediate population II stars are common in the bulge near the center of the Milky Way,[citation needed] whereas Population II stars found in the galactic halo are older and thus more metal-poor.[citation needed] Globular clusters also contain high numbers of population II stars. In 2014, the first planets around a halo star were announced around Kapteyn's star, the nearest halo star to Earth, around 13 light years away. However, later research suggests that Kapteyn b is just an artefact of stellar activity and that Kapteyn c needs more study to be confirmed. The metallicity of Kapteyn's star is estimated to be about 8[e] times less than the Sun. Different types of galaxies have different histories of star formation and hence planet formation. Planet formation is affected by the ages, metallicities, and orbits of stellar populations within a galaxy. Distribution of stellar populations within a galaxy varies between the different types of galaxies. Stars in elliptical galaxies are much older than stars in spiral galaxies. Most elliptical galaxies contain mainly low-mass stars, with minimal star-formation activity. The distribution of the different types of galaxies in the universe depends on their location within galaxy clusters, with elliptical galaxies found mostly close to their centers. See also Notes References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/List_of_color_palettes] | [TOKENS: 1369] |
Contents List of color palettes The following is a list that contains color palettes for notable computer graphics, terminals and video game consoles. Only a simulated image using a palette and its name are given. Main articles are linked from the name of each palette, test charts, sample colours, simulated images, and further technical details (including references). During older eras of computing, manufacturers developed many different display systems often in a competitive, non-collaborative basis (with a few exceptions in the VESA consortium), creating many proprietary, non-standard different instances of display hardware. Often, as with early personal and home computers, a given machine employed its unique display subsystem, also with its unique color palette. Furthermore, software developers had made use of the color abilities of distinct display systems in many different ways. The result is that there is no single common standard nomenclature or classification taxonomy which can encompass every computer color palette. In order to organize the material, color palettes have been grouped following certain criteria. First, generic monochrome and full RGB repertories common to various computer display systems are listed. Then, usual color repertories used for display systems that employ indexed color techniques. And finally, specific manufacturers' color palettes implemented in many representative early personal computers and video game consoles of various brands. The list for personal computer palettes is split into two categories: 8-bit and 16-bit machines. This is not intended as a true strict categorization of such machines, because mixed architectures also exist (16-bit processors with an 8-bit data bus or 32-bit processors with a 16-bit data bus, among others). The distinction is based more on broad 8-bit and 16-bit computer ages or generations (around 1975–1985 and 1985–1995, respectively) and their associated state of the art in color display capabilities. The following is the common color test chart and sample image used to render each palette in this list: See further details in the summary paragraph of the corresponding article. List of monochrome and RGB palettes In this article, the term monochrome palette means a set of intensities for a monochrome display, and the term RGB palette is defined as the complete set of combinations a given RGB display can offer by mixing all the possible intensities of the red, green, and blue primaries available in its hardware. These are generic complete repertories of colors to produce black and white and RGB color pictures by the display hardware, not necessarily the total number of such colors that can be simultaneously displayed in a given text or graphic mode of any machine. RGB is the most common method to produce colors for displays; so these complete RGB color repertories have every possible combination of R-G-B triplets within any given maximum number of levels per component. For specific hardware and different methods to produce colors than RGB, see the List of computer hardware palettes and the List of video game consoles sections. For various software arrangements and sorts of colors, including other possible full RGB arrangements within 8-bit depth displays, see the List of software palettes section. These palettes only have shades of gray. Each permuted pair of red, green, and blue (16-bit color palette, with 65,536 colors). For example, "additive red green" has zero blue and "subtractive red green" has full blue. These full RGB palettes employ the same number of bits to store the relative intensity for the red, green and blue components of every image's pixel color. Thus, they have the same number of levels per channel and the total number of possible colors is always the cube of a power of two. It should be understood that 'when developed' many of these formats were directly related to the size of some host computers 'natural word length' in bytes—the amount of memory in bits held by a single memory address such that the CPU can grab or put it in one operation. These are also RGB palettes, in the sense defined above (except for 4-bit RGBI, which has an intensity bit that affects all channels at once), but either they do not have the same number of levels for each primary channel, or the numbers are not powers of two, so are not represented as separate bit fields. All of these have been used in popular personal computers. List of software palettes Systems that use a 4-bit or 8-bit pixel depth can display up to 16 or 256 colors simultaneously. Many personal computers in the later 1980s and early 1990s displayed at most 256 different colors, freely selected by software (either by the user or by a program) from their wider hardware's color palette. Usual selections of colors in limited subsets (generally 16 or 256) of the full palette includes some RGB level arrangements commonly used with the 8 bpp palettes as master palettes or universal palettes (i.e., palettes for multipurpose uses). These are some representative software palettes, but any selection can be made in such types of systems. These are selections of colors officially employed as system palettes in some popular operating systems for personal computers that feature 8-bit displays. These are selections of colors based on evenly ordered RGB levels, mainly used as master palettes to display any kind of image within the limitations of the 8-bit pixel depth. List of computer hardware palettes In old personal computers and terminals that offered color displays, some color palettes were chosen algorithmically to provide the most diverse set of colors for a given palette size, and others were chosen to assure the availability of certain colors. In many early home computers, especially when the palette choices were determined at the hardware level by resistor combinations, the palette was determined by the manufacturer. Many early models output composite video colors. When seen on TV devices, the perception of the colors may not correspond with the value levels for the color values employed (most noticeable with NTSC TV color system). For current RGB display systems for PCs (Super VGA, etc.), see the 16-bit RGB and 24-bit RGB for High Color (thousands) and True Color (millions of colors) modes. For video game consoles, see the List of video game consoles section. For every model, their main different graphical color modes are listed based exclusively in the way they handle colors on screen, not all their different screen modes. The list is organized roughly historically by video hardware, not by branch. They are listed according to the original model of each system, which means that extended versions, clones, and compatibles also support the original palette. Color palettes of some of the most popular video game consoles. The criteria are the same as those of the List of computer hardware palettes section. See also References |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.