text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/Crusades] | [TOKENS: 19469]
Contents Crusades Later Crusades: 14th century, 15th century Iberian Crusades (1095–1492) Northern (1147–1410) Against Christians (1107–1588) Popular (1096–1320) Period post-First Crusade Second Crusade Period post-Second Crusade Third Crusade Period post-Third Crusade Fourth Crusade Fifth Crusade Sixth Crusade and aftermath Seventh Crusade End of the Crusader states in the Levant The Crusades were a series of military campaigns launched by the papacy between 1095 and 1291 against Muslim rulers for the recovery and defence of the Holy Land, encouraged by promises of spiritual reward. The First Crusade was proclaimed by Pope Urban II at the Council of Clermont in November 1095—a call to arms for Christians to reconquer Jerusalem from the Muslims. By this time, the papacy's position as head of the Catholic Church had strengthened, and earlier conflicts with secular rulers and wars on the frontiers of Western Christendom had prepared it for the direction of armed force in religious causes. The successes of the First Crusade led to the establishment of four Crusader states in the Levant, where their defence required further expeditions from Catholic Europe.[note 1] The organisation of such large-scale campaigns demanded complex religious, social, and economic institutions, including crusade indulgences, military orders, and the taxation of clerical income. Over time, the crusading movement expanded to include campaigns against pagans, Christian dissidents, and other enemies of the papacy, promoted with similar spiritual rewards and continuing into the 18th century.[note 2] The Crusade of 1101, the earliest papally sanctioned expedition inspired by the First Crusade, ended in disastrous defeats. For several decades thereafter, only smaller expeditions reached the Holy Land, yet their role in consolidating and expanding the Crusader states was pivotal. The fall of Edessa, the capital of the first Crusader state, prompted the Second Crusade, which failed in 1148. Its failure reduced support for crusading across Latin Christendom, leaving the Crusader states unable to resist Saladin's expansion. Having united Egypt and Muslim Syria under his rule, Saladin destroyed their combined armies at the Battle of Hattin in 1187. The Crusader states survived largely owing to the Third Crusade, a major campaign against Saladin, though Jerusalem remained under Muslim control. Initially directed against Egypt, the Fourth Crusade was diverted to the Byzantine Empire, culminating in the Sack of Constantinople and the establishment of the Latin Empire in 1204. The Fifth Crusade again targeted Egypt but failed to conquer it in 1219–21. By this period, crusade indulgences could also be obtained through other campaigns—such as the Iberian, Albigensian, and Northern Crusades—thereby diminishing enthusiasm for expeditions in the eastern Mediterranean. Jerusalem was regained through negotiation during the Sixth Crusade in 1229, and in 1239–41 the Barons' Crusade restored much of the territory the Crusader states had lost. However, the Sack of Jerusalem by Muslim freebooters soon ended Crusader rule in the Holy City. Louis IX of France launched two major campaigns—the Seventh Crusade against Egypt in 1248–51 and the Eighth Crusade against Tunis in 1270—both of which ended in failure. In place of the large-scale passagium generale, the smaller passagium particulare became the predominant form of crusading campaigns in the late 13th century. The Crusader states, however, were unable to withstand the advance of the Mamluks. Having reunited Egypt and Muslim Syria by 1260, they went on to attack the Crusader states, capturing the Crusaders' last mainland strongholds in 1291. Although plans for the reconquest of the Holy Land continued to be made in the following decades, only the Alexandrian Crusade briefly revived crusading activity in the region in 1365. Terminology The Crusades were military campaigns undertaken by Western Christians to reclaim the Holy Land, or Palestine, from Muslim control between the 11th and 13th centuries. Launched by the papacy with promises of spiritual reward, they were occasionally accompanied by unauthorised movements—driven by popular zeal—commonly referred to as popular crusades. In scholarly usage, the term is frequently applied more broadly to include papally authorised conflicts in other regions, conducted within the wider framework of the crusading movement, including the Iberian, Northern and Albigensian Crusades. Terminology evolved gradually, primarily reflecting the close association between the Crusades and Christian pilgrimage. Early usage favoured terms denoting mobility—iter ('journey'), expeditio ('expedition'), passagium ('passage')—typically accompanied by references to the intended destination, such as the Church of the Holy Sepulchre in Jerusalem. Other early expressions invoked the cross (crux), and by around 1250, canon lawyers were distinguishing between campaigns in the Holy Land—crux transmarina ('the cross overseas')—and those within Europe—crux cismarina ('the cross this side of the sea'). Participants, who traditionally sewed a cross onto their garments, came to be known as crucesignati ('those signed with the cross').[note 3] Vernacular terminology reflected the ritual of "taking the cross". The earliest attested form, crozada, appeared in Spain in 1212. The Middle English croiserie, derived from Old French, emerged in the 13th–14th centuries, later supplanted by forms such as croisade and crusado, both influenced by Spanish through French. The modern term crusade was established by 1706. The medievalist Thomas Asbridge notes that the term's conventional use by historians imposes "a somewhat misleading aura of coherence and conformity" on the earliest crusading efforts. Background Sites linked to Jesus's ministry became popular pilgrimage destinations in Roman Palestine. Christian emperors built churches at these locations, including the Church of the Holy Sepulchre, marking Jesus's crucifixion and resurrection in Jerusalem. In 395, the Roman Empire split into eastern and western halves. The Western Roman Empire had fragmented into smaller kingdoms by 476, while the Eastern Roman (Byzantine) Empire persisted, though it lost vast territories to the rising Islamic Caliphate in the 7th century. Jerusalem fell to Caliph Umar in 638. Islamic expansion, motivated by jihad (holy war), reached Western Europe with the Muslim conquest of much of the Iberian Peninsula after 711. Christians under Muslim rule were dhimmi—legally protected but socially subordinate. Islam's ideological unity fractured over disputes about leadership. The Shi'a believed authority belonged to the descendants of Muhammad's cousin and son-in-law, Ali, while the Sunni majority rejected the Alids' hereditary claim. By the mid-10th century, three rival caliphates had emerged: the Umayyads in al-Andalus (Muslim Spain), the Shi'ite Fatimids in Egypt, and the Abbasids in the Middle East. To Muslim observers, such as Ibn Khordadbeh, the remote and less developed Western Europe was merely a source of slaves and raw materials. However, between c. 950 and c. 1070, drought and cold spells across North Africa, the Middle East, and Central Asia led to famine and migration. Interfaith tensions escalated, culminating in the temporary destruction of the Holy Sepulchre in 1009. From the 1040s, nomadic Turkomans disrupted the Middle East. In 1055, their leader Tughril I of the Seljuk clan assumed authority within the Abbasid Caliphate with Caliph Al-Qa'im's consent. Tughril's nephew Alp Arslan crushed the Byzantines at the Battle of Manzikert in 1071, opening Anatolia to Turkoman migration. The Seljuk Empire emerged as a loose federation of provinces ruled by Seljuk princes, Turkoman warlords and Arab emirs. As Byzantine control collapsed, Armenian and Greek strongmen took over frontier cities and fortresses. From the mid-9th century, central authority in Western Europe weakened, and local lords gained power, commanding heavily armoured knights and holding castles. Their territorial disputes made warfare a regular feature across regions. To protect church property and unarmed groups, church leaders launched the Peace of God movement, threatening offenders with excommunication. As sins permeated daily life, Christians feared damnation. Sinners were expected to confess and undertake priestly prescribed penance. Thousands made the penitential journey to Jerusalem, though attacks on pilgrims became increasingly frequent. From c. 1000, the Medieval Warm Period favoured Western Europe, spurring economic and population growth. Within a century, Italian merchants supplanted their Muslim and Jewish rivals as the leading force in Mediterranean trade. In 1031, al-Andalus fragmented into taifas—smaller kingdoms—that could not resist the Reconquista—the expansion of the northern Christian states—prompting intervention by the radical Almoravids from the Maghreb. In southern Italy, Norman warriors from northern France founded principalities and completed the conquest of Muslim Sicily by 1091. In the mid-11th century, clerics promoting the "liberty of the Church" rose to power in Rome, banning simony and clerical marriage. The popes, regarded as successors to Saint Peter in Rome, claimed supremacy over Christendom, but Eastern Christian leaders rejected this. Combined with long-standing liturgical and theological differences, this led to mutual excommunications in 1054 and ultimately the division between the Catholic West and Orthodox East. Reformist clerics' rejection of lay control triggered the Investiture Controversy with secular powers. Popes had already courted allies by offering spiritual rewards, and the Controversy revived interest in the theology of just war, first articulated by Augustine in the 5th century. Theologians, under Pope Gregory VII's auspices, concluded that dying in a just war equated to martyrdom. Still, the idea of penitential warfare drew sharp criticism from anti-papal figures like Sigebert of Gembloux. First Crusade By the late 11th century, the development of Christian just war theory, increasing aristocratic piety, and the popularity of penitential journeys to the Holy Land created a context for armed pilgrimages. Strengthened by the church reforms, the papacy was well positioned to channel anxiety over sin and hopes of remission into a papally orchestrated war. In 1074, Gregory VII was the first pope to plan a campaign against the Turkomans, though it was never launched. In March 1095, his successor, Urban II, received envoys from Emperor Alexios I Komnenos, who requested military aid at the Council of Piacenza. By this time, the Seljuk Empire had descended into civil war following the deaths of Vizier Nizam al-Mulk and Sultan Malik-Shah I in 1092. Malik-Shah's brother Tutush I contested the succession of Malik-Shah's son Berkyaruq. Although Tutush was killed in battle in 1095, his sons Ridwan and Duqaq, seized control of the Syrian cities of Aleppo and Damascus, respectively, while Tutush's former mamluk (slave soldier), Yaghi-Siyan, maintained his rule over Antioch. In Anatolia, the breakaway Seljuk prince Kilij Arslan I founded the independent Sultanate of Rum, while an autonomous Turkoman clan, the Danishmendids, seized control of the north. Meanwhile, Fatimid Egypt faced its own succession crisis after the deaths of Caliph al-Mustansir and his vizier al-Jamali. Al-Jamali's son and successor al-Afdal Shahanshah installed al-Mustansir's youngest son al-Musta'li as caliph bypassing the eldest son Nizar. Although Nizar was murdered, his supporters rejected al-Musta'li's legitimacy and established a new branch of radical Shi'a Islam—the Nizaris, also known as the Assassins. In July 1095, Pope Urban began a tour of France, negotiating with local elites, and ending with the Council of Clermont. Here, on 27 November, he announced a military campaign against the Turkomans. According to most accounts, he urged military support for eastern Christians, promising spiritual rewards, and condemning knightly violence. Accounts differ on whether he promised reduced penance or full remission of sin. Urban's appeal reportedly prompted the crowd to cry Deus vult! ('God wills it!'). The ritual of "taking the cross" was introduced on the spot, with Bishop Adhemar of Le Puy setting the precedent. He was soon appointed papal legate. Urban held further councils in France, and set 15 August—two weeks after the harvest began—as the campaign's start date. His message spread mainly through those present at Clermont, leaving much of Western Europe unaware of the crusade. He also urged Catalan counts not to join, granting them equal spiritual rewards for fighting the Almoravids, marking an early instance of crusading in Iberia. Pope Urban sought to restrict enlistment to trained warriors, but popular enthusiasm proved uncontrollable. The charismatic Peter the Hermit preached in regions Urban had avoided, reportedly bearing a heavenly letter urging the expulsion of "pagans" from the Holy Land. He attracted thousands of peasants and townsfolk, alongside some nobles such as Walter Sans Avoir. In Germany, the preachers Folkmar and Gottschalk assembled similar groups. Several contingents departed before the harvest, from March 1096. Walter and Peter each led forces of 10,000–15,000. While travelling, Peter threatened Jewish communities in pursuit of provisions. King Coloman of Hungary granted market access, but during their passage the host, by then c. 20,000 people, plundered the border town of Zemun. Entering Byzantine territory in June, their continued looting provoked imperial raids, causing severe losses. Meanwhile, Folkmar and Gottschalk's 15,000-strong host was destroyed by Coloman on Hungary's western frontier in July. A parallel rising under the Swabian count Emicho launched the anti-Jewish Rhineland massacres in western Germany, beginning at Speyer on 3 May 1096. Despite episcopal efforts at protection, his force spread anti-semitic violence until Hungarian troops dispersed it in mid-July. Walter reached Constantinople on 20 July, Peter on 1 August. Distrustful of their disorder, Emperor Alexios shipped them across the Bosporus to Anatolia. Germans captured the Seljuk fortress of Xerigordos, but it was retaken by the Turkomans on 29 September. Kilij Arslan destroyed the crusaders at Civetot on 21 October; Peter survived with a few followers. No crowned ruler joined the First Crusade, largely because of tensions with the Church. The first major noble to depart was Hugh of Vermandois, brother of King Philip I of France. Godfrey of Bouillon, Duke of Lower Lorraine, set off in August 1096, followed by Bohemond of Taranto, a veteran of anti-Byzantine campaigns, in October, and Raymond of Saint-Gilles, Count of Toulouse, who led the largest force. Other leaders included Robert Curthose, Duke of Normandy; Stephen of Blois; and Robert II of Flanders. Their armies, as the historian Thomas Madden notes, were "a curious mix of rich and poor, saints and sinners", motivated by both faith and gain. As a knight's participation could cost four years' income, it was often financed through loans or donations; the less wealthy joined noble retinues. At Constantinople, tensions with the Byzantines resulted in skirmishes. Emperor Alexios demanded oaths from the crusader leaders to return former Byzantine lands before allowing passage into Anatolia. The crusading host numbered 60,000–100,000, including 30,000 non-combatants and up to 7,000 knights. Exploiting Seljuk distractions, the crusaders and Byzantines captured Nicaea in June 1097 and advanced toward Antioch, once a Byzantine provincial capital in Syria. They repelled Kilij Arslan's lightly armoured cavalry at the Battle of Dorylaeum. After a gruelling march, c. 40,000 crusaders reached Antioch and began the city's prolonged siege in October 1097. During this time, Baldwin of Boulogne—Godfrey's brother—left with 100 knights and, with Armenian support, seized fortresses and the city of Edessa, founding the first Crusader state, the County of Edessa, in March 1098. The Seljuk general Kerbogha assembled a 40,000-strong army in Iraq, but arrived in June after Bohemond had secured Antioch through collusion with a guard. The crusaders massacred the Muslim inhabitants and some of the native Christians. Despite famine, disease, and desertion, they—encouraged by the mystic Peter Bartholomew—defeated Kerbogha at the Battle of Antioch on 28 June 1098. The march on Jerusalem was halted due to intense summer heat and a plague that claimed Adhemar of Le Puy's life. In the Byzantines' absence, Bohemond persuaded the other leaders to recognise his rule over Antioch, establishing a new Crusader state, the Principality of Antioch. The crusade resumed under pressure from the common soldiers in November. After massacring the defenders of Ma'arra, the crusaders were granted safe passage by local Muslim rulers. They reached Jerusalem, then held by a Fatimid governor, on 7 June 1099. The siege stalled until Genoese craftsmen arrived with supplies. Their siege towers enabled the crusaders to conquer the city on 15 July. Over the next two days, they slaughtered the population and looted the city. Godfrey was elected Jerusalem's first Western ruler, while Arnulf of Chocques, a Norman priest, was named the first Latin patriarch. Meanwhile, al-Afdal mobilised c. 20,000 Egyptian troops to retake the city, but the crusaders—roughly 9,000 infantry and 1,200 knights—defeated his army at the Battle of Ascalon on 12 August. With their vow fulfilled, most crusaders returned home, leaving Godfrey with just 300 knights and 2,000 foot soldiers. Conquest, consolidation and defence The historian Malcolm Barber notes that the Crusader states' creation "committed western Europeans to crusading for the foreseeable future". In the century after the First Crusade, the resurgence of Muslim unity shaped Middle Eastern history. During the first half of this period, the Franks sought Western military aid only four times; between 1149 and 1186, they made at least sixteen such appeals. The Italian merchant republics pledged naval aid for the crusade but needed time to prepare. The Pisan fleet of 120 ships arrived under Archbishop Daimbert in September 1099. As papal legate, he deposed Arnulf and was installed patriarch on Christmas Day, with Godfrey and Bohemond doing homage to him. Meanwhile, Tancred, Bohemond's nephew, completed the conquest of Galilee. Vitale I Michiel, Doge of Venice, soon arrived with over 200 ships. After Godfrey's unexpected death on 18 July 1100, the Venetians helped Tancred take Haifa. Daimbert, seeking to make Jerusalem an ecclesiastical lordship, lost support when Bohemond was captured by the Danishmendid Gazi Gümüshtigin in August. Meanwhile, Godfrey's followers invited Baldwin of Boulogne to succeed him. Before going to Jerusalem, Baldwin granted Edessa to his cousin Baldwin of Bourcq, then seized Jerusalem and forced Daimbert to crown him king on Christmas Day. Within nine months, he captured Arsuf and Caesarea with Genoese aid, and defeated a superior Egyptian force at the First Battle of Ramla. After Antioch's capture, crusader leaders wrote to senior Catholic clerics urging them to rally oath-breakers. In December 1100, Pope Urban's successor Paschal II launched a new crusade. Nicknamed the "Crusade of the Faint-Hearted", it included deserters such as Stephen of Blois and Hugh of Vermandois. The first contingent, led by Anselm, Archbishop of Milan and Albert of Biandrate, left Lombardy in September 1100. The Lombards reportedly aimed at Baghdad or Egypt, and even attacked the Blachernae Palace in Constantinople before being ferried to Anatolia in early 1101. They were soon joined by French and German forces led by William IX of Aquitaine, William II of Nevers, Welf I of Bavaria, the widowed Marchioness Ida of Austria, and Archbishop Thiemo of Salzburg. Reaching Constantinople in June, they met Raymond of Saint-Gilles. Ignoring his and Stephen's warnings, the Lombards pressed to free Bohemond. Joined by other crusaders, they advanced into eastern Anatolia, but were crushed at the Battle of Mersivan in August by a coalition of Turkoman rulers. William of Nevers' army, heading south, was almost destroyed at Heraclea, where a third mainly German force was also routed. Ida vanished, later giving rise to tales she became mother of the powerful Turkoman ruler Zengi. The failure of the 1101 Crusade shattered crusader invincibility, with Westerners chiefly blaming Byzantines. Few survived. William of Aquitaine, Welf, and Stephen regrouped at Antioch, aiding Raymond and Genoese allies in capturing Tortosa. Some, including Stephen, reached the Holy Land, where he died at the Second Battle of Ramla on 17 May 1102. On that occasion Egyptians caught the crusaders by surprise, but survivors redeemed themselves at the Battle of Jaffa ten days later. Bohemond of Antioch secured his release by ransom, exploiting Danishmendid–Seljuk conflict. He supported Baldwin II of Edessa in an attack on Harran, but in May 1104, Jikirmish, atabeg (governor) of Mosul, defeated them at the Battle of Harran. Jikirmish's victory allowed Ridwan to retake border fortresses, while the Byzantines expelled Antiochene garrisons from Cilicia. Seeking support in the West, Bohemond left Tancred in charge of Antioch in autumn 1104. Pope Paschal named Bishop Bruno of Segni as papal legate to promote a crusade for Jerusalem in France. Though highly regarded, Bohemond drew only lesser nobles like Hugh of Le Puiset and Robert of Vieux-Pont to take the cross. He then chose to invade the Byzantine Empire from Italy, accusing the Byzantines of heresy. In October 1107, he besieged the fortress of Dyrrachium, but Alexios had reinforced its defences, allied with Venetians, and, with Turkoman mercenaries, blockaded Bohemond's army. Bohemond had to withdraw and accept Byzantine suzerainty over Antioch in the 1108 Treaty of Devol, but Tancred did not implement the treaty. King Baldwin I of Jerusalem expanded his realm to secure defence and attract knights with rewards. Naval aid for coastal conquests came from Pisans, Genoese, and Venetians, compensated with trade privileges. He captured Acre in 1104 and Beirut and Sidon in 1110. Sigurd I of Norway, the first crowned monarch on crusade, assisted at Sidon. Baldwin's position was strengthened by Duqaq of Damascus's death. Though Duqaq's successor, Toghtekin joined an Egyptian invasion, the Muslim coalition was defeated at the Third Battle of Ramla in 1105. Around the same time, the Damascene scholar Ali ibn Tahir al-Sulami urged Muslim unity in jihad against the Franks. Raymond of Saint Gilles began the Siege of Tripoli in 1103 but died within two years. A dispute between his son Bertrand of Toulouse and cousin William Jordan was resolved by King Baldwin at the Council of Tripoli. Soon after, Frankish forces with Genoese aid seized the city in July 1109. William Jordan was killed, leaving Bertrand sole ruler of the County of Tripoli. Tripoli's fall alarmed the Muslim world. The Seljuk sultan Muhammad I Tapar ordered Mawdud, atabeg of Mosul to invade, but his campaigns of 1110–13 failed amid desertions. In 1115 his successor Aqsunqur also failed at Edessa. That year Toghtekin sheltered his kinsman Ilghazi, angering the Sultan. Toghtekin allied with Tancred's successor in Antioch, Roger, who defeated the Sultan's army at the Battle of Sarmin on 14 September 1115. Meanwhile, Ridwan's death in 1113 sparked a succession crisis in Aleppo, enabling Roger to exact tribute from the city. Baldwin I of Jerusalem died of illness during a campaign against Egypt on 2 April 1118. He was succeeded by Baldwin of Bourcq, who ceded Edessa to his kinsman Joscelin I. Facing Roger of Antioch's repeated demands for tribute, the Aleppans appealed to Ilghazi, who with Toghtekin's aid invaded Antiochene lands. They defeated Roger at the Battle of the Field of Blood on 28 June 1119, where some 700 knights and 3,000 infantry perished along with Roger. Antioch was saved by Baldwin II of Jerusalem, who became regent for the absent Bohemond II, son of Bohemond I. Amid famine and military disaster, Jerusalem's leaders met at the Council of Nablus in 1120, issuing decrees against sexual offences such as sodomy and relations with Muslims. Patriarch Warmund approved Hugues de Payens' knightly confraternity, whose members vowed poverty, chastity, obedience, and protection of pilgrims. This marked the birth of the military orders. Baldwin II installed them in the former Al-Aqsa Mosque, identified by the Franks as Solomon's Temple, whence their name Knights Templar. Seeking aid, Baldwin II sent envoys to the West. Pope Paschal urged the Venetian doge Domenico Michiel to lead a naval expedition. As regent Baldwin prioritised Antioch's defence, though it was unpopular in Jerusalem. After Ilghazi's death, his nephew Belek Ghazi captured Joscelin and, in April 1123, Baldwin himself. In his absence Patriarch Warmund concluded the Pactum Warmundi with Venice, securing the conquest of Tyre on 7 July 1124. Baldwin returned to Jerusalem in April 1125. Aqsunqur united Aleppo and Mosul, recovering much territory from the Franks before his assassination in 1126. That year Bohemond II assumed power in Antioch, but his conflict with Joscelin I of Edessa prevented him from exploiting unrest in Aleppo. In 1127 the Turkoman commander Zengi became atabeg of Mosul. In preparation for a major offensive against Damascus, Baldwin II of Jerusalem sent envoys to Europe to raise troops and arrange the marriage of his heir, Melisende. Her betrothal to Fulk V of Anjou included the promise of joint succession. In May 1128 Toghtekin of Damascus died, succeeded by his son Buri, while Zengi reunited Aleppo with Mosul. Fulk arrived in May 1129 and married Melisende. Though lacking papal sanction, the Crusade of 1129 drew some 60,000 warriors. The Franks invaded Damascene territory in November, but a sortie routed their foragers. On hearing of this, the main force withdrew, perhaps also driven by a violent storm. In February 1130 Bohemond II of Antioch was killed in a skirmish. His widow Alice—daughter of Baldwin II of Jerusalem—sought power with Zengi's support, but Baldwin assumed the regency for her daughter by Bohemond, Constance. When Baldwin died on 21 August 1131, Fulk and Melisende succeeded him in Jerusalem, while Fulk secured the regency in Antioch by defeating Alice's allies Pons of Tripoli and Joscelin II of Edessa. Muslim pressure mounted: Zengi plundered Antioch and Edessa, and Buri's successor, Ismail of Damascus raided Jerusalemite and Tripolitan lands, causing Pons's death. In 1136, Fulk arranged Constance's marriage to the French Raymond of Poitiers. The next year Raymond did homage to Byzantine emperor John II Komnenos, but John's campaigns against Aleppo and Shaizar failed. Zengi took Homs, but his assault on Damascus was repelled by the city's new ruler Unur allied with Fulk. Fulk died in a hunting accident on 10 November 1143. His reign saw the Hospitallers evolve from a nursing confraternity into a military order. The widowed Melisende resisted sharing power with their son Baldwin III. In the early 1140s Zengi sought dominance over Muslim rivals, notably the Artuqids in Iraq. Kara Arslan, an Artuqid prince, sought aid from Joscelin II of Edessa, offering land in exchange. Joscelin accepted, provoking Zengi to besiege Edessa. When the city fell on 26 December 1144, most of its Frankish population was killed or enslaved. Zengi was assassinated in 1146, but when Joscelin briefly regained Edessa, Zengi's son Nur al-Din expelled him and the Turkomans massacred fleeing Christians. Nur al-Din destroyed the city's fortifications, making its reconquest futile. He secured a marriage alliance with Unur of Damascus. News of Edessa's fall reached Pope Eugenius III through Bishop Hugh of Jabala and Armenian clergy. He responded with the bull Quantum praedecessores on 1 December 1145, granting remission of sins, protection of property, and debt suspension to those who took the cross—establishing the model for later crusade bulls. Louis VII of France, troubled by guilt over a massacre in a church, declared his intention to lead a crusade. At Vézelay in 1146 the Cistercian abbot Bernard of Clairvaux persuaded many French nobles to join. Bernard went on preaching across France and Germany. In the Rhineland, anti-semitic pogroms incited by the monk Radulf ended only after Bernard recalled him. In a Christmas sermon Bernard persuaded Conrad III of Germany to take the cross at Speyer. When Saxon lords resisted abandoning war against the pagan Wends, he convinced Pope Eugenius to issue the bull Divina dispensatione in April 1147, extending crusade indulgences to the Wendish campaign, later seen as the first Northern Crusade. The Pope also named Iberia as a crusading target. A critic to the Wendish campaign, Helmold of Bosau later described the Second Crusade as fought in three theatres—the Holy Land, the Baltic and Iberia. Despite leadership by nobles such as the Saxon duke Henry the Lion, the crusaders failed against the Wendish prince Niklot. The crusaders departed for the Holy Land in May and June 1147. A distinctive feature was the prominent presence of women: Louis VII was joined by his wife Eleanor of Aquitaine and her household, while regulations for the crusader fleet mention wives. The fleet of 150 ships carried about 10,000 crusaders from northwestern Europe. They aided Afonso I of Portugal in his successful Siege of Lisbon in October 1147 and Ramon Berenguer IV of Barcelona in capturing Tortosa in December 1148, but only a small contingent reached the Holy Land. The German army, with many pilgrims, retraced the First Crusade's route through Hungary and the Balkans. Emperor Manuel I Komnenos, fearing attack, made peace with Mesud I, Sultan of Rum. Roger II of Sicily invaded the Balkans, heightening Byzantine suspicion of a coordinated western action. After clashes at Constantinople, the Germans crossed into Anatolia without waiting for the French. On 25 October 1147 Mesud's forces crushed them at the Battle of Dorylaeum; many died, but Conrad, wounded, escaped into Byzantine territory. The French reached Constantinople in October 1147. Clashes followed, and Bishop Godefroy of Langres urged Louis VII to seize the city, but he advanced into Anatolia. The crusaders endured shortages, desertions and raids while wintering at Ephesus. At Antalya Louis and his knights sailed for Syria on Byzantine ships; most left behind perished, deserted or were enslaved. Louis reached Antioch on 19 March 1148. Raymond of Poitiers urged an attack on Aleppo and Shaizar, but Louis pressed on to Jerusalem, despite Eleanor—Raymond's niece—supporting her uncle. At Acre he joined Conrad, who had arrived by sea from Constantinople. The Council of Acre resolved to besiege Damascus, beginning on 24 July. Though Conrad repelled attacks, Damascene raids and news of Nur al-Din's approaching reinforcements soon forced the crusaders to abandon the siege. A plan to attack Ascalon, the last Fatimid port, also collapsed, and the crusaders withdrew from the Holy Land. The failure gravely weakened crusading fervour in Europe. Conrad blamed Jerusalem's leaders, while others, including Bernard of Clairvaux, accused the Byzantines. Muslim forces pressed the northern Crusader states. Raymond of Poitiers was killed at the Battle of Inab on 29 June 1149; Nur al-Din seized Antiochene fortresses and destroyed Tortosa, while the Artuqids and Seljuks of Rum attacked the ruined County of Edessa. Joscelin II of Edessa was captured, and in 1150 his wife Beatrice sold the remnants of his county to Byzantium. Unur's death ended the Aleppo–Damascus alliance, as his successor Abaq allied with the Franks. In 1151 Assassins murdered Raymond II of Tripoli and his son Raymond III succeeded him. The following year, Baldwin III of Jerusalem deposed his mother Melisende. He captured Ascalon in 1153, completing the conquest of the coast. He arranged the marriage of the French crusader Raynald of Châtillon to Constance of Antioch. Between 1154 and 1157, Nur al-Din blockaded Damascus, forced Abaq to withdraw, and took Shaizar, uniting Muslim Syria. The Franks failed to retake Shaizar despite the support of Thierry of Flanders, a veteran of the Second Crusade. In 1159 Emperor Manuel I invaded Syria, halting Nur al-Din's advance, but Raynald was captured by Turkomans in 1160/61. The childless Baldwin III died of illness on 10 February 1163. His brother Amalric's succession was made conditional on the annulment of his marriage to Agnes of Courtenay. Their children, Sibylla and Baldwin, nevertheless were recognised as legitimate. In Antioch, Bohemond III, son of Constance and Raymond of Poitiers, took power and expelled his mother. Under Amalric, the wealthy but divided Egypt became the main battleground with Nur al-Din. Between 1163 and 1169, Amalric launched five campaigns, but Nur al-Din's forces blocked his conquest. In early 1169, the Fatimid caliph al-Adid appointed Nur al-Din's Kurdish general Shirkuh as vizier; on his death, his nephew Saladin succeeded him. Amalric renewed the Byzantine alliance, but their joint invasion of Egypt failed. In September 1171, Saladin abolished the Fatimid caliphate, but soon quarrelled with Nur al-Din. In response to Amalric's appeals, Louis VII of France imposed a levy—one penny on every pound of property and income—for the Holy Land over five years. His initiative was soon matched by Henry II of England. In 1174, Nur al-Din and Amalric both died, leaving underage heirs: as-Salih and the leper Baldwin IV. In his final years, Nur al-Din had made the conquest of Jerusalem the chief aim of jihad, inspiring a new Muslim literary genre, the Merits of Jerusalem. The struggle for his legacy was won by Saladin, who took Damascus in 1174, Aleppo in 1183, and compelled the Zengid ruler of Mosul, Izz al-Din, to submit in 1186. As early as 1176, the Abbasid caliph al-Mustadi urged Saladin to renew the jihad against the Franks, but he instead fought his Muslim rivals. Once he secured much of the Near East, however, he needed a new target to furnish his troops with plunder. Jerusalem's leaders sought Western support by marrying Baldwin's heir Sibylla to William of Montferrat, a kinsman of German and French royalty, but he died in 1177. That year Philip I of Flanders led a futile armed pilgrimage to the Holy Land, and a Byzantine–Frankish invasion of Egypt failed amid disputes over its future. Before his death Baldwin designated Sibylla's posthumous son by William, Baldwin V his successor. On the child's death in 1186, Sibylla and her second husband Guy of Lusignan seized power with the support of leading figures, including Raynald of Châtillon, by then lord of Transjordan. Their rival, Raymond III, allied with Saladin, granting his troops free passage through Galilee. Fall and recovery Disillusioned by the failure of the Second Crusade, Western rulers were unwilling to launch another expedition to the Holy Land, despite the threat from Saladin. Criticism of crusading intensified, recorded in the 1187 Military Affairs by the historian Ralph Niger, who questioned the efficacy of crusading indulgences without corresponding spiritual renewal. In this climate only a major defeat in the East could revive crusading zeal. The Byzantine Empire, a traditional ally of Jerusalem, was destabilised by coups in 1183 and 1185, while the massacre of Italian merchants deepened its isolation from the West. In 1185 Emperor Isaac II Angelos concluded an anti-Seljuk alliance with Saladin, recognising his claim to Syria except Antioch. Despite a truce signed in 1185 still in force, Raynald of Châtillon attacked a Muslim caravan in Transjordan in early 1187, prompting Saladin to muster troops across his empire. Guy of Jerusalem and Raymond III of Tripoli were reconciled, but the Jerusalemite field army, exhausted by a long march, was crushed at the Battle of Hattin on 4 July 1187. Raymond fled while others were killed or captured. Saladin executed Raynald, the Templars and Hospitallers, but spared other leaders including Guy. The kingdom lay defenceless—after a 12-day siege the city of Jerusalem surrendered to Saladin on 2 October. Tyre resisted under the newly arrived crusader Conrad of Montferrat, who sent Archbishop Joscius west for aid. Saladin's siege of Tyre was lifted on 1 January 1188. The first reports of the disaster reached Italy through Genoese merchants. William II of Sicily dispatched c. 50 ships and 200 knights, and his fleet's support strengthened the defence of Antioch, Tripoli, and Tyre. Raymond III died of illness, and the County of Tripoli was seized by Bohemond IV, son of Bohemond III of Antioch. Pope Gregory VIII launched the new crusade with the bull Audita tremendi on 29 October 1187. The English prince Richard was the first to take the cross. Pope Gregory appointed Joscius of Tyre to preach in France and Henry of Albano in Germany. On 22 January 1189 Joscius reconciled Philip II of France and Henry II of England at Gisors, where both kings and many nobles took the cross. Troubadours such as Conon of Béthune also spread the message of the bull. To fund the crusade, the "Saladin tithe"—a levy of 10% on income and movable goods—was imposed in England and France. On 27 March 1188 Emperor Frederick I swore his oath at the Curia Christi ('Court of Christ') in Mainz. The English, French, and part of the German host chose the sea route, but Frederick resolved to march overland. Frederick set out in May 1189 with c. 15,000 troops. By then Frankish control was reduced to Tyre, Antioch, Tripoli, and the fortresses of Beaufort, Margat, and Krak des Chevaliers. Saladin had freed Guy in May 1188, but Conrad barred him from Tyre. Gathering c. 9,000 men, Guy laid siege to Acre in August 1189 with Pisan naval support, his army reinforced by arriving western contingents. Fearing a German–Seljuk alliance, Emperor Isaac II denied Frederick safe passage. Frederick retaliated by attacking Byzantine towns, forcing Isaac in March 1190 to allow transport into Anatolia on Genoese and Pisan ships. Despite Turkoman raids and scarce supplies, the Germans briefly took Konya, capital of Rum, but the crusade collapsed when Frederick drowned in the river Saleph on 10 June 1190. His son Frederick of Swabia failed to sustain morale: many deserted or died, and only remnants reached Acre in October. Franco-English tensions persisted until Henry II's death in July 1189. Richard I succeeded and swiftly prepared for the crusade, raising further funds by exacting a taillage from the Jews. He met with Philip II at Vézelay on 4 July 1190 before departing. Richard's host numbered c. 17,000, while the French force was smaller, as many had already left under Henry of Champagne. Richard hired ships in Marseille, Philip in Genoa, and both sailed to Sicily. There Richard seized Messina, compelling the new Sicilian king Tancred of Lecce to pay a substantial sum. From Sicily the French sailed directly to Acre, arriving on 20 April. A storm drove several English ships onto Cyprus, where the local Byzantine ruler Isaac Komnenos seized the wrecks and captives. Richard conquered the island before reaching Acre on 6 June 1191. Meanwhile, the long siege caused a deadly plague at Acre, killing Queen Sibylla. As Guy's kingship relied on her, her death voided his claim. Supported by French and German crusaders and the papal legate Ubaldo of Pisa, Conrad married Sibylla's half-sister Isabella on 24 November 1190. Guy refused to abdicate and sought Richard's backing. The siege intensified with the arrival of two royal armies, and on 12 July 1191 the defenders surrendered without Saladin's approval, under safe-conduct terms. Richard's and Philip's banners rose on Acre's walls, but when Leopold V of Austria raised his flag, Richard tore it down. Stricken by illness, Philip II soon withdrew from the crusade. Acre's surrender required Saladin to free 1,600 Frankish prisoners and return the True Cross within a month; when he failed, Richard ordered 2,700–3,000 Muslim captives executed. From Acre, Richard advanced south, defeated Saladin at the Battle of Arsuf, and secured Jaffa. News that Richard's brother John was attempting to seize England reached the Holy Land, prompting Richard to plan his return. On 20 April 1192 he recognised Conrad's claim to the remnant Kingdom of Jerusalem and granted Cyprus to Guy as compensation. Conrad was assassinated eight days later, and his pregnant widow Isabella soon married Henry of Champagne, a kinsman of both the French and English kings. In June Richard advanced towards Jerusalem, but the crusaders halted at Bayt Nuba, 13 miles (21 km) away, fearing defeat, and withdrew to the coast. Saladin counterattacked at Jaffa, but Richard relieved the town. Peace talks begun the previous year led to the Treaty of Jaffa on 2 September, a three-year truce confirming Frankish control of the coast between Tyre and Jaffa and allowing Christian pilgrims access to the holy sites of Palestine. Richard left Palestine on 9 October 1192 but was captured in Austria by Leopold V. In 1193 he was handed to Emperor Henry VI, who freed him for a ransom of 100,000 marks. Saladin died of illness on 4 March 1193. His empire soon collapsed, as his eldest son and designated heir, al-Afdal, proved unable to restrain the ambitions of his many Ayyubid kinsmen. Of these, Saladin's brother al-Adil was the most astute, securing control of Damascus in 1196. The Third Crusade, with its heavy naval use, set a model for later expeditions: sea travel limited non-combatants and eased army supply. Though no campaign matched its scale, new plans arose. Emperor Henry VI, after taking the Kingdom of Sicily from Tancred, revived Norman ambitions in the eastern Mediterranean. He took the cross in April 1195, and Pope Celestine III authorised preaching a new crusade in Germany. By then Leo I of Cilician Armenia and Aimery of Lusignan, Guy's successor in Cyprus, had recognised Henry's suzerainty. Henry planned to recruit 3,000 mercenaries and demanded tribute from the new Byzantine emperor Alexios III Angelos to fund the venture. Alexios levied the heavy Alamanikon ('German tax'), raising over 7,000 pounds of silver, but payment ended when Henry died of illness on 28 September 1197. Earlier, the ailing emperor had named his marshal Henry of Kalden, and the imperial chancellor Bishop Conrad of Hildesheim to lead the crusade. German forces sailed from southern Italian ports between March and September. That same month al-Adil captured Jaffa, but the crusaders took Botrun, Sidon and Beirut before abandoning the campaign when Henry's death reached Palestine in February 1198. During the crusade, Aimery of Cyprus and Leo I of Cilicia Armenia were crowned kings by imperial envoys. After marrying the widowed Isabella I of Jerusalem, Aimery was also crowned king of Jerusalem in January 1198. He soon prolonged the truce with the Ayyubids until 1204. The same year the German nursing confraternity that had run a hospital at Acre since the Third Crusade assumed military functions, forming the Teutonic Knights. Pope Celestine III died in 1198 and was succeeded by Innocent III, a learned theologian and jurist. That year he proclaimed a new crusade, but the Anglo–French war and the German throne dispute between Philip of Swabia and Otto of Brunswick blocked any large-scale campaign. Markward von Annweiler, a veteran of the Third Crusade, rejected Innocent's claim to act as regent in Sicily for the child Frederick, son of Emperor Henry VI. Innocent accused him of endangering the Holy Land and extended crusading indulgence to those fighting him, though only Walter of Brienne, a French claimant to southern Italian fiefs, joined this first "political crusade". Innocent pressed on with plans for a crusade to the Holy Land. He sent his legate Peter Capuano to mediate peace between England and France, but talks ended when Richard I died in April 1199. By then Innocent had tasked the preacher Fulk of Neuilly with promoting the crusade in France. To fund it, he imposed a 2.5% extraordinary levy on clerical income. Theobald III of Champagne was the first to take the cross on 28 November, followed by his cousin Louis of Blois and, in February 1200, his brother-in-law Baldwin IX of Flanders. They secretly agreed to strike Egypt first, concealing the plan to avoid rank-and-file opposition. Six envoys, including Geoffrey of Villehardouin—later the crusade's chronicler—were appointed to hire a fleet. They agreed with Doge Enrico Dandolo that Venice would build, by June 1202, a fleet for 33,500 crusaders for 85,000 marks (over 20 tons of silver). After Theobald's unexpected death in May 1201, Boniface of Montferrat, linked to royal houses in East and West, became leader. The crusade faltered when only a third of the expected force gathered at Venice; many embarked elsewhere or failed to keep their vows. The Venetians had invested heavily but the crusaders could not pay the agreed sum. To recover losses, Dandolo proposed attacking Zara, a Christian city in Dalmatia under King Emeric of Hungary, himself a sworn crusader. Despite papal prohibition and protests from some, including Simon de Montfort, the leaders agreed and captured Zara for Venice in November 1202. While wintering there, Alexios Angelos, son of the deposed Emperor Isaac II, offered to reunite the Byzantine Church with Rome, pay 200,000 marks, and supply 10,000 troops if restored to Constantinople. Though only recently absolved for attacking a Christian city, the leaders accepted and diverted the expedition, prompting several hundred dissenters to quit or sail directly to the Holy Land. The army reached Constantinople in June 1203 and began the siege. Their first assault in July forced Emperor Alexios III to flee; Isaac II was restored and his son crowned co-emperor as Alexios IV. Alexios raised only 100,000 marks and promised more if the crusaders stayed until March, which a parlament, attended by both commanders and knights, accepted. As he failed to pay, the crusaders began plundering. Losing support, Alexios IV and Isaac were deposed by the aristocrat Alexios Doukas, crowned Alexios V in February 1204. Lacking supplies, the crusader leaders resolved to attack Constantinople after agreeing on how to divide its spoils and partition the empire. Their first assault failed, but clergy kept morale with sermons branding the Byzantines schismatics "worse than the Jews". The second attack, on 12 April, succeeded and the Sack of Constantinople lasted for days. The crusaders massacred thousands, desecrated holy sites and seized the city's movable wealth. Relics were taken in great numbers to Western churches. The brutality shocked contemporaries, including the Pope and the Muslim scholar Ibn al-Athirv The Byzantine historian Nicetas Choniates contrasted Saladin's clemency in Jerusalem with the crusaders' slaughter of Orthodox Christians in Constantinople. A committee of six Venetian and six French crusaders elected Baldwin of Flanders as the first Latin Emperor. Boniface of Montferrat received Macedonia and Thessaly, founding the Kingdom of Thessalonica; his vassals created the Duchy of Athens in Attica and the Principality of Achaea in the Peloponnese. Venice gained many Aegean islands, including Crete, and thereafter thereafter a Venetian cleric was appointed as Latin Patriarch of Constantinople. Frankish control of former Byzantine lands proved precarious. Baldwin died in Bulgarian captivity after defeat at the Battle of Adrianople in 1205, and Boniface was killed fighting Bulgarians in 1207. Greek resistance centred on three Byzantine successor states: Epirus, Nicaea, and Trebizond. From the Crusader states' view, the Fourth Crusade was almost a failure: only about a fifth of those who took the cross around 1200 reached the Holy Land—enabling Aimery of Jerusalem to extend the 1198 truce for six years in 1204—while most participants in Constantinople's sack returned home without going east. After the Fourth Crusade's collapse, Pope Innocent III considered a new eastern campaign. Yet large-scale plans had little chance amid the prolonged German throne dispute and renewed war between France and England. The Crusader states faced no immediate danger because of divisions within the Ayyubids. In 1212 John of Brienne, the new king of Jerusalem, concluded a five-year truce with al-Adil, by then ruler of Egypt and Damascus, and soon asked Innocent to call a crusade once it expired. John had gained the throne by marrying Queen Isabella's daughter and heir, Maria of Montferrat; after his wife's death, he ruled with their infant daughter Isabella II. The medievalist Andrew Jotischky sees Innocent's crusade policy as "pragmatic reactions to problems". One challenge was Catharism, a dualist religious movement in southern France. He launched the Albigensian Crusade against them in 1208, denouncing the Cathars as "more evil" than Muslims. Popular zeal for crusading persisted, though recent failures drew criticism of noble-led campaignsexpeditions. Petition processions for Iberian Christians resisting the Muslim revivalist Almohads and preaching against the Cathars stirred fervour in central France and the Rhineland in the early 1210s. In 1212 this produced popular movements later called the "Children's Crusade". Sources conflict and mix myth with moral tales, but agree the participants were children and youths seeking to retake Jerusalem, but none reached the Holy Land. Unlike in the Levant, crusading in Europe was succeeding. In Iberia the Reconquista struck a decisive blow to the Almohads at the Battle of Las Navas de Tolosa in July 1212. That year Simon de Montfort, now leader of the Albigensian Crusade, completed the conquest of much of southern France. These victories, with the spontaneous zeal of the Children's Crusade, let Pope Innocent III plan a new Levantine crusade. He proclaimed it in the bull Quia maior, citing a new Muslim fort on Mount Tabor as pretext. According to Madden, this "impressive document represents the full maturation of the crusading idea". The Fourth Crusade had shown the ruinous effect of poor organisation, and Innocent concluded only papal direction could ensure success. He also broke with the tradition of appealing solely to the military class and granted full indulgence to those who funded a warrior's journey if unable to go themselves, and partial indulgence to donors. The expedition's terms were set at the Fourth Lateran Council in November 1215. Crusaders were to gather at Brindisi or Messina in southern Italy by 1 June 1217, when the 1212 truce ended. A 5% levy on clerical income across Europe for three years was imposed, and Innocent pledged 30,000 pounds of silver. The appeal failed in France, preoccupied with the Albigensian Crusade, but found support elsewhere. Andrew II of Hungary and Leopold VI of Austria took the cross. Frederick II, Innocent's protégé in the German throne dispute, also vowed to join though had not yet defeated Otto of Brunswick Oliver of Paderborn, a crusade preacher, toured the Low Countries recounting visions such as three crosses in the sky, while Jacques de Vitry won over Genoese patricians through their wives. During preparations Innocent died on 16 July 1216, but his successor, Honorius III, carried on his policy. For the first time in crusading history, the crusade was also preached in the Crusader states and in Cyprus. Hungarian and Austrian crusaders embarked at the Dalmatian port of Spalato on Venetian ships rather than the more distant southern Italian harbours. By late September they reached Acre, where John of Brienne, Hugh I of Cyprus, and Bohemond IV of Antioch joined them. After a failed siege of Mount Tabor, Andrew II deemed his vow fulfilled and, with most Hungarians, withdrew. Frisian, German, and Italian forces then joined, and in May 1218 the army advanced on Damietta, a thriving Nile Delta port. The city's subsequent siege saw a shifting host as Western contingents arrived and others departed. John of Brienne was chosen commander but soon challenged by the papal legate Pelagius. In late August the crusaders seized the Tower of the Chain, guarding the Delta. Al-Adil reportedly died of shock; his son al-Kamil offered to restore the pre-1187 borders of the Kingdom of Jerusalem (excluding Transjordan) for withdrawal. On learning of the offer, al-Kamil's brother al-Mu'azzam dismantled Jerusalem's walls, but the crusaders rejected the offer as the kingdom was indefensible without the fortresses over the Jordan. In August 1219, the mystic Francis of Assisi met al-Kamil, seeking to convert him to Christianity unsuccessfully. Prophecies promised victory and aid from the mythical Prester John, fuelled by distorted reports of the Mongol conquests in Central Asia. Damietta fell to the crusaders in November 1219, but its possession soon sparked renewed conflict between John of Brienne and Pelagius. A year later Frederick reaffirmed his crusading vow at his imperial coronation in Rome. The first German forces arrived under Louis I of Bavaria in 1221. That July, ignoring Frederick's orders, Louis and Pelagius advanced towards Cairo, but al-Kamil, aided by his brothers al-Muʿazzam and al-Ashraf, forced a northward retreat. With the Nile in flood, he opened the sluices, flooding their route. Trapped, the crusaders accepted terms: Damietta was surrendered for safe conduct and an eight-year truce. Al-Kamil re-entered the city in September as the crusaders withdrew. The sudden collapse shocked Western Christendom. Many blamed Pelagius for the disastrous final campaign, while others—including returning crusaders and Honorius—condemned Frederick for failing to honour his vow. By 1218 Frederick II had secured his authority in Germany, but the union of Sicily with the Holy Roman Empire under his rule threatened the papacy. Yet, relations with Pope Honorius III stayed cordial, aided by mediators such as Hermann of Salza, Grand Master of the Teutonic Knights, and Thomas of Capua, head of the papal penitentiary. Frederick renewed his crusading vow in May 1223, setting June 1225 for departure, and agreed to marry Isabella II of Jerusalem in the presence of her father, John of Brienne. With little response to the crusade call, he renewed the vow again in March 1225, pledging under threat of excommunication to depart in August 1227. In November 1225 he married Isabella and exacted oaths of fealty from the barons of the Kingdom of Jerusalem despite earlier assurances that he would allow John to rule. In 1226 tensions between al-Kamil and al-Mu'azzam grew so severe that al-Kamil sent an envoy to Frederick, offering Jerusalem's return to the Christians for aid against his rival. In March 1227 Pope Honorius died and the energetic Gregory IX succeeded him. He soon clashed with Frederick over papal rights in Sicily, though preparations for the crusade continued. To win Lombard support Frederick used force, yet many eagerly joined, including the German noble Louis IV of Thuringia, the Italian aristocrat Thomas of Acerra, and the English bishop Peter des Roches. Several crusaders sailed from Brindisi on 15 August 1227. Frederick followed on 8 September with c. 800 knights and 10,000 infantry but fell ill and returned to southern Italy. Enraged, Pope Gregory excommunicated him before the end of the month. Learning of Frederick's illness and excommunication, many crusaders in the Holy Land abandoned the campaign; the rest repaired coastal fortifications. They also seized Sidon and built Montfort Castle near Acre after al-Mu'azzam died in November 1227. Disregarding papal demands to seek absolution before resuming the crusade, Frederick resolved to lead an expedition to the Holy Land. As Isabella died shortly after giving birth to their son, Conrad, he departed only in late June 1228. Reaching Cyprus, an imperial fief, he deposed John of Ibelin, regent for the underage King Henry I, and demanded fealty from Bohemond IV of Antioch and Tripoli, who refused. Frederick landed at Acre on 7 September. As the Hospitallers, the Templars, and devout crusaders would not follow an excommunicated leader, he used intermediaries to issue orders. He renewed talks with al-Kamil, displaying tolerance toward Islam and notable learning. On 18 February 1229 the Treaty of Jaffa ceded Jerusalem, Bethlehem, and other key cities to the Christians, while preserving the Temple Mount, the Dome of the Rock, and the al-Aqsa Mosque as Muslim places of worship; it also established a ten-year truce, excluding Antioch, Tripoli, and the Hospitaller and Templar lands. Though gaining more than any earlier crusade, the treaty drew sharp criticism. Visiting Jerusalem, Frederick entered Muslim shrines and on 18 May crowned himself king in the Church of the Holy Sepulchre. Meanwhile, with papal backing, John of Brienne invaded southern Italy, compelling Frederick to abandon his eastern campaign in May. He landed at Brindisi in June and, by the end of October, had driven his former father-in-law back into papal territory. By Emperor Frederick II's return from his eastern campaign, the Treaty of Paris had ended the Albigensian Crusades on 12 April 1229. The period also saw crusader successes in Iberia: James I of Aragon conquered the Balearic Islands and Valencia by 1238, while Ferdinand III of Castile took Córdoba in 1236 after a successful siege. In the Baltic, the Teutonic Knights assumed command of the crusade against the pagan Prussians in 1230. Pope Gregory IX meanwhile launched the Drenther and Stedinger Crusades against peasant rebels and the Bosnian Crusade against dissident Christians. Frederick reconciled with the papacy in May 1230. The following year Frederick appointed Richard Filangieri as bailli (deputy) in the Kingdom of Jerusalem. Supported by the Teutonic Knights, Pisans, and some local nobles, Filangieri seized Tyre, but most Jerusalemite barons, led by John of Ibelin and backed by the Genoese and Henry I of Cyprus, resisted. Upon John's death in 1236, his son Balian of Beirut assumed command of the resistance. Pope Gregory called for a new crusade in separate encyclicals to the English and French in 1234. The expedition was to depart for the Holy Land when the 1229 truce expired in 1239. He ordered taxation of clerical income and promoted commuting crusading vows for cash. He also proposed the establishment of a garrison in Palestine, to be maintained for ten years and financed by lay contributions in return for partial crusade indulgences. Mendicant friars preached the crusade, but bishops held the funds, which were distributed by papal authorisation to aristocrats who had taken the cross. In France, Theobald IV of Champagne (also king of Navarre), Hugh IV of Burgundy, and Peter of Dreux were among the nobles who joined. Most had earlier rebelled against Blanche of Castile, regent for King Louis IX of France, and by taking the cross gained church protection. Louis aided them with gifts, loans, and authorised them to fight under the royal banner. In England, several nobles hostile to royal authority enlisted, including Richard of Cornwall—one of Europe's wealthiest men—and his brother-in-law Gilbert Marshal; the army also attracted former enemies such as Simon de Montfort and Richard Siward. In the late 1230s the Crusader states faced little threat from their Muslim neighbours. After al-Kamil's death in 1238, a two-year struggle followed before his son Ayyub secured Egypt. He recruited new mamluk troops, stationed on a Nile island, forming the Bahri ("river") mamluks. By contrast, the Latin Empire came under pressure from a Bulgarian–Nicaean alliance. Pope Gregory tried to divert crusaders to Constantinople, but only a few—among them Humbert V de Beaujeu and Thomas of Marle—agreed. By their arrival in 1239, the anti-Latin coalition had collapsed, and the crusaders mounted only minor raids in Thrace. The French crusaders offered command to Frederick II, who promised that he or his son Conrad would join. Yet his bid to assert power in Lombardy led to renewed conflict with Pope Gregory, who excommunicated him in March 1239. The French reached Acre that September. Leadership was divided, and in November an Egyptian force routed a contingent at the Battle of Gaza. The defeat emboldened Dawud, Ayyubid emir of Damascus, to sack Jerusalem and dismantle its walls. The divided Ayyubids failed to exploit success; Ismail of Damascus, Ayyub's uncle, even offered to cede Beaufort, Tiberias, and Saphet, then held by Dawud. It is unclear if the offer was accepted, as Theobald of Champagne and Peter of Dreux abandoned the crusade after a pilgrimage to Jerusalem in September 1241. The English force, c. 600–800 knights and additional troops, arrived in October 1240. Richard of Cornwall, Frederick II's brother-in-law, sided with the pro-imperialist faction in Jerusalem, favouring alliance with Egypt over Damascus. Ayyub, via his ally Dawud, offered to restore Jerusalem to the Franks and release the prisoners taken at Gaza. Richard accepted the proposal, which also upheld Ismail's earlier concessions, expanding the kingdom to its widest extent since 1187. With the agreement secured, the crusade ended in May 1241. Fall of the Crusader states The final phase of the Levantine Crusades was marked by Mongol intervention in Middle Eastern politics and the restoration of Muslim unity. Earlier, the Mongols had invaded Hungary and Poland, prompting Pope Gregory IX in June 1241 to call for a crusade against them, but the German host soon dispersed. The invasion ended unexpectedly later that year when the Great Khan, Ögödei, died, compelling the Mongols to withdraw. In his letters, Emperor Frederick II portrayed Richard of Cornwall as acting on his behalf in concluding the treaty with Ayyub. In 1242 the Templars, Frederick's foes, plundered Nablus in defiance of the treaty, provoking an Egyptian counterattack. The following year Frederick's son, Conrad—the absent king of Jerusalem—came of age, ending his father's claim to the regency. The barons appointed Conrad's heir presumptive, Alice of Champagne, as regent and seized Tyre, seat of Frederick's lieutenant, Filangieri. This break with Frederick gave Ayyub a pretext to reject papal proposals to renew the truce of 1229. The Mongols secured their position in Middle Eastern politics by defeating Kaykhusraw II, Sultan of Rum, at the Battle of Köse Dağ in June 1243. The Seljuks of Rum, the Ayyubids of Aleppo, and the Cilician Armenians soon accepted Mongol suzerainty, while Bohemond V of Antioch refused and warned Emperor Frederick and the newly elected Pope Innocent IV of the growing threat. Meanwhile, Ayyub of Egypt allied with fugitive Khwarazmian soldiers who had settled in Anatolia and Iraq. His rival, Ismail of Damascus, aligned with the Franks and sent forces to Gaza, prompting some 10,000 Khwarazmian horsemen to join Ayyub. Advancing south, they massacred thousands of Christians and sacked Jerusalem in July 1244. Shortly after, they joined the Egyptian army and crushed the Damascene–Frankish force at the Battle of La Forbie on 17 October. Thousands of Frankish troops were killed, leaving the kingdom virtually defenceless. Reports of Jerusalem's sack had scarcely reached Europe when Louis IX of France took the cross in December 1244. Recently recovered from severe illness, he was, according to contemporary accounts, inspired by a visionary experience to make the vow. Within two months Pope Innocent issued a new crusading bull and tasked Cardinal Odo of Châteauroux with preaching the crusade in France. Promising spiritual rewards, Odo urged the nobility to emulate the heroism of their forebears in the First and Third Crusades. Among the earliest to take the cross were Louis's brothers Robert of Artois, Alphonse of Poitiers, and Charles of Anjou. Elsewhere in Europe support was weaker: Henry III of England, recently defeated by Louis, barred Bishop Galeran of Beirut from preaching in his realm, while Haakon IV of Norway declined Louis's invitation despite having earlier taken the cross. Exiled in Lyon following his conflict with Emperor Frederick, Pope Innocent convened the First Council of Lyon in summer 1245 to plan a new crusade and address the state of the Latin Empire and the Mongol threat. Funds came from indulgence sales, donations, and a clerical tax; Louis added royal taxes and enforced Jewish contributions, totalling over 1,500,000 livres tournois in official accounts. At the council Frederick was deposed, but Louis forbade preaching a crusade against him in France. Meanwhile, war in the Levant continued: Ayyub expelled Ismail from Damascus in late 1245 and seized Galilee from the Franks in 1246. That year Alice died and was succeeded as regent by her son Henry I of Cyprus. In preparation for the crusade, Louis stockpiled food and wine in Cyprus, largely imported from Apulia with Emperor Frederick's approval. The fleet was supplied by Genoa and Marseilles, with extra ships built even in distant Scottish ports. It assembled at the new harbour of Aigues-Mortes, where thousands of volunteers—archers and foot soldiers—sought to enlist, but Louis refused them. The fleet sailed on 25 August 1248. The crusaders wintered in Cyprus, where plague killed many, though reinforcements, including Geoffrey II of Achaea with 400 knights, strengthened their ranks. They left Cyprus on 30 May 1249 and captured Damietta a week later. By then Ayyub was in poor health, yet the crusaders delayed advancing for months, fearing the Nile floods and awaiting reinforcements. They moved towards Cairo only days before Ayyub died on 23 November 1249. His death was concealed as envoys summoned his heir Turanshah from Iraq. Egyptians blocked the crusaders from crossing the Nile to Al Mansurah, a key garrison town, with Greek fire. On 6 February 1251 Robert of Artois led a surprise assault across a ford. The Egyptian commander Fakhr al-Din was killed, but Robert's forces were trapped in Al Mansurah's narrow streets and almost all were slain despite Louis's attempt to aid them. On 24 February Turanshah took power, blockading the crusaders' camp and unleashing famine and plague. Too weakened to retreat to Damietta, Louis surrendered on 6 April. Turanshah ordered the slaughter of the poor and sick but released the rest, including Louis, for 800,000 bezants and Damietta's surrender. Before this could be completed, he was assassinated by the Bakhri mamluks, who feared replacement by his own followers. With their backing Shajar al-Durr, Ayyub's widow, assumed power, while Turanshah's treaty remained in force. With Templar help, half the ransom (400,000 bezants) was paid, and Damietta evacuated before Louis and much of his army sailed for Acre on 6 May. The rest was secured by hostages. When news of Louis's defeat reached France, a charismatic preacher known as the "master of Hungary" proclaimed a new crusade. Claiming to bear a letter from the Virgin Mary to shepherds, he rallied followers from northern France and Flanders to free the Holy Land. This movement, later called the Shepherds' Crusade, attacked Jews in central France before royal forces dispersed it in June 1250. After the Egyptian defeat, most French crusaders, including Louis's surviving brothers, abandoned the campaign, but Louis remained in Palestine with c. 1,000 troops. He rebuilt Caesarea, Jaffa and Sidon, and fortified Acre's suburb using revenue from a crusader tax on the French clergy. Though Louis held no legal claim to the Kingdom of Jerusalem, still nominally ruled by the absent Conrad, his authority went unchallenged. After Emperor Frederick II died in December 1250, Conrad succeeded him in Sicily and Germany but was soon targeted by the political crusade proclaimed against his father. Amid struggles between mamluk factions and Ayyubid princes, Louis opened negotiations with the Egyptian leaders in early 1252. They pledged to free hostages and restore much of the Kingdom of Jerusalem, but the pact failed that April when they made peace with An-Nasir, Ayyubid ruler of Aleppo and Damascus. As unrest grew in France, Louis resolved to return. Before leaving the Holy Land on 24 April 1254, he secured a ten-year truce with An-Nasir and left a garrison of 100 knights at Acre under the distinguished soldier Geoffrey of Sergines. Shortly after Louis IX left Egypt for Acre, Shajar al-Durr married the Bahri commander Aybak, who became the first Mamluk sultan, founding a regime that ruled Egypt for over 250 years. Unlike the hereditary Ayyubids, the Mamluks chose rulers from the military elite and vigorously waged jihad to expel the Franks from the Levant. In the 1250s the Bahri commander Baybars was exiled amid factional rivalries, while his rival Qutuz seized power in Egypt. Louis's return to France in 1254 left a power vacuum in Jerusalem. The regent Henry I had been succeeded by his infant son Hugh II in Cyprus the previous year. Soon after Louis's departure King Conrad died; his two-year-old son Conradin, though residing in Bavaria, was recognised as king of Jerusalem. In 1255 Jerusalemite barons agreed a ten-year truce with Egypt, but next year rivalry between Venice and Genoa sparked the War of Saint Sabas, dividing merchant communities, military orders and the aristocracy in the Crusader states. In 1258 the child Hugh II was named regent for Conradin, with his mother Plaisance of Antioch acting for him. As some Mongols followed the Eastern Syriac (Nestorian) Church, hopes of alliance led popes and Louis IX to send envoys to the Great Khans, who instead demanded submission. In 1258 Hulegu, the Mongol il khan, sacked Baghdad and ended the Abbasid Caliphate. Bohemond VI of Antioch–Tripoli accepted Mongol suzerainty and joined their army to seize Damascus in 1260. Jerusalem's leaders, distrusting the Mongols, persuaded Pope Alexander IV to excommunicate Bohemond and proclaim an anti-Mongol crusade. Qutuz executed Hulegu's envoys, prompting a Mongol advance under the Christian commander Kitbuqa. Qutuz and Baybars reconciled and routed the Mongols at the Battle of Ain Jalut in 1260; they then occupied Muslim Syria. Baybars murdered Qutuz in October and assumed power in Egypt. Baybars allied with Hulegu's rival Berke Khan of the Golden Horde and from 1261 raided the Crusader states. He sacked Saint Symeon in 1262, destroyed the Church of the Annunciation in Nazareth and Acre's suburbs in 1263. After Hulegu's death in 1265 he launched a systematic conquest: seizing and razing Caesarea, Haifa and Arsuf, capturing Safed, and taking Jaffa, Beaufort and Antioch by 1268. His campaigns often included massacres of Christians. Meanwhile, as the historian Jean Richard notes, "those who wanted to earn the crusade indulgence did not lack opportunities" in Europe. The papacy, regarding Emperor Frederick II's heirs as its main foes, used crusade proclamations to tax clerical revenues. After King Conrad's death his half-brother Manfred of Sicily became the target of a crusade. In northern Italy the Ghibelline (anti-papal) brothers Ezzelino and Alberico da Romano were crushed. The Nicaean reconquest of Constantinople prompted a crusade against Emperor Michael VIII Palaiologos, but the exiled Latin emperor Baldwin II did not gain support. Only small contingents reached the Holy Land, among them French troops under Olivier de Termes, who replaced Geoffrey of Sergines in 1264, and Odo of Nevers with fifty knights in 1265. In 1264 Pope Urban IV granted Sicily to Charles of Anjou, who secured it by defeating Manfred at the Battle of Benevento in February 1266. Conradin—Manfred's nephew—tried to recover southern Italy but was crushed by a crusade at the Battle of Tagliacozzo in August 1268 and executed. Two rivals then claimed the throne of Jerusalem: Hugh III, successor to Hugh II in Cyprus, and his aunt Maria of Antioch. The Jerusalemite barons backed Hugh, but Maria maintained her claim. Baybars's conquests revived crusading zeal in Europe. Pope Urban IV called a new crusade in 1265, but planning advanced mainly under his successor Clement IV after Charles of Anjou's triumph in southern Italy. Clement first proposed a passagium particulare—a modest, quickly raised force—to sail by April 1267, but delayed when Louis IX of France again took the cross on 25 March 1267. Abaqa, the new Mongol il khan, offered alliance against the Mamluks, but war with his rival Baraq Khan hindered any Levantine campaign. Funding came from a three-year, 10% clerical levy, legacies, indulgence sales and plunder from Jewish bankers. Louis financed vassals with gifts and loans. He persuaded the English crown prince Edward to take the cross despite Henry III's objections, and won James I of Aragon's pledge by diplomacy. Louis's leadership strengthened during the long sede vacante after Clement's death in November 1268. He hired Genoese ships and mediated peace between Genoa and Venice. James sailed first from Barcelona on 4 September 1269, but a storm scattered his fleet and he soon abandoned the crusade. His two illegitimate sons, Fernando Sánchez and Pedro Fernández, reached Acre with fewer than 200 knights that October. They joined the French garrison, but were ambushed by Baybars. Over 10,000 French crusaders sailed from Aigues-Mortes on 2 July 1270. Louis chose to attack Tunis, capital of the new Hafsid Caliphate in North Africa. Some historians, including Peter Lock, attribute this to his brother Charles of Anjou, while others, such as Christopher Tyerman, see it as Louis's own plan to secure a base for invading Egypt. The crusaders reached North Africa on 18 July and captured Carthage, but plague ravaged the camp. Louis died on 25 August, the day Charles arrived. Charles assumed command and on 1 November made peace with the Caliph Muhammad: the crusaders withdrew for 210,000 gold ounces and rights to Christian worship and proselytism. English crusaders landed at Tunis as the French departed on 10 November. The fleets regrouped at Trapani, but a storm wrecked most ships. Philip III of France, Louis's successor, and Charles abandoned the crusade, but the English continued with Frisians and other small groups. These sailed to Acre, while the English wintered in Sicily, reaching Acre only on 9 May 1271. Meanwhile, Baybars seized Chastel Blanc from the Templars and Krak des Chevaliers and Gibelacar from the Hospitallers. Edward mounted no major operations; in June Baybars took Montfort Castle, the Franks' last inland stronghold, from the Teutonic Knights. On 12 May 1272 Baybars accepted an almost eleven-year truce, and Edward—now king of England—sailed from Acre in late September. The Italian cardinal Tedaldo Visconti was at Acre when elected pope as Gregory X. Convinced of a divine mission to recover the Holy Land, he worked with Philip III of France to send small knightly expeditions as a prelude to a major crusade. In 1272–1273 he commissioned reports for an ecumenical council. One, the Collectio scandalis Ecclesiae ('Collection of Church Abuses') by a Franciscan, condemned clerical taxation, redemption of crusader vows and secular feuds. The Dominican Humbert of Romans urged preaching as Christianity's principal means of conversion but upheld crusading as a duty to defend the faithful. The Franciscan Fidentius of Padua deemed crusades essential against Muslim obstinacy; the Dominican William of Tripoli preferred peaceful proselytism. The Second Council of Lyon opened on 7 May 1274. Gregory X proclaimed a new crusade, setting 1278 as the departure date and funding it by taxing clerical income for six years. The plan drew criticism: the troubadour Folquet de Lunel accused the pope of seeking to divert crusading zeal against Christian foes. Threatened by Charles of Anjou's ambitions, Emperor Michael VIII acknowledged papal supremacy but failed to enforce church union at home. Gregory ordered Charles to renew his truce with Byzantium, and talks began on Byzantine participation in the crusade. In 1275 Philip III, the new German king Rudolf of Habsburg and his rival Ottokar II of Bohemia took the cross, but preparations collapsed with Gregory's death next year. Yet the crusade tax remained in force. The subsequent period, as the historian Norman Housley observes, "was dominated by the ambitions of Charles of Anjou". In 1277 Charles purchased Maria of Antioch's claim to Jerusalem. Her rival Hugh III had withdrawn to Cyprus, and the barons of Jerusalem did homage to Charles. In 1279 the Bahri veteran Qalawun seized power in Mamluk Egypt. Facing revolts and a Mongol invasion of Syria, he renewed the truce with the Crusader states in 1281. That year Charles, backed by Pope Martin IV after Michael VIII failed to implement the 1274 church union, pursued an anti-Byzantine policy. Martin granted crusade indulgences for Charles's planned campaign, but the Sicilian Vespers—a popular uprising—forced him to recall his troops in 1282. As Peter III of Aragon backed the rebels, Martin proclaimed a crusade against Aragon, diverting funds raised for the Holy Land. The withdrawal of Charles's troops let Hugh III recover Tyre and Beirut. Meanwhile, Qalawun defeated the Mongols at the Second Battle of Homs and resumed jihad against the Franks, seizing the Hospitallers' last fortress, Margat, taking Latakia and capturing Tripoli by 1289. Tripoli's fall shocked the West. Pope Nicholas IV sent 4,000 livres tournois and 13 galleys to Acre; Edward I of England dispatched troops to strengthen its defences. A new crusade was preached, prompting a passagium particulare of 3,540 Italian infantry. In August 1290 they attacked Muslims at Acre despite the truce. In retaliation Qalawun's son, Khalil, besieged and took Acre on 28 May 1291. The last Frankish mainland strongholds soon fell, the final, Château Pèlerin, surrendering on 14 August. The conquests brought massacres and enslavement, with only a few escaping to Cyprus. Aftermath The fall of the Frankish East caused dismay rather than shock in the West and Western Christians mostly blamed the Franks' alleged immorality. Pope Nicholas IV appealed to Edward I of England to lead a new crusade and imposed a tax to fund it, but the Gascon War with Philip IV of France ended English plans in 1294. Schemes to recover the Holy Land inspired treatises: Fidentius of Padua and Charles II of Naples urged a blockade of Egypt; James of Molay, Templar Grand Master, called for a large crusade; and the Armenian prince-turned-monk Hayton of Corycus proposed a two-stage expedition. Amid frequent wars among Catholic powers, the crusade declined into a mainly political instrument. In 1297 Pope Boniface VIII proclaimed crusades against his enemies, the Colonna cardinals and Frederick III of Sicily. His taxation of clergy provoked conflict with France, leading to the Pope's seizure by French troops in 1303. French influence grew, and the papal court moved to Avignon. In 1307 French officers arrested all Templars on charges of corruption and heresy; Pope Clement V could not prevent the dissolution of their Order in 1312. The Templar trials prompted new crusade plans, notably by the French William of Nogaret and Pierre Dubois. In 1314 Philip IV took the cross with his sons and son-in-law Edward II of England, but his death next year ended the plan. Philip VI of France, the first Valois king, revived Levantine crusade proposals, but the Hundred Years' War stopped preparations in 1336. The final crusade for the Holy Land, the Alexandrian Crusade, was led by Peter I of Cyprus: in October 1365 crusaders sacked the Egyptian port of Alexandria but withdrew after a week. Elsewhere crusading endured. In 1391, Duke Philip the Bold of Burgundy proposed a new Crusade. In 1396, a French and Burgundian army joined the Kingdom of Hungary in an invasion of the Ottoman Empire. The crusading army was crushed at Nicopolis. In Iberia, crusading ended in 1492 with the fall of the Muslim Emirate of Granada to Castile and Aragon. In the Baltic, the Teutonic Knights waged anti-pagan wars, drawing crusaders from France, Germany and England until the 1410s. After the Reformation the papacy occasionally granted indulgences against Protestants, but more often Catholic powers formed papally sponsored Holy Leagues against the Ottoman Empire into the 17th century. Legacy The Crusades created national mythologies, tales of heroism, and a few place names. Historical parallelism and the tradition of drawing inspiration from the Middle Ages have become keystones of political Islam encouraging ideas of a modern jihad and a centuries-long struggle against Christian states, while secular Arab nationalism highlights the role of Western imperialism. Modern Muslim thinkers, politicians and historians have drawn parallels between the crusades and political developments such as the establishment of Israel in 1948. Right-wing circles in the western world have drawn opposing parallels, considering Christianity to be under an Islamic religious and demographic threat that is analogous to the situation at the time of the crusades. Crusader symbols and anti-Islamic rhetoric are presented as an appropriate response. These symbols and rhetoric are used to provide a religious justification and inspiration for a struggle against a religious enemy. Historiography The historiography of the Crusades is concerned with their "history of the histories" during the Crusader period. The subject is a complex one, with overviews provided in Select Bibliography of the Crusades, Modern Historiography, and Crusades (Bibliography and Sources). The histories describing the Crusades are broadly of three types: (1) The primary sources of the Crusades, which include works written in the medieval period, generally by participants in the Crusade or written contemporaneously with the event, letters and documents in archives, and archaeological studies; (2) secondary sources, beginning with early consolidated works in the 16th century and continuing to modern times; and (3) tertiary sources, primarily encyclopedias, bibliographies and genealogies. The primary sources for the Crusades are generally presented in the individual articles on each Crusade and summarised in the list of sources for the Crusades. For the First Crusade, this includes the original Latin chronicles, including the Gesta Francorum, works by Albert of Aachen and Fulcher of Chartres, the Alexiad by Byzantine princess Anna Komnene, the Complete Work of History by Muslim historian Ali ibn al-Athir, and the Chronicle of Armenian historian Matthew of Edessa. Many of these and related texts are found in the collections Recueil des historiens des croisades (RHC) and Crusade Texts in Translation. The work of William of Tyre, Historia Rerum in Partibus Transmarinis Gestarum, and its continuations by later historians complete the foundational work of the traditional Crusade. Some of these works also provide insight into the later Crusades and Crusader states. Other works include: After the fall of Acre, the crusades continued through the 16th century. Principal references on this subject are the Wisconsin Collaborative History of the Crusades and Norman Housley's The Later Crusades, 1274–1580: From Lyons to Alcazar. Complete bibliographies are also given in these works.[citation needed] The secondary sources of the Crusades began in the 16th century, with one of the first uses of the term crusades by 17th century French historian Louis Maimbourg in his Histoire des Croisades pour la délivrance de la Terre Sainte. Other works of the 18th century include Voltaire's Histoire des Croisades, and Edward Gibbon's Decline and Fall of the Roman Empire, excerpted as The Crusades, A.D. 1095–1261. This edition also includes an essay on chivalry by Walter Scott, whose works helped popularize the Crusades. Early in the 19th century, the monumental Histoire des Croisades was published by the French historian Joseph François Michaud, a major new narrative based on original sources. These histories have provided evolving views of the Crusades as discussed in detail in the Historiography writeup in Crusading movement. Modern works that serve as secondary source material are listed in the Bibliography section below. Three such works are: Louis Bréhier's multiple works on the Crusades in the Catholic Encyclopedia; the works of Ernest Barker in the Encyclopædia Britannica (11th edition), later expanded into a separate publication; and The Crusades: An Encyclopedia (2006), edited by historian Alan V. Murray. See also Notes References Bibliography Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-44] | [TOKENS: 10728]
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Grey_alien#cite_note-BiteTheDust-19] | [TOKENS: 2835]
Contents Grey alien Grey aliens, also referred to as Zeta Reticulans, Roswell Greys, or simply, Greys,[a] are purported extraterrestrial beings. They are frequently featured in claims of close encounter and alien abduction. Greys are typically described as having small, humanoid bodies, smooth, grey skin, disproportionately large, hairless heads, and large, black, almond-shaped eyes. The 1961 Barney and Betty Hill abduction claim was key to the popularization of Grey aliens. Precursor figures have been described in science fiction and similar descriptions appeared in later accounts of the 1947 Roswell UFO incident and early accounts of the 1948 Aztec UFO hoax. The Grey alien is cited an archetypal image of an intelligent non-human creature and extraterrestrial life in general, as well as an iconic trope of popular culture in the age of space exploration. Description Greys are typically depicted as grey-skinned, diminutive humanoid beings that possess reduced forms of, or completely lack, external human body parts such as noses, ears, or sex organs. Their bodies are usually depicted as being elongated, having a small chest, and lacking in muscular definition and visible skeletal structure. Their legs are depicted as being shorter and jointed differently from humans with limbs proportionally different from a human. Greys are depicted as having unusually large heads in proportion to their bodies, and as having no hair, no noticeable outer ears or noses, and small orifices for ears, nostrils, and mouths. In drawings, Greys are almost always shown with very large, opaque, black eyes, without eye whites. They are frequently described as shorter than average adult humans. The association between Grey aliens and Zeta Reticuli originated with the interpretation of a map drawn by Betty Hill by a school-teacher named Marjorie Fish sometime in 1969. Betty Hill, under hypnosis, had claimed to have been shown a map that displayed the aliens' home system and nearby stars. Upon learning of this, Fish attempted to create a model from a drawing produced by Hill, eventually determining that the stars marked as the aliens' home were Zeta Reticuli, a binary star system. History In literature, descriptions of beings similar to Grey aliens predate claims of supposed encounters with them. In 1893, H. G. Wells presented a description of humanity's future appearance in the article "The Man of the Year Million", describing humans as having no mouths, noses, or hair, and with large heads. In 1895, Wells also depicted the Eloi, a successor species to humanity, in similar terms in the novel The Time Machine. Both share many characteristics with future perceptions of Greys. As early as 1917, the occultist Aleister Crowley described a meeting with a "preternatural entity" named Lam that was similar in appearance to a modern Grey. Crowley claimed to have contacted Lam through a process called the "Amalantrah Workings," which he believed allowed humans to contact beings from outer space and across dimensions. Other occultists and ufologists, many of whom have retroactively linked Lam to later Grey encounters, have since described their own visitations from him, with one describing the being as a "cold, computer-like intelligence," and utterly beyond human comprehension. ...the creatures did not resemble any race of humans. They were short, shorter than the average Japanese, and their heads were big and bald, with strong, square foreheads, and very small noses and mouths, and weak chins. What was most extraordinary about them were the eyes—large, dark, gleaming, with a sharp gaze. They wore clothes made of soft grey fabric, and their limbs seemed to be similar to those of humans. In 1933, the Swedish novelist Gustav Sandgren, using the pen name Gabriel Linde, published a science fiction novel called Den okända faran (The Unknown Danger), in which he describes a race of extraterrestrials who wore clothes made of soft grey fabric and were short, with big bald heads, and large, dark, gleaming eyes. The novel, aimed at young readers, included illustrations of the imagined aliens. This description would become the template upon which the popular image of grey aliens is based. The conception remained a niche one until 1965, when newspaper reports of the Betty and Barney Hill abduction made the archetype famous. The alleged abductees, Betty and Barney Hill, claimed that in 1961, humanoid alien beings with greyish skin had abducted them and taken them to a flying saucer. In his 1990 article "Entirely Unpredisposed", Martin Kottmeyer suggested that Barney's memories revealed under hypnosis might have been influenced by an episode of the science-fiction television show The Outer Limits titled "The Bellero Shield", which was broadcast 12 days before Barney's first hypnotic session. The episode featured an extraterrestrial with large eyes, who says, "In all the universes, in all the unities beyond the universes, all who have eyes have eyes that speak." The report from the regression featured a scenario that was in some respects similar to the television show. In part, Kottmeyer wrote: Wraparound eyes are an extreme rarity in science fiction films. I know of only one instance. They appeared on the alien of an episode of an old TV series The Outer Limits entitled "The Bellero Shield." A person familiar with Barney's sketch in "The Interrupted Journey" and the sketch done in collaboration with the artist David Baker will find a "frisson" of "déjà vu" creeping up his spine when seeing this episode. The resemblance is much abetted by an absence of ears, hair, and nose on both aliens. Could it be by chance? Consider this: Barney first described and drew the wraparound eyes during the hypnosis session dated 22 February 1964. "The Bellero Shield" was first broadcast on 10 February 1964. Only twelve days separate the two instances. If the identification is admitted, the commonness of wraparound eyes in the abduction literature falls to cultural forces. — Martin Kottmeyer, Entirely Unpredisposed: The Cultural Background of UFO Reports Carl Sagan echoed Kottmeyer's suspicions in his 1997 book, The Demon Haunted World: Science as a Candle in the Dark, where Invaders from Mars was cited as another potential inspiration. After the Hills' encounter, Greys would go on to become an integral part of ufology and other extraterrestrial-related folklore. This is particularly true in the case of the United States: according to journalist C. D. B. Bryan, 73% of all reported alien encounters in the United States describe Grey aliens, a significantly higher proportion than other countries.: 68 During the early 1980s, Greys were linked to the alleged crash-landing of a flying saucer in Roswell, New Mexico, in 1947. A number of publications contained statements from individuals who claimed to have seen the U.S. military handling a number of unusually proportioned, bald, child-sized beings. These individuals claimed, during and after the incident, that the beings had oversized heads and slanted eyes, but scant other distinguishable facial features. In 1987, novelist Whitley Strieber published the book Communion, which, unlike his previous works, was categorized as non-fiction, and in which he describes a number of close encounters he alleges to have experienced with Greys and other extraterrestrial beings. The book became a New York Times bestseller, and New Line Cinema released a 1989 film adaption that starred Christopher Walken as Strieber. In 1988, Christophe Dechavanne interviewed the French science-fiction writer and ufologist Jimmy Guieu on TF1's Ciel, mon mardi !. Besides mentioning Majestic 12, Guieu described the existence of what he called "the little greys", which later on became better known in French under the name: les Petits-Gris. Guieu later wrote two docudramas, using as a plot the Grey aliens / Majestic-12 conspiracy theory as described by John Lear and Milton William Cooper: the series "E.B.E." (for "Extraterrestrial Biological Entity"): E.B.E.: Alerte rouge (first part) (1990) and E.B.E.: L'entité noire d'Andamooka (second part) (1991).[citation needed] Greys have since become the subject of many conspiracy theories. Many conspiracy theorists believe that Greys represent part of a government-led disinformation or plausible deniability campaign, or that they are a product of government mind-control experiments. During the 1990s, popular culture also began to increasingly link Greys to a number of military-industrial complex and New World Order conspiracy theories. In 1995, filmmaker Ray Santilli claimed to have obtained 22 reels of 16 mm film that depicted the autopsy of a "real" Grey supposedly recovered from the site of the 1947 incident in Roswell. In 2006, though, Santilli announced that the film was not original, but was instead a "reconstruction" created after the original film was found to have degraded. He maintained that a real Grey had been found and autopsied on camera in 1947, and that the footage released to the public contained a percentage of that original footage. Analysis Greys are often involved in alien abduction claims. Among reports of alien encounters, Greys make up about 50% in Australia, 73% in the United States, 48% in continental Europe, and around 12% in the United Kingdom.: 68 These reports include two distinct groups of Greys that differ in height.: 74 Abduction claims are often described as extremely traumatic, similar to an abduction by humans or even a sexual assault in the level of trauma and distress. The emotional impact of perceived abductions can be as great as that of combat, sexual abuse, and other traumatic events. The eyes are often a focus of abduction claims, which often describe a Grey staring into the eyes of an abductee when conducting mental procedures. This staring is claimed to induce hallucinogenic states or directly provoke different emotions. Neurologist Steven Novella proposes that Grey aliens are a byproduct of the human imagination, with the Greys' most distinctive features representing everything that modern humans traditionally link with intelligence. "The aliens, however, do not just appear as humans, they appear like humans with those traits we psychologically associate with intelligence." In 2005, Frederick V. Malmstrom, writing in Skeptic magazine, Volume 11, issue 4, presents his idea that Greys are actually residual memories of early childhood development. Malmstrom reconstructs the face of a Grey through transformation of a mother's face based on our best understanding of early-childhood sensation and perception. Malmstrom's study offers another alternative to the existence of Greys, the intense instinctive response many people experience when presented an image of a Grey, and the act of regression hypnosis and recovered-memory therapy in "recovering" memories of alien abduction experiences, along with their common themes. According to biologist Jack Cohen, the typical image of a Grey, assuming that it would have evolved from a world with different environmental and ecological conditions from Earth, is too physiologically similar to a human to be credible as a representation of an alien. The interdimensional hypothesis, the cryptoterrestrial hypothesis, and the time-traveller hypothesis attempt to provide an alternative explanation to the humanoid anatomy and behavior of these alleged beings. In popular culture Depictions of Grey aliens have gone on to appear in a number of films and television shows, supplanting the previously popular little green men. As early as 1966, for example, the superhero character Ultraman was explicitly based on them, and in 1977 they were featured in Close Encounters of the Third Kind. Greys have also been worked into space opera and other interstellar settings: in Babylon 5, the Greys are referred to as the "Vree", and are depicted as being allies and trade partners of 23rd-century Earth, while in the Stargate franchise they are called the "Asgard" and depicted as ancient astronauts allied with modern-day Earth.[citation needed] South Park refers to them as "visitors". During the 1990s, plotlines wherein Greys were linked to conspiracy theories became common. A well-known example is the Fox television series The X-Files, which first aired in 1993. It combined the quest to find proof of the existence of Grey-like extraterrestrials with a number of UFO conspiracy theory subplots, to form its primary story arc. Other notable examples include the XCOM video game franchise (where they are called "Sectoids"); Dark Skies, first broadcast in 1996, which expanded upon the MJ-12 conspiracy;[citation needed] and American Dad!, which features a Grey-like alien named Roger, whose backstory draws from both the Roswell incident and Area 51 conspiracy theories. The 2011 film Paul tells the story of a Grey named Paul who attributes the Greys' frequent presence in science fiction pop culture to the US government deliberately inserting the stereotypical Grey alien image into mainstream media; this is done so that if humanity came into contact with Paul's species, no immediate shock would occur as to their appearance. Child abduction by Greys is a key plot point in the 2013 film, Dark Skies. Greys appear in Syfy's 2021 science fiction dramedy series Resident Alien. The Greys appear as the main antagonistic faction in the 2023 independent game Greyhill Incident. See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Automata] | [TOKENS: 5062]
Contents Automaton An automaton (/ɔːˈtɒmətən/ ⓘ; pl.: automata or automatons) is a relatively self-operating machine or control mechanism designed to automatically follow a sequence of operations or respond to predetermined instructions. Some automata, such as bellstrikers in mechanical clocks, are designed to give the illusion to the casual observer that they are operating under their own power or will, like a mechanical robot. The term has long been commonly associated with automated puppets that resemble moving humans or animals, built to impress and/or to entertain people. Animatronics are a modern type of automata with electronics, often used for the portrayal of characters or creatures in films and in theme park attractions. Etymology The word automaton is the latinization of the Ancient Greek automaton (αὐτόματον), which means "acting of one's own will". It was first used by Homer to describe an automatic door opening, or automatic movement of wheeled tripods. It is more often used to describe non-electronic moving machines, especially those that have been made to resemble human or animal actions, such as the jacks on old public striking clocks, or the cuckoo and any other animated figures on a cuckoo clock. History There are many examples of automata in Greek mythology: Hephaestus created automata for his workshop; Talos was an artificial man of bronze; King Alkinous of the Phaiakians employed gold and silver watchdogs. According to Aristotle, Daedalus used quicksilver to make his wooden statue of Aphrodite move. In other Greek legends he used quicksilver to install voice in his moving statues. The automata in the Hellenistic world were intended as tools, toys, religious spectacles, or prototypes for demonstrating basic scientific principles. Numerous water-powered automata were built by Ktesibios, a Greek inventor and the first head of the Great Library of Alexandria; for example, he "used water to sound a whistle and make a model owl move. He had invented the world's first 'cuckoo clock'".[a] This tradition continued in Alexandria with inventors such as the Greek mathematician Hero of Alexandria (sometimes known as Heron), whose writings on hydraulics, pneumatics, and mechanics described siphons, a fire engine, a water organ, the aeolipile, and a programmable cart. Philo of Byzantium was famous for his inventions. Complex mechanical devices are known to have existed in Hellenistic Greece, though the only surviving example is the Antikythera mechanism, the earliest known analog computer. The clockwork is thought to have come originally from Rhodes, where there was apparently a tradition of mechanical engineering; the island was renowned for its automata; to quote Pindar's seventh Olympic Ode: However, the information gleaned from recent scans of the fragments indicate that it may have come from the colonies of Corinth in Sicily and implies a connection with Archimedes. According to Jewish legend, King Solomon used his wisdom to design a throne with mechanical animals which hailed him as king when he ascended it; upon sitting down an eagle would place a crown upon his head, and a dove would bring him a Torah scroll. It is also said that when King Solomon stepped upon the throne, a mechanism was set in motion. As soon as he stepped upon the first step, a golden ox and a golden lion each stretched out one foot to support him and help him rise to the next step. On each side, the animals helped the King up until he was comfortably seated upon the throne. In ancient China, a curious account of automata is found in the Lie Zi text, believed to have originated around 400 BCE and compiled around the fourth century CE. Within it there is a description of a much earlier encounter between King Mu of Zhou (1023–957 BCE) and a mechanical engineer known as Yan Shi, an 'artificer'. The latter proudly presented the king with a very realistic and detailed life-size, human-shaped figure of his mechanical handiwork: The king stared at the figure in astonishment. It walked with rapid strides, moving its head up and down, so that anyone would have taken it for a live human being. The artificer touched its chin, and it began singing, perfectly in tune. He touched its hand, and it began posturing, keeping perfect time...As the performance was drawing to an end, the robot winked its eye and made advances to the ladies in attendance, whereupon the king became incensed and would have had Yen Shih [Yan Shi] executed on the spot had not the latter, in mortal fear, instantly taken the robot to pieces to let him see what it really was. And, indeed, it turned out to be only a construction of leather, wood, glue and lacquer, variously coloured white, black, red and blue. Examining it closely, the king found all the internal organs complete—liver, gall, heart, lungs, spleen, kidneys, stomach and intestines; and over these again, muscles, bones and limbs with their joints, skin, teeth and hair, all of them artificial...The king tried the effect of taking away the heart, and found that the mouth could no longer speak; he took away the liver and the eyes could no longer see; he took away the kidneys and the legs lost their power of locomotion. The king was delighted. Other notable examples of automata include Archytas' dove, mentioned by Aulus Gellius. Similar Chinese accounts of flying automata are written of the 5th century BC Mohist philosopher Mozi and his contemporary Lu Ban, who made artificial wooden birds (ma yuan) that could successfully fly according to the Han Fei Zi and other texts. The manufacturing tradition of automata continued in the Greek world well into the Middle Ages. On his visit to Constantinople in 949 ambassador Liutprand of Cremona described automata in the emperor Theophilos' palace, including "lions, made either of bronze or wood covered with gold, which struck the ground with their tails and roared with open mouth and quivering tongue," "a tree of gilded bronze, its branches filled with birds, likewise made of bronze gilded over, and these emitted cries appropriate to their species" and "the emperor's throne" itself, which "was made in such a cunning manner that at one moment it was down on the ground, while at another it rose higher and was to be seen up in the air." Similar automata in the throne room (singing birds, roaring and moving lions) were described by Luitprand's contemporary the Byzantine emperor Constantine Porphyrogenitus, in his book De Ceremoniis (Perì tês Basileíou Tákseōs). In the mid-8th century, the first wind powered automata were built: "statues that turned with the wind over the domes of the four gates and the palace complex of the Round City of Baghdad". The "public spectacle of wind-powered statues had its private counterpart in the 'Abbasid palaces where automata of various types were predominantly displayed." Also in the 8th century, the Muslim alchemist, Jābir ibn Hayyān (Geber), included recipes for constructing artificial snakes, scorpions, and humans that would be subject to their creator's control in his coded Book of Stones. In 827, Abbasid caliph al-Ma'mun had a silver and golden tree in his palace in Baghdad, which had the features of an automatic machine. There were metal birds that sang automatically on the swinging branches of this tree built by Muslim inventors and engineers.[page needed] The Abbasid caliph al-Muqtadir also had a silver and golden tree in his palace in Baghdad in 917, with birds on it flapping their wings and singing. In the 9th century, the Banū Mūsā brothers invented a programmable automatic flute player and which they described in their Book of Ingenious Devices. Al-Jazari described complex programmable humanoid automata amongst other machines he designed and constructed in the Book of Knowledge of Ingenious Mechanical Devices in 1206. His automaton was a boat with four automatic musicians that floated on a lake to entertain guests at royal drinking parties. His mechanism had a programmable drum machine with pegs (cams) that bump into little levers that operate the percussion. The drummer could be made to play different rhythms and drum patterns if the pegs were moved around. Al-Jazari constructed a hand washing automaton first employing the flush mechanism now used in modern toilets. It features a female automaton standing by a basin filled with water. When the user pulls the lever, the water drains and the automaton refills the basin. His "peacock fountain" was another more sophisticated hand washing device featuring humanoid automata as servants who offer soap and towels. Mark E. Rosheim describes it as follows: "Pulling a plug on the peacock's tail releases water out of the beak; as the dirty water from the basin fills the hollow base a float rises and actuates a linkage which makes a servant figure appear from behind a door under the peacock and offer soap. When more water is used, a second float at a higher level trips and causes the appearance of a second servant figure—with a towel!" Al-Jazari thus appears to have been the first inventor to display an interest in creating human-like machines for practical purposes such as manipulating the environment for human comfort. Lamia Balafrej has also pointed out the prevalence of the figure of the automated slave in al-Jazari's treatise. Automated slaves were a frequent motif in ancient and medieval literature but it was not so common to find them described in a technical book. Balafrej has also written about automated female slaves, which appeared in timekeepers and as liquid-serving devices in medieval Arabic sources, thus suggesting a link between feminized forms of labor like housekeeping, medieval slavery, and the imaginary of automation. In 1066, the Chinese inventor Su Song built a water clock in the form of a tower which featured mechanical figurines which chimed the hours. Samarangana Sutradhara, a Sanskrit treatise by Bhoja (11th century), includes a chapter about the construction of mechanical contrivances (automata), including mechanical bees and birds, fountains shaped like humans and animals, and male and female dolls that refilled oil lamps, danced, played instruments, and re-enacted scenes from Hindu mythology. [better source needed] Villard de Honnecourt, in his 1230s sketchbook, depicted an early escapement mechanism in a drawing titled How to make an angel keep pointing his finger toward the Sun with an angel that would perpetually turn to face the sun. He also drew an automaton of a bird with jointed wings, which led to their design implementation in clocks. At the end of the thirteenth century, Robert II, Count of Artois, built a pleasure garden at his castle at Hesdin that incorporated several automata as entertainment in the walled park. The work was conducted by local workmen and overseen by the Italian knight Renaud Coignet. It included monkey marionettes, a sundial supported by lions and "wild men", mechanized birds, mechanized fountains and a bellows-operated organ. The park was famed for its automata well into the fifteenth century before it was destroyed by English soldiers in the sixteenth century. The Chinese author Xiao Xun wrote that when the Ming dynasty founder Hongwu (r. 1368–1398) was destroying the palaces of Khanbaliq belonging to the previous Yuan dynasty, there were—among many other mechanical devices—automata found that were in the shape of tigers. The Renaissance witnessed a considerable revival of interest in automata. Hero's treatises were edited and translated into Latin and Italian. Hydraulic and pneumatic automata, similar to those described by Hero, were created for garden grottoes. Giovanni Fontana, a Paduan engineer in 1420, developed Bellicorum instrumentorum liber[b] which includes a puppet of a camelid driven by a clothed primate twice the height of a human being and an automaton of Mary Magdalene. He also created mechanical devils and rocket-propelled animal automata. While functional, early clocks were also often designed as novelties and spectacles which integrated features of automata. Many big and complex clocks with automated figures were built as public spectacles in European town centres. One of the earliest of these large clocks was the Strasbourg astronomical clock, built in the 14th century which takes up the entire side of a cathedral wall. It contained an astronomical calendar, automata depicting animals, saints and the life of Christ. The mechanical rooster of Strasbourg clock was active from 1352 to 1789. The clock still functions to this day, but has undergone several restorations since its initial construction. The Prague astronomical clock was built in 1410, animated figures were added from the 17th century onwards. Numerous clockwork automata were manufactured in the 16th century, principally by the goldsmiths of the Free Imperial Cities of central Europe. These wondrous devices found a home in the cabinet of curiosities or Wunderkammern of the princely courts of Europe. In 1454, Duke Philip created an entertainment show named The extravagant Feast of the Pheasant, which was intended to influence the Duke's peers to participate in a crusade against the Ottomans but ended up being a grand display of automata, giants, and dwarves. A banquet in Camilla of Aragon's honor in Italy, 1475, featured a lifelike automated camel. The spectacle was a part of a larger parade which continued over days. Leonardo da Vinci sketched a complex mechanical knight, which he may have built and exhibited at a celebration hosted by Ludovico Sforza at the court of Milan around 1495. The design of Leonardo's robot was not rediscovered until the 1950s. A functional replica was later built that could move its arms, twist its head, and sit up. Da Vinci is frequently credited with constructing a mechanical lion, which he presented to King Francois I in Lyon in 1515. Although no record of the device's original designs remain, a recreation of this piece is housed at the Château du Clos Lucé. The Smithsonian Institution has in its collection a clockwork monk, about 15 in (380 mm) high, possibly dating as early as 1560. The monk is driven by a key-wound spring and walks the path of a square, striking his chest with his right arm, while raising and lowering a small wooden cross and rosary in his left hand, turning and nodding his head, rolling his eyes, and mouthing silent obsequies. From time to time, he brings the cross to his lips and kisses it. It is believed that the monk was manufactured by Juanelo Turriano, mechanician to the Holy Roman Emperor Charles V. The first description of a modern cuckoo clock was by the Augsburg nobleman Philipp Hainhofer in 1629. The clock belonged to Prince Elector August von Sachsen. By 1650, the workings of mechanical cuckoos were understood and were widely disseminated in Athanasius Kircher's handbook on music, Musurgia Universalis. In what is the first documented description of how a mechanical cuckoo works, a mechanical organ with several automated figures is described. In 18th-century Germany, clockmakers began making cuckoo clocks for sale. Clock shops selling cuckoo clocks became commonplace in the Black Forest region by the middle of the 18th century. Japan adopted clockwork automata in the early 17th century as "karakuri" puppets. In 1662, Takeda Omi completed his first butai karakuri and then built several of these large puppets for theatrical exhibitions. Karakuri puppets went through a golden age during the Edo period (1603–1867). A new attitude towards automata is to be found in René Descartes when he suggested that the bodies of animals are nothing more than complex machines – the bones, muscles and organs could be replaced with cogs, pistons, and cams. Thus mechanism became the standard to which Nature and the organism was compared. France in the 17th century was the birthplace of those ingenious mechanical toys that were to become prototypes for the engines of the Industrial Revolution. Thus, in 1649, when Louis XIV was still a child, François-Joseph de Camus designed for him a miniature coach, complete with horses and footmen, a page, and a lady within the coach; all these figures exhibited a perfect movement. According to Labat, General de Gennes constructed, in 1688, in addition to machines for gunnery and navigation, a peacock that walked and ate. Athanasius Kircher produced many automata to create Jesuit shows, including a statue which spoke and listened via a speaking tube. The world's first successfully-built biomechanical automaton is considered to be The Flute Player, which could play twelve songs, created by the French engineer Jacques de Vaucanson in 1737. He also constructed The Tambourine Player and the Digesting Duck, a mechanical duck that – apart from quacking and flapping its wings – gave the false illusion of eating and defecating, seeming to endorse Cartesian ideas that animals are no more than machines of flesh. In 1769, a chess-playing machine called the Turk, created by Wolfgang von Kempelen, made the rounds of the courts of Europe purporting to be an automaton.: 34 The Turk beat Benjamin Franklin in a game of chess when Franklin was ambassador to France.: 34–35 The Turk was actually operated from inside by a hidden human director, and was not a true automaton. Other 18th century automaton makers include the prolific Swiss Pierre Jaquet-Droz (see Jaquet-Droz automata) and his son Henri-Louis Jaquet-Droz, and his contemporary Henri Maillardet. Maillardet, a Swiss mechanic, created an automaton capable of drawing four pictures and writing three poems. Maillardet's Automaton is now part of the collections at the Franklin Institute Science Museum in Philadelphia. Belgian-born John Joseph Merlin created the mechanism of the Silver Swan automaton, now at Bowes Museum. A musical elephant made by the French clockmaker Hubert Martinet in 1774 is one of the highlights of Waddesdon Manor. Tipu's Tiger is another late-18th century example of automata, made for Tipu Sultan, featuring a European soldier being mauled by a tiger. Catherine the Great of Russia was gifted a very large and elaborate Peacock Clock created by James Cox in 1781 now on display in the Hermitage Museum in Saint Petersburg. According to philosopher Michel Foucault, Frederick the Great, king of Prussia from 1740 to 1786, was "obsessed" with automata. According to Manuel de Landa, "he put together his armies as a well-oiled clockwork mechanism whose components were robot-like warriors". In 1801, Joseph Jacquard built his loom automaton that was controlled autonomously with punched cards. Automata, particularly watches and clocks, were popular in China during the 18th and 19th centuries, and items were produced for the Chinese market. Strong interest by Chinese collectors in the 21st century brought many interesting items to market where they have had dramatic realizations. The famous magician Jean-Eugène Robert-Houdin (1805–1871) was known for creating automata for his stage shows.: 33 Automata that acted according to a set of preset instructions were popular with magicians during this time.: 33 In 1840, Italian inventor Innocenzo Manzetti constructed a flute-playing automaton, in the shape of a man, life-size, seated on a chair. Hidden inside the chair were levers, connecting rods and compressed air tubes, which made the automaton's lips and fingers move on the flute according to a program recorded on a cylinder similar to those used in player pianos. The automaton was powered by clockwork and could perform 12 different arias. As part of the performance, it would rise from the chair, bow its head, and roll its eyes. The period between 1860 and 1910 is known as "The Golden Age of Automata". Mechanical coin-operated fortune tellers were introduced to boardwalks in Britain and America. In Paris during this period, many small family based companies of automata makers thrived. From their workshops they exported thousands of clockwork automata and mechanical singing birds around the world. Although now rare and expensive, these French automata attract collectors worldwide. The main French makers were Bontems, Lambert, Phalibois, Renou, Roullet & Decamps, Theroude and Vichy. Abstract automata theory started in mid-20th century with finite automata; it is applied in branches of formal and natural science including computer science, physics, biology, as well as linguistics. Contemporary automata continue this tradition with an emphasis on art, rather than technological sophistication. Contemporary automata are represented by the works of Cabaret Mechanical Theatre in the United Kingdom, Thomas Kuntz, Arthur Ganson, Joe Jones and Le Défenseur du Temps by French artist Jacques Monestier. Since 1990 Dutch artist Theo Jansen has been building large automated PVC structures called strandbeest (beach animal) that can walk on wind power or compressed air. Jansen claims that he intends them to automatically evolve and develop artificial intelligence, with herds roaming freely over the beach. British sculptor Sam Smith (1908–1983) was a well-known maker of automata. In 2016, the NASA Innovative Advanced Concepts program studied a rover, the Automaton Rover for Extreme Environments, designed to survive for an extended time in Venus' environmental conditions. Unlike other modern automata, AREE is an automaton instead of a robot for practical reasons—Venus's harsh conditions, particularly its surface temperature of 462 °C (864 °F), make operating electronics there for any significant time impossible. It would be controlled by a mechanical computer and driven by wind power. Clocks Automaton clocks are clocks which feature automatons within or around the housing and typically activate around the beginning of each hour, at each half hour, or at each quarter hour. They were largely produced from the 1st century BC to the end of the Victorian times in Europe. Older clocks typically featured religious characters or other mythical characters such as Death or Father Time. As time progressed, however, automaton clocks began to feature influential characters at the time of creation, such as kings, famous composers, or industrialists. Examples of automaton clocks include chariot clocks and cuckoo clocks. The Cuckooland Museum exhibits autonomous clocks. While automaton clocks are largely perceived to have been in use during medieval times in Europe, they are largely produced in Japan today. In Automata theory, clocks are regarded as timed automatons, a type of finite automaton. Automaton clocks being finite essentially means that automaton clocks have a certain number of states in which they can exist. The exact number is the number of combinations possible on a clock with the hour, minute, and second hand: 43,200. The title of timed automaton declares that the automaton changes states at a set rate, which for clocks is 1 state change every second. Clock automata only takes as input the time displayed by the previous state. The automata uses this input to produce the next state, a display of time 1 second later than the previous. Clock automata often also use the previous state's input to 'decide' whether or not the next state requires merely changing the hands on the clock, or if a special function is required, such as a mechanical bird popping out of a house like in cuckoo clocks. This choice is evaluated through the position of complex gears, cams, axles, and other mechanical devices within the automaton. See also Notes References Further reading External links
========================================
[SOURCE: https://he.wikipedia.org/wiki/%D7%A7%D7%98%D7%92%D7%95%D7%A8%D7%99%D7%94:%D7%9E%D7%A9%D7%97%D7%A7%D7%99_%D7%A4%D7%A2%D7%95%D7%9C%D7%94-%D7%94%D7%A8%D7%A4%D7%AA%D7%A7%D7%90%D7%95%D7%AA] | [TOKENS: 251]
קטגוריה:משחקי פעולה-הרפתקאות קטגוריות־משנה דף קטגוריה זה כולל את 5 קטגוריות המשנה הבאות, מתוך 5 בקטגוריה כולה. (לתצוגת עץ) דפים בקטגוריה "משחקי פעולה-הרפתקאות" דף קטגוריה זה כולל את 181 הדפים הבאים, מתוך 181 בקטגוריה כולה. (לתצוגת עץ)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Saladin] | [TOKENS: 15674]
Contents Saladin Salah ad-Din Yusuf ibn Ayyub[a] (c. 1137 – 4 March 1193), commonly known as Saladin,[b] was a Kurdish commander and political leader. He was the founder of the Ayyubid dynasty and the first sultan of both Egypt and Syria. An important figure of the Third Crusade, he spearheaded the Muslim military effort against the Crusader states in the Levant. At the height of his power, the Ayyubid realm spanned Egypt, Syria, Upper Mesopotamia, the Hejaz, Yemen, and Nubia. Alongside his uncle Shirkuh, a Kurdish mercenary commander in service of the Zengid dynasty, Saladin was sent to Fatimid Egypt in 1164, on the orders of the Zengid ruler Nur ad-Din. With their original purpose being to help restore Shawar as the vizier to the teenage Fatimid caliph al-Adid, a power struggle ensued between Shirkuh and Shawar after the latter was reinstated. Saladin, meanwhile, climbed the ranks of the Fatimid government by virtue of his military successes against Crusader assaults and his personal closeness to al-Adid. After Shawar was assassinated and Shirkuh died in 1169, al-Adid appointed Saladin as vizier. During his tenure, Saladin, a Sunni Muslim, began to undermine the Fatimid establishment; following al-Adid's death in 1171, he abolished the Cairo-based Fatimid Caliphate who was an Isma'ili (a branch within Shia Islam), and realigned Egypt with the Baghdad-based Sunni Abbasid Caliphate. In the following years, Saladin led forays against the Crusaders in Palestine, commissioned the successful conquest of Yemen, and staved off pro-Fatimid rebellions in Egypt. Not long after Nur ad-Din died in 1174, Saladin launched his conquest of Syria, peacefully entering Damascus at the request of its governor. By mid-1175, Saladin had conquered Hama and Homs, inviting the animosity of other Zengid lords, who were the official rulers of Syria's principalities; he subsequently defeated the Zengids at the Battle of the Horns of Hama in 1175 and was thereafter proclaimed the Sultan of Egypt and Syria by the Abbasid caliph al-Mustadi. Saladin launched further conquests in northern Syria and Upper Mesopotamia, escaping two attempts on his life by the Order of Assassins before returning to Egypt in 1177 to address local issues there. By 1182, Saladin had completed the conquest of Syria after capturing Aleppo but failed to take over the Zengid stronghold of Mosul. Under Saladin's command, the Ayyubid army defeated the Crusaders at the decisive Battle of Hattin in 1187, capturing Jerusalem and re-establishing Muslim military dominance in the Levant. Although the Crusaders' Kingdom of Jerusalem persisted until the late 13th century, the defeat in 1187 marked a turning point in the Christian military effort against Muslim powers in the region. Saladin died in Damascus in 1193, having given away much of his personal wealth to his subjects; he is buried in a mausoleum adjacent to the Umayyad Mosque. Alongside his significance to Muslim culture, Saladin is revered prominently in Kurdish, Turkic, and Arab culture. He has frequently been described as the most famous Kurdish figure in history. Early life Saladin was born in Tikrit in present-day Iraq. His personal name was "Yusuf"; "Salah ad-Din" is a laqab, an honorific epithet, meaning "Righteousness of the Faith". His parents were of Kurdish ancestry,[excessive citations] and had originated from the village of Ajdanakan near the city of Dvin in central Armenia. He was the son of a Kurdish mercenary Najm ad-Din Ayyub. The Rawadiya tribe he hailed from had been partially assimilated into the Arabic-speaking world by this time. In Saladin's era, no scholar had more influence than sheikh Abdul Qadir Gilani, and Saladin was strongly influenced and aided by him and his pupils. In 1132, the defeated army of Zengi, Atabeg of Mosul, found their retreat blocked by the Tigris River opposite the fortress of Tikrit, where Saladin's father, Najm ad-Din Ayyub served as the warden. Ayyub provided ferries for the army and gave them refuge in Tikrit. Mujahid ad-Din Bihruz, a former Greek slave who had been appointed as the military governor of northern Mesopotamia for his service to the Seljuks, reprimanded Ayyub for giving Zengi refuge, and in 1137 banished Ayyub from Tikrit after his brother Asad ad-Din Shirkuh killed a friend of Bihruz. According to Baha ad-Din ibn Shaddad, Saladin was born the same night his family left Tikrit. In 1139, Ayyub and his family moved to Mosul, where Imad ad-Din Zengi acknowledged his debt and appointed Ayyub commander of his fortress in Baalbek. After the death of Zengi in 1146, his son, Nur ad-Din, became the regent of Aleppo and the leader of the Zengids. Saladin, who now lived in Damascus, was reportedly fond of the city, but information on his early childhood is scarce. About education, Saladin wrote, "Children are brought up in the way in which their elders were brought up". According to his biographers, Anne-Marie Eddé and al-Wahrani, Saladin was able to answer questions on Euclid, the Almagest, arithmetic, and law, but this was an academic ideal. It was his knowledge of the Qur'an and the "sciences of religion" that linked him to his contemporaries; several sources claim that during his studies, he was more interested in religious studies than joining the military. Another factor which may have affected his interest in religion was that, during the First Crusade, Jerusalem was taken by the Christians. In addition to Islam, Saladin knew the genealogies, biographies, and histories of the Arabs, as well as the bloodlines of Arabian horses. More significantly, he knew the Hamasah of Abu Tammam by heart. He spoke Kurdish and Arabic, and knew Turkish and Persian. Personality and religious leanings According to Baha ad-Din ibn Shaddad (one of Saladin's contemporary biographers), Saladin was a pious Muslim—he loved hearing Qur'an recitals, prayed punctually, and "hated the philosophers, those that denied God's attributes, the materialists and those who stubbornly rejected the Holy Law." He was also a supporter of Sufism and a patron of khanqahs (Sufi hostels) in Egypt and Syria, in addition to madrasas that provided orthodox Sunni teachings. Above all else he was a devotee of jihad: The sacred works [Koran, hadith, etc.] are full of passages referring to the jihad. Saladin was more assiduous and zealous in this than in anything else.... Jihad and the suffering involved in it weighed heavily on his heart and his whole being in every limb; he spoke of nothing else, thought only about equipment for the fight, was interested only in those who had taken up arms, had little sympathy with anyone who spoke of anything else or encouraged any other activity. In 1174, Saladin ordered the arrest of a Sufi mystic, Qadid al-Qaffas (Arabic: قديد القفاص), in Alexandria. In 1191, he ordered his son to execute the Sufi philosopher Yahya al-Suhrawardi, the founder of the Illuminationist current in Muslim philosophy, in Aleppo. Ibn Shaddad, who describes this event as part of his chapter on the sultan's piety, states that Al-Suhrawardi was said to have "rejected the Holy Law and declared it invalid." After consulting with some of the ulama (religious scholars), Saladin ordered al-Suhrawardi's execution. Saladin also opposed the Order of Assassins, an extremist Isma'ili Shi'i sect in Iran and Syria, seeing them as heretics and as being too close with the Crusaders. Saladin welcomed Asiatic Sufis to Egypt, and he and his followers founded and endowed many khanqahs and zawiyas of which al-Maqrizi gives a long list. However, it is not yet clear what Saladin's interests in the khanqah actually were and why he specifically wanted Sufis from outside Egypt. The answers to these questions lie in the kinds of Sufis he wished to attract. In addition to requiring that the Sufis come from outside Egypt, the waqfiyya seems to have specified that they be of a very particular type: The inhabitants of the khanqah were known for religious knowledge and piety and their baraka (blessings) was sought after... The founder stipulated that the khanqah be endowed for the Sufis as a group, those coming from abroad and settling in Cairo and Fustat. If those could not be found, then it would be for the poor jurists, either Shafi'i or Maliki, and Ash'ari in their creed. Early expeditions Saladin's military career began under the tutelage of his paternal uncle Shirkuh, a prominent military commander under Nur ad-Din, the Zengid emir of Damascus and Aleppo and the most influential teacher of Saladin. In 1163, Shawar, the vizier to al-Adid, who was the Fatimid caliph at the time, had been driven out of Egypt by his rival Dirgham, a member of the powerful Banu Ruzzaik tribe. He asked for military backing from Nur ad-Din, who complied, and in 1164, sent Shirkuh to aid Shawar in his expedition against Dirgham. Saladin, at age 26, went along with them. After Shawar was successfully reinstated as vizier, he demanded that Shirkuh withdraw his army from Egypt for a sum of 30,000 gold dinars, but he refused, insisting it was Nur ad-Din's will that he remain. Saladin's role in this expedition was minor, and it is known that he was ordered by Shirkuh to collect stores from Bilbais prior to its siege by a combined force of Crusaders and Shawar's troops. After the sacking of Bilbais, the Crusader–Egyptian force and Shirkuh's army were to engage in the Battle of al-Babein on the desert border of the Nile, just west of Giza. Saladin played a major role, commanding the right-wing of the Zengid army, while a force of Kurds commanded the left, and Shirkuh was stationed in the centre. Muslim sources at the time, however, put Saladin in the "baggage of the centre" with orders to lure the enemy into a trap by staging a feigned retreat. The Crusader force enjoyed early success against Shirkuh's troops, but the terrain was too steep and sandy for their horses, and commander Hugh of Caesarea was captured while attacking Saladin's unit. After scattered fighting in little valleys to the south of the main position, the Zengid central force returned to the offensive; Saladin joined in from the rear. The battle ended in a Zengid victory, and Saladin is credited with having helped Shirkuh in one of the "most remarkable victories in recorded history" according to Ibn al-Athir, although more of Shirkuh's men were killed and the battle is considered by most sources as not a total victory. Saladin and Shirkuh moved towards Alexandria where they were welcomed, given money and arms, and provided a base. Faced by a superior Crusader–Egyptian force attempting to besiege the city, Shirkuh split his army. He and the bulk of his force withdrew from Alexandria, while Saladin was left with the task of guarding the city, where he was besieged. In Egypt Shirkuh was in a power struggle over Egypt with Shawar and Amalric I of Jerusalem in which Shawar requested Amalric's assistance. In 1169, Shawar was reportedly assassinated by Saladin, and Shirkuh died later that year. Following his death, a number of candidates were considered for the role of vizier to al-Adid, most of whom were ethnic Kurds. Their ethnic solidarity came to shape the Ayyubid family's actions in their political career. Saladin and his close associates were wary of Turkish influence. On one occasion, Isa al-Hakkari, a Kurdish lieutenant of Saladin, urged a candidate for the viziership, Emir Qutb ad-Din al-Hadhbani, to step aside by arguing that "both you and Saladin are Kurds and you will not let the power pass into the hands of the Turks". Nur ad-Din chose a successor for Shirkuh, but al-Adid appointed Saladin to replace Shawar as vizier. The reasoning behind the Shia caliph al-Adid's selection of Saladin, a Sunni, varies. Ibn al-Athir claims that the caliph chose him after being told by his advisers that "there is no one weaker or younger" than Saladin, and "not one of the emirs [commanders] obeyed him or served him". However, according to this version, after some bargaining, he was eventually accepted by the majority of the emirs. Al-Adid's advisers were also suspected of promoting Saladin in an attempt to split the Syria-based Zengids. Al-Wahrani wrote that Saladin was selected because of the reputation of his family in their "generosity and military prowess". Imad ad-Din wrote that after the brief mourning period for Shirkuh, during which "opinions differed", the Zengid emirs decided upon Saladin and forced the caliph to "invest him as vizier". Although positions were complicated by rival Muslim leaders, the bulk of the Syrian commanders supported Saladin because of his role in the Egyptian expedition, in which he gained a record of military qualifications. Inaugurated as vizier on 26 March, Saladin repented "wine-drinking and turned from frivolity to assume the dress of religion", according to Arabic sources of the time. Having gained more power and independence than ever before in his career, he still faced the issue of ultimate loyalty between al-Adid and Nur ad-Din. Later in the year, a group of Egyptian soldiers and emirs attempted to assassinate Saladin, but having already known of their intentions thanks to his intelligence chief Ali ibn Safyan, he had the chief conspirator, Naji, Mu'tamin al-Khilafa—the civilian controller of the Fatimid Palace—arrested and killed. The day after, 50,000 Black African soldiers from the regiments of the Fatimid army opposed to Saladin's rule, along with Egyptian emirs and commoners, staged a revolt. By 23 August, Saladin had decisively quelled the uprising, and never again had to face a military challenge from Cairo. Towards the end of 1169, Saladin, with reinforcements from Nur ad-Din, defeated a massive Crusader-Byzantine force near Damietta. Afterwards, in the spring of 1170, Nur ad-Din sent Saladin's father to Egypt in compliance with Saladin's request, as well as encouragement from the Baghdad-based Abbasid caliph, al-Mustanjid, who aimed to pressure Saladin in deposing his rival caliph, al-Ad. Saladin himself had been strengthening his hold on Egypt and widening his support base there. He began granting his family members high-ranking positions in the region; he ordered the construction of a college for the Maliki branch of Sunni Islam in the city, as well as one for the Shafi'i denomination to which he belonged in al-Fustat. After establishing himself in Egypt, Saladin launched a campaign against the Crusaders, besieging Darum in 1170. Amalric withdrew his Knights Templar garrison from Gaza to assist him in defending Darum, but Saladin evaded their force and captured Gaza in 1187. In 1191 Saladin destroyed the fortifications in Gaza built by King Baldwin III for the Knights Templar. It is unclear exactly when, but during that same year, he attacked and captured the Crusader castle of Eilat, built on an island off the head of the Gulf of Aqaba. It did not pose a threat to the passage of the Muslim navy but could harass smaller parties of Muslim ships, and Saladin decided to clear it from his path. According to Imad ad-Din, Nur ad-Din wrote to Saladin in June 1171, telling him to reestablish the Abbasid caliphate in Egypt, which Saladin coordinated two months later after additional encouragement by Najm ad-Din al-Khabushani, the Shafi'i faqih, who vehemently opposed Shia rule in the country. Several Egyptian emirs were thus killed, but al-Adid was told that they were killed for rebelling against him. He then fell ill or was poisoned according to one account. While ill, he asked Saladin to pay him a visit to request that he take care of his young children, but Saladin refused, fearing treachery against the Abbasids, and is said to have regretted his action after realizing what al-Adid had wanted. He died on 13 September, and five days later, the Abbasid khutba was pronounced in Cairo and al-Fustat, proclaiming al-Mustadi as caliph. On 25 September, Saladin left Cairo to take part in a joint attack on Kerak and Montréal, the desert castles of the Kingdom of Jerusalem, with Nur ad-Din who would attack from Syria. Prior to arriving at Montreal, Saladin however withdrew back to Cairo as he received the reports that in his absence the Crusader leaders had increased their support to the traitors inside Egypt to attack Saladin from within and lessen his power, especially the Fatimid who started plotting to restore their past glory. Because of this, Nur ad-Din went on alone. During the summer of 1173, a Nubian army along with a contingent of Armenian former Fatimid troops were reported on the Egyptian border, preparing for a siege against Aswan. The emir of the city had requested Saladin's assistance and was given reinforcements under Turan-Shah, Saladin's brother. Consequently, the Nubians departed; but returned in 1173 and were again driven off. This time, Egyptian forces advanced from Aswan and captured the Nubian town of Ibrim. Saladin sent a gift to Nur ad-Din, who had been his friend and teacher, 60,000 dinars, "wonderful manufactured goods", some jewels, and an elephant. While transporting these goods to Damascus, Saladin took the opportunity to ravage the Crusader countryside. He did not press an attack against the desert castles, but attempted to drive out the Muslim Bedouins who lived in Crusader territory with the aim of depriving the Franks of guides. On 31 July 1173, Saladin's father Ayyub was wounded in a horse-riding accident, ultimately causing his death on 9 August. In 1174, Saladin sent Turan-Shah to conquer Yemen to allocate it and its port Aden to the territories of the Ayyubid Dynasty. Conquest of Syria In the early summer of 1174, Nur ad-Din was mustering an army, sending summons to Mosul, Diyar Bakr, and the Jazira in an apparent preparation of an attack against Saladin's Egypt. The Ayyubids held a council upon the revelation of these preparations to discuss the possible threat, and Saladin collected his own troops outside Cairo. On 15 May, Nur ad-Din died after falling ill the previous week and his power was handed to his eleven-year-old son as-Salih Ismail al-Malik. His death left Saladin with political independence and in a letter to as-Salih, he promised to "act as a sword" against his enemies and referred to the death of his father as an "earthquake shock". In the wake of Nur ad-Din's death, Saladin faced a difficult decision; he could move his army against the Crusaders from Egypt or wait until invited by as-Salih in Syria to come to his aid and launch a war from there. He could also take it upon himself to annex Syria before it could possibly fall into the hands of a rival, but he feared that attacking a land that formerly belonged to his master—forbidden in the Muslim principles in which he believed—could portray him as hypocritical, thus making him unsuitable for leading the war against the Crusaders. Saladin saw that in order to acquire Syria, he needed either an invitation from as-Salih or to warn him that potential anarchy could give rise to danger from the Crusaders. When as-Salih was removed to Aleppo in August, Gumushtigin, the emir of the city and a captain of Nur ad-Din's veterans assumed guardianship over him. The emir prepared to unseat all his rivals in Syria and the Jazira, beginning with Damascus. In this emergency, the emir of Damascus appealed to Saif ad-Din of Mosul (a cousin of Gumushtigin) for assistance against Aleppo, but he refused, forcing the Syrians to request the aid of Saladin, who complied. Saladin rode across the desert with 700 picked horsemen, passing through al-Kerak then reaching Bosra. According to his own account, was joined by "emirs, soldiers, and Bedouins—the emotions of their hearts to be seen on their faces." On 23 November, he arrived in Damascus amid general acclamation and rested at his father's old home there, until the gates of the Citadel of Damascus, whose commander Raihan initially refused to surrender, were opened to Saladin four days later, after a brief siege by his brother Tughtakin ibn Ayyub. He installed himself in the castle and received the homage and salutations of the inhabitants. Leaving his brother Tughtakin ibn Ayyub as Governor of Damascus, Saladin proceeded to reduce other cities that had belonged to Nur ad-Din, but were now practically independent. His army conquered Hama with relative ease, but avoided attacking Homs because of the strength of its citadel. Saladin moved north towards Aleppo, besieging it on 30 December after Gumushtigin refused to abdicate his throne. As-Salih, fearing capture by Saladin, came out of his palace and appealed to the inhabitants not to surrender him and the city to the invading force. One of Saladin's chroniclers claimed "the people came under his spell". Gumushtigin requested Rashid ad-Din Sinan, chief da'i of the Order of Assassins of Syria, who were already at odds with Saladin since he replaced the Fatimids of Egypt, to assassinate Saladin in his camp. On 11 May 1175, a group of thirteen Assassins easily gained admission into Saladin's camp, but were detected immediately before they carried out their attack by Nasih ad-Din Khumartekin of Abu Qubays. One was killed by one of Saladin's generals and the others were slain while trying to escape. To deter Saladin's progress, Raymond of Tripoli gathered his forces by Nahr al-Kabir, where they were well placed for an attack on Muslim territory. Saladin later moved toward Homs instead, but retreated after being told a relief force was being sent to the city by Saif ad-Din. Meanwhile, Saladin's rivals in Syria and Jazira waged a propaganda war against him, claiming he had "forgotten his own condition [servant of Nur ad-Din]" and showed no gratitude for his old master by besieging his son, rising "in rebellion against his Lord". Saladin aimed to counter this propaganda by ending the siege, claiming that he was defending Islam from the Crusaders; his army returned to Hama to engage a Crusader force there. The Crusaders withdrew beforehand and Saladin proclaimed it "a victory opening the gates of men's hearts". Soon after, Saladin entered Homs and captured its citadel in March 1175, after stubborn resistance from its defenders. Saladin's successes alarmed Saif ad-Din. As head of the Zengids, including Gumushtigin, he regarded Syria and Mesopotamia as his family estate and was angered when Saladin attempted to usurp his dynasty's holdings. Saif ad-Din mustered a large army and dispatched it to Aleppo, whose defenders anxiously had awaited them. The combined forces of Mosul and Aleppo marched against Saladin in Hama. Heavily outnumbered, Saladin initially attempted to make terms with the Zengids by abandoning all conquests north of the Damascus province, but they refused, insisting he return to Egypt. Seeing that confrontation was unavoidable, Saladin prepared for battle, taking up a superior position at the Horns of Hama, hills by the gorge of the Orontes River. On 13 April 1175, the Zengid troops marched to attack his forces, but soon found themselves surrounded by Saladin's Ayyubid veterans, who crushed them. The battle ended in a decisive victory for Saladin, who pursued the Zengid fugitives to the gates of Aleppo, forcing as-Salih's advisers to recognize Saladin's control of the provinces of Damascus, Homs, and Hama, as well as a number of towns outside Aleppo such as Ma'arat al-Numan. After his victory against the Zengids, Saladin proclaimed himself king, and suppressed the name of as-Salih in Friday prayers and Muslim coinage. From then on, he ordered prayers in all the mosques of Syria and Egypt as the sovereign king and he issued at the Cairo mint gold coins bearing his official title—al-Malik an-Nasir Yusuf Ayyub, ala ghaya "the King Strong to Aid, Joseph son of Job; exalted be the standard." The Abbasid caliph in Baghdad graciously welcomed Saladin's assumption of power and declared him "Sultan of Egypt and Syria". The Battle of Hama did not end the contest for power between the Ayyubids and the Zengids, with the final confrontation occurring in the spring of 1176. Saladin had gathered massive reinforcements from Egypt while Saif ad-Din was levying troops among the minor states of Diyarbakir and al-Jazira. When Saladin crossed the Orontes, leaving Hama, the sun was eclipsed. He viewed this as an omen, but he continued his march north. He reached the Sultan's Mound, roughly 25 km (16 mi) from Aleppo, where his forces encountered Saif ad-Din's army. A hand-to-hand fight ensued and the Zengids managed to plough Saladin's left-wing, driving it before him when Saladin himself charged at the head of the Zengid guard. The Zengid forces panicked and most of Saif ad-Din's officers ended up being killed or captured—Saif ad-Din narrowly escaped. The Zengid army's camp, horses, baggage, tents, and stores were seized by the Ayyubids. The Zengid prisoners of war, however, were given gifts and freed. All of the booty from the Ayyubid victory was accorded to the army, Saladin not keeping anything himself. He continued towards Aleppo, which still closed its gates to him, halting before the city. On the way, his army took Buza'a and then captured Manbij. From there, they headed west to besiege the fortress of A'zaz on 15 May. Several days later, while Saladin was resting in one of his captain's tents, an Assassin rushed forward at him and struck at his head with a knife. The cap of his head armour was not penetrated and he managed to grip the Assassin's hand—the dagger only slashing his gambeson—and the assailant was soon killed. Saladin was unnerved at the attempt on his life, which he accused Gumushtugin and the Assassins of plotting, and so increased his efforts in the siege. A'zaz capitulated on 21 June, and Saladin then hurried his forces to Aleppo to punish Gumushtigin. His assaults were again resisted, but he managed to secure not only a truce, but a mutual alliance with Aleppo, in which Gumushtigin and as-Salih were allowed to continue their hold on the city, and in return, they recognized Saladin as the sovereign over all of the dominions he conquered. The emirs of Mardin and Keyfa, the Muslim allies of Aleppo, also recognised Saladin as the King of Syria. When the treaty was concluded, the younger sister of as-Salih came to Saladin and requested the return of the Fortress of A'zaz; he complied and escorted her back to the gates of Aleppo with numerous presents. Saladin had by now agreed to truces with his Zengid rivals and the Kingdom of Jerusalem (the latter occurred in the summer of 1175), but faced a threat from the Order of Assassins, led by Rashid ad-Din Sinan. Based in the an-Nusayriyah Mountains, they commanded nine fortresses, all built on high elevations. As soon as he dispatched the bulk of his troops to Egypt, Saladin led his army into the an-Nusayriyah range in August 1176. He retreated the same month, after laying waste to the countryside, but failing to conquer any of the forts. Most Muslim historians claim that Saladin's uncle, the governor of Hama, mediated a peace agreement between him and Sinan. Saladin had his guards supplied with link lights and had chalk and cinders strewed around his tent outside Masyaf—which he was besieging—to detect any footsteps by the Assassins. According to this version, one night Saladin's guards noticed a spark glowing down the hill of Masyaf and then vanishing among the Ayyubid tents. Presently, Saladin awoke to find a figure leaving the tent. He saw that the lamps were displaced and beside his bed laid hot scones of the shape peculiar to the Assassins with a note at the top pinned by a poisoned dagger. The note threatened that he would be killed if he did not withdraw from his assault. Saladin gave a loud cry, exclaiming that Sinan himself was the figure that had left the tent. Another version claims that Saladin hastily withdrew his troops from Masyaf because they were urgently needed to fend off a Crusader force in the vicinity of Mount Lebanon. In reality, Saladin sought to form an alliance with Sinan and his Assassins, consequently depriving the Crusaders of a potent ally against him. Viewing the expulsion of the Crusaders as a mutual benefit and priority, Saladin and Sinan maintained cooperative relations afterwards, the latter dispatching contingents of his forces to bolster Saladin's army in a number of decisive subsequent battlefronts. Return to Cairo and forays in Palestine After leaving the an-Nusayriyah Mountains, Saladin returned to Damascus, and had his Syrian soldiers return home. He left Turan Shah in command of Syria and left for Egypt with only his personal followers, reaching Cairo on 22 September. Having been absent for roughly two years, he had much to organize and supervise in Egypt, namely fortifying and reconstructing Cairo. The city walls were repaired and their extensions laid out, while the construction of the Cairo Citadel was commenced. The 280 feet (85 m) deep Bir Yusuf ("Joseph's Well") was built on Saladin's orders. The chief public work he commissioned outside of Cairo was the large bridge at Giza, which was intended to form an outwork of defence against a potential Moorish invasion. Saladin remained in Cairo supervising its improvements, building colleges such as the Madrasa of the Sword Makers and ordering the internal administration of the country. In November 1177, he set out upon a raid into Palestine; the Crusaders had recently forayed into the territory of Damascus, so Saladin saw the truce as no longer worth preserving. The Christians sent a large portion of their army to besiege the fortress of Harim north of Aleppo, so southern Palestine bore few defenders. Saladin found the situation ripe and marched to Ascalon, which he referred to as the "Bride of Syria". William of Tyre recorded that the Ayyubid army consisted of 26,000 soldiers, of which 8,000 were elite forces and 18,000 were black soldiers from Sudan. This army proceeded to raid the countryside, sack Ramla and Lod, and disperse themselves as far as the Gates of Jerusalem. The Ayyubids allowed Baldwin IV of Jerusalem to enter Ascalon with his famous Gaza-based Knights Templar without taking any precautions against a sudden attack. Although the Crusader force consisted of only 375 knights, Saladin hesitated to ambush them because of the presence of highly skilled templar generals. On 25 November, while the greater part of the Ayyubid army was absent, Saladin and his men were surprised near Ramla in the battle of Montgisard (possibly at Gezer, also known as Tell Jezar). Before they could form up, the Templar force hacked the Ayyubid army down by body-to-body of sword. Initially, Saladin attempted to organize his men into battle order, but as his bodyguards were being killed, he saw that defeat was inevitable and so with a small remnant of his troops mounted a swift camel, riding all the way to the territories of Egypt. Not discouraged by his defeat at Montgisard, Saladin was prepared to fight the Crusaders once again. In the spring of 1178, he was encamped under the walls of Homs, and a few skirmishes occurred between his generals and the Crusader army. His forces in Hama won a victory over their enemy and brought the spoils, together with many prisoners of war, to Saladin who ordered the captives to be beheaded for "plundering and laying waste the lands of the Faithful". He spent the rest of the year in Syria without a confrontation with his enemies. Saladin's intelligence services reported to him that the Crusaders were planning a raid into Syria. He ordered one of his generals, Farrukh-Shah, to guard the Damascus frontier with a thousand of his men to watch for an attack, then to retire, avoiding battle, and to light warning beacons on the hills, after which Saladin would march out. In April 1179, the Crusaders and Templars led by King Baldwin expected no resistance and waited to launch a surprise attack on Muslim herders grazing their herds and flocks east of the Golan Heights. Baldwin advanced too rashly in pursuit of Farrukh-Shah's force, which was concentrated southeast of Quneitra and was subsequently defeated by the Ayyubids. With this victory, Saladin decided to call in more troops from Egypt; he requested al-Adil to dispatch 1,500 horsemen. In the summer of 1179, King Baldwin had set up an outpost on the road to Damascus and aimed to fortify a passage over the Jordan River, known as Jacob's Ford, that commanded the approach to the Banias plain (the plain was divided by the Muslims and the Christians). Saladin had offered 100,000 gold pieces to Baldwin to abandon the project, which was particularly offensive to the Muslims, but to no avail. He then resolved to destroy the fortress, called "Chastellet" and defended by the Templars knights, moving his headquarters to Banias. As the Crusaders hurried down to attack the Muslim forces, they fell into disorder, with the infantry falling behind. Despite early success, they pursued the Muslims far enough to become scattered, and Saladin took advantage by rallying his troops and charging at the Crusaders. The engagement ended in a decisive Ayyubid victory, and many high-ranking knights were captured. Saladin then moved to besiege the fortress, which fell on 30 August 1179. In the spring of 1180, while Saladin was in the area of Safad, anxious to commence a vigorous campaign against the Kingdom of Jerusalem, King Baldwin sent messengers to him with proposals of peace. Because droughts and bad harvests hampered his commissariat, Saladin agreed to a truce. Raymond of Tripoli denounced the truce but was compelled to accept after an Ayyubid raid on his territory in May and upon the appearance of Saladin's naval fleet off the port of Tartus. Domestic affairs In June 1180, Saladin hosted a reception for Nur ad-Din Muhammad, the Artuqid emir of Keyfa, at Geuk Su, in which he presented him and his brother Abu Bakr with gifts, valued at over 100,000 dinars according to Imad ad-Din. This was intended to cement an alliance with the Artuqids and to impress other emirs in Mesopotamia and Anatolia. Previously, Saladin offered to mediate relations between Nur ad-Din and Kilij Arslan II—the Seljuk sultan of Rûm—after the two came into conflict. The latter demanded that Nur ad-Din return the lands given to him as a dowry for marrying his daughter when he received reports that she was being abused and used to gain Seljuk territory. Nur ad-Din asked Saladin to mediate the issue, but Arslan refused. After Nur ad-Din and Saladin met at Geuk Su, the top Seljuk emir, Ikhtiyar ad-Din al-Hasan, confirmed Arslan's submission, after which an agreement was drawn up. Saladin was later enraged when he received a message from Arslan accusing Nur ad-Din of more abuses against his daughter. He threatened to attack the city of Malatya, saying, "it is two days march for me and I shall not dismount [my horse] until I am in the city." Alarmed at the threat, the Seljuks pushed for negotiations. Saladin felt that Arslan was correct to care for his daughter, but Nur ad-Din had taken refuge with him, and therefore he could not betray his trust. It was finally agreed that Arslan's daughter would be sent away for a year, and if Nur ad-Din failed to comply, Saladin would move to abandon his support for him. Leaving Farrukh-Shah in charge of Syria, Saladin returned to Cairo at the beginning of 1181. According to Abu Shama, he intended to spend the fast of Ramadan in Egypt and then make the hajj pilgrimage to Mecca in the summer. For an unknown reason, he apparently changed his plans regarding the pilgrimage and was seen inspecting the Nile River banks in June. He was again embroiled with the Bedouin; he removed two-thirds of their fiefs to use as compensation for the fief-holders at Fayyum. The Bedouin were also accused of trading with the Crusaders and, consequently, their grain was confiscated and they were forced to migrate westward. Later, Ayyubid warships were deployed against Bedouin river pirates, who were plundering the shores of Lake Tanis. In the summer of 1181, Saladin's former palace administrator Baha ad-Din Qaraqush led a force to arrest Majd ad-Din—a former deputy of Turan-Shah in the Yemeni town of Zabid—while he was entertaining Imad ad-Din al-Ishfahani at his estate in Cairo. Saladin's intimates accused Majd ad-Din of misappropriating the revenues of Zabid, but Saladin himself believed there was no evidence to back the allegations. He had Majd ad-Din released in return for a payment of 80,000 dinars. In addition, other sums were to be paid to Saladin's brothers al-Adil and Taj al-Muluk Buri. The controversial detainment of Majd ad-Din was a part of the larger discontent associated with the aftermath of Turan-Shah's departure from Yemen. Although his deputies continued to send him revenues from the province, centralized authority was lacking and an internal quarrel arose between Izz ad-Din Uthman of Aden and Hittan of Zabid. Saladin wrote in a letter to al-Adil: "this Yemen is a treasure house ... We conquered it, but up to this day we have had no return and no advantage from it. There have been only innumerable expenses, the sending out of troops ... and expectations which did not produce what was hoped for in the end." Imperial expansions Saif ad-Din had died earlier in June 1181 and his brother Izz ad-Din inherited leadership of Mosul. On 4 December, the crown prince of the Zengids, as-Salih, died in Aleppo. Prior to his death, he had his chief officers swear an oath of loyalty to Izz ad-Din, as he was the only Zengid ruler strong enough to oppose Saladin. Izz ad-Din was welcomed in Aleppo, but possessing it and Mosul put too great of a strain on his abilities. He thus handed Aleppo to his brother Imad ad-Din Zangi, in exchange for Sinjar. Saladin offered no opposition to these transactions in order to respect the treaty he previously made with the Zengids. On 11 May 1182, Saladin, along with half of the Egyptian Ayyubid army and numerous non-combatants, left Cairo for Syria. On the evening before he departed, he sat with his companions and the tutor of one of his sons quoted a line of poetry: "enjoy the scent of the ox-eye plant of Najd, for after this evening it will come no more". Saladin took this as an evil omen and he never saw Egypt again. Knowing that Crusader forces were massed upon the frontier to intercept him, he took the desert route across the Sinai Peninsula to Ailah at the head of the Gulf of Aqaba. Meeting no opposition, Saladin ravaged the countryside of Montreal, whilst Baldwin's forces watched on, refusing to intervene. He arrived in Damascus in June to learn that Farrukh-Shah had attacked the Galilee, sacking Daburiyya and capturing Habis Jaldek, a fortress of great importance to the Crusaders. In July, Saladin led his army across the Jordan and into Galilee, where he marched south to sack Bethsan. He was met by a substantial Crusader force in an inconclusive battle near Belvoir Castle; he was unable to destroy the Christian army, and could not logistically sustain his own army any longer, so he withdrew across the river. In August, he passed through the Beqaa Valley to Beirut, where he rendezvoused with the Egyptian fleet and laid siege to the city. Failing to make any headway, he withdrew after a few days to deal with matters in Mesopotamia. Kukbary (Muzaffar ad-Din Gökböri), the emir of Harran, invited Saladin to occupy the Jazira region, making up northern Mesopotamia. He complied and the truce between him and the Zengids officially ended in September 1182. Prior to his march to Jazira, tensions had grown between the Zengid rulers of the region, primarily concerning their unwillingness to pay deference to Mosul. Before he crossed the Euphrates, Saladin besieged Aleppo for three days, signaling that the truce was over. Once he reached Bira, near the river, he was joined by Kukbary and Nur ad-Din of Hisn Kayfa and the combined forces captured the cities of Jazira, one after the other. First, Edessa fell, followed by Saruj, then Raqqa, Qirqesiya and Nusaybin. Raqqa was an important crossing point and held by Qutb ad-Din Inal, who had lost Manbij to Saladin in 1176. Upon seeing the large size of Saladin's army, he made little effort to resist and surrendered on the condition that he would retain his property. Saladin promptly impressed the inhabitants of the town by publishing a decree that ordered a number of taxes to be canceled and erased all mention of them from treasury records, stating "the most miserable rulers are those whose purses are fat and their people thin". From Raqqa, he moved to conquer al-Fudain, al-Husain, Maksim, Durain, 'Araban, and Khabur—all of which swore allegiance to him. Saladin proceeded to take Nusaybin which offered no resistance. A medium-sized town, Nusaybin was not of great importance, but it was located in a strategic position between Mardin and Mosul and within easy reach of Diyarbakir. In the midst of these victories, Saladin received word that the Crusaders were raiding the villages of Damascus. He replied, "Let them... whilst they knock down villages, we are taking cities; when we come back, we shall have all the more strength to fight them." Meanwhile, in Aleppo, the emir of the city Zangi raided Saladin's cities to the north and east, such as Balis, Manbij, Saruj, Buza'a, al-Karzain. He also destroyed his own citadel at A'zaz to prevent it from being used by the Ayyubids if they were to conquer it. As Saladin approached Mosul, he faced the issue of taking over a large city and justifying the action. The Zengids of Mosul appealed to an-Nasir, the Abbasid caliph at Baghdad whose vizier favored them. An-Nasir sent Badr al-Badr (a high-ranking religious figure) to mediate between the two sides. Saladin arrived at the city on 10 November 1182. Izz ad-Din would not accept his terms because he considered them disingenuous and extensive, and Saladin immediately laid siege to the heavily fortified city. After several minor skirmishes and a stalemate in the siege that was initiated by the caliph, Saladin intended to find a way to withdraw without damage to his reputation while still keeping up some military pressure. He decided to attack Sinjar, which was held by Izz ad-Din's brother Sharaf ad-Din. It fell after a 15-day siege on 30 December. Saladin's soldiers broke their discipline, plundering the city; Saladin managed to protect the governor and his officers only by sending them to Mosul. After establishing a garrison at Sinjar, he awaited a coalition assembled by Izz ad-Din consisting of his forces, those from Aleppo, Mardin, and Armenia. Saladin and his army met the coalition at Harran in February 1183, but on hearing of his approach, the latter sent messengers to Saladin asking for peace. Each force returned to their cities and al-Fadil wrote: "They [Izz ad-Din's coalition] advanced like men, like women they vanished." On 2 March, al-Adil from Egypt wrote to Saladin that the Crusaders had struck the "heart of Islam". Raynald de Châtillon had sent ships to the Gulf of Aqaba to raid towns and villages off the coast of the Red Sea. It was not an attempt to extend the Crusader influence into that sea or to capture its trade routes, but merely a piratical move. Nonetheless, Imad ad-Din writes the raid was alarming to the Muslims because they were not accustomed to attacks on that sea, and Ibn al-Athir adds that the inhabitants had no experience with the Crusaders either as fighters or traders. Ibn Jubair was told that sixteen Muslim ships were burnt by the Crusaders, who then captured a pilgrim ship and caravan at Aidab. He also reported that they intended to attack Medina and remove Muhammad's body. Al-Maqrizi added to the rumor by claiming Muhammad's tomb was going to be relocated to Crusader territory so Muslims would make pilgrimages there. Al-Adil had his warships moved from Fustat and Alexandria to the Red Sea under the command of an Armenian mercenary Lu'lu. They broke the Crusader blockade, destroyed most of their ships, and pursued and captured those who anchored and fled into the desert. The surviving Crusaders, numbered at 170, were ordered to be killed by Saladin in various Muslim cities. From the point of view of Saladin, in terms of territory, the war against Mosul was going well, but he still failed to achieve his objectives and his army was shrinking; Taqi ad-Din took his men back to Hama, while Nasir ad-Din Muhammad and his forces had left. This encouraged Izz ad-Din and his allies to take the offensive. The previous coalition regrouped at Harzam some 140 km from Harran. In early April, without waiting for Nasir ad-Din, Saladin and Taqi ad-Din commenced their advance against the coalition, marching eastward to Ras al-Ein unhindered. By late April, after three days of "actual fighting", according to Saladin, the Ayyubids had captured Amid. He handed the city to Nur ad-Din Muhammad together with its stores, which consisted of 80,000 candles, a tower full of arrowheads, and 1,040,000 books. In return for a diploma—granting him the city, Nur ad-Din swore allegiance to Saladin, promising to follow him in every expedition in the war against the Crusaders, and repairing the damage done to the city. The fall of Amid, in addition to territory, convinced Il-Ghazi of Mardin to enter the service of Saladin, weakening Izz ad-Din's coalition. Saladin attempted to gain the Caliph an-Nasir's support against Izz ad-Din by sending him a letter requesting a document that would give him legal justification for taking over Mosul and its territories. Saladin aimed to persuade the caliph claiming that while he conquered Egypt and Yemen under the flag of the Abbasids, the Zengids of Mosul openly supported the Seljuks (rivals of the caliphate) and only came to the caliph when in need. He also accused Izz ad-Din's forces of disrupting the Muslim "Holy War" against the Crusaders, stating "they are not content not to fight, but they prevent those who can". Saladin defended his own conduct claiming that he had come to Syria to fight the Crusaders, end the heresy of the Assassins, and stop the wrong-doing of the Muslims. He also promised that if Mosul was given to him, it would lead to the capture of Jerusalem, Constantinople, Georgia, and the lands of the Almohads in the Maghreb, "until the word of God is supreme and the Abbasid caliphate has wiped the world clean, turning the churches into mosques". Saladin stressed that all this would happen by the will of God, and instead of asking for financial or military support from the caliph, he would capture and give the caliph the territories of Tikrit, Daquq, Khuzestan, Kish Island, and Oman. Saladin turned his attention from Mosul to Aleppo, sending his brother Taj al-Muluk Buri to capture Tell Khalid, 130 km northeast of the city. A siege was set, but the governor of Tell Khalid surrendered upon the arrival of Saladin himself on 17 May before a siege could take place. According to Imad ad-Din, after Tell Khalid, Saladin took a detour northwards to Aintab, but he gained possession of it when his army turned towards it, allowing him to quickly move backward another c. 100 km towards Aleppo. On 21 May, he camped outside the city, positioning himself east of the Citadel of Aleppo, while his forces encircled the suburb of Banaqusa to the northeast and Bab Janan to the west. He stationed his men dangerously close to the city, hoping for an early success. Zangi did not offer long resistance. He was unpopular with his subjects and wished to return to his Sinjar, the city he governed previously. An exchange was negotiated where Zangi would hand over Aleppo to Saladin in return for the restoration of his control of Sinjar, Nusaybin, and Raqqa. Zangi would hold these territories as Saladin's vassals in terms of military service. On 12 June, Aleppo was formally placed in Ayyubid hands. The people of Aleppo had not known about these negotiations and were taken by surprise when Saladin's standard was hoisted over the citadel. Two emirs, including an old friend of Saladin, Izz ad-Din Jurduk, welcomed and pledged their service to him. Saladin replaced the Hanafi courts with Shafi'i administration, despite a promise that he would not interfere in the religious leadership of the city. Although he was short of money, Saladin also allowed the departing Zangi to take all the stores of the citadel that he could travel with and to sell the remainder—which Saladin purchased himself. In spite of his earlier hesitation to go through with the exchange, he had no doubts about his success, stating that Aleppo was "the key to the lands" and "this city is the eye of Syria and the citadel is its pupil". For Saladin, the capture of the city marked the end of over eight years of waiting since he told Farrukh-Shah that "we have only to do the milking and Aleppo will be ours". After spending one night in Aleppo's citadel, Saladin marched to Harim, near the Crusader-held Antioch. The city was held by Surhak, a "minor mamluk". Saladin offered him the city of Busra and property in Damascus in exchange for Harim, but when Surhak asked for more, his own garrison in Harim forced him out. He was arrested by Saladin's deputy Taqi ad-Din on allegations that he was planning to cede Harim to Bohemond III of Antioch. When Saladin received its surrender, he proceeded to arrange the defense of Harim from the Crusaders. He reported to the caliph and his own subordinates in Yemen and Baalbek that he was going to attack the Armenians. Before he could move, however, there were a number of administrative details to be settled. Saladin agreed to a truce with Bohemond in return for Muslim prisoners being held by him and then he gave A'zaz to Alam ad-Din Suleiman and Aleppo to Saif ad-Din al-Yazkuj—the former was an emir of Aleppo who joined Saladin and the latter was a former mamluk of Shirkuh who helped rescue him from the assassination attempt at A'zaz. Wars against Crusaders Crusader attacks provoked further responses by Saladin. Raynald of Châtillon, in particular, harassed Muslim trading and pilgrimage routes with a fleet on the Red Sea, a water route that Saladin needed to keep open. Raynald threatened to attack the holy cities of Mecca and Medina. On 29 September 1183, Saladin crossed the Jordan River to attack Beisan, which was found to be empty. The next day his forces sacked and burned the town and moved westwards. They intercepted Crusader reinforcements from Karak and Shaubak along the Nablus road and took prisoners. Meanwhile, the main Crusader force under Guy of Lusignan moved from Sepphoris to al-Fula. Saladin sent out 500 skirmishers to harass their forces, and he himself marched to Ain Jalut. When the Crusader force—reckoned to be the largest the kingdom ever produced from its own resources, but still outmatched by the Muslims—advanced, the Ayyubids unexpectedly moved down the stream of Ain Jalut. After a few Ayyubid raids—including attacks on Zir'in, Forbelet, and Mount Tabor—the Crusaders still were not tempted to attack their main force, and Saladin led his men back across the river once provisions and supplies ran low. Saladin still had to exact retribution on Raynald, so he twice besieged Kerak, Raynald's fortress in Oultrejordain. The first time was in 1183, following his unsuccessful campaign into Galilee, but a relief force caused him to withdraw. He opened his campaign of 1184 with a second siege of Kerak, hoping this time to draw the Crusader army into battle on open ground, but they outmaneuvered him and successfully relieved the fortress. Following the failure of his Kerak sieges, Saladin temporarily turned his attention back to another long-term project and resumed attacks on the territory of Izz ad-Din (Mas'ud ibn Mawdud ibn Zangi), around Mosul, which he had begun with some success in 1182. However, since then, Masʻūd had allied himself with the powerful governor of Azerbaijan and Jibal, who in 1185 began moving his troops across the Zagros Mountains, causing Saladin to hesitate in his attacks. The defenders of Mosul, when they became aware that help was on the way, increased their efforts, and Saladin subsequently fell ill, so in March 1186 a peace treaty was signed. On hearing of the attack, Saladin vowed that he would personally slay Raynald for breaking the truce, a vow he would keep. The outrage also led Saladin to resolve to dispense with half-measures to rein in the unruly lord of Kerak, and to instead topple the entire edifice of the Christian Kingdom of Jerusalem, thus precipitating the invasion of the summer of 1187. On 4 July 1187, Saladin faced the combined forces of Guy of Lusignan, King Consort of Jerusalem, and Raymond III of Tripoli at the Battle of Hattin. In this battle alone the Crusader force was largely annihilated by Saladin's determined army. It was a major disaster for the Crusaders and a turning point in the history of the Crusades. Saladin captured Raynald and was personally responsible for his execution in retaliation for his attacks against Muslim caravans. The members of these caravans had, in vain, besought his mercy by reciting the truce between the Muslims and the Crusaders, but Raynald ignored this and insulted Muhammad the Muslim prophet, before murdering and torturing some of them. Upon hearing this, Saladin swore an oath to personally execute Raynald. Guy of Lusignan was also captured. Seeing the execution of Raynald, he feared he would be next. However, his life was spared by Saladin, who said of Raynald, "[i]t is not the wont of kings, to kill kings; but that man had transgressed all bounds, and therefore did I treat him thus." Saladin had captured almost every Crusader city. Saladin preferred to take Jerusalem without bloodshed and offered generous terms, but those inside refused to leave their holy city, vowing to destroy it in a fight to the death rather than see it handed over peacefully. Jerusalem capitulated to his forces on Friday, 2 October 1187, after a siege. When the siege had started, Saladin was unwilling to promise terms of quarter to the Frankish inhabitants of Jerusalem. Balian of Ibelin threatened to kill every Muslim hostage, estimated at 5,000, and to destroy Islam's holy shrines of the Dome of the Rock and the al-Aqsa Mosque if such quarter were not provided. Saladin consulted his council and the terms were accepted. The agreement was read out through the streets of Jerusalem so that everyone might within forty days provide for himself and pay to Saladin the agreed tribute for his freedom. An unusually low ransom was to be paid for each Frank in the city, whether man, woman, or child, but Saladin, against the wishes of his treasurers, allowed many families who could not afford the ransom to leave. Patriarch Heraclius of Jerusalem organised and contributed to a collection that paid the ransoms for about 18,000 of the poorer citizens, leaving another 15,000 to be enslaved. Saladin's brother al-Adil "asked Saladin for a thousand of them for his own use and then released them on the spot." Most of the foot soldiers were sold into slavery. Upon the capture of Jerusalem, Saladin summoned the Jews and permitted them to resettle in the city. In particular, the residents of Ascalon, a large Jewish settlement, responded to his request. Tyre, on the coast of modern-day Lebanon, was the last major Crusader city that was not captured by Muslim forces. Strategically, it would have made more sense for Saladin to capture Tyre before Jerusalem; Saladin, however, chose to pursue Jerusalem first because of the importance of the city to Islam. Tyre was commanded by Conrad of Montferrat, who strengthened its defences and withstood two sieges by Saladin. In 1188, at Tortosa, Saladin released Guy of Lusignan and returned him to his wife Sibylla of Jerusalem. They went first to Tripoli, then to Antioch. In 1189, they sought to reclaim Tyre for their kingdom but were refused admission by Conrad, who did not recognize Guy as king. Guy then set about besieging Acre. Saladin was on friendly terms with Queen Tamar of Georgia. Saladin's biographer Baha ad-Din ibn Shaddad reports that, after Saladin's conquest of Jerusalem, the Georgian Queen sent envoys to the sultan to request the return of confiscated possessions of the Georgian monasteries in Jerusalem. Saladin's response is not recorded, but the queen's efforts seem to have been successful as Jacques de Vitry, the Bishop of Acre, reports the Georgians were, in contrast to the other Christian pilgrims, allowed a free passage into the city with their banners unfurled. Ibn Šaddād furthermore claims that Queen Tamar outbid the Byzantine emperor in her efforts to obtain the relics of the True Cross, offering 200,000 gold pieces to Saladin who had taken the relics as booty at the battle of Hattin, but to no avail. According to Baha ad-Din, after these victories, Saladin mused of invading Europe, saying: "I think that when God grants me victory over the rest of Palestine I shall divide my territories, make a will stating my wishes, then set sail on this sea for their far-off lands in and pursue the Franks there, so as to free the earth of anyone who does not believe in God, or die in the attempt." It is equally true that his generosity, his piety, devoid of fanaticism, that flower of liberality and courtesy which had been the model of our old chroniclers, won him no less popularity in Frankish Syria than in the lands of Islam. Hattin and the fall of Jerusalem prompted the Third Crusade (1189–1192), which was partially financed by a special "Saladin tithe" in 1188. King Richard I led Guy's siege of Acre, conquered the city and executed almost 3,000 Muslim prisoners of war. Baha ad-Din wrote: The motives of this massacre are differently told; according to some, the captives were slain by way of reprisal for the death of those Christians whom the Musulmans had slain. Others again say that the king of England, on deciding to attempt the conquest of Ascalon, thought it unwise to leave so many prisoners in the town after his departure. God alone knows what the real reason was. The armies of Saladin engaged in combat with the army of King Richard at the Battle of Arsuf on 7 September 1191, at which Saladin's forces suffered heavy losses and were forced to withdraw. After the battle of Arsuf, Richard occupied Jaffa, restoring the city's fortifications. Meanwhile, Saladin moved south, where he dismantled the fortifications of Ascalon to prevent this strategically important city, which lay at the junction between Egypt and Palestine, from falling into Crusader hands. In October 1191, Richard began restoring the inland castles on the coastal plain beyond Jaffa in preparation for an advance on Jerusalem. During this period, Richard and Saladin passed envoys back and forth, negotiating the possibility of a truce. Richard proposed that his sister Joan should marry Saladin's brother and that Jerusalem could be their wedding gift. However, Saladin rejected this idea when Richard insisted that Saladin's brother convert to Christianity. Richard suggested that his niece Eleanor, Fair Maid of Brittany be the bride instead, an idea that Saladin also rejected. In January 1192, Richard's army occupied Beit Nuba, just twelve miles from Jerusalem, but withdrew without attacking the Holy City. Instead, Richard advanced south on Ascalon, where he restored the fortifications. In July 1192, Saladin tried to threaten Richard's command of the coast by attacking Jaffa. The city was besieged, and Saladin very nearly captured it; however, Richard arrived a few days later and defeated Saladin's army in a battle outside the city. The Battle of Jaffa (1192) proved to be the last military engagement of the Third Crusade. After Richard reoccupied Jaffa and restored its fortifications, he and Saladin again discussed terms. At last Richard agreed to demolish the fortifications of Ascalon, while Saladin agreed to recognize Crusader control of the Palestinian coast from Tyre to Jaffa. The Christians would be allowed to travel as unarmed pilgrims to Jerusalem, and Saladin's kingdom would be at peace with the Crusader states for the following three years. Death Saladin died of a fever on 4 March 1193 (27 Safar 589 AH) at Damascus, not long after King Richard's departure. In Saladin's possession at the time of his death were one piece of gold and forty pieces of silver. He had given away his great wealth to his poor subjects, leaving nothing to pay for his funeral. He was buried in a mausoleum in the garden outside the Umayyad Mosque in Damascus, Syria. Originally the tomb was part of a complex which also included a school, Madrassah al-Aziziah, of which little remains except a few columns and an internal arch. Seven centuries later, Emperor Wilhelm II of Germany donated a new marble sarcophagus to the mausoleum. However, the original sarcophagus was not replaced; instead, the mausoleum, which is open to visitors, now has two sarcophagi: the marble one placed on the side and the original wooden one, which covers Saladin's tomb. Family Imad ad-Din al-Isfahani compiled a list of Saladin's sons along with their dates of birth, according to information provided by Saladin late in his reign. Notable members of Saladin's progeny, as listed by Imad, include: The sons listed by Imad number fifteen, but elsewhere he writes that Saladin was survived by seventeen sons and one daughter. Saladin's daughter is said to have married her cousin al-Kamil Muhammad ibn Adil. Saladin may also have had other children who died before him. One son, Al-Zahir Dawud, whom Imad listed eighth, is recorded as being Saladin's twelfth son in a letter written by his minister. Not much is known of Saladin's wives or slave-women. He married Ismat ad-Din Khatun, the widow of Nur ad-Din Zengi, in 1176. She did not have children. One of his wives, Shamsah, is buried with her son al-Aziz in the tomb of al-Shafi'i. Recognition and legacy Saladin has become a prominent figure in Muslim, Arab, Turkish and Kurdish culture, and he has been described as the most famous Kurd in history. Historian ibn Munqidh mentioned him as the person who revived the reign[clarification needed] of Rashidun Caliphs. The Turkish writer Mehmet Akif Ersoy called him the most beloved sultan of the Orient. In 1898, Wilhelm II, the German emperor, visited Saladin's tomb to pay his respects. The visit, coupled with anti-imperialist sentiments, encouraged the image in the Arab world of Saladin as a hero of the struggle against the West, building on was the romantic one created by Walter Scott and other Europeans in the West at the time. Saladin's reputation had previously been largely forgotten in the Muslim world, eclipsed by more successful figures,[clarification needed] such as Baybars of Egypt. Modern Arab states have sought to commemorate Saladin through various measures, often based on the image created of him in the 19th-century west. A governorate centered around Tikrit and Samarra in modern-day Iraq is named after him, as is Salahaddin University in Erbil, the largest city of Iraqi Kurdistan. A suburban community of Erbil, Masif Salahaddin, is also named after him. Few structures associated with Saladin survive within modern cities. Saladin first fortified the Citadel of Cairo (1175–1183), which had been a domed pleasure pavilion with a fine view in more peaceful times. In Syria, even the smallest city is centred on a defensible citadel, and Saladin introduced this essential feature to Egypt. Although the Ayyubid dynasty that he founded would outlive him by only 57 years, the legacy of Saladin within the Arab world continues to this day. With the rise of Arab nationalism in the 20th century, particularly with regard to the Arab–Israeli conflict, Saladin's heroism and leadership gained a new significance. Saladin's recapture of Palestine from the European Crusaders is considered an inspiration for modern-day Arabs' opposition to Zionism. Moreover, the glory and comparative unity of the Arab world under Saladin was seen as the perfect symbol for the new unity sought by Arab nationalists, such as Gamal Abdel Nasser. For this reason, the Eagle of Saladin became the symbol of revolutionary Egypt, and was subsequently adopted by several other Arab states (the United Arab Republic, Iraq, Libya, the State of Palestine, and Yemen). Among Egyptian Shias, Saladin is dubbed as "Kharab ad-Din", the destroyer of religion—a derisive play on the name "Saladin." Saladin was widely renowned in medieval Europe as a model of kingship, and in particular of the courtly virtue of regal generosity. As early as 1202/03, Walther von der Vogelweide urged the German King Philip of Swabia to be more like Saladin, who believed that a king's hands should have holes to let the gold fall through.[c] By the 1270s, Jans der Enikel was spreading the fictitious but approving story of Saladin's table,[d] which presented him as both pious and wise to religious diversity. In The Divine Comedy (1308–1320), Dante mentions him as one of the virtuous non-Christians in limbo, and he is also depicted favorably in Boccaccio's The Decameron (1438–53). Although Saladin faded into history after the Middle Ages, he appears in a sympathetic light in modern literature, first in Lessing's play Nathan the Wise (1779), which transfers the central idea of "Saladin's table" to the post-medieval world. He is a central character in Sir Walter Scott's novel The Talisman (1825), which more than any other single text influenced the romantic view of Saladin. Scott presented Saladin as a "modern [19th-century] liberal European gentlemen, beside whom medieval Westerners would always have made a poor showing". 20th-century French author Albert Champdor described him as "Le plus pur héros de l'Islam" (English: The purest Hero of Islam). Despite the Crusaders' slaughter when they originally conquered Jerusalem in 1099, Saladin granted amnesty and free passage to all common Catholics and even to the defeated Christian army, as long as they were able to pay the aforementioned ransom (the Greek Orthodox Christians were treated even better because they often opposed the western Crusaders).[citation needed] Notwithstanding the differences in beliefs, the Muslim Saladin was respected by Christian lords, Richard especially. Richard once praised Saladin as a great prince, saying that he was, without doubt, the greatest and most powerful leader in the Muslim world. Saladin, in turn, stated that there was not a more honorable Christian lord than Richard. After the treaty, Saladin and Richard sent each other many gifts as tokens of respect but never met face to face. In April 1191, a Frankish woman's three-month-old baby had been stolen from her camp and sold on the market. The Franks urged her to approach Saladin herself with her grievance. According to Ibn Shaddad, Saladin used his own money to buy the child back: He gave it to the mother and she took it; with tears streaming down her face, and hugged the baby to her chest. The people were watching her and weeping and I (Ibn Shaddad) was standing amongst them. She suckled it for some time and then Saladin ordered a horse to be fetched for her and she went back to camp. Mark Cartwright, the publishing director of World History Encyclopedia, writes: "Indeed, it is somewhat ironic that the Muslim leader became one of the great exemplars of chivalry in 13th century European literature. Much has been written about the sultan during his own lifetime and since, but the fact that an appreciation for his diplomacy and leadership skills can be found in both contemporary Muslim and Christian sources would suggest that Saladin is indeed worthy of his position as one of the great medieval leaders." Cultural depictions of Saladin See also Notes References Bibliography Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Evolving_network] | [TOKENS: 1646]
Contents Evolving network Evolving networks are networks that change as a function of time. They are a natural extension of network science since almost all real world networks evolve over time, either by adding or removing nodes or links over time. Often all of these processes occur simultaneously, such as in social networks where people make and lose friends over time, thereby creating and destroying edges, and some people become part of new social networks or leave their networks, changing the nodes in the network. Evolving network concepts build on established network theory and are now being introduced into studying networks in many diverse fields. Network theory background The study of networks traces its foundations to the development of graph theory, which was first analyzed by Leonhard Euler in 1736 when he wrote the famous Seven Bridges of Königsberg paper. Probabilistic network theory then developed with the help of eight famous papers studying random graphs written by Paul Erdős and Alfréd Rényi. The Erdős–Rényi model (ER) supposes that a graph is composed of N labeled nodes where each pair of nodes is connected by a preset probability p. While the ER model's simplicity has helped it find many applications, it does not accurately describe many real world networks. The ER model fails to generate local clustering and triadic closures as often as they are found in real world networks. Therefore, the Watts and Strogatz model was proposed, whereby a network is constructed as a regular ring lattice, and then nodes are rewired according to some probability β. This produces a locally clustered network and dramatically reduces the average path length, creating networks which represent the small world phenomenon observed in many real world networks. Despite this achievement, both the ER and the Watts and Storgatz models fail to account for the formulation of hubs as observed in many real world networks. The degree distribution in the ER model follows a Poisson distribution, while the Watts and Strogatz model produces graphs that are homogeneous in degree. Many networks are instead scale free, meaning that their degree distribution follows a power law of the form: This exponent turns out to be approximately 3 for many real world networks, however, it is not a universal constant and depends continuously on the network's parameters First evolving network model – scale-free networks The Barabási–Albert (BA) model was the first widely accepted model to produce scale-free networks. This was accomplished by incorporating preferential attachment and growth, where nodes are added to the network over time and are more likely to link to other nodes with high degree distributions. The BA model was first applied to degree distributions on the web, where both of these effects can be clearly seen. New web pages are added over time, and each new page is more likely to link to highly visible hubs like Google which have high degree distributions than to nodes with only a few links. Formally this preferential attachment is: Additions to BA model The BA model was the first model to derive the network topology from the way the network was constructed with nodes and links being added over time. However, the model makes only the simplest assumptions necessary for a scale-free network to emerge, namely that there is linear growth and linear preferential attachment. This minimal model does not capture variations in the shape of the degree distribution, variations in the degree exponent, or the size independent clustering coefficient. Therefore, the original model has since been modified[by whom?] to more fully capture the properties of evolving networks by introducing a few new properties. One concern with the BA model is that the degree distributions of each nodes experience strong positive feedback whereby the earliest nodes with high degree distributions continue to dominate the network indefinitely. However, this can be alleviated by introducing a fitness for each node, which modifies the probability of new links being created with that node or even of links to that node being removed. In order to preserve the preferential attachment from the BA model, this fitness is then multiplied by the preferential attachment based on degree distribution to give the true probability that a link is created which connects to node i. Where η {\displaystyle \eta } is the fitness, which may also depend on time. A decay of fitness with respect to time may occur and can be formalized by where γ {\displaystyle \gamma } increases with ν . {\displaystyle \nu .} Further complications arise because nodes may be removed from the network with some probability. Additionally, existing links may be destroyed and new links between existing nodes may be created. The probability of these actions occurring may depend on time and may also be related to the node's fitness. Probabilities can be assigned to these events by studying the characteristics of the network in question in order to grow a model network with identical properties. This growth would take place with one of the following actions occurring at each time step: Prob p: add an internal link. Prob q: delete a link. Prob r: delete a node. Prob 1-p-q-r: add a node. Other ways of characterizing evolving networks In addition to growing network models as described above, there may be times when other methods are more useful or convenient for characterizing certain properties of evolving networks. In networked systems where competitive decision making takes place, game theory is often used to model system dynamics, and convergence towards equilibria can be considered as a driver of topological evolution. For example, Kasthurirathna and Piraveenan have shown that when individuals in a system display varying levels of rationality, improving the overall system rationality might be an evolutionary reason for the emergence of scale-free networks. They demonstrated this by applying evolutionary pressure on an initially random network which simulates a range of classic games, so that the network converges towards Nash equilibria while being allowed to re-wire. The networks become increasingly scale-free during this process. The most common way to view evolving networks is by considering them as successive static networks. This could be conceptualized as the individual still images which compose a motion picture. Many simple parameters exist to describe a static network (number of nodes, edges, path length, connected components), or to describe specific nodes in the graph such as the number of links or the clustering coefficient. These properties can then individually be studied as a time series using signal processing notions. For example, we can track the number of links established to a server per minute by looking at the successive snapshots of the network and counting these links in each snapshot. Unfortunately, the analogy of snapshots to a motion picture also reveals the main difficulty with this approach: the time steps employed are very rarely suggested by the network and are instead arbitrary. Using extremely small time steps between each snapshot preserves resolution, but may actually obscure wider trends which only become visible over longer timescales. Conversely, using larger timescales loses the temporal order of events within each snapshot. Therefore, it may be difficult to find the appropriate timescale for dividing the evolution of a network into static snapshots. It may be important to look at properties which cannot be directly observed by treating evolving networks as a sequence of snapshots, such as the duration of contacts between nodes Other similar properties can be defined and then it is possible to instead track these properties through the evolution of a network and visualize them directly. Another issue with using successive snapshots is that only slight changes in network topology can have large effects on the outcome of algorithms designed to find communities. Therefore, it is necessary to use a non classical definition of communities which permits following the evolution of the community through a set of rules such as birth, death, merge, split, growth, and contraction. Applications Almost all real world networks are evolving networks since they are constructed over time. By varying the respective probabilities described above, it is possible to use the expanded BA model to construct a network with nearly identical properties as many observed networks. Moreover, the concept of scale free networks shows us that time evolution is a necessary part of understanding the network's properties, and that it is difficult to model an existing network as having been created instantaneously. Real evolving networks which are currently being studied include social networks, communications networks, the internet, the movie actor network, the World Wide Web, and transportation networks. Further reading References
========================================
[SOURCE: https://en.wikipedia.org/wiki/History_of_Mars_observation] | [TOKENS: 6105]
Contents History of Mars observation The history of Mars observation is about the recorded history of observation of the planet Mars. Some of the early records of Mars's observation date back to the era of the ancient Egyptian astronomers in the 2nd millennium BCE. Chinese records about the motions of Mars appeared before the founding of the Zhou dynasty (1045 BCE). Detailed observations of the position of Mars were made by Babylonian astronomers who developed arithmetic techniques to predict the future position of the planet. The ancient Greek philosophers and Hellenistic astronomers developed a geocentric model to explain the planet's motions. Measurements of Mars's angular diameter can be found in ancient Greek and Indian texts. In the 16th century, Nicolaus Copernicus proposed a heliocentric model for the Solar System in which the planets follow circular orbits about the Sun. This was revised by Johannes Kepler, yielding an elliptic orbit for Mars that more accurately fitted the observational data. The first telescopic observation of Mars was by Galileo Galilei in 1609. Within a century, astronomers discovered distinct albedo features on the planet, including the dark patch Syrtis Major Planum and polar ice caps. They were able to determine the planet's rotation period and axial tilt. These observations were primarily made during the time intervals when the planet was located in opposition to the Sun, at which points Mars made its closest approaches to the Earth. Better telescopes developed early in the 19th century allowed permanent Martian albedo features to be mapped in detail. The first crude map of Mars was published in 1840, followed by more refined maps from 1877 onward. When astronomers mistakenly thought they had detected the spectroscopic signature of water in the Martian atmosphere, the idea of life on Mars became popularized among the public. Percival Lowell believed he could see a network of artificial canals on Mars. These linear features later proved to be an optical illusion, and the atmosphere was found to be too thin to support an Earth-like environment. Yellow clouds on Mars have been observed since the 1870s, which Eugène M. Antoniadi suggested were windblown sand or dust. During the 1920s, the range of Martian surface temperature was measured; it ranged from −85 to 7 °C (−121 to 45 °F). The planetary atmosphere was found to be arid with only trace amounts of oxygen and water. In 1947, Gerard Kuiper showed that the thin Martian atmosphere contained extensive carbon dioxide; roughly double the quantity found in Earth's atmosphere. The first standard nomenclature for Mars albedo features was adopted in 1960 by the International Astronomical Union. Since the 1960s, multiple robotic spacecraft have been sent to explore Mars from orbit and the surface. The planet has remained under observation by ground and space-based instruments across a broad range of the electromagnetic spectrum.The discovery of meteorites on Earth that originated on Mars has allowed laboratory examination of the chemical conditions on the planet. Earliest records The existence of Mars as a wandering object in the night sky was recorded by ancient Egyptian astronomers. By the 2nd millennium BCE they were familiar with the apparent retrograde motion of the planet, in which it appears to move in the opposite direction across the sky from its normal progression. Mars was portrayed on the ceiling of the tomb of Seti I, on the Ramesseum ceiling, and in the Senenmut star map. The last is the oldest known star map, being dated to 1534 BCE based on the position of the planets. By the period of the Neo-Babylonian Empire, Babylonian astronomers were making systematic observations of the positions and behavior of the planets. For Mars, they knew, for example, that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. The Babylonians invented arithmetic methods for making minor corrections to the predicted positions of the planets. This technique was primarily derived from timing measurements—such as when Mars rose above the horizon, rather than from the less accurately known position of the planet on the celestial sphere. Chinese records of the appearances and motions of Mars appear before the founding of the Zhou dynasty (1045 BCE), and by the Qin dynasty (221 BCE) astronomers maintained close records of planetary conjunctions, including those of Mars. Occultations of Mars by Venus were noted in 368, 375, and 405 CE. The period and motion of the planet's orbit was known in detail during the Tang dynasty (618 CE). The early astronomy of ancient Greece was influenced by knowledge transmitted from the Mesopotamian culture. Thus, the Babylonians associated Mars with Nergal, their god of war and pestilence, and the Greeks connected the planet with their god of war, Ares. During this period, the motions of the planets were of little interest to the Greeks; Hesiod's Works and Days (c. 650 BCE) makes no mention of the planets. Orbital models The Greeks used the word planēton to refer to the seven celestial bodies that moved with respect to the background stars and they held a geocentric view that these bodies moved about the Earth. In his work, The Republic (X.616E–617B), the Greek philosopher Plato provided the oldest known statement defining the order of the planets in Greek astronomical tradition. His list, in order of the nearest to the most distant from the Earth, was as follows: the Moon, Sun, Venus, Mercury, Mars, Jupiter, Saturn, and the fixed stars. In his dialogue Timaeus, Plato proposed that the progression of these objects across the skies depended on their distance, so that the most distant object moved the slowest. Aristotle, a student of Plato, observed an occultation of Mars by the Moon on 4 May 357 BCE. From this he concluded that Mars must lie further from the Earth than the Moon. He noted that other such occultations of stars and planets had been observed by the Egyptians and Babylonians. Aristotle used this observational evidence to support the Greek sequencing of the planets. His work De Caelo presented a model of the universe in which the Sun, Moon, and planets circle about the Earth at fixed distances. A more sophisticated version of the geocentric model was developed by the Greek astronomer Hipparchus when he proposed that Mars moved along a circular track called the epicycle that, in turn, orbited about the Earth along a larger circle called the deferent. In Roman Egypt during the 2nd century CE, Claudius Ptolemaeus (Ptolemy) attempted to address the problem of the orbital motion of Mars. Observations of Mars had shown that the planet appeared to move 40% faster on one side of its orbit than the other, in conflict with the Aristotelian model of uniform motion. Ptolemy modified the model of planetary motion by adding a point offset from the center of the planet's circular orbit about which the planet moves at a uniform rate of rotation. He proposed that the order of the planets, by increasing distance, was: the Moon, Mercury, Venus, Sun, Mars, Jupiter, Saturn, and the fixed stars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection Almagest, which became the authoritative treatise on Western astronomy for the next fourteen centuries. In 1543, Nicolaus Copernicus published a heliocentric model in his work De revolutionibus orbium coelestium. This approach placed the Earth in an orbit around the Sun between the circular orbits of Venus and Mars. His model successfully explained why the planets Mars, Jupiter and Saturn were on the opposite side of the sky from the Sun whenever they were in the middle of their retrograde motions. Copernicus was able to sort the planets into their correct heliocentric order based solely on the period of their orbits about the Sun. His theory gradually gained acceptance among European astronomers, particularly after the publication of the Prutenic Tables by the German astronomer Erasmus Reinhold in 1551, which were computed using the Copernican model. On October 13, 1590, the German astronomer Michael Maestlin observed an occultation of Mars by Venus. One of his students, Johannes Kepler, quickly became an adherent to the Copernican system. After the completion of his education, Kepler became an assistant to the Danish nobleman and astronomer, Tycho Brahe. With access granted to Tycho's detailed observations of Mars, Kepler was set to work mathematically assembling a replacement to the Prutenic Tables. After repeatedly failing to fit the motion of Mars into a circular orbit as required under Copernicanism, he succeeded in matching Tycho's observations by assuming the orbit was an ellipse and the Sun was located at one of the foci. His model became the basis for Kepler's laws of planetary motion, which were published in his multi-volume work Epitome Astronomiae Copernicanae (Epitome of Copernican Astronomy) between 1615 and 1621. Early telescope observations At its closest approach, the angular size of Mars is 25 arcseconds (a unit of degree); this is much too small for the naked eye to resolve. Hence, prior to the invention of the telescope, nothing was known about the planet besides its red hue and its position on the sky. Telescopes appeared in 1608 and in September 1610 the Italian scientist Galileo Galilei indicated in his records that he began observing Mars through a telescope. This instrument was too primitive to display any surface detail on the planet, so he set the goal of seeing if Mars exhibited phases of partial darkness similar to Venus or the Moon. Although uncertain of his success, by December he did note that Mars had shrunk in angular size. Polish astronomer Johannes Hevelius succeeded in observing a phase of Mars in 1645. In 1636 Francesco Fontana drew the first telescopic observation of Mars, but noted no actual features. In 1644, the Italian Jesuit Daniello Bartoli reported seeing two darker patches on Mars. During the oppositions of 1651, 1653 and 1655, when the planet made its closest approaches to the Earth, the Italian astronomer Giovanni Battista Riccioli and his student Francesco Maria Grimaldi noted patches of differing reflectivity on Mars. The first person to draw Mars with observed actual features was the Dutch astronomer Christiaan Huygens. On November 28, 1659, he made an illustration of Mars that showed the distinct dark region now known as Syrtis Major Planum, and possibly one of the polar ice caps. The same year, he succeeded in measuring the rotation period of the planet, giving it as approximately 24 hours. He made a rough estimate of the diameter of Mars, guessing that it is about 60% of the size of the Earth, which compares well with the modern value of 53%. Perhaps the first definitive mention of Mars's southern polar ice cap was by the Italian astronomer Giovanni Domenico Cassini, in 1666. That same year, he used observations of the surface markings on Mars to determine a rotation period of 24h 40m. This differs from the currently-accepted value by less than three minutes. In 1672, Huygens noticed a fuzzy white cap at the north pole. After Cassini became the first director of the Paris Observatory in 1671, he tackled the problem of the physical scale of the Solar System. The relative size of the planetary orbits was known from Kepler's third law, so what was needed was the actual size of one of the planet's orbits. For this purpose, the position of Mars was measured against the background stars from different points on the Earth, thereby measuring the diurnal parallax of the planet. During this year, the planet was moving past the point along its orbit where it was nearest to the Sun (a perihelic opposition), which made this a particularly close approach to the Earth. Cassini and Jean Picard determined the position of Mars from Paris, while the French astronomer Jean Richer made measurements from Cayenne, South America. Although these observations were hampered by the quality of the instruments, the parallax computed by Cassini came within 10% of the correct value. The English astronomer John Flamsteed made comparable measurement attempts and had similar results. In 1704, Italian astronomer Jacques Philippe Maraldi "made a systematic study of the southern cap and observed that it underwent" variations as the planet rotated. This indicated that the cap was not centered on the pole. He observed that the size of the cap varied over time. The German-born British astronomer Sir William Herschel began making observations of the planet Mars in 1777, particularly of the planet's polar caps. In 1781, he noted that the south cap appeared "extremely large", which he ascribed to that pole being in darkness for the past twelve months. By 1784, the southern cap appeared much smaller, thereby suggesting that the caps vary with the planet's seasons and thus were made of ice. In 1781, he estimated the rotation period of Mars as 24h 39m 21.67s and measured the axial tilt of the planet's poles to the orbital plane as 28.5°. He noted that Mars had a "considerable but moderate atmosphere, so that its inhabitants probably enjoy a situation in many respects similar to ours". Between 1796 and 1809, the French astronomer Honoré Flaugergues noticed obscurations of Mars, suggesting "ochre-colored veils" covered the surface. This may be the earliest report of yellow clouds or storms on Mars. Geographical period At the start of the 19th century, improvements in the size and quality of telescope optics proved a significant advance in observation capability. Most notable among these enhancements was the two-component achromatic lens of the German optician Joseph von Fraunhofer that essentially eliminated coma—an optical effect that can distort the outer edge of the image. By 1812, Fraunhofer had succeeded in creating an achromatic objective lens 190 mm (7.5 in) in diameter. The size of this primary lens is the main factor in determining the light gathering ability and resolution of a refracting telescope. During the opposition of Mars in 1830, the German astronomers Johann Heinrich Mädler and Wilhelm Beer used a 95 mm (3.7 in) Fraunhofer refracting telescope to launch an extensive study of the planet. They chose a feature located 8° south of the equator as their point of reference. (This was later named the Sinus Meridiani, and it would become the zero meridian of Mars.) During their observations, they established that most of Mars's surface features were permanent, and more precisely determined the planet's rotation period. In 1840, Mädler combined ten years of observations to draw the first map of Mars. Rather than giving names to the various markings, Beer and Mädler simply designated them with letters; thus Meridian Bay (Sinus Meridiani) was feature "a". Working at the Vatican Observatory during the opposition of Mars in 1858, Italian astronomer Angelo Secchi noticed a large blue triangular feature, which he named the "Blue Scorpion". This same seasonal cloud-like formation was seen by English astronomer J. Norman Lockyer in 1862, and it has been viewed by other observers. During the 1862 opposition, Dutch astronomer Frederik Kaiser produced drawings of Mars. By comparing his illustrations to those of Huygens and the English natural philosopher Robert Hooke, he was able to further refine the rotation period of Mars. His value of 24h 37m 22.6s is accurate to within a tenth of a second. Father Secchi produced some of the first color illustrations of Mars in 1863. He used the names of famous explorers for the distinct features. In 1869, he observed two dark linear features on the surface that he referred to as canali, which is Italian for 'channels' or 'grooves'. In 1867, English astronomer Richard A. Proctor created a more detailed map of Mars based on the 1864 drawings of English astronomer William R. Dawes. Proctor named the various lighter or darker features after astronomers, past and present, who had contributed to the observations of Mars. During the same decade, comparable maps and nomenclature were produced by the French astronomer Camille Flammarion and the English astronomer Nathan Green. At the University of Leipzig in 1862–64, German astronomer Johann K. F. Zöllner developed a custom photometer to measure the reflectivity of the Moon, planets and bright stars. For Mars, he derived an albedo of 0.27. Between 1877 and 1893, German astronomers Gustav Müller and Paul Kempf observed Mars using Zöllner's photometer. They found a small phase coefficient—the variation in reflectivity with angle—indicating that the surface of Mars is smooth and without large irregularities. In 1867, French astronomer Pierre Janssen and British astronomer William Huggins used spectroscopes to examine the atmosphere of Mars. Both compared the optical spectrum of Mars to that of the Moon. As the spectrum of the latter did not display absorption lines of water, they believed they had detected the presence of water vapor in the atmosphere of Mars. This result was confirmed by German astronomer Herman C. Vogel in 1872 and English astronomer Edward W. Maunder in 1875, but would later come into question. In 1882, an article appeared in Scientific American discussing snow on the polar regions of Mars and speculation on the probability of ocean currents. A particularly favorable perihelic opposition occurred in 1877. The English astronomer David Gill used this opportunity to measure the diurnal parallax of Mars from Ascension Island, which led to a parallax estimate of 8.78 ± 0.01 arcseconds. Using this result, he was able to more accurately determine the distance of the Earth from the Sun, based upon the relative size of the orbits of Mars and the Earth. He noted that the edge of the disk of Mars appeared fuzzy because of its atmosphere, which limited the precision he could obtain for the planet's position. In August 1877, the American astronomer Asaph Hall discovered the two moons of Mars using a 660 mm (26 in) telescope at the U.S. Naval Observatory. The names of the two satellites, Phobos and Deimos, were chosen by Hall based upon a suggestion by Henry Madan, a science instructor at Eton College in England. Martian canals During the 1877 opposition, Italian astronomer Giovanni Schiaparelli used a 22 cm (8.7 in) telescope to help produce the first detailed map of Mars. These maps notably contained features he called canali, which were later shown to be an optical illusion. These canali were supposedly long straight lines on the surface of Mars to which he gave names of famous rivers on Earth. His term canali was popularly mistranslated in English as canals. In 1886, the English astronomer William F. Denning observed that these linear features were irregular in nature and showed concentrations and interruptions. By 1895, English astronomer Edward Maunder became convinced that the linear features were merely the summation of many smaller details. In his 1892 work La planète Mars et ses conditions d'habitabilité, Camille Flammarion wrote about how these channels resembled man-made canals, which an intelligent race could use to redistribute water across a dying Martian world. He advocated for the existence of such inhabitants, and suggested they may be more advanced than humans. Influenced by the observations of Schiaparelli, Percival Lowell founded an observatory with 30-and-45 cm (12-and-18 in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894 and the following less favorable oppositions. He published books on Mars and life on the planet, which had a great influence on the public. The canali were found by other astronomers, such as Henri Joseph Perrotin and Louis Thollon using a 38 cm (15 in) refractor at the Nice Observatory in France, one of the largest telescopes of that time. Beginning in 1901, American astronomer A. E. Douglass attempted to photograph the canal features of Mars. These efforts appeared to succeed when American astronomer Carl O. Lampland published photographs of the supposed canals in 1905. Although these results were widely accepted, they became contested by Greek astronomer Eugène M. Antoniadi, English naturalist Alfred Russel Wallace and others as merely imagined features. As bigger telescopes were used, fewer long, straight canali were observed. During an observation in 1909 by Flammarion with an 84 cm (33 in) telescope, irregular patterns were observed, but no canali were seen. Starting in 1909 Eugène Antoniadi was able to help disprove the theory of Martian canali by viewing through the great refractor of Meudon, the Grande Lunette (83 cm lens). A trifecta of observational factors synergize; viewing through the third largest refractor in the World, Mars was at opposition, and exceptional clear weather. The canali dissolved before Antoniadi's eyes into various "spots and blotches" on the surface of Mars. Refining planetary parameters Surface obscuration caused by yellow clouds had been noted in the 1870s when they were observed by Schiaparelli. Evidence for such clouds was observed during the oppositions of 1892 and 1907. In 1909, Antoniadi noted that the presence of yellow clouds was associated with the obscuration of albedo features. He discovered that Mars appeared more yellow during oppositions when the planet was closest to the Sun and was receiving more energy. He suggested windblown sand or dust as the cause of the clouds. In 1894, American astronomer William W. Campbell found that the spectrum of Mars was identical to the spectrum of the Moon, throwing doubt on the burgeoning theory that the atmosphere of Mars is similar to that of the Earth. Previous detections of water in the atmosphere of Mars were explained by unfavorable conditions, and Campbell determined that the water signature came entirely from the Earth's atmosphere. Although he agreed that the ice caps did indicate there was water in the atmosphere, he did not believe the caps were sufficiently large to allow the water vapor to be detected. At the time, Campbell's results were considered controversial and were criticized by members of the astronomical community, but they were confirmed by American astronomer Walter S. Adams in 1925. Baltic German astronomer Hermann Struve used the observed changes in the orbits of the Martian moons to determine the gravitational influence of the planet's oblate shape. In 1895, he used this data to estimate that the equatorial diameter was 1/190 larger than the polar diameter. In 1911, he refined the value to 1/192. This result was confirmed by American meteorologist Edgar W. Woolard in 1944. Using a vacuum thermocouple attached to the 2.54 m (100 in) Hooker Telescope at Mount Wilson Observatory, in 1924 the American astronomers Seth Barnes Nicholson and Edison Pettit were able to measure the thermal energy being radiated by the surface of Mars. They determined that the temperature ranged from −68 °C (−90 °F) at the pole up to 7 °C (45 °F) at the midpoint of the disk (corresponding to the equator). Beginning in the same year, radiated energy measurements of Mars were made by American physicist William Coblentz and American astronomer Carl Otto Lampland. The results showed that the night time temperature on Mars dropped to −85 °C (−121 °F), indicating an "enormous diurnal fluctuation" in temperatures. The temperature of Martian clouds was measured as −30 °C (−22 °F). In 1926, by measuring spectral lines that were redshifted by the orbital motions of Mars and Earth, American astronomer Walter Sydney Adams was able to directly measure the amount of oxygen and water vapor in the atmosphere of Mars. He determined that "extreme desert conditions" were prevalent on Mars. In 1934, Adams and American astronomer Theodore Dunham Jr. found that the amount of oxygen in the atmosphere of Mars was less than one percent of the amount over a comparable area on Earth. In 1927, Dutch graduate student Cyprianus Annius van den Bosch made a determination of the mass of Mars based upon the motions of the Martian moons, with an accuracy of 0.2%. This result was confirmed by the Dutch astronomer Willem de Sitter and published posthumously in 1938. Using observations of the near Earth asteroid Eros from 1926 to 1945, German-American astronomer Eugene K. Rabe was able to make an independent estimate the mass of Mars, as well as the other planets in the inner Solar System, from the planet's gravitational perturbations of the asteroid. His estimated margin of error was 0.05%, but subsequent checks suggested his result was poorly determined compared to other methods. During the 1920s, French astronomer Bernard Lyot used a polarimeter to study the surface properties of the Moon and planets. In 1929, he noted that the polarized light emitted from the Martian surface is very similar to that radiated from the Moon, although he speculated that his observations could be explained by frost and possibly vegetation. Based on the amount of sunlight scattered by the Martian atmosphere, he set an upper limit of 1/15 the thickness of the Earth's atmosphere. This restricted the surface pressure to no greater than 2.4 kPa (24 mbar). Using infrared spectrometry, in 1947 the Dutch-American astronomer Gerard Kuiper detected carbon dioxide in the Martian atmosphere. He was able to estimate that the amount of carbon dioxide over a given area of the surface is double that on the Earth. However, because he overestimated the surface pressure on Mars, Kuiper concluded erroneously that the ice caps could not be composed of frozen carbon dioxide. In 1948, American meteorologist Seymour L. Hess determined that the formation of the thin Martian clouds would only require 4 mm (0.16 in) of water precipitation and a vapor pressure of 0.1 kPa (1.0 mbar). The first standard nomenclature for Martian albedo features was introduced by the International Astronomical Union (IAU) when in 1960 they adopted 128 names from the 1929 map of Antoniadi named La Planète Mars. The Working Group for Planetary System Nomenclature (WGPSN) was established by the IAU in 1973 to standardize the naming scheme for Mars and other bodies. Remote sensing The International Planetary Patrol Program was formed in 1969 as a consortium to continually monitor planetary changes. This worldwide group focused on observing dust storms on Mars. Their images allow Martian seasonal patterns to be studied globally, and they showed that most Martian dust storms occur when the planet is closest to the Sun. Since the 1960s, robotic spacecraft have been sent to explore Mars from orbit and the surface in extensive detail. In addition, remote sensing of Mars from Earth by ground-based and orbiting telescopes has continued across much of the electromagnetic spectrum. These include infrared observations to determine the composition of the surface, ultraviolet and submillimeter observation of the atmospheric composition, and radio measurements of wind velocities. The Hubble Space Telescope (HST) has been used to perform systematic studies of Mars and has taken the highest resolution images of Mars ever captured from Earth. This telescope can produce useful images of the planet when it is at an angular distance of at least 50° from the Sun. The HST can take images of a hemisphere, which yields views of entire weather systems. Earth-based telescopes equipped with charge-coupled devices can produce useful images of Mars, allowing for regular monitoring of the planet's weather during oppositions. X-ray emission from Mars was first observed by astronomers in 2001 using the Chandra X-ray Observatory, and in 2003 it was shown to have two components. The first component is caused by X-rays from the Sun scattering off the upper Martian atmosphere; the second comes from interactions between ions that result in an exchange of charges. The emission from the latter source has been observed out to eight times the radius of Mars by the XMM-Newton orbiting observatory. In 1983, the analysis of the shergottite, nakhlite, and chassignite (SNC) group of meteorites showed that they may have originated on Mars. The Allan Hills 84001 meteorite, discovered in Antarctica in 1984, is believed to have originated on Mars but it has an entirely different composition than the SNC group. In 1996, it was announced that this meteorite might contain evidence for microscopic fossils of Martian bacteria. However, this finding remains controversial. Chemical analysis of the Martian meteorites found on Earth suggests that the ambient near-surface temperature of Mars has most likely been below the freezing point of water (0 °C) for much of the last four billion years. Observations See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/XAI_(company)#cite_note-63] | [TOKENS: 1856]
Contents xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Lightbulb_joke] | [TOKENS: 453]
Contents Lightbulb joke A lightbulb joke is a joke cycle that asks how many people of a certain group are needed to change, replace, or screw in a light bulb. Generally, the punch line answer highlights a stereotype of the target group. There are numerous versions of the lightbulb joke satirizing a wide range of cultures, beliefs, and occupations. Early versions of the joke, popular in the late 1960s and the 1970s, were used to insult the intelligence of people, especially Poles ("Polish jokes"). Such jokes generally take the form of: Q. How many [members of the target group] does it take to change a lightbulb? A. Three — one to hold the light bulb and two to turn the ladder around. Although lightbulb jokes tend to be derogatory in tone (e.g., "How many drunkards..." / "Four: one to hold the light bulb and three to drink until the room spins"), the people targeted by them may take pride in the stereotypes expressed and are often themselves the jokes' originators. An example where the joke itself becomes a statement of ethnic pride is: Q. How many Germans does it take to change a lightbulb? A. One, we're very efficient but not funny. Lightbulb jokes applied to subgroups can be used to ease tensions between them. Variations Some versions of the joke are puns on the words "change" or "screw". Q. How many psychiatrists does it take to change a light bulb? A. None—the light bulb will change when it's ready. Q. How many flies does it take to screw in a lightbulb? A. Two, but don't ask me how they got in there. Lightbulb jokes are often responses to contemporary events. For example, the lightbulb may not need to be changed at all due to ongoing power outages. The Village Voice held a $200 lightbulb joke contest around the time of the Iran hostage crisis, with the winning joke being: Q. How many Iranians does it take to change a light bulb? A. You send us the prize money and we'll tell you the answer. References Notes
========================================
[SOURCE: https://en.wikipedia.org/wiki/Plantation_economy] | [TOKENS: 1119]
Contents Plantation economy A plantation economy is an economy based on agricultural mass production, usually of a few commodity crops, grown on large farms worked by laborers or slaves. The properties are called plantations. Plantation economies rely on the export of cash crops as a source of income. Prominent crops included cotton, rubber, sugar cane, tobacco, figs, rice, kapok, sisal, Red Sandalwood, and species in the genus Indigofera, used to produce indigo dye. The longer a crop's harvest period, the more efficient plantations become. Economies of scale are also achieved when the distance to market is long. Plantation crops usually need processing immediately after harvesting. Sugarcane, tea, sisal, and palm oil are most suited to plantations, while coconuts, rubber, and cotton are suitable to a lesser extent. Conditions for formation Plantation economies are factory-like, industrialised and centralised forms of agriculture,[citation needed] owned by large corporations or affluent owners. Under normal circumstances, plantation economies are not as efficient as small farm holdings, since there is immense difficulty in proper supervision of labour over a large land area.[citation needed] When there are large distances between the plantations and their markets, processing can reduce the bulk of the crop and lower shipping costs. Large plantations producing large quantities of a good are able to achieve economies of scale for expensive processing machinery, as the per unit cost of processing is greatly diminished. This economy of scale can be achieved best with tropical crops that are harvested continuously through the year, fully utilising the processing machinery. Examples of crops that are suitable to be processed are sugar, sisal, palm oil, and tea. American plantations In the Thirteen Colonies, plantations were concentrated in the South. These colonies included Maryland, Virginia, North Carolina, South Carolina, and Georgia. They had good soil and long growing seasons, ideal for crops such as rice and tobacco. The existence of many waterways in the region made transportation easier. Each colony specialized in one or two crops, with Virginia standing out in tobacco production. The North meanwhile focused more on food crops like corn and had a structure of yeoman farmers in addition to manufacturing. By the time of the American Civil War this structural difference provided the loyal states with a significant advantage in industrial output and a greater ease of feeding their armies. By contrast the extractive cash crop economy of the rebel states was hampered by the blockade and shortsighted King Cotton diplomacy. The Northern states processed the output of their farms and mines into intermediate or finished goods (corn into whiskey, iron ore and coal into steel etc.) whereas the South relied to a much larger extent on exporting raw goods to either the North or Europe (e.g. most cotton was spun into yarn and cloth in Europe or the North, not the South where it was produced). Consequently the North favored high protective tariffs whereas the South advocated for free trade. Slavery Planters embraced the use of slaves mainly because indentured labor became expensive. Some indentured servants were also leaving to start their own farms as land was widely available. Colonists tried to use Native Americans for labor, but they were susceptible to European diseases and died in large numbers. The plantation owners then turned to enslaved Africans for labor. In 1665, there were fewer than 500 Africans in Virginia but by 1750, 85 percent of the 235,000 slaves lived in the Southern colonies, Virginia included. Africans made up 40 percent of the South's population. According to the 1840 United States census, one out of every four families in Virginia owned slaves. There were over 100 plantation owners who owned over 100 slaves. The number of slaves in the 15 States was just shy of 4 million in a total population of 12.4 million, and the percentage was 32% of the population. Fewer than one-third of white Southern families owned slaves at the peak of slavery prior to the Civil War. In Mississippi and South Carolina the figure approached one half. The total number of slave owners was 385,000 (including, in Louisiana, some free African Americans), amounting to approximately 3.8% of the Southern and Border states population. Industrial Revolution in Europe Western Europe was the final destination for the plantation produce. At this time, Europe was starting to industrialize, and it needed a lot of materials to manufacture goods. Being the power center of the world at the time, they exploited the New World and Africa to industrialize. Africa supplied slaves for the plantations; the New World produced raw material for industries in Europe. Manufactured goods, of higher value, were then sold both to Africa and the New World. The system was largely run by European merchants. Indigo plantations Indigofera was a major crop cultivated during the 18th century, in Venezuela, Guatemala—and Haiti until the slave rebellion against France that left them embargoed by Europe and India in the 19th and 20th centuries. The indigo crop was grown for making blue indigo dye in the pre-industrial age. Mahatma Gandhi's investigation of indigo workers' claims of exploitation led to the passage of the Champaran Agrarian Bill in 1917 by the British colonial government. Southeast Asia In Southeast Asia British and Dutch colonies established plantations to produce agricultural commodity products including tea, pepper and other spices, palm oil, coffee, and rubber. Large scale agricultural production continues in many areas. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Ray_Santilli] | [TOKENS: 480]
Contents Ray Santilli Ray Santilli (born 30 September 1958) is a British musician, record and film producer. He is best known for his exploitation in 1995 of the controversial "alien autopsy" footage and subject of the Warner Bros. film Alien Autopsy. Early life Born in London, Santilli was the son of Italian immigrants. He spent his childhood in Islington London. Career Ray Santilli started his professional career in 1974 as a session musician, record producer and music distributor. In 1982, Santilli founded AMP Entertainment, where he produced and promoted acts of the day. In 1981, Santilli produced The Tweets album which featured "The Birdy Song".[citation needed] In 1985 he founded Music Broadcasting Services Ltd, an independent record label which handled the exclusive rights to the Walt Disney Audio Soundtrack Catalogue in the United Kingdom. In 1987 Santilli produced the charity record "The Wishing Well" featuring Boy George, Dollar and Grace Kennedy for Great Ormond Street Hospital. In 1991, Santilli founded the Merlin Group. The company specialised in the re-recording of hits with original artists. Merlin also produced and marketed a number of television specials. In 1994 Santilli formed Orbital Media Ltd where he produced a succession of TV documentaries and films for television. Santilli is best known for his claim to have discovered footage which depicted the autopsy of an alien creature. The "alien autopsy" footage, supposedly of extraterrestrial corpses from the so-called Roswell UFO incident, was broadcast to a worldwide audience on 28 August 1995. The film and those who took part in the making of it have all admitted it is a hoax, although Santilli still maintains it is real despite him changing his story numerous times. He also claims Kodak have analysed the film and confirmed its date, but when asked to resubmit the film with the images, Santilli has always refused. In 2006 the story of the autopsy film was the subject of a Warner Bros. feature film Alien Autopsy, starring the British double act Ant & Dec. Dec plays Santilli, with Ant as Santilli's real life business partner and friend Gary Shoefield. In the same year, Santilli claimed that sections of the autopsy footage had been "restored". Filmography References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-23] | [TOKENS: 10515]
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/File:OpenAI_corporate_structure.svg] | [TOKENS: 158]
File:OpenAI corporate structure.svg Summary Licensing Citations References Diagram source file This diagram is created from the following Mermaid code. If updating this diagram, please update this code. Process: (Steps 1 and 2 are needed to workaround an issue with Mermaid's SVG export functionality - see mermaid issues #2102, #2485 and #2688) File history Click on a date/time to view the file as it appeared at that time. File usage The following page uses this file: Global file usage The following other wikis use this file: Metadata This file contains additional information, probably added from the digital camera or scanner used to create or digitize it. If the file has been modified from its original state, some details may not fully reflect the modified file.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Middle_East#cite_note-15] | [TOKENS: 6152]
Contents Middle East The Middle East[b] is a geopolitical region encompassing the Arabian Peninsula, Egypt, Iran, Iraq, the Levant, and Turkey. The term came into widespread usage by Western European nations in the early 20th century as a replacement of the term Near East (both were in contrast to the Far East). The term "Middle East" has led to some confusion over its changing definitions. Since the late 20th century, it has been criticized as being too Eurocentric. The region includes the vast majority of the territories included in the closely associated definition of West Asia, but without the South Caucasus. It also includes all of Egypt (not just the Sinai region) and all of Turkey (including East Thrace). Most Middle Eastern countries (13 out of 18) are part of the Arab world. The three most populous countries in the region are Egypt, Iran, and Turkey, while Saudi Arabia is the largest Middle Eastern country by area. The history of the Middle East dates back to ancient times, and it was long considered the "cradle of civilization". The geopolitical importance of the region has been recognized and competed for during millennia. The Abrahamic religions (Judaism, Christianity, and Islam) have their origins in the Middle East. Arabs constitute the main ethnic group in the region, followed by Turks, Persians, Kurds, Jews, and Assyrians. The Middle East generally has a hot, arid climate, especially in the Arabian and Egyptian regions. Several major rivers provide irrigation to support agriculture in limited areas here, such as the Nile Delta in Egypt, the Tigris and Euphrates watersheds of Mesopotamia, and the basin of the Jordan River that spans most of the Levant. These regions are collectively known as the Fertile Crescent, and comprise the core of what historians had long referred to as the cradle of civilization; multiple regions of the world have since been classified as also having developed independent, original civilizations. Conversely, the Levantine coast and most of Turkey have relatively temperate climates typical of the Mediterranean, with dry summers and cool, wet winters. Most of the countries that border the Persian Gulf have vast reserves of petroleum. Monarchs of the Arabian Peninsula in particular have benefitted economically from petroleum exports. Because of the arid climate and dependence on the fossil fuel industry, the Middle East is both a major contributor to climate change and a region that is expected to be severely adversely affected by it. Other concepts of the region exist, including the broader Middle East and North Africa (MENA), which includes states of the Maghreb and the Sudan. The term the "Greater Middle East" also includes Afghanistan, Mauritania, Pakistan, as well as parts of East Africa, and sometimes Central Asia and the South Caucasus. Terminology The term "Middle East" may have originated in the 1850s in the British India Office. However, it became more widely known when United States naval strategist Alfred Thayer Mahan used the term in 1902 to "designate the area between Arabia and India". During this time the British and Russian empires were vying for influence in Central Asia, a rivalry that would become known as the Great Game. Mahan realized not only the strategic importance of the region, but also of its center, the Persian Gulf. He labeled the area surrounding the Persian Gulf as the Middle East. He said that, beyond Egypt's Suez Canal, the Gulf was the most important passage for Britain to control in order to keep the Russians from advancing towards British India. Mahan first used the term in his article "The Persian Gulf and International Relations", published in September 1902 in the National Review, a British journal. The Middle East, if I may adopt a term which I have not seen, will some day need its Malta, as well as its Gibraltar; it does not follow that either will be in the Persian Gulf. Naval force has the quality of mobility which carries with it the privilege of temporary absences; but it needs to find on every scene of operation established bases of refit, of supply, and in case of disaster, of security. The British Navy should have the facility to concentrate in force if occasion arise, about Aden, India, and the Persian Gulf. Mahan's article was reprinted in The Times and followed in October by a 20-article series entitled "The Middle Eastern Question", written by Sir Ignatius Valentine Chirol. During this series, Sir Ignatius expanded the definition of Middle East to include "those regions of Asia which extend to the borders of India or command the approaches to India." After the series ended in 1903, The Times removed quotation marks from subsequent uses of the term. Until World War II, it was customary to refer to areas centered on Turkey and the eastern shore of the Mediterranean as the "Near East", while the "Far East" centered on China, India and Japan. The Middle East was then defined as the area from Mesopotamia to Burma; namely, the area between the Near East and the Far East. This area broadly corresponds to South Asia. In the late 1930s, the British established the Middle East Command, which was based in Cairo, for its military forces in the region. After that time, the term "Middle East" gained broader usage in Europe and the United States. Following World War II, for example, the Middle East Institute was founded in Washington, D.C. in 1946. The corresponding adjective is Middle Eastern and the derived noun is Middle Easterner. While non-Eurocentric terms such as "Southwest Asia" or "Swasia" have been sparsely used, the classification of the African country, Egypt, among those counted in the Middle East challenges the usefulness of using such terms. The description Middle has also led to some confusion over changing definitions. Before the First World War, "Near East" was used in English to refer to the Balkans and the Ottoman Empire, while "Middle East" referred to the Caucasus, Persia, and Arabian lands, and sometimes Afghanistan, India and others. In contrast, "Far East" referred to the countries of East Asia (e.g. China, Japan, and Korea). With the collapse of the Ottoman Empire in 1918, "Near East" largely fell out of common use in English, while "Middle East" came to be applied to the emerging independent countries of the Islamic world. However, the usage "Near East" was retained by a variety of academic disciplines, including archaeology and ancient history. In their usage, the term describes an area identical to the term Middle East, which is not used by these disciplines (see ancient Near East).[citation needed] The first official use of the term "Middle East" by the United States government was in the 1957 Eisenhower Doctrine, which pertained to the Suez Crisis. Secretary of State John Foster Dulles defined the Middle East as "the area lying between and including Libya on the west and Pakistan on the east, Syria and Iraq on the North and the Arabian peninsula to the south, plus the Sudan and Ethiopia." In 1958, the State Department explained that the terms "Near East" and "Middle East" were interchangeable, and defined the region as including only Egypt, Syria, Israel, Lebanon, Jordan, Iraq, Saudi Arabia, Kuwait, Bahrain, and Qatar. Since the late 20th century, scholars and journalists from the region, such as journalist Louay Khraish and historian Hassan Hanafi have criticized the use of "Middle East" as a Eurocentric and colonialist term. The Associated Press Stylebook of 2004 says that Near East formerly referred to the farther west countries while Middle East referred to the eastern ones, but that now they are synonymous. It instructs: Use Middle East unless Near East is used by a source in a story. Mideast is also acceptable, but Middle East is preferred. European languages have adopted terms similar to Near East and Middle East. Since these are based on a relative description, the meanings depend on the country and are generally different from the English terms. In German the term Naher Osten (Near East) is still in common use (nowadays the term Mittlerer Osten is more and more common in press texts translated from English sources, albeit having a distinct meaning). In the four Slavic languages, Russian Ближний Восток or Blizhniy Vostok, Bulgarian Близкия Изток, Polish Bliski Wschód or Croatian Bliski istok (terms meaning Near East are the only appropriate ones for the region). However, some European languages do have "Middle East" equivalents, such as French Moyen-Orient, Swedish Mellanöstern, Spanish Oriente Medio or Medio Oriente, Greek is Μέση Ανατολή (Mesi Anatoli), and Italian Medio Oriente.[c] Perhaps because of the political influence of the United States and Europe, and the prominence of Western press, the Arabic equivalent of Middle East (Arabic: الشرق الأوسط ash-Sharq al-Awsaṭ) has become standard usage in the mainstream Arabic press. It comprises the same meaning as the term "Middle East" in North American and Western European usage. The designation, Mashriq, also from the Arabic root for East, also denotes a variously defined region around the Levant, the eastern part of the Arabic-speaking world (as opposed to the Maghreb, the western part). Even though the term originated in the West, countries of the Middle East that use languages other than Arabic also use that term in translation. For instance, the Persian equivalent for Middle East is خاورمیانه (Khāvar-e miyāneh), the Hebrew is המזרח התיכון (hamizrach hatikhon), and the Turkish is Orta Doğu. Countries and territory Traditionally included within the Middle East are Arabia, Asia Minor, East Thrace, Egypt, Iran, the Levant, Mesopotamia, and the Socotra Archipelago. The region includes 17 UN-recognized countries and one British Overseas Territory. Various concepts are often paralleled to the Middle East, most notably the Near East, Fertile Crescent, and Levant. These are geographical concepts, which refer to large sections of the modern-day Middle East, with the Near East being the closest to the Middle East in its geographical meaning. Due to it primarily being Arabic speaking, the Maghreb region of North Africa is sometimes included. "Greater Middle East" is a political term coined by the second Bush administration in the first decade of the 21st century to denote various countries, pertaining to the Muslim world, specifically Afghanistan, Iran, Pakistan, and Turkey. Various Central Asian countries are sometimes also included. History The Middle East lies at the juncture of Africa and Eurasia and of the Indian Ocean and the Mediterranean Sea (see also: Indo-Mediterranean). It is the birthplace and spiritual center of religions such as Christianity, Islam, Judaism, Manichaeism, Yezidi, Druze, Yarsan, and Mandeanism, and in Iran, Mithraism, Zoroastrianism, Manicheanism, and the Baháʼí Faith. Throughout its history the Middle East has been a major center of world affairs; a strategically, economically, politically, culturally, and religiously sensitive area. The region is one of the regions where agriculture was independently discovered, and from the Middle East it was spread, during the Neolithic, to different regions of the world such as Europe, the Indus Valley and Eastern Africa. Prior to the formation of civilizations, advanced cultures formed all over the Middle East during the Stone Age. The search for agricultural lands by agriculturalists, and pastoral lands by herdsmen meant different migrations took place within the region and shaped its ethnic and demographic makeup. The Middle East is widely and most famously known as the cradle of civilization. The world's earliest civilizations, Mesopotamia (Sumer, Akkad, Assyria and Babylonia), ancient Egypt and Kish in the Levant, all originated in the Fertile Crescent and Nile Valley regions of the ancient Near East. These were followed by the Hittite, Greek, Hurrian and Urartian civilisations of Asia Minor; Elam, Persia and Median civilizations in Iran, as well as the civilizations of the Levant (such as Ebla, Mari, Nagar, Ugarit, Canaan, Aramea, Mitanni, Phoenicia and Israel) and the Arabian Peninsula (Magan, Sheba, Ubar). The Near East was first largely unified under the Neo Assyrian Empire, then the Achaemenid Empire followed later by the Macedonian Empire and after this to some degree by the Iranian empires (namely the Parthian and Sassanid Empires), the Roman Empire and Byzantine Empire. The region served as the intellectual and economic center of the Roman Empire and played an exceptionally important role due to its periphery on the Sassanid Empire. Thus, the Romans stationed up to five or six of their legions in the region for the sole purpose of defending it from Sassanid and Bedouin raids and invasions. From the 4th century CE onwards, the Middle East became the center of the two main powers at the time, the Byzantine Empire and the Sassanid Empire. However, it would be the later Islamic Caliphates of the Middle Ages, or Islamic Golden Age which began with the Islamic conquest of the region in the 7th century AD, that would first unify the entire Middle East as a distinct region and create the dominant Islamic Arab ethnic identity that largely (but not exclusively) persists today. The 4 caliphates that dominated the Middle East for more than 600 years were the Rashidun Caliphate, the Umayyad caliphate, the Abbasid caliphate and the Fatimid caliphate. Additionally, the Mongols would come to dominate the region, the Kingdom of Armenia would incorporate parts of the region to their domain, the Seljuks would rule the region and spread Turko-Persian culture, and the Franks would found the Crusader states that would stand for roughly two centuries. Josiah Russell estimates the population of what he calls "Islamic territory" as roughly 12.5 million in 1000 – Anatolia 8 million, Syria 2 million, and Egypt 1.5 million. From the 16th century onward, the Middle East came to be dominated, once again, by two main powers: the Ottoman Empire and the Safavid dynasty. The modern Middle East began after World War I, when the Ottoman Empire, which was allied with the Central Powers, was defeated by the Allies and partitioned into a number of separate nations, initially under British and French Mandates. Other defining events in this transformation included the establishment of Israel in 1948 and the eventual departure of European powers, notably Britain and France by the end of the 1960s. They were supplanted in some part by the rising influence of the United States from the 1970s onwards. In the 20th century, the region's significant stocks of crude oil gave it new strategic and economic importance. Mass production of oil began around 1945, with Saudi Arabia, Iran, Kuwait, Iraq, and the United Arab Emirates having large quantities of oil. Estimated oil reserves, especially in Saudi Arabia and Iran, are some of the highest in the world, and the international oil cartel OPEC is dominated by Middle Eastern countries. During the Cold War, the Middle East was a theater of ideological struggle between the two superpowers and their allies: NATO and the United States on one side, and the Soviet Union and Warsaw Pact on the other, as they competed to influence regional allies. Besides the political reasons there was also the "ideological conflict" between the two systems. Moreover, as Louise Fawcett argues, among many important areas of contention, or perhaps more accurately of anxiety, were, first, the desires of the superpowers to gain strategic advantage in the region, second, the fact that the region contained some two-thirds of the world's oil reserves in a context where oil was becoming increasingly vital to the economy of the Western world [...] Within this contextual framework, the United States sought to divert the Arab world from Soviet influence. Throughout the 20th and 21st centuries, the region has experienced both periods of relative peace and tolerance and periods of conflict particularly between Sunnis and Shiites. Geography In 2018, the MENA region emitted 3.2 billion tonnes of carbon dioxide and produced 8.7% of global greenhouse gas emissions (GHG) despite making up only 6% of the global population. These emissions are mostly from the energy sector, an integral component of many Middle Eastern and North African economies due to the extensive oil and natural gas reserves that are found within the region. The Middle East region is one of the most vulnerable to climate change. The impacts include increase in drought conditions, aridity, heatwaves and sea level rise. Sharp global temperature and sea level changes, shifting precipitation patterns and increased frequency of extreme weather events are some of the main impacts of climate change as identified by the Intergovernmental Panel on Climate Change (IPCC). The MENA region is especially vulnerable to such impacts due to its arid and semi-arid environment, facing climatic challenges such as low rainfall, high temperatures and dry soil. The climatic conditions that foster such challenges for MENA are projected by the IPCC to worsen throughout the 21st century. If greenhouse gas emissions are not significantly reduced, part of the MENA region risks becoming uninhabitable before the year 2100. Climate change is expected to put significant strain on already scarce water and agricultural resources within the MENA region, threatening the national security and political stability of all included countries. Over 60 percent of the region's population lives in high and very high water-stressed areas compared to the global average of 35 percent. This has prompted some MENA countries to engage with the issue of climate change on an international level through environmental accords such as the Paris Agreement. Law and policy are also being established on a national level amongst MENA countries, with a focus on the development of renewable energies. Economy Middle Eastern economies range from being very poor (such as Gaza and Yemen) to extremely wealthy nations (such as Qatar and UAE). According to the International Monetary Fund, the three largest Middle Eastern economies in nominal GDP in 2023 were Saudi Arabia ($1.06 trillion), Turkey ($1.03 trillion), and Israel ($0.54 trillion). For nominal GDP per person, the highest ranking countries are Qatar ($83,891), Israel ($55,535), the United Arab Emirates ($49,451) and Cyprus ($33,807). Turkey ($3.6 trillion), Saudi Arabia ($2.3 trillion), and Iran ($1.7 trillion) had the largest economies in terms of GDP PPP. For GDP PPP per person, the highest-ranking countries are Qatar ($124,834), the United Arab Emirates ($88,221), Saudi Arabia ($64,836), Bahrain ($60,596) and Israel ($54,997). The lowest-ranking country in the Middle East, in terms of GDP nominal per capita, is Yemen ($573). The economic structure of Middle Eastern nations are different because while some are heavily dependent on export of only oil and oil-related products (Saudi Arabia, the UAE and Kuwait), others have a highly diverse economic base (such as Cyprus, Israel, Turkey and Egypt). Industries of the Middle Eastern region include oil and oil-related products, agriculture, cotton, cattle, dairy, textiles, leather products, surgical instruments, defence equipment (guns, ammunition, tanks, submarines, fighter jets, UAVs, and missiles). Banking is an important sector, especially for UAE and Bahrain. With the exception of Cyprus, Turkey, Egypt, Lebanon and Israel, tourism has been a relatively undeveloped area of the economy, in part because of the socially conservative nature of the region as well as political turmoil in certain regions. Since the end of the COVID pandemic however, countries such as the UAE, Bahrain, and Jordan have begun attracting greater numbers of tourists because of improving tourist facilities and the relaxing of tourism-related restrictive policies. Unemployment is high in the Middle East and North Africa region, particularly among people aged 15–29, a demographic representing 30% of the region's population. The total regional unemployment rate in 2025 is 10.8%, and among youth is as high as 28%. Demographics Arabs constitute the largest ethnic group in the Middle East, followed by various Iranian peoples and then by Turkic peoples (Turkish, Azeris, Syrian Turkmen, and Iraqi Turkmen). Native ethnic groups of the region include, in addition to Arabs, Arameans, Assyrians, Baloch, Berbers, Copts, Druze, Greek Cypriots, Jews, Kurds, Lurs, Mandaeans, Persians, Samaritans, Shabaks, Tats, and Zazas. European ethnic groups that form a diaspora in the region include Albanians, Bosniaks, Circassians (including Kabardians), Crimean Tatars, Greeks, Franco-Levantines, Italo-Levantines, and Iraqi Turkmens. Among other migrant populations are Chinese, Filipinos, Indians, Indonesians, Pakistanis, Pashtuns, Romani, and Afro-Arabs. "Migration has always provided an important vent for labor market pressures in the Middle East. For the period between the 1970s and 1990s, the Arab states of the Persian Gulf in particular provided a rich source of employment for workers from Egypt, Yemen and the countries of the Levant, while Europe had attracted young workers from North African countries due both to proximity and the legacy of colonial ties between France and the majority of North African states." According to the International Organization for Migration, there are 13 million first-generation migrants from Arab nations in the world, of which 5.8 reside in other Arab countries. Expatriates from Arab countries contribute to the circulation of financial and human capital in the region and thus significantly promote regional development. In 2009 Arab countries received a total of US$35.1 billion in remittance in-flows and remittances sent to Jordan, Egypt and Lebanon from other Arab countries are 40 to 190 per cent higher than trade revenues between these and other Arab countries. In Somalia, the Somali Civil War has greatly increased the size of the Somali diaspora, as many of the best educated Somalis left for Middle Eastern countries as well as Europe and North America. Non-Arab Middle Eastern countries such as Turkey, Israel and Iran are also subject to important migration dynamics. A fair proportion of those migrating from Arab nations are from ethnic and religious minorities facing persecution and are not necessarily ethnic Arabs, Iranians or Turks.[citation needed] Large numbers of Kurds, Jews, Assyrians, Greeks and Armenians as well as many Mandeans have left nations such as Iraq, Iran, Syria and Turkey for these reasons during the last century. In Iran, many religious minorities such as Christians, Baháʼís, Jews and Zoroastrians have left since the Islamic Revolution of 1979. The Middle East is very diverse when it comes to religions, many of which originated there. Islam is the largest religion in the Middle East, but other faiths that originated there, such as Judaism and Christianity, are also well represented. Christian communities have played a vital role in the Middle East, and they represent 78% of Cyprus population, and 40.5% of Lebanon, where the Lebanese president, half of the cabinet, and half of the parliament follow one of the various Lebanese Christian rites. There are also important minority religions like the Baháʼí Faith, Yarsanism, Yazidism, Zoroastrianism, Mandaeism, Druze, and Shabakism, and in ancient times the region was home to Mesopotamian religions, Canaanite religions, Manichaeism, Mithraism and various monotheist gnostic sects. The six top languages, in terms of numbers of speakers, are Arabic, Persian, Turkish, Kurdish, Modern Hebrew and Greek. About 20 minority languages are also spoken in the Middle East. Arabic, with all its dialects, is the most widely spoken language in the Middle East, with Literary Arabic being official in all North African and in most West Asian countries. Arabic dialects are also spoken in some adjacent areas in neighbouring Middle Eastern non-Arab countries. It is a member of the Semitic branch of the Afro-Asiatic languages. Several Modern South Arabian languages such as Mehri and Soqotri are also spoken in Yemen and Oman. Another Semitic language is Aramaic and its dialects are spoken mainly by Assyrians and Mandaeans, with Western Aramaic still spoken in two villages near Damascus, Syria. There is also an Oasis Berber-speaking community in Egypt where the language is also known as Siwa. It is a non-Semitic Afro-Asiatic sister language. Persian is the second most spoken language. While it is primarily spoken in Iran and some border areas in neighbouring countries, the country is one of the region's largest and most populous. It belongs to the Indo-Iranian branch of the family of Indo-European languages. Other Western Iranic languages spoken in the region include Achomi, Daylami, Kurdish dialects, Semmani, Lurish, amongst many others. The close third-most widely spoken language, Turkish, is largely confined to Turkey, which is also one of the region's largest and most populous countries, but it is present in areas in neighboring countries. It is a member of the Turkic languages, which have their origins in East Asia. Another Turkic language, Azerbaijani, is spoken by Azerbaijanis in Iran. The fourth-most widely spoken language, Kurdish, is spoken in the countries of Iran, Iraq, Syria and Turkey, Sorani Kurdish is the second official language in Iraq (instated after the 2005 constitution) after Arabic. Hebrew is the official language of Israel, with Arabic given a special status after the 2018 Basic law lowered its status from an official language prior to 2018. Hebrew is spoken and used by over 80% of Israel's population, the other 20% using Arabic. Modern Hebrew only began being spoken in the 20th century after being revived in the late 19th century by Elizer Ben-Yehuda (Elizer Perlman) and European Jewish settlers, with the first native Hebrew speaker being born in 1882. Greek is one of the two official languages of Cyprus, and the country's main language. Small communities of Greek speakers exist all around the Middle East; until the 20th century it was also widely spoken in Asia Minor (being the second most spoken language there, after Turkish) and Egypt. During the antiquity, Ancient Greek was the lingua franca for many areas of the western Middle East and until the Muslim expansion it was widely spoken there as well. Until the late 11th century, it was also the main spoken language in Asia Minor; after that it was gradually replaced by the Turkish language as the Anatolian Turks expanded and the local Greeks were assimilated, especially in the interior. English is one of the official languages of Akrotiri and Dhekelia. It is also commonly taught and used as a foreign second language, in countries such as Egypt, Jordan, Iran, Iraq, Qatar, Bahrain, United Arab Emirates and Kuwait. It is also a main language in some Emirates of the United Arab Emirates. It is also spoken as native language by Jewish immigrants from Anglophone countries (UK, US, Australia) in Israel and understood widely as second language there. French is taught and used in many government facilities and media in Lebanon, and is taught in some primary and secondary schools of Egypt and Syria. Maltese, a Semitic language mainly spoken in Europe, is used by the Franco-Maltese diaspora in Egypt. Due to widespread immigration of French Jews to Israel, it is the native language of approximately 200,000 Jews in Israel. Armenian speakers are to be found in the region. Georgian is spoken by the Georgian diaspora. Russian is spoken by a large portion of the Israeli population, because of emigration in the late 1990s. Russian today is a popular unofficial language in use in Israel; news, radio and sign boards can be found in Russian around the country after Hebrew and Arabic. Circassian is also spoken by the diaspora in the region and by almost all Circassians in Israel who speak Hebrew and English as well. The largest Romanian-speaking community in the Middle East is found in Israel, where as of 1995[update] Romanian is spoken by 5% of the population.[d] Bengali, Hindi and Urdu are widely spoken by migrant communities in many Middle Eastern countries, such as Saudi Arabia (where 20–25% of the population is South Asian), the United Arab Emirates (where 50–55% of the population is South Asian), and Qatar, which have large numbers of Pakistani, Bangladeshi and Indian immigrants. Culture The Middle East has recently become more prominent in hosting global sport events due to its wealth and desire to diversify its economy. The South Asian diaspora is a major backer of cricket in the region. See also Notes References Further reading External links 29°N 41°E / 29°N 41°E / 29; 41
========================================
[SOURCE: https://en.wikipedia.org/wiki/Ova] | [TOKENS: 1505]
Contents Egg cell The egg cell or ovum (pl.: ova) is the female reproductive cell, or gamete, in most anisogamous organisms (organisms that reproduce sexually with a larger, female gamete and a smaller, male one). The term is used when the female gamete is not capable of movement (non-motile). If the male gamete (sperm) is capable of movement, the type of sexual reproduction is also classified as oogamous. A nonmotile female gamete formed in the oogonium of some algae, fungi, oomycetes, or bryophytes is an oosphere. When fertilized, the oosphere becomes the oospore.[clarification needed] When egg and sperm fuse together during fertilisation, a diploid cell (the zygote) is formed, which rapidly grows into a new organism. History While the non-mammalian animal egg was obvious, the doctrine ex ovo omne vivum ("every living [animal comes from] an egg"), associated with William Harvey (1578–1657), was a rejection of spontaneous generation and preformationism as well as a bold assumption that mammals also reproduced via eggs. Karl Ernst von Baer discovered the mammalian ovum in 1827. The fusion of spermatozoa with ova (of a starfish) was observed by Oskar Hertwig in 1876. Animals In animals, egg cells are also known as ova (singular ovum, from the Latin word ovum meaning 'egg'). The term ovule in animals is used for the young ovum of an animal. In vertebrates, ova are produced by female gonads (sex glands) called ovaries. A number of ova are present at birth in mammals and mature via oogenesis. Studies performed on humans, dogs, and cats in the 1870s suggested that the production of oocytes (immature egg cells) stops at or shortly after birth. A review of reports from 1900 to 1950 by zoologist Solomon Zuckerman cemented the belief that females have a finite number of oocytes that are formed before they are born. This dogma has been challenged by a number of studies since 2004. Several studies suggest that ovarian stem cells exist within the mammalian ovary. Whether or not mature mammals can actually create new egg cells remains uncertain and is an ongoing research question. In all mammals, the ovum is fertilized inside the female body. Human ova grow from primitive germ cells that are embedded in the substance of the ovaries. The ovum is one of the largest cells in the human body, typically visible to the naked eye without the aid of a microscope or other magnification device. The human ovum measures approximately 120 μm (0.0047 in) in diameter. In humans, recombination rates differ between maternal and paternal DNA: Ooplasm is like the yolk of the ovum, a cell substance at its center, which contains its nucleus, named the germinal vesicle, and the nucleolus, called the germinal disc. The ooplasm consists of the cytoplasm of the ordinary animal cell with its spongioplasm and hyaloplasm, often called the formative yolk; and the nutritive yolk or deutoplasm, made of rounded granules of fatty and albuminoid substances imbedded in the cytoplasm. Mammalian ova contain only a tiny amount of the nutritive yolk, for nourishing the embryo in the early stages of its development only. In contrast, bird eggs contain enough to supply the chick with nutriment throughout the whole period of incubation. In the oviparous animals (all birds, most fish, amphibians and reptiles), the ova develop protective layers and pass through the oviduct to the outside of the body. They are fertilized by male sperm either inside the female body (as in birds and reptiles), or outside (as in many fish and amphibians). After fertilization, an embryo develops, nourished by nutrients contained in the egg. It then hatches from the egg, outside the mother's body. See egg for a discussion of eggs of oviparous animals. The egg cell's cytoplasm and mitochondria are the sole means the egg can reproduce by mitosis and eventually form a blastocyst after fertilization. There is an intermediate form, the ovoviviparous animals: the embryo develops within and is nourished by an egg as in the oviparous case, but then it hatches inside the mother's body shortly before birth, or just after the egg leaves the mother's body. Some fish, reptiles and many invertebrates use this technique. Plants Nearly all land plants have alternating diploid and haploid generations. Gametes are produced by the haploid generation, which is known as the gametophyte. The female gametophyte produces structures called archegonia, and the egg cells form within them via mitosis. The typical bryophyte archegonium consists of a long neck with a wider base containing the egg cell. Upon maturation, the neck opens to allow sperm cells to swim into the archegonium and fertilize the egg. The resulting zygote then gives rise to an embryo, which will grow into a new diploid individual, known as a sporophyte. In seed plants, a structure called the ovule contains the female gametophyte. The gametophyte produces an egg cell. After fertilization, the ovule develops into a seed containing the embryo. In flowering plants, the female gametophyte (sometimes referred to as the embryo sac) has been reduced to just eight cells inside the ovule. The gametophyte cell closest to the micropyle opening of the ovule develops into the egg cell. Upon pollination, a pollen tube delivers sperm into the gametophyte and one sperm nucleus fuses with the egg nucleus. The resulting zygote develops into an embryo inside the ovule. The ovule, in turn, develops into a seed and in many cases, the plant ovary develops into a fruit to facilitate the dispersal of the seeds. Upon germination, the embryo grows into a seedling. In the moss Physcomitrella patens, the Polycomb protein FIE is expressed in the unfertilised egg cell (Figure, right) as the blue colour after GUS staining reveals. Soon after fertilisation the FIE gene is inactivated (the blue colour is no longer visible, left) in the young embryo. Other organisms In algae, the egg cell is often called oosphere.[citation needed] Drosophila oocytes develop in individual egg chambers that are supported by nurse cells and surrounded by somatic follicle cells. The nurse cells are large polyploid cells that synthesize and transfer RNA, proteins, and organelles to the oocytes. This transfer is followed by the programmed cell death (apoptosis) of the nurse cells. During oogenesis, 15 nurse cells die for every oocyte that is produced. In addition to this developmentally regulated cell death, egg cells may also undergo apoptosis in response to starvation and other insults. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_note-10] | [TOKENS: 9291]
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Orion_(constellation)#cite_note-19] | [TOKENS: 4993]
Contents Orion (constellation) Orion is a prominent set of stars visible during winter in the northern celestial hemisphere. It is one of the 88 modern constellations; it was among the 48 constellations listed by the 2nd-century AD/CE astronomer Ptolemy. It is named after a hunter in Greek mythology. Orion is most prominent during winter evenings in the Northern Hemisphere, as are five other constellations that have stars in the Winter Hexagon asterism. Orion's two brightest stars, Rigel (β) and Betelgeuse (α), are both among the brightest stars in the night sky; both are supergiants and slightly variable. There are a further six stars brighter than magnitude 3.0, including three making the short straight line of the Orion's Belt asterism. Orion also hosts the radiant of the annual Orionids, the strongest meteor shower associated with Halley's Comet, and the Orion Nebula, one of the brightest nebulae in the sky. Characteristics Orion is bordered by Taurus to the northwest, Eridanus to the southwest, Lepus to the south, Monoceros to the east, and Gemini to the northeast. Covering 594 square degrees, Orion ranks 26th of the 88 constellations in size. The constellation boundaries, as set by Belgian astronomer Eugène Delporte in 1930, are defined by a polygon of 26 sides. In the equatorial coordinate system, the right ascension coordinates of these borders lie between 04h 43.3m and 06h 25.5m , while the declination coordinates are between 22.87° and −10.97°. The constellation's three-letter abbreviation, as adopted by the International Astronomical Union in 1922, is "Ori". Orion is most visible in the evening sky from January to April, winter in the Northern Hemisphere, and summer in the Southern Hemisphere. In the tropics (less than about 8° from the equator), the constellation transits at the zenith. From May to July (summer in the Northern Hemisphere, winter in the Southern Hemisphere), Orion is in the daytime sky and thus invisible at most latitudes. However, for much of Antarctica in the Southern Hemisphere's winter months, the Sun is below the horizon even at midday. Stars (and thus Orion, but only the brightest stars) are then visible at twilight for a few hours around local noon, just in the brightest section of the sky low in the North where the Sun is just below the horizon. At the same time of day at the South Pole itself (Amundsen–Scott South Pole Station), Rigel is only 8° above the horizon, and the Belt sweeps just along it. In the Southern Hemisphere's summer months, when Orion is normally visible in the night sky, the constellation is actually not visible in Antarctica because the Sun does not set at that time of year south of the Antarctic Circle. In countries close to the equator (e.g. Kenya, Indonesia, Colombia, Ecuador), Orion appears overhead in December around midnight and in the February evening sky. Navigational aid Orion is very useful as an aid to locating other stars. By extending the line of the Belt southeastward, Sirius (α CMa) can be found; northwestward, Aldebaran (α Tau). A line eastward across the two shoulders indicates the direction of Procyon (α CMi). A line from Rigel through Betelgeuse points to Castor and Pollux (α Gem and β Gem). Additionally, Rigel is part of the Winter Circle asterism. Sirius and Procyon, which may be located from Orion by following imaginary lines (see map), also are points in both the Winter Triangle and the Circle. Features Orion's seven brightest stars form a distinctive hourglass-shaped asterism, or pattern, in the night sky. Four stars—Rigel, Betelgeuse, Bellatrix, and Saiph—form a large roughly rectangular shape, at the center of which lie the three stars of Orion's Belt—Alnitak, Alnilam, and Mintaka. His head is marked by an additional eighth star called Meissa, which is fairly bright to the observer. Descending from the Belt is a smaller line of three stars, Orion's Sword (the middle of which is in fact not a star but the Orion Nebula), also known as the hunter's sword. Many of the stars are luminous hot blue supergiants, with the stars of the Belt and Sword forming the Orion OB1 association. Standing out by its red hue, Betelgeuse may nevertheless be a runaway member of the same group. Orion's Belt, or The Belt of Orion, is an asterism within the constellation. It consists of three bright stars: Alnitak (Zeta Orionis), Alnilam (Epsilon Orionis), and Mintaka (Delta Orionis). Alnitak is around 800 light-years away from Earth, 100,000 times more luminous than the Sun, and shines with a magnitude of 1.8; much of its radiation is in the ultraviolet range, which the human eye cannot see. Alnilam is approximately 2,000 light-years from Earth, shines with a magnitude of 1.70, and with an ultraviolet light that is 375,000 times more luminous than the Sun. Mintaka is 915 light-years away and shines with a magnitude of 2.21. It is 90,000 times more luminous than the Sun and is a double star: the two orbit each other every 5.73 days. In the Northern Hemisphere, Orion's Belt is best visible in the night sky during the month of January at around 9:00 pm, when it is approximately around the local meridian. Just southwest of Alnitak lies Sigma Orionis, a multiple star system composed of five stars that have a combined apparent magnitude of 3.7 and lying at a distance of 1150 light-years. Southwest of Mintaka lies the quadruple star Eta Orionis. Orion's Sword contains the Orion Nebula, the Messier 43 nebula, Sh 2-279 (also known as the Running Man Nebula), and the stars Theta Orionis, Iota Orionis, and 42 Orionis. Three stars comprise a small triangle that marks the head. The apex is marked by Meissa (Lambda Orionis), a hot blue giant of spectral type O8 III and apparent magnitude 3.54, which lies some 1100 light-years distant. Phi-1 and Phi-2 Orionis make up the base. Also nearby is the young star FU Orionis. Stretching north from Betelgeuse are the stars that make up Orion's club. Mu Orionis marks the elbow, Nu and Xi mark the handle of the club, and Chi1 and Chi2 mark the end of the club. Just east of Chi1 is the Mira-type variable red giant star U Orionis. West from Bellatrix lie six stars all designated Pi Orionis (π1 Ori, π2 Ori, π3 Ori, π4 Ori, π5 Ori, and π6 Ori) which make up Orion's shield. Around 20 October each year, the Orionid meteor shower (Orionids) reaches its peak. Coming from the border with the constellation Gemini, as many as 20 meteors per hour can be seen. The shower's parent body is Halley's Comet. Hanging from Orion's Belt is his sword, consisting of the multiple stars θ1 and θ2 Orionis, called the Trapezium and the Orion Nebula (M42). This is a spectacular object that can be clearly identified with the naked eye as something other than a star. Using binoculars, its clouds of nascent stars, luminous gas, and dust can be observed. The Trapezium cluster has many newborn stars, including several brown dwarfs, all of which are at an approximate distance of 1,500 light-years. Named for the four bright stars that form a trapezoid, it is largely illuminated by the brightest stars, which are only a few hundred thousand years old. Observations by the Chandra X-ray Observatory show both the extreme temperatures of the main stars—up to 60,000 kelvins—and the star forming regions still extant in the surrounding nebula. M78 (NGC 2068) is a nebula in Orion. With an overall magnitude of 8.0, it is significantly dimmer than the Great Orion Nebula that lies to its south; however, it is at approximately the same distance, at 1600 light-years from Earth. It can easily be mistaken for a comet in the eyepiece of a telescope. M78 is associated with the variable star V351 Orionis, whose magnitude changes are visible in very short periods of time. Another fairly bright nebula in Orion is NGC 1999, also close to the Great Orion Nebula. It has an integrated magnitude of 10.5 and is 1500 light-years from Earth. The variable star V380 Orionis is embedded in NGC 1999. Another famous nebula is IC 434, the Horsehead Nebula, near Alnitak (Zeta Orionis). It contains a dark dust cloud whose shape gives the nebula its name. NGC 2174 is an emission nebula located 6400 light-years from Earth. Besides these nebulae, surveying Orion with a small telescope will reveal a wealth of interesting deep-sky objects, including M43, M78, and multiple stars including Iota Orionis and Sigma Orionis. A larger telescope may reveal objects such as the Flame Nebula (NGC 2024), as well as fainter and tighter multiple stars and nebulae. Barnard's Loop can be seen on very dark nights or using long-exposure photography. All of these nebulae are part of the larger Orion molecular cloud complex, which is located approximately 1,500 light-years away and is hundreds of light-years across. Due to its proximity, it is one of the most intense regions of stellar formation visible from Earth. The Orion molecular cloud complex forms the eastern part of an even larger structure, the Orion–Eridanus Superbubble, which is visible in X-rays and in hydrogen emissions. History and mythology The distinctive pattern of Orion is recognized in numerous cultures around the world, and many myths are associated with it. Orion is used as a symbol in the modern world. In Siberia, the Chukchi people see Orion as a hunter; an arrow he has shot is represented by Aldebaran (Alpha Tauri), with the same figure as other Western depictions. In Greek mythology, Orion was a gigantic, supernaturally strong hunter, born to Euryale, a Gorgon, and Poseidon (Neptune), god of the sea. One myth recounts Gaia's rage at Orion, who dared to say that he would kill every animal on Earth. The angry goddess tried to dispatch Orion with a scorpion. This is given as the reason that the constellations of Scorpius and Orion are never in the sky at the same time. However, Ophiuchus, the Serpent Bearer, revived Orion with an antidote. This is said to be the reason that the constellation of Ophiuchus stands midway between the Scorpion and the Hunter in the sky. The constellation is mentioned in Horace's Odes (Ode 3.27.18), Homer's Odyssey (Book 5, line 283) and Iliad, and Virgil's Aeneid (Book 1, line 535). In old Hungarian tradition, Orion is known as "Archer" (Íjász), or "Reaper" (Kaszás). In recently rediscovered myths, he is called Nimrod (Hungarian: Nimród), the greatest hunter, father of the twins Hunor and Magor. The π and o stars (on upper right) form together the reflex bow or the lifted scythe. In other Hungarian traditions, Orion's Belt is known as "Judge's stick" (Bírópálca). In Ireland and Scotland, Orion was called An Bodach, a figure from Irish folklore whose name literally means "the one with a penis [bod]" and was the husband of the Cailleach (hag). In Scandinavian tradition, Orion's Belt was known as "Frigg's Distaff" (friggerock) or "Freyja's distaff". The Finns call Orion's Belt and the stars below it "Väinämöinen's scythe" (Väinämöisen viikate). Another name for the asterism of Alnilam, Alnitak, and Mintaka is "Väinämöinen's Belt" (Väinämöisen vyö) and the stars "hanging" from the Belt as "Kaleva's sword" (Kalevanmiekka). There are claims in popular media that the Adorant from the Geißenklösterle cave, an ivory carving estimated to be 35,000 to 40,000 years old, is the first known depiction of the constellation. Scholars dismiss such interpretations, saying that perceived details such as a belt and sword derive from preexisting features in the grain structure of the ivory. The Babylonian star catalogues of the Late Bronze Age name Orion MULSIPA.ZI.AN.NA,[note 1] "The Heavenly Shepherd" or "True Shepherd of Anu" – Anu being the chief god of the heavenly realms. The Babylonian constellation is sacred to Papshukal and Ninshubur, both minor gods fulfilling the role of "messenger to the gods". Papshukal is closely associated with the figure of a walking bird on Babylonian boundary stones, and on the star map the figure of the Rooster is located below and behind the figure of the True Shepherd—both constellations represent the herald of the gods, in his bird and human forms respectively. In ancient Egypt, the stars of Orion were regarded as a god, called Sah. Because Orion rises before Sirius, the star whose heliacal rising was the basis for the Solar Egyptian calendar, Sah was closely linked with Sopdet, the goddess who personified Sirius. The god Sopdu is said to be the son of Sah and Sopdet. Sah is syncretized with Osiris, while Sopdet is syncretized with Osiris' mythological wife, Isis. In the Pyramid Texts, from the 24th and 23rd centuries BC, Sah is one of many gods whose form the dead pharaoh is said to take in the afterlife. The Armenians identified their legendary patriarch and founder Hayk with Orion. Hayk is also the name of the Orion constellation in the Armenian translation of the Bible. The Bible mentions Orion three times, naming it "Kesil" (כסיל, literally – fool). Though, this name perhaps is etymologically connected with "Kislev", the name for the ninth month of the Hebrew calendar (i.e. November–December), which, in turn, may derive from the Hebrew root K-S-L as in the words "kesel, kisla" (כֵּסֶל, כִּסְלָה, hope, positiveness), i.e. hope for winter rains.: Job 9:9 ("He is the maker of the Bear and Orion"), Job 38:31 ("Can you loosen Orion's belt?"), and Amos 5:8 ("He who made the Pleiades and Orion"). In ancient Aram, the constellation was known as Nephîlā′, the Nephilim are said to be Orion's descendants. In medieval Muslim astronomy, Orion was known as al-jabbar, "the giant". Orion's sixth brightest star, Saiph, is named from the Arabic, saif al-jabbar, meaning "sword of the giant". In China, Orion was one of the 28 lunar mansions Sieu (Xiù) (宿). It is known as Shen (參), literally meaning "three", for the stars of Orion's Belt. The Chinese character 參 (pinyin shēn) originally meant the constellation Orion (Chinese: 參宿; pinyin: shēnxiù); its Shang dynasty version, over three millennia old, contains at the top a representation of the three stars of Orion's Belt atop a man's head (the bottom portion representing the sound of the word was added later). The Rigveda refers to the constellation as Mriga (the Deer). Nataraja, "the cosmic dancer", is often interpreted as the representation of Orion. Rudra, the Rigvedic form of Shiva, is the presiding deity of Ardra nakshatra (Betelgeuse) of Hindu astrology. The Jain Symbol carved in the Udayagiri and Khandagiri Caves, India in 1st century BCE has a striking resemblance with Orion. Bugis sailors identified the three stars in Orion's Belt as tanra tellué, meaning "sign of three". The Seri people of northwestern Mexico call the three stars in Orion's Belt Hapj (a name denoting a hunter) which consists of three stars: Hap (mule deer), Haamoja (pronghorn), and Mojet (bighorn sheep). Hap is in the middle and has been shot by the hunter; its blood has dripped onto Tiburón Island. The same three stars are known in Spain and most of Latin America as "Las tres Marías" (Spanish for "The Three Marys"). In Puerto Rico, the three stars are known as the "Los Tres Reyes Magos" (Spanish for The Three Wise Men). The Ojibwa/Chippewa Native Americans call this constellation Mesabi for Big Man. To the Lakota Native Americans, Tayamnicankhu (Orion's Belt) is the spine of a bison. The great rectangle of Orion is the bison's ribs; the Pleiades star cluster in nearby Taurus is the bison's head; and Sirius in Canis Major, known as Tayamnisinte, is its tail. Another Lakota myth mentions that the bottom half of Orion, the Constellation of the Hand, represented the arm of a chief that was ripped off by the Thunder People as a punishment from the gods for his selfishness. His daughter offered to marry the person who can retrieve his arm from the sky, so the young warrior Fallen Star (whose father was a star and whose mother was human) returned his arm and married his daughter, symbolizing harmony between the gods and humanity with the help of the younger generation. The index finger is represented by Rigel; the Orion Nebula is the thumb; the Belt of Orion is the wrist; and the star Beta Eridani is the pinky finger. The seven primary stars of Orion make up the Polynesian constellation Heiheionakeiki which represents a child's string figure similar to a cat's cradle. Several precolonial Filipinos referred to the belt region in particular as "balatik" (ballista) as it resembles a trap of the same name which fires arrows by itself and is usually used for catching pigs from the bush. Spanish colonization later led to some ethnic groups referring to Orion's Belt as "Tres Marias" or "Tatlong Maria." In Māori tradition, the star Rigel (known as Puanga or Puaka) is closely connected with the celebration of Matariki. The rising of Matariki (the Pleiades) and Rigel before sunrise in midwinter marks the start of the Māori year. In Javanese culture, the constellation is often called Lintang Waluku or Bintang Bajak, referring to the shape of a paddy field plow. The imagery of the Belt and Sword has found its way into popular Western culture, for example in the form of the shoulder insignia of the 27th Infantry Division of the United States Army during both World Wars, probably owing to a pun on the name of the division's first commander, Major General John F. O'Ryan. The film distribution company Orion Pictures used the constellation as its logo. In artistic renderings, the surrounding constellations are sometimes related to Orion: he is depicted standing next to the river Eridanus with his two hunting dogs Canis Major and Canis Minor, fighting Taurus. He is sometimes depicted hunting Lepus the hare. He sometimes is depicted to have a lion's hide in his hand. There are alternative ways to visualise Orion. From the Southern Hemisphere, Orion is oriented south-upward, and the Belt and Sword are sometimes called the saucepan or pot in Australia and New Zealand. Orion's Belt is called Drie Konings (Three Kings) or the Drie Susters (Three Sisters) by Afrikaans speakers in South Africa and are referred to as les Trois Rois (the Three Kings) in Daudet's Lettres de Mon Moulin (1866). The appellation Driekoningen (the Three Kings) is also often found in 17th and 18th-century Dutch star charts and seaman's guides. The same three stars are known in Spain, Latin America, and the Philippines as "Las Tres Marías" (The Three Marys), and as "Los Tres Reyes Magos" (The Three Wise Men) in Puerto Rico. Even traditional depictions of Orion have varied greatly. Cicero drew Orion in a similar fashion to the modern depiction. The Hunter held an unidentified animal skin aloft in his right hand; his hand was represented by Omicron2 Orionis and the skin was represented by the five stars designated Pi Orionis. Saiph and Rigel represented his left and right knees, while Eta Orionis and Lambda Leporis were his left and right feet, respectively. As in the modern depiction, Mintaka, Alnilam, and Alnitak represented his Belt. His left shoulder was represented by Betelgeuse, and Mu Orionis made up his left arm. Meissa was his head, and Bellatrix his right shoulder. The depiction of Hyginus was similar to that of Cicero, though the two differed in a few important areas. Cicero's animal skin became Hyginus's shield (Omicron and Pi Orionis), and instead of an arm marked out by Mu Orionis, he holds a club (Chi Orionis). His right leg is represented by Theta Orionis and his left leg is represented by Lambda, Mu, and Epsilon Leporis. Further Western European and Arabic depictions have followed these two models. Future Orion is located on the celestial equator, but it will not always be so located due to the effects of precession of the Earth's axis. Orion lies well south of the ecliptic, and it only happens to lie on the celestial equator because the point on the ecliptic that corresponds to the June solstice is close to the border of Gemini and Taurus, to the north of Orion. Precession will eventually carry Orion further south, and by AD 14000, Orion will be far enough south that it will no longer be visible from the latitude of Great Britain. Further in the future, Orion's stars will gradually move away from the constellation due to proper motion. However, Orion's brightest stars all lie at a large distance from Earth on an astronomical scale—much farther away than Sirius, for example. Orion will still be recognizable long after most of the other constellations—composed of relatively nearby stars—have distorted into new configurations, with the exception of a few of its stars eventually exploding as supernovae, for example Betelgeuse, which is predicted to explode sometime in the next million years. See also References External links
========================================
[SOURCE: https://he.wikipedia.org/wiki/%D7%A7%D7%98%D7%92%D7%95%D7%A8%D7%99%D7%94:%D7%95%D7%99%D7%A7%D7%99%D7%A4%D7%93%D7%99%D7%94:_%D7%94%D7%A9%D7%9C%D7%9E%D7%94_-_%D7%9E%D7%97%D7%A9%D7%95%D7%91] | [TOKENS: 577]
קטגוריה:ויקיפדיה: השלמה - מחשוב שער הקהילה > תחזוקה > ערכים הדורשים השלמה בנושא מחשוב קטגוריה זו מרכזת ערכים הדורשים השלמה בשל אחת או יותר מהסיבות הבאות: אוכל • אישים • אמנות • בריאות • גאוגרפיה • דתות • היסטוריה • יהדות • ישראל • כלכלה • מדינות אוקיאניה • מדינות אירופה • מדינות אמריקה • מדינות אסיה • מדינות אפריקה • מדינות המזרח התיכון • מדע וטכנולוגיה • מדעי החברה • מדעי הטבע • מדעי הרוח • כלכלה • מוזיקה • מחשוב • ספורט • ספרות • פוליטיקה • צבא וביטחון • קולנוע וטלוויזיה • תחבורה • תרבות פופולרית • פורטלים דפים בקטגוריה "ויקיפדיה: השלמה - מחשוב" דף קטגוריה זה כולל את 200 הדפים הבאים, מתוך 288 בקטגוריה כולה. (לתצוגת עץ)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Exploration_of_Mars] | [TOKENS: 8386]
Contents Exploration of Mars The planet Mars has been explored remotely by spacecraft. Probes sent from Earth, beginning in the late 20th century, have yielded a large increase in knowledge about the Martian system, focused primarily on understanding its geology and habitability potential. Engineering interplanetary journeys is complicated and the exploration of Mars has experienced a high failure rate, especially the early attempts. Roughly sixty percent of all spacecraft destined for Mars failed before completing their missions, with some failing before their observations could begin. Some missions have been met with unexpected success, such as the twin Mars Exploration Rovers, Spirit and Opportunity, which operated for years beyond their specification. Current status There are two functional rovers on the surface of Mars, the Curiosity and Perseverance rovers, both operated by the American space agency NASA. Perseverance was accompanied by the Ingenuity helicopter, which scouted sites for Perseverance to study before the helicopter's mission ended in 2024. The Zhurong rover, part of the Tianwen-1 mission by the China National Space Administration (CNSA) was active until 20 May 2022 when it went into hibernation due to approaching sandstorms and Martian winter; the rover was expected to wake up from hibernation in December 2022, but as of April 2023 it has not moved and is presumed to be permanently inactive. There are seven orbiters surveying the planet: Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, the Trace Gas Orbiter, the Hope Mars Mission, and the Tianwen-1 orbiter, which have contributed massive amounts of information about Mars. Thus there are nine total vehicles currently exploring Mars: 2 rovers and 7 orbiters. Various Mars sample return missions are being planned like NASA-ESA Mars Sample Return that will pick up the samples currently being collected by the Perseverance rover. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Martian system Mars has long been the subject of human interest. Early telescopic observations revealed color changes on the surface that were attributed to seasonal vegetation and apparent linear features were ascribed to intelligent design. Further telescopic observations found two moons, Phobos and Deimos, polar ice caps and the feature now known as Olympus Mons, the Solar System's tallest mountain. The discoveries piqued further interest in the study and exploration of the red planet. Mars is a rocky planet, like Earth, that formed around the same time, yet with only half the diameter of Earth, and a thin atmosphere; it has a cold and desert-like surface. One way the surface of Mars has been categorized, is by thirty "quadrangles", with each quadrangle named for a prominent physiographic feature within that quadrangle. Launch windows The minimum-energy launch windows for a Martian expedition occur at intervals of approximately two years and two months (specifically 780 days, the planet's synodic period with respect to Earth). In addition, the lowest available transfer energy varies on a roughly 16-year cycle. For example, a minimum occurred in the 1969 and 1971 launch windows, rising to a peak in the late 1970s, and hitting another low in 1986 and 1988. (2) Impulse/Relativity Mars lander Past missions Starting in 1960, the Soviet space program launched a series of probes to Mars including the first intended (but unsuccessful) flybys and hard (impact) landing (Mars 1962B), and the first successful soft landing (Mars 3). The first successful flyby of Mars was on 14–15 July 1965, by NASA's Mariner 4. On November 14, 1971, Mariner 9 became the first space probe to orbit another planet when it entered into orbit around Mars. The amount of data returned by probes increased substantially as technology improved. The first to contact the surface were two Soviet probes: Mars 2 lander on November 27 and Mars 3 lander on December 2, 1971—Mars 2 failed during descent and Mars 3 failed about twenty seconds after the first Martian soft landing. Mars 6 failed during descent but did return some corrupted atmospheric data in 1974. The 1975 NASA launches of the Viking program consisted of two orbiters, each with a lander that successfully soft landed in 1976. Viking 1 remained operational for six years, Viking 2 for three years. The Viking landers relayed the first color panoramas of Mars. The Soviet probes Phobos 1 and 2 were sent to Mars in 1988 to study Mars and its two moons, with a focus on Phobos. Phobos 1 lost contact on the way to Mars. Phobos 2, while successfully photographing Mars and Phobos, failed before it was set to release two landers to the surface of Phobos. Missions that ended prematurely after Phobos 1 and 2 (1988) include (see Probe difficulties section for more details): Following the 1993 failure of the Mars Observer orbiter, the NASA Mars Global Surveyor achieved Mars orbit in 1997. This mission was a complete success, having finished its primary mapping mission in early 2001. Contact was lost with the probe in November 2006 during its third extended program, spending exactly 10 operational years in space. The NASA Mars Pathfinder, carrying a robotic exploration vehicle Sojourner, landed in the Ares Vallis on Mars in July 1997, returning many images. NASA's Mars Odyssey orbiter entered Mars orbit in 2001. Odyssey's Gamma Ray Spectrometer detected significant amounts of hydrogen in the upper metre or so of regolith on Mars. This hydrogen is thought to be contained in large deposits of water ice. The Mars Express mission of the European Space Agency (ESA) reached Mars in 2003. It carried the Beagle 2 lander, which was not heard from after being released and was declared lost in February 2004. Beagle 2 was located in January 2015 by the HiRise camera on NASA's Mars Reconnaissance Orbiter (MRO) having landed safely but failed to fully deploy its solar panels and antenna. In early 2004, the Mars Express Planetary Fourier Spectrometer team announced the orbiter had detected methane in the Martian atmosphere, a potential biosignature. ESA announced in June 2006 the discovery of aurora on Mars by the Mars Express. In January 2004, the NASA twin Mars Exploration Rovers named Spirit (MER-A) and Opportunity (MER-B) landed on the surface of Mars. Both have met and exceeded all their science objectives. Among the most significant scientific returns has been conclusive evidence that liquid water existed at some time in the past at both landing sites. Martian dust devils and windstorms have occasionally cleaned both rovers' solar panels, and thus increased their lifespan. Spirit rover (MER-A) was active until 2010, when it stopped sending data because it got stuck in a sand dune and was unable to reorient itself to recharge its batteries. Rosetta came within 250 km of Mars during its 2007 flyby. Dawn flew by Mars in February 2009 for a gravity assist on its way to investigate Vesta and Ceres. Phoenix landed on the north polar region of Mars on May 25, 2008. Its robotic arm dug into the Martian soil and the presence of water ice was confirmed on June 20, 2008. The mission concluded on November 10, 2008, after contact was lost. In 2008, the price of transporting material from the surface of Earth to the surface of Mars was approximately US$309,000 per kilogram. The Indian Space Research Organisation (ISRO) launched their Mars Orbiter Mission (MOM) on November 5, 2013, and it was inserted into Mars orbit on September 24, 2014. India's ISRO is the fourth space agency to reach Mars, after the Soviet space program, NASA and ESA. India successfully placed a spacecraft into Mars orbit, and became the first country to do so in its maiden attempt. The following entails a brief overview of previous missions to Mars, oriented towards orbiters and flybys; see also Mars landing and Mars rover. Between 1960 and 1969, the Soviet Union launched nine probes intended to reach Mars. They all failed: three at launch; three failed to reach near-Earth orbit; one during the burn to put the spacecraft into trans-Mars trajectory; and two during the interplanetary orbit. The Mars 1M programs (sometimes dubbed Mars-nik in Western media) was the first Soviet uncrewed spacecraft interplanetary exploration program, which consisted of two flyby probes launched towards Mars in October 1960, Mars 1960A and Mars 1960B (also known as Korabl 4 and Korabl 5 respectively). After launch, the third stage pumps on both launchers were unable to develop enough pressure to commence ignition, so Earth parking orbit was not achieved. The spacecraft reached an altitude of 120 km before reentry. Mars 1962A was a Mars flyby mission, launched on October 24, 1962, and Mars 1962B an intended first Mars lander mission, launched in late December of the same year (1962). Both failed from either breaking up as they were going into Earth orbit or having the upper stage explode in orbit during the burn to put the spacecraft into trans-Mars trajectory. Mars 1 (1962 Beta Nu 1), an automatic interplanetary spacecraft launched to Mars on November 1, 1962, was the first probe of the Soviet Mars probe program to achieve interplanetary orbit. Mars 1 was intended to fly by the planet at a distance of about 11,000 km and take images of the surface as well as send back data on cosmic radiation, micrometeoroid impacts and Mars's magnetic field, radiation environment, atmospheric structure, and possible organic compounds. Sixty-one radio transmissions were held, initially at 2-day intervals and later at 5-day intervals, from which a large amount of interplanetary data was collected. On 21 March 1963, when the spacecraft was at a distance of 106,760,000 km from Earth, on its way to Mars, communications ceased due to failure of its antenna orientation system. In 1964, both Soviet probe launches, of Zond 1964A on June 4, and Zond 2 on November 30, (part of the Zond program), resulted in failures. Zond 1964A had a failure at launch, while communication was lost with Zond 2 en route to Mars after a mid-course maneuver, in early May 1965. In 1969, and as part of the Mars probe program, the Soviet Union prepared two identical 5-ton orbiters called M-69, dubbed by NASA as Mars 1969A and Mars 1969B. Both probes were lost in launch-related complications with the newly developed Proton rocket. The USSR intended to have the first artificial satellite of Mars beating the planned American Mariner 8 and Mariner 9 Mars orbiters. In May 1971, one day after Mariner 8 malfunctioned at launch and failed to reach orbit, Cosmos 419 (Mars 1971C), a heavy probe of the Soviet Mars program M-71, also failed to launch. This spacecraft was designed as an orbiter only, while the next two probes of project M-71, Mars 2 and Mars 3, were multipurpose combinations of an orbiter and a lander with small skis-walking rovers, PrOP-M, that would be the first planet rovers outside the Moon. They were successfully launched in mid-May 1971 and reached Mars about seven months later. On November 27, 1971, the lander of Mars 2 crash-landed due to an on-board computer malfunction and became the first man-made object to reach the surface of Mars. On 2 December 1971, the Mars 3 lander became the first spacecraft to achieve a soft landing, but its transmission was interrupted after 14.5 seconds. The Mars 2 and 3 orbiters sent back a relatively large volume of data covering the period from December 1971 to March 1972, although transmissions continued through to August. By 22 August 1972, after sending back data and a total of 60 pictures, Mars 2 and 3 concluded their missions. The images and data enabled creation of surface relief maps, and gave information on the Martian gravity and magnetic fields. In 1973, the Soviet Union sent four more probes to Mars: the Mars 4 and Mars 5 orbiters and the Mars 6 and Mars 7 flyby/lander combinations. All missions except Mars 7 sent back data, with Mars 5 being most successful. Mars 5 transmitted just 60 images before a loss of pressurization in the transmitter housing ended the mission. Mars 6 lander transmitted data during descent, but failed upon impact. Mars 4 flew by the planet at a range of 2200 km returning one swath of pictures and radio occultation data, which constituted the first detection of the nightside ionosphere on Mars. Mars 7 probe separated prematurely from the carrying vehicle due to a problem in the operation of one of the onboard systems (attitude control or retro-rockets) and missed the planet by 1,300 kilometres (8.7×10−6 au). In 1964, NASA's Jet Propulsion Laboratory made two attempts at reaching Mars. Mariner 3 and Mariner 4 were identical spacecraft designed to carry out the first flybys of Mars. Mariner 3 was launched on November 5, 1964, but the shroud encasing the spacecraft atop its rocket failed to open properly, dooming the mission. Three weeks later, on November 28, 1964, Mariner 4 was launched successfully on a 71⁄2-month voyage to Mars.[citation needed] Mariner 4 flew past Mars on July 14, 1965, providing the first close-up photographs of another planet. The pictures, gradually played back to Earth from a small tape recorder on the probe, showed impact craters. It provided radically more accurate data about the planet; a surface atmospheric pressure of about 1% of Earth's and daytime temperatures of −100 °C (−148 °F) were estimated. No magnetic field or Martian radiation belts were detected. The new data meant redesigns for then planned Martian landers, and showed life would have a more difficult time surviving there than previously anticipated. NASA continued the Mariner program with another pair of Mars flyby probes, Mariner 6 and 7. They were sent at the next launch window, and reached the planet in 1969. During the following launch window the Mariner program again suffered the loss of one of a pair of probes. Mariner 9 successfully entered orbit about Mars, the first spacecraft ever to do so, after the launch time failure of its sister ship, Mariner 8. When Mariner 9 reached Mars in 1971, it and two Soviet orbiters (Mars 2 and Mars 3) found that a planet-wide dust storm was in progress. The mission controllers used the time spent waiting for the storm to clear to have the probe rendezvous with, and photograph, Phobos. When the storm cleared sufficiently for Mars's surface to be photographed by Mariner 9, the pictures returned represented a substantial advance over previous missions. These pictures were the first to offer more detailed evidence that liquid water might at one time have flowed on the planetary surface. They also finally discerned the true nature of many Martian albedo features. For example, Nix Olympica was one of only a few features that could be seen during the planetary duststorm, revealing it to be the highest mountain (volcano, to be exact) on any planet in the entire Solar System, and leading to its reclassification as Olympus Mons.[citation needed] The Viking program launched Viking 1 and Viking 2 spacecraft to Mars in 1975; The program consisted of two orbiters and two landers – these were the second and third spacecraft to successfully land on Mars. In 1976, Viking 1 and Viking 2 touched down on the Martian surface. These landers were significantly larger than the Soviet Mars 3 lander (Viking 1 was 3,527 kilograms compared to the 358 kg Mars 3 lander). They were able to take the first photographs from the surface of Mars. Viking 1 operated on the surface of Mars for around six years (On Nov 11, 1982 the Lander stopped operating after getting a faulty command) and Viking 2 for over three years (mission ended in early 1980). Both landers were equipped with a robotic sampler arm which successfully scooped up soil samples and tested them with instruments such as a Gas chromatography–mass spectrometer. The landers measured temperatures ranging from negative 86 degrees Celsius before dawn to negative 33 degrees Celsius in the afternoon. Both landers had issues obtaining accurate results from their seismometers. Photographs from the landers and orbiters surpassed expectations in quality and quantity. The total exceeded 4,500 from the landers and 52,000 from the orbiters. The Viking landers recorded atmospheric pressures ranging from below 7 millibars (0.0068 bars) to over 10 millibars (0.0108 bars) over the Martian year, leading to the conclusion that atmospheric pressure varies by 30 percent during the Martian year because carbon dioxide condenses and sublimes at the polar caps. Martian winds generally blow more slowly than expected, scientists had expected them to reach speeds of several hundred miles an hour from observing global dust storms, but neither lander recorded gusts over 120 kilometers (74 miles) an hour, and average velocities were considerably lower. Nevertheless, the orbiters observed more than a dozen small dust storms. The Viking landers detected nitrogen in the atmosphere for the first time, and that it was a significant component of the Martian atmosphere. There was speculation from the atmospheric analysis that the atmosphere of Mars used to be much denser. The primary scientific objectives of the lander mission were to search for biosignatures and observe meteorologic, seismic and magnetic properties of Mars. The results of the biological experiments on board the Viking landers remain inconclusive, with a reanalysis of the Viking data published in 2012 suggesting signs of microbial life on Mars. The Viking orbiters revealed that large floods of water carved deep valleys, eroded grooves into bedrock, and traveled thousands of kilometers. Areas of branched streams, in the southern hemisphere, suggest that rain once fell. Mars Pathfinder was a U.S. spacecraft that landed a base station with a roving probe on Mars on July 4, 1997. It consisted of a lander and a small 10.6-kilogram (23 lb) wheeled robotic rover named Sojourner, which was the first rover to operate on the surface of Mars. In addition to scientific objectives, the Mars Pathfinder mission was also a "proof-of-concept" for various technologies, such as an airbag landing system and automated obstacle avoidance, both later exploited by the Mars Exploration Rovers. After the 1992 failure of NASA's Mars Observer orbiter, NASA retooled and launched Mars Global Surveyor (MGS). Mars Global Surveyor launched on November 7, 1996, and entered orbit on September 12, 1997. After a year and a half trimming its orbit from a looping ellipse to a circular track around the planet, the spacecraft began its primary mapping mission in March 1999. It observed the planet from a low-altitude, nearly polar orbit over the course of one complete Martian year, the equivalent of nearly two Earth years. Mars Global Surveyor completed its primary mission on January 31, 2001, and completed several extended mission phases until communication was lost in 2007. The mission studied the entire Martian surface, atmosphere, and interior, and returned more data about the red planet than all previous Mars missions combined. The data has been archived and remains available publicly. Among key scientific findings, Global Surveyor took pictures of gullies and debris flow features that suggest there may be current sources of liquid water, similar to an aquifer, at or near the surface of the planet. Similar channels on Earth are formed by flowing water, but on Mars the temperature is normally too cold and the atmosphere too thin to sustain liquid water. Nevertheless, many scientists hypothesize that liquid groundwater can sometimes surface on Mars, erode gullies and channels, and pool at the bottom before freezing and evaporating. Magnetometer readings showed that the planet's magnetic field is not globally generated in the planet's core, but is localized in particular areas of the crust. New temperature data and closeup images of the Martian moon Phobos showed that its surface is composed of powdery material at least 1 meter (3 feet) thick, caused by millions of years of meteoroid impacts. Data from the spacecraft's laser altimeter gave scientists their first 3-D views of Mars's north polar ice cap in January 1999. Faulty software uploaded to the vehicle in June 2006 caused the spacecraft to orient its solar panels incorrectly several months later, resulting in battery overheating and subsequent failure. On November 5, 2006, MGS lost contact with Earth. NASA ended efforts to restore communication on January 28, 2007. In 2001, NASA's Mars Odyssey orbiter arrived at Mars. Its mission is to use spectrometers and imagers to hunt for evidence of past or present water and volcanic activity on Mars. In 2002, it was announced that the probe's gamma-ray spectrometer and neutron spectrometer had detected large amounts of hydrogen, indicating that there are vast deposits of water ice in the upper three meters of Mars's soil within 60° latitude of the south pole.[citation needed] On June 2, 2003, the European Space Agency's Mars Express set off from Baikonur Cosmodrome to Mars. The Mars Express craft consists of the Mars Express Orbiter and the stationary lander Beagle 2. The lander carried a digging device and the smallest mass spectrometer created to date, as well as a range of other devices, on a robotic arm in order to accurately analyze soil beneath the dusty surface to look for biosignatures and biomolecules.[citation needed] The orbiter entered Mars orbit on December 25, 2003, and Beagle 2 entered Mars's atmosphere the same day. However, attempts to contact the lander failed. Communications attempts continued throughout January, but Beagle 2 was declared lost in mid-February, and a joint inquiry was launched by the UK and ESA. The Mars Express Orbiter confirmed the presence of water ice and carbon dioxide ice at the planet's south pole, while NASA had previously confirmed their presence at the north pole of Mars.[citation needed] The lander's fate remained a mystery until it was located intact on the surface of Mars in a series of images from the Mars Reconnaissance Orbiter. The images suggest that two of the spacecraft's four solar panels failed to deploy, blocking the spacecraft's communications antenna. Beagle 2 is the first British and first European probe to achieve a soft landing on Mars.[citation needed] NASA's Mars Exploration Rover Mission (MER), started in 2003, was a robotic space mission involving two rovers, Spirit (MER-A) and Opportunity, (MER-B) that explored the Martian surface geology. The mission's scientific objective was to search for and characterize a wide range of rocks and soils that hold clues to past water activity on Mars. The mission was part of NASA's Mars Exploration Program, which includes three previous successful landers: the two Viking program landers in 1976; and Mars Pathfinder probe in 1997.[citation needed] The ESA Rosetta space probe mission to the comet 67P/Churyumov-Gerasimenko flew within 250 km of Mars on February 25, 2007, in a gravitational slingshot designed to slow and redirect the spacecraft. The NASA Dawn spacecraft used the gravity of Mars in 2009 to change direction and velocity on its way to Vesta, and tested out Dawn's cameras and other instruments on Mars. On November 8, 2011, Russia's Roscosmos launched an ambitious mission called Fobos-Grunt. It consisted of a lander aimed to retrieve a sample back to Earth from Mars's moon Phobos, and place the Chinese Yinghuo-1 probe in Mars's orbit. The Fobos-Grunt mission suffered a complete control and communications failure shortly after launch and was left stranded in low Earth orbit, later falling back to Earth. The Yinghuo-1 satellite and Fobos-Grunt underwent destructive re-entry on January 15, 2012, finally disintegrating over the Pacific Ocean. The Mars Orbiter Mission, also called Mangalyaan, was launched on 5 November 2013 by the Indian Space Research Organisation (ISRO). It was successfully inserted into Martian orbit on 24 September 2014. The mission is a technology demonstrator, and as secondary objective, it will also study the Martian atmosphere. This is India's first mission to Mars, and with it, ISRO became the fourth space agency to successfully reach Mars after the Soviet Union, NASA (USA) and ESA (Europe). It was completed in a record low budget of $71 million, making it the least-expensive Mars mission to date. The mission concluded on September 27, 2022, after contact was lost. In August 2012, NASA selected InSight, a $425 million lander mission with a heat flow probe and seismometer, to determine the deep interior structure of Mars. InSight landed successfully on Mars on 26 November 2018. Valuable data on the atmosphere, surface and the planet's interior were gathered by Insight. Insight's mission was declared as ended on 21 December 2022. Two flyby CubeSats called MarCO were launched with InSight on 5 May 2018 to provide real-time telemetry during the entry and landing of InSight. The CubeSats separated from the Atlas V booster 1.5 hours after launch and traveled their own trajectories to Mars. Current missions On 10 March 2006, NASA's Mars Reconnaissance Orbiter (MRO) probe arrived in orbit to conduct a two-year science survey. The orbiter began mapping the Martian terrain and weather to find suitable landing sites for upcoming lander missions. The MRO captured the first image of a series of active avalanches near the planet's north pole in 2008. The Mars Science Laboratory mission was launched on November 26, 2011, and delivered the Curiosity rover on the surface of Mars on August 6, 2012 UTC. It is larger and more advanced than the Mars Exploration Rovers, with a velocity of up to 90 meters per hour (295 feet per hour). Experiments include a laser chemical sampler that can deduce the composition of rocks at a distance of 7 meters. MAVEN orbiter was launched on 18 November 2013, and on 22 September 2014, it was injected into an areocentric elliptic orbit 6,200 km (3,900 mi) by 150 km (93 mi) above the planet's surface to study its atmosphere. Mission goals include determining how the planet's atmosphere and water, presumed to have once been substantial, were lost over time. The ExoMars Trace Gas Orbiter arrived at Mars in 2016 and deployed the Schiaparelli EDM lander, a test lander. Schiaparelli crashed on surface, but it transmitted key data during its parachute descent, so the test was declared a partial success. The Mars Reconnaissance Orbiter (MRO) is a multipurpose spacecraft designed to conduct reconnaissance and exploration of Mars from orbit. The US$720 million spacecraft was built by Lockheed Martin under the supervision of the Jet Propulsion Laboratory, launched August 12, 2005, and entered Mars orbit on March 10, 2006. The MRO contains a host of scientific instruments such as the HiRISE camera, CTX camera, CRISM, and SHARAD. The HiRISE camera is used to analyze Martian landforms, whereas CRISM and SHARAD can detect water, ice, and minerals on and below the surface. Additionally, MRO is paving the way for upcoming generations of spacecraft through daily monitoring of Martian weather and surface conditions, searching for future landing sites, and testing a new telecommunications system that enable it to send and receive information at an unprecedented bitrate, compared to previous Mars spacecraft. Data transfer to and from the spacecraft occurs faster than all previous interplanetary missions combined and allows it to serve as an important relay satellite for other missions.[citation needed] The NASA Mars Science Laboratory mission with its rover named Curiosity, was launched on November 26, 2011, and landed on Mars on August 6, 2012, on Aeolis Palus in Gale Crater. The rover carries instruments designed to look for past or present conditions relevant to the past or present habitability of Mars. NASA's MAVEN is an orbiter mission to study the upper atmosphere of Mars. It also serves as a communications relay satellite for robotic landers and rovers on the surface of Mars. MAVEN was launched 18 November 2013 and reached Mars on 22 September 2014.[citation needed] The ExoMars Trace Gas Orbiter is an atmospheric research orbiter built in collaboration between ESA and Roscosmos. It was injected into Mars orbit on 19 October 2016 to gain a better understanding of methane (CH4) and other trace gases present in the Martian atmosphere that could be evidence for possible biological or geological activity. The Schiaparelli EDM lander was destroyed when trying to land on the surface of Mars. The United Arab Emirates launched the Hope Mars Mission, in July 2020 on the Japanese H-IIA booster. It was successfully placed into orbit on 9 February 2021. It is studying the Martian atmosphere and weather. Tianwen-1 was a Chinese mission launched on 23 July 2020 which included an orbiter, a lander, and a 240-kilogram (530 lb) rover along with a package of deployable and remote cameras. Tianwen-1 entered orbit on 10 February 2021 and the Zhurong rover successfully landed on 14 May 2021 and deployed on 22 May 2021. Zhurong had been in operation for 347 Martian days and traveled 1,921 meters across Mars before entering hibernation state in May 22. The rover has never been awake since then, but the orbiter continued to work. The Mars 2020 mission by NASA was launched on 30 July 2020 on a United Launch Alliance Atlas V rocket from Cape Canaveral. It is based on the Mars Science Laboratory design. The scientific payload is focused on astrobiology. It includes the Perseverance rover and the retired Ingenuity helicopter. Unlike older rovers that relied on solar power, Perseverance is nuclear powered, to survive longer than its predecessors in this harsh, dusty environment. The car-size rover weighs about 1 ton, with a robotic arm that reaches about 7 feet (2.1 m), zoom cameras, a chemical analyzer and a rock drill. After traveling 293 million miles (471 million km) to reach Mars over the course of more than six months, Perseverance successfully landed on February 18, 2021. Its initial mission is set for at least one Martian year, or 687 Earth days. It will search for signs of ancient life and explore the red planet's surface. As of October 19, 2021, Perseverance had captured the first sounds from Mars. Recordings consisted of five hours of Martian wind gusts, rover wheels crunching over gravel, and motors whirring as the spacecraft moves its arm. The sounds give researchers clues about the atmosphere, such as how far sound travels on the planet.[citation needed] The NASA Europa Clipper to Jupiter and Europa, NASA Psyche space probe mission to the metal-rich asteroid 16 Psyche and ESA Hera to Didymos undertook a flyby of Mars on March 1, 2025, and will undertake further flybys on March 2025, and May 2026 respectively, in a gravitational slingshot designed to slow and redirect the spacecraft. NASA's EscaPADE (Escape and Plasma Acceleration and Dynamics Explorers) is a twin-spacecraft orbiter mission to study the structure, composition, variability and dynamics of Mars's magnetosphere and atmospheric escape processes. The spacecraft were launched in November 2025. Future missions Other mission concepts include polar probes, Martian aircraft and a network of small meteorological stations. Longterm areas of study may include Martian lava tubes, resource utilization, and electronic charge carriers in rocks. Multiple Mars missions were proposed but cancelled, see List of missions to Mars. Human mission proposals The human exploration of Mars has been an aspiration since the earliest days of modern rocketry; Robert H. Goddard credits the idea of reaching Mars as his own inspiration to study the physics and engineering of space flight. Proposals for human exploration of Mars have been made throughout the history of space exploration. Currently there are multiple active plans and programs to put humans on Mars within the next ten to thirty years, both governmental and private, some of which are listed below. Human exploration by the United States was identified as a long-term goal in the Vision for Space Exploration announced in 2004 by then US President George W. Bush. The planned Orion spacecraft would be used to send a human expedition to Earth's moon by 2020 as a stepping stone to a Mars expedition. On September 28, 2007, NASA administrator Michael D. Griffin stated that NASA aims to put a person on Mars by 2037. On December 2, 2014, NASA's Advanced Human Exploration Systems and Operations Mission Director Jason Crusan and Deputy Associate Administrator for Programs James Reuthner announced tentative support for the Boeing "Affordable Mars Mission Design" including radiation shielding, centrifugal artificial gravity, in-transit consumable resupply, and a lander which can return. Reuthner suggested that if adequate funding was forthcoming, the proposed mission would be expected in the early 2030s. On October 8, 2015, NASA published its official plan for human exploration and colonization of Mars. They called it "Journey to Mars". The plan operates through three distinct phases leading to fully sustained colonization. On August 28, 2015, NASA funded a year-long simulation to study the effects of a year-long Mars mission on six scientists. The scientists lived in a biodome on a Mauna Loa mountain in Hawaii with limited connection to the outside world and were only allowed outside if they were wearing spacesuits. NASA's human Mars exploration plans have evolved through the NASA Mars Design Reference Missions, a series of design studies for human exploration of Mars. In 2017, the focus of NASA shifted to a return to the Moon by 2024 with the Artemis program, a flight to Mars could follow after this project. The long-term goal of the private corporation SpaceX is the establishment of routine flights to Mars to enable colonization. To this end, the company is developing Starship, a spacecraft capable of crew transportation to Mars and other celestial bodies, along with its booster Super Heavy. In 2016 SpaceX announced plans to send two uncrewed Starships to Mars by 2022, followed by two more uncrewed flights and two crewed flights in 2024. SpaceX is currently targeting the first uncrewed launches NET 2026, with the first crewed flights happening NET 2028. Starship is planned to have a payload of at least 100 tonnes and is designed to use a combination of aerobraking and propulsive descent, using fuel produced from a Mars (in situ resource utilization) facility. As of 2024, the Starship development program has seen multiple integrated test flights and is progressing towards full reusability. SpaceX's plans involve the mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. Mars Direct, a low-cost human mission proposed by Robert Zubrin, founder of the Mars Society, would use heavy-lift Saturn V class rockets, such as the Ares V, to skip orbital construction, LEO rendezvous, and lunar fuel depots. A modified proposal, called "Mars to Stay", involves not returning the first immigrant explorers immediately, if ever (see Colonization of Mars). Probe difficulties The challenge, complexity and length of Mars missions have led to many mission failures. The high failure rate of missions attempting to explore Mars is informally called the "Mars Curse" or "Martian Curse". The phrase "Galactic Ghoul" or "Great Galactic Ghoul" refers to a fictitious space monster that subsists on a diet of Mars probes, and is sometimes facetiously used to "explain" the recurring difficulties. Two Soviet probes were sent to Mars in 1988 as part of the Phobos program. Phobos 1 operated normally until an expected communications session on 2 September 1988 failed to occur. The problem was traced to a software error, which deactivated Phobos 1's attitude thrusters, causing the spacecraft's solar arrays to no longer point at the Sun, depleting Phobos 1's batteries. Phobos 2 operated normally throughout its cruise and Mars orbital insertion phases on January 29, 1989, gathering data on the Sun, interplanetary medium, Mars, and Phobos. Shortly before the final phase of the mission – during which the spacecraft was to approach within 50 m of Phobos's surface and release two landers, one a mobile 'hopper', the other a stationary platform – contact with Phobos 2 was lost. The mission ended when the spacecraft signal failed to be successfully reacquired on March 27, 1989. The cause of the failure was determined to be a malfunction of the on-board computer.[citation needed] Just a few years later in 1992, Mars Observer, launched by NASA, failed as it approached Mars. Mars 96, an orbiter launched on November 16, 1996, by Russia failed, when the planned second burn of the Block D-2 fourth stage did not occur. Following the success of Global Surveyor and Pathfinder, another spate of failures occurred in 1998 and 1999, with the Japanese Nozomi orbiter and NASA's Mars Climate Orbiter, Mars Polar Lander, and Deep Space 2 penetrators all suffering various fatal errors. The Mars Climate Orbiter was noted for mixing up U.S. customary units with metric units, causing the orbiter to burn up while entering Mars's atmosphere. The European Space Agency has also attempted to land two probes on the Martian surface; Beagle 2, a British-built lander that failed to deploy its solar arrays properly after touchdown in December 2003, and Schiaparelli, which was flown along the ExoMars Trace Gas Orbiter. Contact with the Schiaparelli EDM lander was lost 50 seconds before touchdown. It was later confirmed that the lander struck the surface at a high velocity, possibly exploding. See also Notes References Bibliography External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of".
========================================
[SOURCE: https://en.wikipedia.org/wiki/Graph_drawing] | [TOKENS: 723]
Contents Graph drawing Graph drawing is an area of mathematics and computer science combining methods from geometric graph theory and information visualization to derive two-dimensional (or, sometimes, three-dimensional) depictions of graphs arising from applications such as social network analysis, cartography, linguistics, and bioinformatics. A drawing of a graph or network diagram is a pictorial representation of the vertices and edges of a graph. This drawing should not be confused with the graph itself: very different layouts can correspond to the same graph. In the abstract, all that matters is which pairs of vertices are connected by edges. In the concrete, however, the arrangement of these vertices and edges within a drawing affects its understandability, usability, fabrication cost, and aesthetics. The problem gets worse if the graph changes over time by adding and deleting edges (dynamic graph drawing) and the goal is to preserve the user's mental map. Graphical conventions Graphs are frequently drawn as node–link diagrams in which the vertices are represented as disks, boxes, or textual labels and the edges are represented as line segments, polylines, or curves in the Euclidean plane. Node–link diagrams can be traced back to the 14th-16th century works of Pseudo-Lull which were published under the name of Ramon Llull, a 13th century polymath. Pseudo-Lull drew diagrams of this type for complete graphs in order to analyze all pairwise combinations among sets of metaphysical concepts. In the case of directed graphs, arrowheads form a commonly used graphical convention to show their orientation; however, user studies have shown that other conventions such as tapering provide this information more effectively. Upward planar drawing uses the convention that every edge is oriented from a lower vertex to a higher vertex, making arrowheads unnecessary. Alternative conventions to node–link diagrams include adjacency representations such as circle packings, in which vertices are represented by disjoint regions in the plane and edges are represented by adjacencies between regions; intersection representations in which vertices are represented by non-disjoint geometric objects and edges are represented by their intersections; visibility representations in which vertices are represented by regions in the plane and edges are represented by regions that have an unobstructed line of sight to each other; confluent drawings, in which edges are represented as smooth curves within mathematical train tracks; fabrics, in which nodes are represented as horizontal lines and edges as vertical lines; and visualizations of the adjacency matrix of the graph. Quality measures Many different quality measures have been defined for graph drawings, in an attempt to find objective means of evaluating their aesthetics and usability. In addition to guiding the choice between different layout methods for the same graph, some layout methods attempt to directly optimize these measures. Layout methods There are many different graph layout strategies: Application-specific graph drawings Graphs and graph drawings arising in other areas of application include In addition, the placement and routing steps of electronic design automation (EDA) are similar in many ways to graph drawing, as is the problem of greedy embedding in distributed computing, and the graph drawing literature includes several results borrowed from the EDA literature. However, these problems also differ in several important ways: for instance, in EDA, area minimization and signal length are more important than aesthetics, and the routing problem in EDA may have more than two terminals per net while the analogous problem in graph drawing generally only involves pairs of vertices for each edge. Graph drawing algorithms There are many algorithms for graph drawing. Among them are: Software Software, systems, and providers of systems for drawing graphs include: See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_note-11] | [TOKENS: 9291]
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/w/index.php?title=OpenAI&action=edit&section=3] | [TOKENS: 1430]
Editing OpenAI (section) Copy and paste: – — ° ′ ″ ≈ ≠ ≤ ≥ ± − × ÷ ← → · § Cite your sources: <ref></ref> {{}} {{{}}} | [] [[]] [[Category:]] #REDIRECT [[]] &nbsp; <s></s> <sup></sup> <sub></sub> <code></code> <pre></pre> <blockquote></blockquote> <ref></ref> <ref name="" /> {{Reflist}} <references /> <includeonly></includeonly> <noinclude></noinclude> {{DEFAULTSORT:}} <nowiki></nowiki> <!-- --> <span class="plainlinks"></span> Symbols: ~ | ¡ ¿ † ‡ ↔ ↑ ↓ • ¶ # ∞ ‹› «» ¤ ₳ ฿ ₵ ¢ ₡ ₢ $ ₫ ₯ € ₠ ₣ ƒ ₴ ₭ ₤ ℳ ₥ ₦ ₧ ₰ £ ៛ ₨ ₪ ৳ ₮ ₩ ¥ ♠ ♣ ♥ ♦ 𝄫 ♭ ♮ ♯ 𝄪 © ¼ ½ ¾ Latin: A a Á á À à  â Ä ä Ǎ ǎ Ă ă Ā ā à ã Å å Ą ą Æ æ Ǣ ǣ B b C c Ć ć Ċ ċ Ĉ ĉ Č č Ç ç D d Ď ď Đ đ Ḍ ḍ Ð ð E e É é È è Ė ė Ê ê Ë ë Ě ě Ĕ ĕ Ē ē Ẽ ẽ Ę ę Ẹ ẹ Ɛ ɛ Ǝ ǝ Ə ə F f G g Ġ ġ Ĝ ĝ Ğ ğ Ģ ģ H h Ĥ ĥ Ħ ħ Ḥ ḥ I i İ ı Í í Ì ì Î î Ï ï Ǐ ǐ Ĭ ĭ Ī ī Ĩ ĩ Į į Ị ị J j Ĵ ĵ K k Ķ ķ L l Ĺ ĺ Ŀ ŀ Ľ ľ Ļ ļ Ł ł Ḷ ḷ Ḹ ḹ M m Ṃ ṃ N n Ń ń Ň ň Ñ ñ Ņ ņ Ṇ ṇ Ŋ ŋ O o Ó ó Ò ò Ô ô Ö ö Ǒ ǒ Ŏ ŏ Ō ō Õ õ Ǫ ǫ Ọ ọ Ő ő Ø ø Œ œ Ɔ ɔ P p Q q R r Ŕ ŕ Ř ř Ŗ ŗ Ṛ ṛ Ṝ ṝ S s Ś ś Ŝ ŝ Š š Ş ş Ș ș Ṣ ṣ ß T t Ť ť Ţ ţ Ț ț Ṭ ṭ Þ þ U u Ú ú Ù ù Û û Ü ü Ǔ ǔ Ŭ ŭ Ū ū Ũ ũ Ů ů Ų ų Ụ ụ Ű ű Ǘ ǘ Ǜ ǜ Ǚ ǚ Ǖ ǖ V v W w Ŵ ŵ X x Y y Ý ý Ŷ ŷ Ÿ ÿ Ỹ ỹ Ȳ ȳ Z z Ź ź Ż ż Ž ž ß Ð ð Þ þ Ŋ ŋ Ə ə Greek: Ά ά Έ έ Ή ή Ί ί Ό ό Ύ ύ Ώ ώ Α α Β β Γ γ Δ δ Ε ε Ζ ζ Η η Θ θ Ι ι Κ κ Λ λ Μ μ Ν ν Ξ ξ Ο ο Π π Ρ ρ Σ σ ς Τ τ Υ υ Φ φ Χ χ Ψ ψ Ω ω {{Polytonic|}} Cyrillic: А а Б б В в Г г Ґ ґ Ѓ ѓ Д д Ђ ђ Е е Ё ё Є є Ж ж З з Ѕ ѕ И и І і Ї ї Й й Ј ј К к Ќ ќ Л л Љ љ М м Н н Њ њ О о П п Р р С с Т т Ћ ћ У у Ў ў Ф ф Х х Ц ц Ч ч Џ џ Ш ш Щ щ Ъ ъ Ы ы Ь ь Э э Ю ю Я я ́ IPA: t̪ d̪ ʈ ɖ ɟ ɡ ɢ ʡ ʔ ɸ β θ ð ʃ ʒ ɕ ʑ ʂ ʐ ç ʝ ɣ χ ʁ ħ ʕ ʜ ʢ ɦ ɱ ɳ ɲ ŋ ɴ ʋ ɹ ɻ ɰ ʙ ⱱ ʀ ɾ ɽ ɫ ɬ ɮ ɺ ɭ ʎ ʟ ɥ ʍ ɧ ʼ ɓ ɗ ʄ ɠ ʛ ʘ ǀ ǃ ǂ ǁ ɨ ʉ ɯ ɪ ʏ ʊ ø ɘ ɵ ɤ ə ɚ ɛ œ ɜ ɝ ɞ ʌ ɔ æ ɐ ɶ ɑ ɒ ʰ ʱ ʷ ʲ ˠ ˤ ⁿ ˡ ˈ ˌ ː ˑ ̪ {{IPA|}} This page is a member of 8 hidden categories (help):
========================================
[SOURCE: https://en.wikipedia.org/w/index.php?title=List_of_programming_languages&action=edit&section=5] | [TOKENS: 1432]
Editing List of programming languages (section) Copy and paste: – — ° ′ ″ ≈ ≠ ≤ ≥ ± − × ÷ ← → · § Cite your sources: <ref></ref> {{}} {{{}}} | [] [[]] [[Category:]] #REDIRECT [[]] &nbsp; <s></s> <sup></sup> <sub></sub> <code></code> <pre></pre> <blockquote></blockquote> <ref></ref> <ref name="" /> {{Reflist}} <references /> <includeonly></includeonly> <noinclude></noinclude> {{DEFAULTSORT:}} <nowiki></nowiki> <!-- --> <span class="plainlinks"></span> Symbols: ~ | ¡ ¿ † ‡ ↔ ↑ ↓ • ¶ # ∞ ‹› «» ¤ ₳ ฿ ₵ ¢ ₡ ₢ $ ₫ ₯ € ₠ ₣ ƒ ₴ ₭ ₤ ℳ ₥ ₦ ₧ ₰ £ ៛ ₨ ₪ ৳ ₮ ₩ ¥ ♠ ♣ ♥ ♦ 𝄫 ♭ ♮ ♯ 𝄪 © ¼ ½ ¾ Latin: A a Á á À à  â Ä ä Ǎ ǎ Ă ă Ā ā à ã Å å Ą ą Æ æ Ǣ ǣ B b C c Ć ć Ċ ċ Ĉ ĉ Č č Ç ç D d Ď ď Đ đ Ḍ ḍ Ð ð E e É é È è Ė ė Ê ê Ë ë Ě ě Ĕ ĕ Ē ē Ẽ ẽ Ę ę Ẹ ẹ Ɛ ɛ Ǝ ǝ Ə ə F f G g Ġ ġ Ĝ ĝ Ğ ğ Ģ ģ H h Ĥ ĥ Ħ ħ Ḥ ḥ I i İ ı Í í Ì ì Î î Ï ï Ǐ ǐ Ĭ ĭ Ī ī Ĩ ĩ Į į Ị ị J j Ĵ ĵ K k Ķ ķ L l Ĺ ĺ Ŀ ŀ Ľ ľ Ļ ļ Ł ł Ḷ ḷ Ḹ ḹ M m Ṃ ṃ N n Ń ń Ň ň Ñ ñ Ņ ņ Ṇ ṇ Ŋ ŋ O o Ó ó Ò ò Ô ô Ö ö Ǒ ǒ Ŏ ŏ Ō ō Õ õ Ǫ ǫ Ọ ọ Ő ő Ø ø Œ œ Ɔ ɔ P p Q q R r Ŕ ŕ Ř ř Ŗ ŗ Ṛ ṛ Ṝ ṝ S s Ś ś Ŝ ŝ Š š Ş ş Ș ș Ṣ ṣ ß T t Ť ť Ţ ţ Ț ț Ṭ ṭ Þ þ U u Ú ú Ù ù Û û Ü ü Ǔ ǔ Ŭ ŭ Ū ū Ũ ũ Ů ů Ų ų Ụ ụ Ű ű Ǘ ǘ Ǜ ǜ Ǚ ǚ Ǖ ǖ V v W w Ŵ ŵ X x Y y Ý ý Ŷ ŷ Ÿ ÿ Ỹ ỹ Ȳ ȳ Z z Ź ź Ż ż Ž ž ß Ð ð Þ þ Ŋ ŋ Ə ə Greek: Ά ά Έ έ Ή ή Ί ί Ό ό Ύ ύ Ώ ώ Α α Β β Γ γ Δ δ Ε ε Ζ ζ Η η Θ θ Ι ι Κ κ Λ λ Μ μ Ν ν Ξ ξ Ο ο Π π Ρ ρ Σ σ ς Τ τ Υ υ Φ φ Χ χ Ψ ψ Ω ω {{Polytonic|}} Cyrillic: А а Б б В в Г г Ґ ґ Ѓ ѓ Д д Ђ ђ Е е Ё ё Є є Ж ж З з Ѕ ѕ И и І і Ї ї Й й Ј ј К к Ќ ќ Л л Љ љ М м Н н Њ њ О о П п Р р С с Т т Ћ ћ У у Ў ў Ф ф Х х Ц ц Ч ч Џ џ Ш ш Щ щ Ъ ъ Ы ы Ь ь Э э Ю ю Я я ́ IPA: t̪ d̪ ʈ ɖ ɟ ɡ ɢ ʡ ʔ ɸ β θ ð ʃ ʒ ɕ ʑ ʂ ʐ ç ʝ ɣ χ ʁ ħ ʕ ʜ ʢ ɦ ɱ ɳ ɲ ŋ ɴ ʋ ɹ ɻ ɰ ʙ ⱱ ʀ ɾ ɽ ɫ ɬ ɮ ɺ ɭ ʎ ʟ ɥ ʍ ɧ ʼ ɓ ɗ ʄ ɠ ʛ ʘ ǀ ǃ ǂ ǁ ɨ ʉ ɯ ɪ ʏ ʊ ø ɘ ɵ ɤ ə ɚ ɛ œ ɜ ɝ ɞ ʌ ɔ æ ɐ ɶ ɑ ɒ ʰ ʱ ʷ ʲ ˠ ˤ ⁿ ˡ ˈ ˌ ː ˑ ̪ {{IPA|}} This page is a member of 2 hidden categories (help):
========================================
[SOURCE: https://en.wikipedia.org/wiki/Middle_East#cite_note-16] | [TOKENS: 6152]
Contents Middle East The Middle East[b] is a geopolitical region encompassing the Arabian Peninsula, Egypt, Iran, Iraq, the Levant, and Turkey. The term came into widespread usage by Western European nations in the early 20th century as a replacement of the term Near East (both were in contrast to the Far East). The term "Middle East" has led to some confusion over its changing definitions. Since the late 20th century, it has been criticized as being too Eurocentric. The region includes the vast majority of the territories included in the closely associated definition of West Asia, but without the South Caucasus. It also includes all of Egypt (not just the Sinai region) and all of Turkey (including East Thrace). Most Middle Eastern countries (13 out of 18) are part of the Arab world. The three most populous countries in the region are Egypt, Iran, and Turkey, while Saudi Arabia is the largest Middle Eastern country by area. The history of the Middle East dates back to ancient times, and it was long considered the "cradle of civilization". The geopolitical importance of the region has been recognized and competed for during millennia. The Abrahamic religions (Judaism, Christianity, and Islam) have their origins in the Middle East. Arabs constitute the main ethnic group in the region, followed by Turks, Persians, Kurds, Jews, and Assyrians. The Middle East generally has a hot, arid climate, especially in the Arabian and Egyptian regions. Several major rivers provide irrigation to support agriculture in limited areas here, such as the Nile Delta in Egypt, the Tigris and Euphrates watersheds of Mesopotamia, and the basin of the Jordan River that spans most of the Levant. These regions are collectively known as the Fertile Crescent, and comprise the core of what historians had long referred to as the cradle of civilization; multiple regions of the world have since been classified as also having developed independent, original civilizations. Conversely, the Levantine coast and most of Turkey have relatively temperate climates typical of the Mediterranean, with dry summers and cool, wet winters. Most of the countries that border the Persian Gulf have vast reserves of petroleum. Monarchs of the Arabian Peninsula in particular have benefitted economically from petroleum exports. Because of the arid climate and dependence on the fossil fuel industry, the Middle East is both a major contributor to climate change and a region that is expected to be severely adversely affected by it. Other concepts of the region exist, including the broader Middle East and North Africa (MENA), which includes states of the Maghreb and the Sudan. The term the "Greater Middle East" also includes Afghanistan, Mauritania, Pakistan, as well as parts of East Africa, and sometimes Central Asia and the South Caucasus. Terminology The term "Middle East" may have originated in the 1850s in the British India Office. However, it became more widely known when United States naval strategist Alfred Thayer Mahan used the term in 1902 to "designate the area between Arabia and India". During this time the British and Russian empires were vying for influence in Central Asia, a rivalry that would become known as the Great Game. Mahan realized not only the strategic importance of the region, but also of its center, the Persian Gulf. He labeled the area surrounding the Persian Gulf as the Middle East. He said that, beyond Egypt's Suez Canal, the Gulf was the most important passage for Britain to control in order to keep the Russians from advancing towards British India. Mahan first used the term in his article "The Persian Gulf and International Relations", published in September 1902 in the National Review, a British journal. The Middle East, if I may adopt a term which I have not seen, will some day need its Malta, as well as its Gibraltar; it does not follow that either will be in the Persian Gulf. Naval force has the quality of mobility which carries with it the privilege of temporary absences; but it needs to find on every scene of operation established bases of refit, of supply, and in case of disaster, of security. The British Navy should have the facility to concentrate in force if occasion arise, about Aden, India, and the Persian Gulf. Mahan's article was reprinted in The Times and followed in October by a 20-article series entitled "The Middle Eastern Question", written by Sir Ignatius Valentine Chirol. During this series, Sir Ignatius expanded the definition of Middle East to include "those regions of Asia which extend to the borders of India or command the approaches to India." After the series ended in 1903, The Times removed quotation marks from subsequent uses of the term. Until World War II, it was customary to refer to areas centered on Turkey and the eastern shore of the Mediterranean as the "Near East", while the "Far East" centered on China, India and Japan. The Middle East was then defined as the area from Mesopotamia to Burma; namely, the area between the Near East and the Far East. This area broadly corresponds to South Asia. In the late 1930s, the British established the Middle East Command, which was based in Cairo, for its military forces in the region. After that time, the term "Middle East" gained broader usage in Europe and the United States. Following World War II, for example, the Middle East Institute was founded in Washington, D.C. in 1946. The corresponding adjective is Middle Eastern and the derived noun is Middle Easterner. While non-Eurocentric terms such as "Southwest Asia" or "Swasia" have been sparsely used, the classification of the African country, Egypt, among those counted in the Middle East challenges the usefulness of using such terms. The description Middle has also led to some confusion over changing definitions. Before the First World War, "Near East" was used in English to refer to the Balkans and the Ottoman Empire, while "Middle East" referred to the Caucasus, Persia, and Arabian lands, and sometimes Afghanistan, India and others. In contrast, "Far East" referred to the countries of East Asia (e.g. China, Japan, and Korea). With the collapse of the Ottoman Empire in 1918, "Near East" largely fell out of common use in English, while "Middle East" came to be applied to the emerging independent countries of the Islamic world. However, the usage "Near East" was retained by a variety of academic disciplines, including archaeology and ancient history. In their usage, the term describes an area identical to the term Middle East, which is not used by these disciplines (see ancient Near East).[citation needed] The first official use of the term "Middle East" by the United States government was in the 1957 Eisenhower Doctrine, which pertained to the Suez Crisis. Secretary of State John Foster Dulles defined the Middle East as "the area lying between and including Libya on the west and Pakistan on the east, Syria and Iraq on the North and the Arabian peninsula to the south, plus the Sudan and Ethiopia." In 1958, the State Department explained that the terms "Near East" and "Middle East" were interchangeable, and defined the region as including only Egypt, Syria, Israel, Lebanon, Jordan, Iraq, Saudi Arabia, Kuwait, Bahrain, and Qatar. Since the late 20th century, scholars and journalists from the region, such as journalist Louay Khraish and historian Hassan Hanafi have criticized the use of "Middle East" as a Eurocentric and colonialist term. The Associated Press Stylebook of 2004 says that Near East formerly referred to the farther west countries while Middle East referred to the eastern ones, but that now they are synonymous. It instructs: Use Middle East unless Near East is used by a source in a story. Mideast is also acceptable, but Middle East is preferred. European languages have adopted terms similar to Near East and Middle East. Since these are based on a relative description, the meanings depend on the country and are generally different from the English terms. In German the term Naher Osten (Near East) is still in common use (nowadays the term Mittlerer Osten is more and more common in press texts translated from English sources, albeit having a distinct meaning). In the four Slavic languages, Russian Ближний Восток or Blizhniy Vostok, Bulgarian Близкия Изток, Polish Bliski Wschód or Croatian Bliski istok (terms meaning Near East are the only appropriate ones for the region). However, some European languages do have "Middle East" equivalents, such as French Moyen-Orient, Swedish Mellanöstern, Spanish Oriente Medio or Medio Oriente, Greek is Μέση Ανατολή (Mesi Anatoli), and Italian Medio Oriente.[c] Perhaps because of the political influence of the United States and Europe, and the prominence of Western press, the Arabic equivalent of Middle East (Arabic: الشرق الأوسط ash-Sharq al-Awsaṭ) has become standard usage in the mainstream Arabic press. It comprises the same meaning as the term "Middle East" in North American and Western European usage. The designation, Mashriq, also from the Arabic root for East, also denotes a variously defined region around the Levant, the eastern part of the Arabic-speaking world (as opposed to the Maghreb, the western part). Even though the term originated in the West, countries of the Middle East that use languages other than Arabic also use that term in translation. For instance, the Persian equivalent for Middle East is خاورمیانه (Khāvar-e miyāneh), the Hebrew is המזרח התיכון (hamizrach hatikhon), and the Turkish is Orta Doğu. Countries and territory Traditionally included within the Middle East are Arabia, Asia Minor, East Thrace, Egypt, Iran, the Levant, Mesopotamia, and the Socotra Archipelago. The region includes 17 UN-recognized countries and one British Overseas Territory. Various concepts are often paralleled to the Middle East, most notably the Near East, Fertile Crescent, and Levant. These are geographical concepts, which refer to large sections of the modern-day Middle East, with the Near East being the closest to the Middle East in its geographical meaning. Due to it primarily being Arabic speaking, the Maghreb region of North Africa is sometimes included. "Greater Middle East" is a political term coined by the second Bush administration in the first decade of the 21st century to denote various countries, pertaining to the Muslim world, specifically Afghanistan, Iran, Pakistan, and Turkey. Various Central Asian countries are sometimes also included. History The Middle East lies at the juncture of Africa and Eurasia and of the Indian Ocean and the Mediterranean Sea (see also: Indo-Mediterranean). It is the birthplace and spiritual center of religions such as Christianity, Islam, Judaism, Manichaeism, Yezidi, Druze, Yarsan, and Mandeanism, and in Iran, Mithraism, Zoroastrianism, Manicheanism, and the Baháʼí Faith. Throughout its history the Middle East has been a major center of world affairs; a strategically, economically, politically, culturally, and religiously sensitive area. The region is one of the regions where agriculture was independently discovered, and from the Middle East it was spread, during the Neolithic, to different regions of the world such as Europe, the Indus Valley and Eastern Africa. Prior to the formation of civilizations, advanced cultures formed all over the Middle East during the Stone Age. The search for agricultural lands by agriculturalists, and pastoral lands by herdsmen meant different migrations took place within the region and shaped its ethnic and demographic makeup. The Middle East is widely and most famously known as the cradle of civilization. The world's earliest civilizations, Mesopotamia (Sumer, Akkad, Assyria and Babylonia), ancient Egypt and Kish in the Levant, all originated in the Fertile Crescent and Nile Valley regions of the ancient Near East. These were followed by the Hittite, Greek, Hurrian and Urartian civilisations of Asia Minor; Elam, Persia and Median civilizations in Iran, as well as the civilizations of the Levant (such as Ebla, Mari, Nagar, Ugarit, Canaan, Aramea, Mitanni, Phoenicia and Israel) and the Arabian Peninsula (Magan, Sheba, Ubar). The Near East was first largely unified under the Neo Assyrian Empire, then the Achaemenid Empire followed later by the Macedonian Empire and after this to some degree by the Iranian empires (namely the Parthian and Sassanid Empires), the Roman Empire and Byzantine Empire. The region served as the intellectual and economic center of the Roman Empire and played an exceptionally important role due to its periphery on the Sassanid Empire. Thus, the Romans stationed up to five or six of their legions in the region for the sole purpose of defending it from Sassanid and Bedouin raids and invasions. From the 4th century CE onwards, the Middle East became the center of the two main powers at the time, the Byzantine Empire and the Sassanid Empire. However, it would be the later Islamic Caliphates of the Middle Ages, or Islamic Golden Age which began with the Islamic conquest of the region in the 7th century AD, that would first unify the entire Middle East as a distinct region and create the dominant Islamic Arab ethnic identity that largely (but not exclusively) persists today. The 4 caliphates that dominated the Middle East for more than 600 years were the Rashidun Caliphate, the Umayyad caliphate, the Abbasid caliphate and the Fatimid caliphate. Additionally, the Mongols would come to dominate the region, the Kingdom of Armenia would incorporate parts of the region to their domain, the Seljuks would rule the region and spread Turko-Persian culture, and the Franks would found the Crusader states that would stand for roughly two centuries. Josiah Russell estimates the population of what he calls "Islamic territory" as roughly 12.5 million in 1000 – Anatolia 8 million, Syria 2 million, and Egypt 1.5 million. From the 16th century onward, the Middle East came to be dominated, once again, by two main powers: the Ottoman Empire and the Safavid dynasty. The modern Middle East began after World War I, when the Ottoman Empire, which was allied with the Central Powers, was defeated by the Allies and partitioned into a number of separate nations, initially under British and French Mandates. Other defining events in this transformation included the establishment of Israel in 1948 and the eventual departure of European powers, notably Britain and France by the end of the 1960s. They were supplanted in some part by the rising influence of the United States from the 1970s onwards. In the 20th century, the region's significant stocks of crude oil gave it new strategic and economic importance. Mass production of oil began around 1945, with Saudi Arabia, Iran, Kuwait, Iraq, and the United Arab Emirates having large quantities of oil. Estimated oil reserves, especially in Saudi Arabia and Iran, are some of the highest in the world, and the international oil cartel OPEC is dominated by Middle Eastern countries. During the Cold War, the Middle East was a theater of ideological struggle between the two superpowers and their allies: NATO and the United States on one side, and the Soviet Union and Warsaw Pact on the other, as they competed to influence regional allies. Besides the political reasons there was also the "ideological conflict" between the two systems. Moreover, as Louise Fawcett argues, among many important areas of contention, or perhaps more accurately of anxiety, were, first, the desires of the superpowers to gain strategic advantage in the region, second, the fact that the region contained some two-thirds of the world's oil reserves in a context where oil was becoming increasingly vital to the economy of the Western world [...] Within this contextual framework, the United States sought to divert the Arab world from Soviet influence. Throughout the 20th and 21st centuries, the region has experienced both periods of relative peace and tolerance and periods of conflict particularly between Sunnis and Shiites. Geography In 2018, the MENA region emitted 3.2 billion tonnes of carbon dioxide and produced 8.7% of global greenhouse gas emissions (GHG) despite making up only 6% of the global population. These emissions are mostly from the energy sector, an integral component of many Middle Eastern and North African economies due to the extensive oil and natural gas reserves that are found within the region. The Middle East region is one of the most vulnerable to climate change. The impacts include increase in drought conditions, aridity, heatwaves and sea level rise. Sharp global temperature and sea level changes, shifting precipitation patterns and increased frequency of extreme weather events are some of the main impacts of climate change as identified by the Intergovernmental Panel on Climate Change (IPCC). The MENA region is especially vulnerable to such impacts due to its arid and semi-arid environment, facing climatic challenges such as low rainfall, high temperatures and dry soil. The climatic conditions that foster such challenges for MENA are projected by the IPCC to worsen throughout the 21st century. If greenhouse gas emissions are not significantly reduced, part of the MENA region risks becoming uninhabitable before the year 2100. Climate change is expected to put significant strain on already scarce water and agricultural resources within the MENA region, threatening the national security and political stability of all included countries. Over 60 percent of the region's population lives in high and very high water-stressed areas compared to the global average of 35 percent. This has prompted some MENA countries to engage with the issue of climate change on an international level through environmental accords such as the Paris Agreement. Law and policy are also being established on a national level amongst MENA countries, with a focus on the development of renewable energies. Economy Middle Eastern economies range from being very poor (such as Gaza and Yemen) to extremely wealthy nations (such as Qatar and UAE). According to the International Monetary Fund, the three largest Middle Eastern economies in nominal GDP in 2023 were Saudi Arabia ($1.06 trillion), Turkey ($1.03 trillion), and Israel ($0.54 trillion). For nominal GDP per person, the highest ranking countries are Qatar ($83,891), Israel ($55,535), the United Arab Emirates ($49,451) and Cyprus ($33,807). Turkey ($3.6 trillion), Saudi Arabia ($2.3 trillion), and Iran ($1.7 trillion) had the largest economies in terms of GDP PPP. For GDP PPP per person, the highest-ranking countries are Qatar ($124,834), the United Arab Emirates ($88,221), Saudi Arabia ($64,836), Bahrain ($60,596) and Israel ($54,997). The lowest-ranking country in the Middle East, in terms of GDP nominal per capita, is Yemen ($573). The economic structure of Middle Eastern nations are different because while some are heavily dependent on export of only oil and oil-related products (Saudi Arabia, the UAE and Kuwait), others have a highly diverse economic base (such as Cyprus, Israel, Turkey and Egypt). Industries of the Middle Eastern region include oil and oil-related products, agriculture, cotton, cattle, dairy, textiles, leather products, surgical instruments, defence equipment (guns, ammunition, tanks, submarines, fighter jets, UAVs, and missiles). Banking is an important sector, especially for UAE and Bahrain. With the exception of Cyprus, Turkey, Egypt, Lebanon and Israel, tourism has been a relatively undeveloped area of the economy, in part because of the socially conservative nature of the region as well as political turmoil in certain regions. Since the end of the COVID pandemic however, countries such as the UAE, Bahrain, and Jordan have begun attracting greater numbers of tourists because of improving tourist facilities and the relaxing of tourism-related restrictive policies. Unemployment is high in the Middle East and North Africa region, particularly among people aged 15–29, a demographic representing 30% of the region's population. The total regional unemployment rate in 2025 is 10.8%, and among youth is as high as 28%. Demographics Arabs constitute the largest ethnic group in the Middle East, followed by various Iranian peoples and then by Turkic peoples (Turkish, Azeris, Syrian Turkmen, and Iraqi Turkmen). Native ethnic groups of the region include, in addition to Arabs, Arameans, Assyrians, Baloch, Berbers, Copts, Druze, Greek Cypriots, Jews, Kurds, Lurs, Mandaeans, Persians, Samaritans, Shabaks, Tats, and Zazas. European ethnic groups that form a diaspora in the region include Albanians, Bosniaks, Circassians (including Kabardians), Crimean Tatars, Greeks, Franco-Levantines, Italo-Levantines, and Iraqi Turkmens. Among other migrant populations are Chinese, Filipinos, Indians, Indonesians, Pakistanis, Pashtuns, Romani, and Afro-Arabs. "Migration has always provided an important vent for labor market pressures in the Middle East. For the period between the 1970s and 1990s, the Arab states of the Persian Gulf in particular provided a rich source of employment for workers from Egypt, Yemen and the countries of the Levant, while Europe had attracted young workers from North African countries due both to proximity and the legacy of colonial ties between France and the majority of North African states." According to the International Organization for Migration, there are 13 million first-generation migrants from Arab nations in the world, of which 5.8 reside in other Arab countries. Expatriates from Arab countries contribute to the circulation of financial and human capital in the region and thus significantly promote regional development. In 2009 Arab countries received a total of US$35.1 billion in remittance in-flows and remittances sent to Jordan, Egypt and Lebanon from other Arab countries are 40 to 190 per cent higher than trade revenues between these and other Arab countries. In Somalia, the Somali Civil War has greatly increased the size of the Somali diaspora, as many of the best educated Somalis left for Middle Eastern countries as well as Europe and North America. Non-Arab Middle Eastern countries such as Turkey, Israel and Iran are also subject to important migration dynamics. A fair proportion of those migrating from Arab nations are from ethnic and religious minorities facing persecution and are not necessarily ethnic Arabs, Iranians or Turks.[citation needed] Large numbers of Kurds, Jews, Assyrians, Greeks and Armenians as well as many Mandeans have left nations such as Iraq, Iran, Syria and Turkey for these reasons during the last century. In Iran, many religious minorities such as Christians, Baháʼís, Jews and Zoroastrians have left since the Islamic Revolution of 1979. The Middle East is very diverse when it comes to religions, many of which originated there. Islam is the largest religion in the Middle East, but other faiths that originated there, such as Judaism and Christianity, are also well represented. Christian communities have played a vital role in the Middle East, and they represent 78% of Cyprus population, and 40.5% of Lebanon, where the Lebanese president, half of the cabinet, and half of the parliament follow one of the various Lebanese Christian rites. There are also important minority religions like the Baháʼí Faith, Yarsanism, Yazidism, Zoroastrianism, Mandaeism, Druze, and Shabakism, and in ancient times the region was home to Mesopotamian religions, Canaanite religions, Manichaeism, Mithraism and various monotheist gnostic sects. The six top languages, in terms of numbers of speakers, are Arabic, Persian, Turkish, Kurdish, Modern Hebrew and Greek. About 20 minority languages are also spoken in the Middle East. Arabic, with all its dialects, is the most widely spoken language in the Middle East, with Literary Arabic being official in all North African and in most West Asian countries. Arabic dialects are also spoken in some adjacent areas in neighbouring Middle Eastern non-Arab countries. It is a member of the Semitic branch of the Afro-Asiatic languages. Several Modern South Arabian languages such as Mehri and Soqotri are also spoken in Yemen and Oman. Another Semitic language is Aramaic and its dialects are spoken mainly by Assyrians and Mandaeans, with Western Aramaic still spoken in two villages near Damascus, Syria. There is also an Oasis Berber-speaking community in Egypt where the language is also known as Siwa. It is a non-Semitic Afro-Asiatic sister language. Persian is the second most spoken language. While it is primarily spoken in Iran and some border areas in neighbouring countries, the country is one of the region's largest and most populous. It belongs to the Indo-Iranian branch of the family of Indo-European languages. Other Western Iranic languages spoken in the region include Achomi, Daylami, Kurdish dialects, Semmani, Lurish, amongst many others. The close third-most widely spoken language, Turkish, is largely confined to Turkey, which is also one of the region's largest and most populous countries, but it is present in areas in neighboring countries. It is a member of the Turkic languages, which have their origins in East Asia. Another Turkic language, Azerbaijani, is spoken by Azerbaijanis in Iran. The fourth-most widely spoken language, Kurdish, is spoken in the countries of Iran, Iraq, Syria and Turkey, Sorani Kurdish is the second official language in Iraq (instated after the 2005 constitution) after Arabic. Hebrew is the official language of Israel, with Arabic given a special status after the 2018 Basic law lowered its status from an official language prior to 2018. Hebrew is spoken and used by over 80% of Israel's population, the other 20% using Arabic. Modern Hebrew only began being spoken in the 20th century after being revived in the late 19th century by Elizer Ben-Yehuda (Elizer Perlman) and European Jewish settlers, with the first native Hebrew speaker being born in 1882. Greek is one of the two official languages of Cyprus, and the country's main language. Small communities of Greek speakers exist all around the Middle East; until the 20th century it was also widely spoken in Asia Minor (being the second most spoken language there, after Turkish) and Egypt. During the antiquity, Ancient Greek was the lingua franca for many areas of the western Middle East and until the Muslim expansion it was widely spoken there as well. Until the late 11th century, it was also the main spoken language in Asia Minor; after that it was gradually replaced by the Turkish language as the Anatolian Turks expanded and the local Greeks were assimilated, especially in the interior. English is one of the official languages of Akrotiri and Dhekelia. It is also commonly taught and used as a foreign second language, in countries such as Egypt, Jordan, Iran, Iraq, Qatar, Bahrain, United Arab Emirates and Kuwait. It is also a main language in some Emirates of the United Arab Emirates. It is also spoken as native language by Jewish immigrants from Anglophone countries (UK, US, Australia) in Israel and understood widely as second language there. French is taught and used in many government facilities and media in Lebanon, and is taught in some primary and secondary schools of Egypt and Syria. Maltese, a Semitic language mainly spoken in Europe, is used by the Franco-Maltese diaspora in Egypt. Due to widespread immigration of French Jews to Israel, it is the native language of approximately 200,000 Jews in Israel. Armenian speakers are to be found in the region. Georgian is spoken by the Georgian diaspora. Russian is spoken by a large portion of the Israeli population, because of emigration in the late 1990s. Russian today is a popular unofficial language in use in Israel; news, radio and sign boards can be found in Russian around the country after Hebrew and Arabic. Circassian is also spoken by the diaspora in the region and by almost all Circassians in Israel who speak Hebrew and English as well. The largest Romanian-speaking community in the Middle East is found in Israel, where as of 1995[update] Romanian is spoken by 5% of the population.[d] Bengali, Hindi and Urdu are widely spoken by migrant communities in many Middle Eastern countries, such as Saudi Arabia (where 20–25% of the population is South Asian), the United Arab Emirates (where 50–55% of the population is South Asian), and Qatar, which have large numbers of Pakistani, Bangladeshi and Indian immigrants. Culture The Middle East has recently become more prominent in hosting global sport events due to its wealth and desire to diversify its economy. The South Asian diaspora is a major backer of cricket in the region. See also Notes References Further reading External links 29°N 41°E / 29°N 41°E / 29; 41
========================================