text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/Neolithic#Western_Asia] | [TOKENS: 6637]
Contents Neolithic The Neolithic or New Stone Age (from Greek νέος néos 'new' and λίθος líthos 'stone') is an archaeological period, the final division of the Stone Age in Mesopotamia, Asia, Europe and Africa (c. 10,000 BC to c. 2,000 BC). It saw the Neolithic Revolution, a wide-ranging set of developments that appear to have arisen independently in several parts of the world. This "Neolithic package" included the introduction of farming, domestication of animals, and change from a hunter-gatherer lifestyle to one of settlement. The term 'Neolithic' was coined by John Lubbock in 1865 as a refinement of the three-age system. The Neolithic began about 12,000 years ago, when farming appeared in the Epipalaeolithic Near East and Mesopotamia, and later in other parts of the world. It lasted in the Near East until the transitional period of the Chalcolithic (Copper Age) from about 6,500 years ago (4500 BC), marked by the development of metallurgy, leading up to the Bronze Age and Iron Age. In other places, the Neolithic followed the Mesolithic (Middle Stone Age) and then lasted until later. In Ancient Egypt, the Neolithic lasted until the Protodynastic period, c. 3150 BC. In China, it lasted until circa 2000 BC with the rise of the pre-Shang Erlitou culture, as it did in Scandinavia. Origin Following the ASPRO chronology, the Neolithic started in around 10,200 BC in the Levant, arising from the Natufian culture, when pioneering use of wild cereals evolved into early farming. The Natufian period or "proto-Neolithic" lasted from 12,500 to 9,500 BC, and is taken to overlap with the Pre-Pottery Neolithic A (PPNA) of 10,200–8800 BC. As the Natufians had become dependent on wild cereals in their diet, and a sedentary way of life had begun among them, the climatic changes associated with the Younger Dryas (about 10,000 BC) are thought to have forced people to develop farming. The founder crops of the Fertile Crescent were wheat, lentil, pea, chickpea, bitter vetch, and flax. Among the other major crops to be domesticated were rice and millet. Crops were usually domesticated in a single location and ancestral wild species are still found. Early Neolithic age farming was limited to a narrow range of plants, both wild and domesticated, which included einkorn wheat, millet and spelt, and the keeping of dogs. By about 8000 BC, it included domesticated sheep and goats, cattle and pigs. Not all of these cultural elements characteristic of the Neolithic appeared everywhere in the same order: the earliest farming societies in the Near East did not use pottery. In other parts of the world, such as Africa, South Asia and Southeast Asia, independent domestication events led to their own regionally distinctive Neolithic cultures, which arose completely independently of those in Europe and Southwest Asia. Early Japanese societies and other East Asian cultures used pottery before developing agriculture. Periods by region In the Middle East, cultures identified as Neolithic began appearing in the 10th millennium BC. Early development occurred in the Levant (e.g. Pre-Pottery Neolithic A and Pre-Pottery Neolithic B) and from there spread eastwards and westwards. Neolithic cultures are also attested in southeastern Anatolia and northern Mesopotamia by around 8000 BC.[citation needed] Anatolian Neolithic farmers derived a significant portion of their ancestry from the Anatolian hunter-gatherers (AHG), suggesting that agriculture was adopted in site by these hunter-gatherers and not spread by demic diffusion into the region. The Neolithic 1 (PPNA) period began around 10,000 BC in the Levant. A temple area in southeastern Turkey at Göbekli Tepe, dated to around 9500 BC, may be regarded as the beginning of the period. This site was developed by nomadic hunter-gatherer tribes, as shown by the absence of permanent housing nearby, and may be the oldest known human-made place of worship. At least seven stone circles, covering 25 acres (10 ha), contain limestone pillars carved with animals, insects, and birds. Stone tools were used by perhaps as many as hundreds of people to create the pillars, which might have supported roofs. Other early PPNA sites dating to around 9500–9000 BC have been found in Palestine, notably in Tell es-Sultan (ancient Jericho) and Gilgal in the Jordan Valley; Israel (notably Ain Mallaha, Nahal Oren, and Kfar HaHoresh); and in Byblos, Lebanon. The start of Neolithic 1 overlaps the Tahunian and Heavy Neolithic periods to some degree.[citation needed] The major advance of Neolithic 1 was true farming. In the proto-Neolithic Natufian cultures, wild cereals were harvested, and perhaps early seed selection and re-seeding occurred. The grain was ground into flour. Emmer wheat was domesticated, and animals were herded and domesticated (animal husbandry and selective breeding).[citation needed] In 2006, remains of figs were discovered in a house in Jericho dated to 9400 BC. The figs are of a mutant variety that cannot be pollinated by insects, and therefore the trees can only reproduce from cuttings. This evidence suggests that figs were the first cultivated crop and mark the invention of the technology of farming. This occurred centuries before the first cultivation of grains. Settlements became more permanent, with circular houses, much like those of the Natufians, with single rooms. However, these houses were for the first time made of mudbrick. The settlement had a surrounding stone wall and perhaps a stone tower (as in Jericho). The wall served as protection from nearby groups, as protection from floods, or to keep animals penned. Some of the enclosures also suggest grain and meat storage. The Neolithic 2 (PPNB) began around 8800 BC according to the ASPRO chronology in the Levant (Jericho, West Bank). As with the PPNA dates, there are two versions from the same laboratories noted above. This system of terminology, however, is not convenient for southeast Anatolia and settlements of the middle Anatolia basin.[citation needed] A settlement of 3,000 inhabitants called 'Ain Ghazal was found in the outskirts of Amman, Jordan. Considered to be one of the largest prehistoric settlements in the Near East, it was continuously inhabited from approximately 7250 BC to approximately 5000 BC. Settlements have rectangular mud-brick houses where the family lived together in single or multiple rooms. Burial findings suggest an ancestor cult where people preserved skulls of the dead, which were plastered with mud to make facial features. The rest of the corpse could have been left outside the settlement to decay until only the bones were left, then the bones were buried inside the settlement underneath the floor or between houses.[citation needed] Work at the site of 'Ain Ghazal in Jordan has indicated a later Pre-Pottery Neolithic C period. Juris Zarins has proposed that a Circum Arabian Nomadic Pastoral Complex developed in the period from the climatic crisis of 6200 BC, partly as a result of an increasing emphasis in PPNB cultures upon domesticated animals, and a fusion with Harifian hunter gatherers in the Southern Levant, with affiliate connections with the cultures of Fayyum and the Eastern Desert of Egypt. Cultures practicing this lifestyle spread down the Red Sea shoreline and moved east from Syria into southern Iraq. The Late Neolithic began around 6,400 BC in the Fertile Crescent. By then distinctive cultures emerged, with pottery like the Halafian (Turkey, Syria, Northern Mesopotamia) and Ubaid (Southern Mesopotamia). This period has been further divided into PNA (Pottery Neolithic A) and PNB (Pottery Neolithic B) at some sites. The Chalcolithic (Stone-Bronze) period began about 4500 BC, then the Bronze Age began about 3500 BC, replacing the Neolithic cultures.[citation needed] Around 10,000 BC the first fully developed Neolithic cultures belonging to the phase Pre-Pottery Neolithic A (PPNA) appeared in the Fertile Crescent. Around 10,700–9400 BC a settlement was established in Tell Qaramel, 10 miles (16 km) north of Aleppo. The settlement included two temples dating to 9650 BC. Around 9000 BC during the PPNA, one of the world's first towns, Jericho, appeared in the Levant. It was surrounded by a stone wall, may have contained a population of up to 2,000–3,000 people, and contained a massive stone tower. Around 6400 BC the Halaf culture appeared in Syria and Northern Mesopotamia. In 1981, a team of researchers from the Maison de l'Orient et de la Méditerranée, including Jacques Cauvin and Oliver Aurenche, divided Near East Neolithic chronology into ten periods (0 to 9) based on social, economic and cultural characteristics. In 2002, Danielle Stordeur and Frédéric Abbès advanced this system with a division into five periods. They also advanced the idea of a transitional stage between the PPNA and PPNB between 8800 and 8600 BC at sites like Jerf el Ahmar and Tell Aswad. Alluvial plains (Sumer/Elam). Low rainfall makes irrigation systems necessary. Ubaid culture originated from 6200 BC. The earliest evidence of Neolithic culture in northeast Africa was found in the archaeological sites of Bir Kiseiba and Nabta Playa in what is now southwest Egypt. Domestication of sheep and goats reached Egypt from the Near East possibly as early as 6000 BC. Graeme Barker states "The first indisputable evidence for domestic plants and animals in the Nile valley is not until the early fifth millennium BC in northern Egypt and a thousand years later further south, in both cases as part of strategies that still relied heavily on fishing, hunting, and the gathering of wild plants" and suggests that these subsistence changes were not due to farmers migrating from the Near East but was an indigenous development, with cereals either indigenous or obtained through exchange. Other scholars argue that the primary stimulus for agriculture and domesticated animals (as well as mud-brick architecture and other Neolithic cultural features) in Egypt was from the Middle East. The neolithization of Northwestern Africa was initiated by Iberian, Levantine (and perhaps Sicilian) migrants around 5500–5300 BC. During the Early Neolithic period, farming was introduced by Europeans and was subsequently adopted by the locals. During the Middle Neolithic period, an influx of ancestry from the Levant appeared in Northwestern Africa, coinciding with the arrival of pastoralism in the region. The earliest evidence for pottery, domestic cereals and animal husbandry is found in Morocco, specifically at Kaf el-Ghar. The Pastoral Neolithic was a period in Africa's prehistory marking the beginning of food production on the continent following the Later Stone Age. In contrast to the Neolithic in other parts of the world, which saw the development of farming societies, the first form of African food production was mobile pastoralism, or ways of life centered on the herding and management of livestock. The term "Pastoral Neolithic" is used most often by archaeologists to describe early pastoralist periods in the Sahara, as well as in eastern Africa. The Savanna Pastoral Neolithic or SPN (formerly known as the Stone Bowl Culture) is a collection of ancient societies that appeared in the Rift Valley of East Africa and surrounding areas during a time period known as the Pastoral Neolithic. They were South Cushitic speaking pastoralists, who tended to bury their dead in cairns whilst their toolkit was characterized by stone bowls, pestles, grindstones and earthenware pots. Through archaeology, historical linguistics and archaeogenetics, they conventionally have been identified with the area's first Afroasiatic-speaking settlers. Archaeological dating of livestock bones and burial cairns has also established the cultural complex as the earliest center of pastoralism and stone construction in the region. In southeast Europe agrarian societies first appeared in the 7th millennium BC, attested by one of the earliest farming sites of Europe, discovered in Vashtëmi, southeastern Albania and dating back to 6500 BC. In most of Western Europe in followed over the next two thousand years, but in some parts of Northwest Europe it is much later, lasting just under 3,000 years from c. 4500 BC–1700 BC. Recent advances in archaeogenetics have confirmed that the spread of agriculture from the Middle East to Europe was strongly correlated with the migration of early farmers from Anatolia about 9,000 years ago, and was not just a cultural exchange. Anthropomorphic figurines have been found in the Balkans from 6000 BC, and in Central Europe by around 5800 BC (La Hoguette). Among the earliest cultural complexes of this area are the Sesklo culture in Thessaly, which later expanded in the Balkans giving rise to Starčevo-Körös (Cris), Linearbandkeramik, and Vinča. Through a combination of cultural diffusion and migration of peoples, the Neolithic traditions spread west and northwards to reach northwestern Europe by around 4500 BC. The Vinča culture may have created the earliest system of writing, the Vinča signs, though archaeologist Shan Winn believes they most likely represented pictograms and ideograms rather than a truly developed form of writing. The Cucuteni-Trypillian culture built enormous settlements in Romania, Moldova and Ukraine from 5300 to 2300 BC. The megalithic temple complexes of Ġgantija on the Mediterranean island of Gozo (in the Maltese archipelago) and of Mnajdra (Malta) are notable for their gigantic Neolithic structures, the oldest of which date back to around 3600 BC. The Hypogeum of Ħal-Saflieni, Paola, Malta, is a subterranean structure excavated around 2500 BC; originally a sanctuary, it became a necropolis, the only prehistoric underground temple in the world, and shows a degree of artistry in stone sculpture unique in prehistory to the Maltese islands. After 2500 BC, these islands were depopulated for several decades until the arrival of a new influx of Bronze Age immigrants, a culture that cremated its dead and introduced smaller megalithic structures called dolmens to Malta. In most cases there are small chambers here, with the cover made of a large slab placed on upright stones. They are claimed to belong to a population different from that which built the previous megalithic temples. It is presumed the population arrived from Sicily because of the similarity of Maltese dolmens to some small constructions found there. With some exceptions, population levels rose rapidly at the beginning of the Neolithic until they reached the carrying capacity. This was followed by a population crash of "enormous magnitude" after 5000 BC, with levels remaining low during the next 1,500 years. Populations began to rise after 3500 BC, with further dips and rises occurring between 3000 and 2500 BC but varying in date between regions. Around this time is the Neolithic decline, when populations collapsed across most of Europe, possibly caused by climatic conditions, plague, or mass migration. Settled life, encompassing the transition from foraging to farming and pastoralism, began in South Asia in the region of Balochistan, Pakistan, around 7,000 BC. At the site of Mehrgarh, Balochistan, presence can be documented of the domestication of wheat and barley, rapidly followed by that of goats, sheep, and cattle. In April 2006, it was announced in the scientific journal Nature that the oldest (and first Early Neolithic) evidence for the drilling of teeth in vivo (using bow drills and flint tips) was found in Mehrgarh. In South India, the Neolithic began by 3000 BC and lasted until around 1400 BC when the Megalithic transition period began. South Indian Neolithic is characterized by Ash mounds (created from ritual burning of wood, dung and animal matter) from 2500 BC in Karnataka region, expanded later to Tamil Nadu. In East Asia, the earliest sites include the Nanzhuangtou culture around 9500–9000 BC, Pengtoushan culture around 7500–6100 BC, and Peiligang culture around 7000–5000 BC. The prehistoric Beifudi site near Yixian in Hebei Province, China, contains relics of a culture contemporaneous with the Cishan and Xinglongwa cultures of about 6000–5000 BC, Neolithic cultures east of the Taihang Mountains, filling in an archaeological gap between the two Northern Chinese cultures. The total excavated area is more than 1,200 square yards (1,000 m2; 0.10 ha), and the collection of Neolithic findings at the site encompasses two phases. Between 3000 and 1900 BC, the Longshan culture existed in the middle and lower Yellow River valley areas of northern China. Towards the end of the 3rd millennium BC, the population decreased sharply in most of the region and many of the larger centres were abandoned, possibly due to environmental change linked to the end of the Holocene Climatic Optimum. The 'Neolithic' (defined in this paragraph as using polished stone implements) remains a living tradition in small and extremely remote and inaccessible pockets of West Papua. Polished stone adze and axes are used in the present day (as of 2008[update]) in areas where the availability of metal implements is limited. This is likely to cease altogether in the next few years as the older generation die off and steel blades and chainsaws prevail.[citation needed] In 2012, news was released about a new farming site discovered in Munam-ri, Goseong, Gangwon Province, South Korea, which may be the earliest farmland known to date in east Asia. "No remains of an agricultural field from the Neolithic period have been found in any East Asian country before, the institute said, adding that the discovery reveals that the history of agricultural cultivation at least began during the period on the Korean Peninsula". The farm was dated between 3600 and 3000 BC. Pottery, stone projectile points, and possible houses were also found. "In 2002, researchers discovered prehistoric earthenware, jade earrings, among other items in the area". The research team will perform accelerator mass spectrometry (AMS) dating to retrieve a more precise date for the site. In Mesoamerica, a similar set of events (i.e., crop domestication and sedentary lifestyles) occurred by around 4500 BC in South America, but possibly as early as 11,000–10,000 BC. These cultures are usually not referred to as belonging to the Neolithic; in North America, different terms are used such as Formative stage instead of mid-late Neolithic, Archaic Era instead of Early Neolithic, and Paleo-Indian for the preceding period. The Formative stage is equivalent to the Neolithic Revolution period in Europe, Asia, and Africa. In the southwestern United States it occurred from 500 to 1200 AD when there was a dramatic increase in population and development of large villages supported by agriculture based on dryland farming of corn (maize), and later, beans, squash, and domesticated turkeys. During this period the bow and arrow and ceramic pottery were also introduced. In later periods cities of considerable size developed, and some metallurgy by 700 BC. Australia, in contrast to New Guinea, has generally been held not to have had a Neolithic period, with a hunter-gatherer lifestyle continuing until the arrival of Europeans. This view can be challenged in terms of the definition of agriculture, but "Neolithic" remains a rarely used and not very useful concept in discussing Australian prehistory. Cultural characteristics During most of the Neolithic age of Eurasia, people lived in small tribes composed of multiple bands or lineages. There is little scientific evidence of developed social stratification in most Neolithic societies; social stratification is more associated with the later Bronze Age. Although some late Eurasian Neolithic societies formed complex stratified chiefdoms or even states, generally states evolved in Eurasia only with the rise of metallurgy, and most Neolithic societies on the whole were relatively simple and egalitarian. Beyond Eurasia, however, states were formed during the local Neolithic in three areas, namely in the Preceramic Andes with the Caral-Supe Civilization, Formative Mesoamerica and Ancient Hawaiʻi. However, most Neolithic societies were noticeably more hierarchical than the Upper Paleolithic cultures that preceded them and hunter-gatherer cultures in general. The domestication of large animals (c. 8000 BC) resulted in a dramatic increase in social inequality in most of the areas where it occurred; New Guinea being a notable exception. Possession of livestock allowed competition between households and resulted in inherited inequalities of wealth. Neolithic pastoralists who controlled large herds gradually acquired more livestock, and this made economic inequalities more pronounced. However, evidence of social inequality is still disputed, as settlements such as Çatalhöyük reveal a lack of difference in the size of homes and burial sites, suggesting a more egalitarian society with no evidence of the concept of capital, although some homes do appear slightly larger or more elaborately decorated than others. Families and households were still largely independent economically, and the household was probably the center of life. However, excavations in Central Europe have revealed that early Neolithic Linear Ceramic cultures ("Linearbandkeramik") were building large arrangements of circular ditches between 4800 and 4600 BC. These structures (and their later counterparts such as causewayed enclosures, burial mounds, and henge) required considerable time and labour to construct, which suggests that some influential individuals were able to organise and direct human labour – though non-hierarchical and voluntary work remain possibilities. There is a large body of evidence for fortified settlements at Linearbandkeramik sites along the Rhine, as at least some villages were fortified for some time with a palisade and an outer ditch. Settlements with palisades and weapon-traumatized bones, such as those found at the Talheim Death Pit, have been discovered and demonstrate that "...systematic violence between groups" and warfare was probably much more common during the Neolithic than in the preceding Paleolithic period. This supplanted an earlier view of the Linear Pottery Culture as living a "peaceful, unfortified lifestyle". Violence increased toward the end of this culture which existed at 5500–4500 BC. In 2024, a study suggested a peaceful explanation to the reduction in the size of male population observed worldwide 5000–3000 years ago. Control of labour and inter-group conflict is characteristic of tribal groups with social rank that are headed by a charismatic individual – either a 'big man' or a proto-chief – functioning as a lineage-group head. Whether a non-hierarchical system of organization existed is debatable, and there is no evidence that explicitly suggests that Neolithic societies functioned under any dominating class or individual, as was the case in the chiefdoms of the European Early Bronze Age. Possible exceptions to this include Iraq during the Ubaid period and England beginning in the Early Neolithic (4100–3000 BC). Theories to explain the apparent implied egalitarianism of Neolithic (and Paleolithic) societies have arisen, notably the Marxist concept of primitive communism.[citation needed] Phylogenies reconstructed from modern genetic data indicates an extreme drop in Y-chromosomal diversity occurred during the Neolithic, with effective population size for the mitochondria up to 17 times higher than for the Y-chromosomes during this period. The causes of this bottleneck remain poorly understood. At a basic level, it can likely be attributed to a culture-induced change in the distribution of male reproductive success, with possible explanations ranging from an increased incidence of violence and male mortality during the Neolithic to the rise of patrilineal segmentary groups with varying reproductive success due to polygyny. The shelter of early people changed dramatically from the Upper Paleolithic to the Neolithic era. In the Paleolithic, people did not normally live in permanent constructions. In the Neolithic, mud brick houses started appearing that were coated with plaster. This increased use of clay for building, along with the development of pottery and other clay-based artifacts, has led some to refer to the Neolithic period as the Age of Clay. The growth of agriculture made permanent houses far more common. At Çatalhöyük 9,000 years ago, doorways were made on the roof, with ladders positioned both on the inside and outside of the houses. Stilt-house settlements were common in the Alpine and Pianura Padana (Terramare) region. Remains have been found in the Ljubljana Marsh in Slovenia and at the Mondsee and Attersee lakes in Upper Austria, for example. A significant and far-reaching shift in human subsistence and lifestyle was to be brought about in areas where crop farming and cultivation were first developed: the previous reliance on an essentially nomadic hunter-gatherer subsistence technique or pastoral transhumance was at first supplemented, and then increasingly replaced by, a reliance upon the foods produced from cultivated lands. These developments are also believed to have greatly encouraged the growth of settlements, since it may be supposed that the increased need to spend more time and labor in tending crop fields required more localized dwellings. This trend would continue into the Bronze Age, eventually giving rise to permanently settled farming towns, and later cities and states whose larger populations could be sustained by the increased productivity from cultivated lands. The profound differences in human interactions and subsistence methods associated with the onset of early agricultural practices in the Neolithic have been called the Neolithic Revolution, a term coined in the 1920s by the Australian archaeologist Vere Gordon Childe. One potential benefit of the development and increasing sophistication of farming technology was the possibility of producing surplus crop yields, in other words, food supplies in excess of the immediate needs of the community. Surpluses could be stored for later use, or possibly traded for other necessities or luxuries. Agricultural life afforded securities that nomadic life could not, and sedentary farming populations grew faster than nomadic. However, early farmers were also adversely affected in times of famine, such as may be caused by drought or pests. In instances where agriculture had become the predominant way of life, the sensitivity to these shortages could be particularly acute, affecting agrarian populations to an extent that otherwise may not have been routinely experienced by prior hunter-gatherer communities. Nevertheless, agrarian communities generally proved successful, and their growth and the expansion of territory under cultivation continued. Another significant change undergone by many of these newly agrarian communities was one of diet. Pre-agrarian diets varied by region, season, available local plant and animal resources and degree of pastoralism and hunting. Post-agrarian diet was restricted to a limited package of successfully cultivated cereal grains, plants and to a variable extent domesticated animals and animal products. Supplementation of diet by hunting and gathering was to variable degrees precluded by the increase in population above the carrying capacity of the land and a high sedentary local population concentration. In some cultures, there would have been a significant shift toward increased starch and plant protein. The relative nutritional benefits and drawbacks of these dietary changes and their overall impact on early societal development are still debated. In addition, increased population density, decreased population mobility, increased continuous proximity to domesticated animals, and continuous occupation of comparatively population-dense sites would have altered sanitation needs and patterns of disease. The identifying characteristic of Neolithic technology is the use of polished or ground stone tools, in contrast to the flaked stone tools used during the Paleolithic era. Neolithic people were skilled farmers, manufacturing a range of tools necessary for the tending, harvesting and processing of crops (such as sickle blades and grinding stones) and food production (e.g. pottery, bone implements). They were also skilled manufacturers of a range of other types of stone tools and ornaments, including projectile points, beads, and statuettes. But what allowed forest clearance on a large scale was the polished stone axe above all other tools. Together with the adze, fashioning wood for shelter, structures and canoes for example, this enabled them to exploit the newly developed farmland. Neolithic peoples in the Levant, Anatolia, Syria, northern Mesopotamia and Central Asia were also accomplished builders, utilizing mud-brick to construct houses and villages. At Çatalhöyük, houses were plastered and painted with elaborate scenes of humans and animals. In Europe, long houses built from wattle and daub were constructed. Elaborate tombs were built for the dead. These tombs are particularly numerous in Ireland, where there are many thousand still in existence. Neolithic people in the British Isles built long barrows and chamber tombs for their dead and causewayed camps, henges, flint mines and cursus monuments. It was also important to figure out ways of preserving food for future months, such as fashioning relatively airtight containers, and using substances like salt as preservatives. The peoples of the Americas and the Pacific mostly retained the Neolithic level of tool technology until the time of European contact. Exceptions include copper hatchets and spearheads in the Great Lakes region. Most clothing appears to have been made of animal skins, as indicated by finds of large numbers of bone and antler pins that are ideal for fastening leather. Wool cloth and linen might have become available during the later Neolithic, as suggested by finds of perforated stones that (depending on size) may have served as spindle whorls or loom weights. List of early settlements Paleolithic (3.3 Mya - 12 ka) Epipalaeolithic Mesolithic (20 ka - 5 ka) Neolithic (12 ka - 4 ka) Neolithic human settlements include: The world's oldest known engineered roadway, the Post Track in England, dates from 3838 BC and the world's oldest freestanding structure is the Neolithic temple of Ġgantija in Gozo, Malta. List of cultures and sites Note: Dates are very approximate, and are only given for a rough estimate; consult each culture for specific time periods. Early Neolithic Periodization: The Levant: 9500–8000 BC; Europe: 7000–4000 BC; Elsewhere: varies greatly, depending on region. Middle Neolithic Periodization: The Levant: 8000–6500 BC; Europe: 5500–3500 BC; Elsewhere: varies greatly, depending on region. Later Neolithic Periodization: 6500–4500 BC; Europe: 5000–3000 BC; Elsewhere: varies greatly, depending on region. Periodization: Near East: 6000–3500 BC; Europe: 5000–2000 BC; Elsewhere: varies greatly, depending on region. In the Americas, the Chalcolithic ended as late as the 19th century AD for some peoples. Comparative chronology See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Rounding] | [TOKENS: 7584]
Contents Rounding Rounding or rounding off is the process of adjusting a number to an approximate, more convenient value, often with a shorter or simpler representation. For example, replacing $23.4476 with $23.45, the fraction 312/937 with 1/3, or the expression √2 with 1.414. Rounding is often done to obtain a value that is easier to report and communicate than the original. Rounding can also be important to avoid misleadingly precise reporting of a computed number, measurement, or estimate; for example, a quantity that was computed as 123456 but is known to be accurate only to within a few hundred units is usually better stated as "about 123500". On the other hand, rounding of exact numbers will introduce some round-off error in the reported result. Rounding is almost unavoidable when reporting many computations – especially when dividing two numbers in integer or fixed-point arithmetic; when computing mathematical functions such as square roots, logarithms, and sines; or when using a floating-point representation with a fixed number of significant digits. In a sequence of calculations, these rounding errors generally accumulate, and in certain ill-conditioned cases they may make the result meaningless. Accurate rounding of transcendental mathematical functions is difficult because the number of extra digits that need to be calculated to resolve whether to round up or down cannot be known in advance. This problem is known as "the table-maker's dilemma". Rounding has many similarities to the quantization that occurs when physical quantities must be encoded by numbers or digital signals. A wavy equals sign (≈) is sometimes used to indicate rounding of exact numbers, e.g. 9.98 ≈ 10. This sign was introduced by Alfred George Greenhill in 1892. Ideal characteristics of rounding methods include: Because it is not usually possible for a method to satisfy all ideal characteristics, many different rounding methods exist. As a general rule, rounding is idempotent; i.e., once a number has been rounded, rounding it again to the same precision will not change its value. Rounding functions are also monotonic; i.e., rounding two numbers to the same absolute precision will not exchange their order (but may give the same value). In the general case of a discrete range, they are piecewise constant functions. Types of rounding Typical rounding problems include: Rounding to integer The most basic form of rounding is to replace an arbitrary number by an integer. All the following rounding modes are concrete implementations of an abstract single-argument "round()" procedure. These are true functions (with the exception of those that use randomness). These four methods are called directed rounding to an integer, as the displacements from the original number x to the rounded value y are all directed toward or away from the same limiting value (0, +∞, or −∞). Directed rounding is used in interval arithmetic and is often required in financial calculations. If x is positive, round-down is the same as round-toward-zero, and round-up is the same as round-away-from-zero. If x is negative, round-down is the same as round-away-from-zero, and round-up is the same as round-toward-zero. In any case, if x is an integer, y is just x. Where many calculations are done in sequence, the choice of rounding method can have a very significant effect on the result. A famous instance involved a new index set up by the Vancouver Stock Exchange in 1982. It was initially set at 1000.000 (three decimal places of accuracy), and after 22 months had fallen to about 520, although the market appeared to be rising. The problem was caused by the index being recalculated thousands of times daily, and always being truncated (rounded down) to 3 decimal places, in such a way that the rounding errors accumulated. Recalculating the index for the same period using rounding to the nearest thousandth rather than truncation corrected the index value from 524.811 up to 1098.892. For the examples below, sgn(x) refers to the sign function applied to the original number, x. One may round down (or take the floor, or round toward negative infinity): y is the largest integer that does not exceed x. For example, 23.7 gets rounded to 23, and −23.2 gets rounded to −24. One may also round up (or take the ceiling, or round toward positive infinity): y is the smallest integer that is not less than x. For example, 23.2 gets rounded to 24, and −23.7 gets rounded to −23. One may also round toward zero (or truncate, or round away from infinity): y is the integer that is closest to x such that it is between 0 and x (included); i.e. y is the integer part of x, without its fraction digits. For example, 23.7 gets rounded to 23, and −23.7 gets rounded to −23. One may also round away from zero (or round toward infinity): y is the integer that is closest to 0 (or equivalently, to x) such that x is between 0 and y (included). For example, 23.2 gets rounded to 24, and −23.2 gets rounded to −24. These six methods are called rounding to the nearest integer. Rounding a number x to the nearest integer requires some tie-breaking rule for those cases when x is exactly half-way between two integers – that is, when the fraction part of x is exactly 0.5. If it were not for the 0.5 fractional parts, the round-off errors introduced by the round to nearest method would be symmetric: for every fraction that gets rounded down (such as 0.268), there is a complementary fraction (namely, 0.732) that gets rounded up by the same amount. When rounding a large set of fixed-point numbers with uniformly distributed fractional parts, the rounding errors by all values, with the omission of those having 0.5 fractional part, would statistically compensate each other. This means that the expected (average) value of the rounded numbers is equal to the expected value of the original numbers when numbers with fractional part 0.5 from the set are removed. In practice, floating-point numbers are typically used, which have even more computational nuances because they are not equally spaced. One may round half up (or round half toward positive infinity), a tie-breaking rule that is widely used in many disciplines.[citation needed] That is, half-way values of x are always rounded up. If the fractional part of x is exactly 0.5, then y = x + 0.5 For example, 23.5 gets rounded to 24, and −23.5 gets rounded to −23. Some programming languages (such as Java and Python) use "half up" to refer to round half away from zero rather than round half toward positive infinity. This method only requires checking one digit to determine rounding direction in two's complement and similar representations. One may also round half down (or round half toward negative infinity) as opposed to the more common round half up. If the fractional part of x is exactly 0.5, then y = x − 0.5 For example, 23.5 gets rounded to 23, and −23.5 gets rounded to −24. Some programming languages (such as Java and Python) use "half down" to refer to round half toward zero rather than round half toward negative infinity. One may also round half toward zero (or round half away from infinity) as opposed to the conventional round half away from zero. If the fractional part of x is exactly 0.5, then y = x − 0.5 if x is positive, and y = x + 0.5 if x is negative. For example, 23.5 gets rounded to 23, and −23.5 gets rounded to −23. This method treats positive and negative values symmetrically, and therefore is free of overall positive/negative bias if the original numbers are positive or negative with equal probability. It does, however, still have bias toward zero. One may also round half away from zero (or round half toward infinity), a tie-breaking rule that is commonly taught and used, namely: If the fractional part of x is exactly 0.5, then y = x + 0.5 if x is positive, and y = x − 0.5 if x is negative. For example, 23.5 gets rounded to 24, and −23.5 gets rounded to −24. This can be more efficient on computers that use sign-magnitude representation for the values to be rounded, because only the first omitted digit needs to be considered to determine if it rounds up or down. This is one method used when rounding to significant figures due to its simplicity. This method, also known as commercial rounding,[citation needed] treats positive and negative values symmetrically, and therefore is free of overall positive/negative bias if the original numbers are positive or negative with equal probability. It does, however, still have bias away from zero. It is often used for currency conversions and price roundings (when the amount is first converted into the smallest significant subdivision of the currency, such as cents of a euro) as it is easy to explain by just considering the first fractional digit, independently of supplementary precision digits or sign of the amount (for strict equivalence between the paying and recipient of the amount). One may also round half to even, a tie-breaking rule without positive/negative bias and without bias toward/away from zero. By this convention, if the fractional part of x is 0.5, then y is the even integer nearest to x. Thus, for example, 23.5 becomes 24, as does 24.5; however, −23.5 becomes −24, as does −24.5. This function minimizes the expected error when summing over rounded figures, regardless of the inputs being mostly positive or mostly negative, provided they are neither mostly even nor mostly odd. This variant of the round-to-nearest method is also called convergent rounding, statistician's rounding, Dutch rounding, Gaussian rounding, odd–even rounding, or bankers' rounding. This is the default rounding mode used in IEEE 754 operations for results in binary floating-point formats. By eliminating bias, repeated addition or subtraction of independent numbers, as in a one-dimensional random walk, will give a rounded result with an error that tends to grow in proportion to the square root of the number of operations rather than linearly. However, this rule distorts the distribution by increasing the probability of evens relative to odds. That is why this rule is for situations where sums are more important than distribution.[clarification needed] One may also round half to odd, a similar tie-breaking rule to round half to even. In this approach, if the fractional part of x is 0.5, then y is the odd integer nearest to x. Thus, for example, 23.5 becomes 23, as does 22.5; while −23.5 becomes −23, as does −22.5. This method is also free from positive/negative bias and bias toward/away from zero, provided the numbers to be rounded are neither mostly even nor mostly odd. It also shares the round half to even property of distorting the original distribution, as it increases the probability of odds relative to evens. It was the method used for bank balances in the United Kingdom when it decimalized its currency[clarification needed]. This variant is almost never used in computations, except in situations where one wants to avoid increasing the scale of floating-point numbers, which have a limited exponent range. With round half to even, a non-infinite number would round to infinity, and a small denormal value would round to a normal non-zero value. Effectively, this mode prefers preserving the existing scale of tie numbers, avoiding out-of-range results when possible for numeral systems of even radix (such as binary and decimal).[clarification needed (see talk)]. One method, more obscure than most, is to alternate direction when rounding a number with 0.5 fractional part. All others are rounded to the closest integer. Whenever the fractional part is 0.5, alternate rounding up or down: for the first occurrence of a 0.5 fractional part, round up, for the second occurrence, round down, and so on. Alternatively, the first 0.5 fractional part rounding can be determined by a random seed. "Up" and "down" can be any two rounding methods that oppose each other - toward and away from positive infinity or toward and away from zero. If occurrences of 0.5 fractional parts occur significantly more than a restart of the occurrence "counting", then it is effectively bias free. With guaranteed zero bias, it is useful if the numbers are to be summed or averaged. If the fractional part of x is 0.5, choose y randomly between x + 0.5 and x − 0.5, with equal probability. All others are rounded to the closest integer. Like round-half-to-even and round-half-to-odd, this rule is essentially free of overall bias, but it is also fair among even and odd y values. An advantage over alternate tie-breaking is that the last direction of rounding on the 0.5 fractional part does not have to be "remembered". Rounding as follows to one of the closest integer toward negative infinity and the closest integer toward positive infinity, with a probability dependent on the proximity is called stochastic rounding and will give an unbiased result on average. For example, 1.6 would be rounded to 1 with probability 0.4 and to 2 with probability 0.6. Stochastic rounding can be accurate in a way that a rounding function can never be. For example, suppose one started with 0 and added 0.3 to that one hundred times while rounding the running total between every addition. The result would be 0 with regular rounding, but with stochastic rounding, the expected result would be 30, which is the same value obtained without rounding. This can be useful in machine learning where the training may use low precision arithmetic iteratively. Stochastic rounding is also a way to achieve 1-dimensional dithering. Rounding to other values The most common type of rounding is to round to an integer; or, more generally, to an integer multiple of some increment – such as rounding to whole tenths of seconds, hundredths of a dollar, to whole multiples of 1/2 or 1/8 inch, to whole dozens or thousands, etc. In general, rounding a number x to a multiple of some specified positive value m entails the following steps: For example, rounding x = 2.1784 dollars to whole cents (i.e., to a multiple of 0.01) entails computing 2.1784 / 0.01 = 217.84, then rounding that to 218, and finally computing 218 × 0.01 = 2.18. When rounding to a predetermined number of significant digits, the increment m depends on the magnitude of the number to be rounded (or of the rounded result). The increment m is normally a finite fraction in whatever numeral system is used to represent the numbers. For display to humans, that usually means the decimal numeral system (that is, m is an integer times a power of 10, like 1/1000 or 25/100). For intermediate values stored in digital computers, it often means the binary numeral system (m is an integer times a power of 2). The abstract single-argument "round()" function that returns an integer from an arbitrary real value has at least a dozen distinct concrete definitions presented in the rounding to integer section. The abstract two-argument "roundToMultiple()" function is formally defined here, but in many cases it is used with the implicit value m = 1 for the increment and then reduces to the equivalent abstract single-argument function, with also the same dozen distinct concrete definitions. Rounding to a specified power is very different from rounding to a specified multiple; for example, it is common in computing to need to round a number to a whole power of 2. The steps, in general, to round a positive number x to a power of some positive number b other than 1, are: Many of the caveats applicable to rounding to a multiple are applicable to rounding to a power. In the chromatic "twelve-tone" scale of music, 3⁄2 is rounded to 27/12 (a fifth), 4⁄3 is rounded to 25/12 (a fourth), 5⁄4 is rounded to 24/12 (a major third), 6⁄5 is rounded to 23/12 (a minor third), and 9⁄8 is rounded to 22/12 (a diminished third). This type of rounding, which is also named rounding to a logarithmic scale, is a variant of rounding to a specified power. Rounding on a logarithmic scale is accomplished by taking the log of the amount and doing normal rounding to the nearest value on the log scale. For example, resistors are supplied with preferred numbers on a logarithmic scale. In particular, for resistors with a 10% accuracy, they are supplied with nominal values 100, 120, 150, 180, 220, etc. rounded to multiples of 10 (E12 series). If a calculation indicates a resistor of 165 ohms is required then log(150) = 2.176, log(165) = 2.217 and log(180) = 2.255. The logarithm of 165 is closer to the logarithm of 180 therefore a 180 ohm resistor would be the first choice if there are no other considerations. Whether a value x ∈ (a, b) rounds to a or b depends upon whether the squared value x2 is greater than or less than the product ab. The value 165 rounds to 180 in the resistors example because 1652 = 27225 is greater than 150 × 180 = 27000. In floating-point arithmetic, rounding aims to turn a given value x into a value y with a specified number of significant digits. In other words, y should be a multiple of a number m that depends on the magnitude of x. The number m is a power of the base (usually 2 or 10) of the floating-point representation. Apart from this detail, all the variants of rounding discussed above apply to the rounding of floating-point numbers as well. The algorithm for such rounding is presented in the Scaled rounding section above, but with a constant scaling factor s = 1, and an integer base b > 1. Where the rounded result would overflow the result for a directed rounding is either the appropriate signed infinity when "rounding away from zero", or the highest representable positive finite number (or the lowest representable negative finite number if x is negative), when "rounding toward zero". The result of an overflow for the usual case of round to nearest is always the appropriate infinity. In some contexts it is desirable to round a given number x to a "neat" fraction – that is, the nearest fraction y = m/n whose numerator m and denominator n do not exceed a given maximum. This problem is fairly distinct from that of rounding a value to a fixed number of decimal or binary digits, or to a multiple of a given unit m. This problem is related to Farey sequences, the Stern–Brocot tree, and continued fractions. Finished lumber, writing paper, electronic components, and many other products are usually sold in only a few standard values. Many design procedures describe how to calculate an approximate value, and then "round" to some standard size using phrases such as "round down to nearest standard value", "round up to nearest standard value", or "round to nearest standard value". When a set of preferred values is equally spaced on a logarithmic scale, choosing the closest preferred value to any given value can be seen as a form of scaled rounding. Such rounded values can be directly calculated. More general rounding rules can separate values at arbitrary break points, used for example in data binning. A related mathematically formalized tool is signpost sequences, which use notions of distance other than the simple difference – for example, a sequence may round to the integer with the smallest relative (percent) error. Rounding in other contexts When digitizing continuous signals, such as sound waves, the overall effect of a number of measurements is more important than the accuracy of each individual measurement. In these circumstances, dithering, and a related technique, error diffusion, are normally used. A related technique called pulse-width modulation is used to achieve analog type output from an inertial device by rapidly pulsing the power with a variable duty cycle. Delta-sigma modulation is commonly used for converting between real-world signals and digital signals, which allows control of the frequency statistics of quantization. Error diffusion tries to ensure the error, on average, is minimized. When dealing with a gentle slope from one to zero, the output would be zero for the first few terms until the sum of the error and the current value becomes greater than 0.5, in which case a 1 is output and the difference subtracted from the error so far. Floyd–Steinberg dithering is a popular error diffusion procedure when digitizing images. As a one-dimensional example, suppose the numbers 0.9677, 0.9204, 0.7451, and 0.3091 occur in order and each is to be rounded to a multiple of 0.01. In this case the cumulative sums, 0.9677, 1.8881 = 0.9677 + 0.9204, 2.6332 = 0.9677 + 0.9204 + 0.7451, and 2.9423 = 0.9677 + 0.9204 + 0.7451 + 0.3091, are each rounded to a multiple of 0.01: 0.97, 1.89, 2.63, and 2.94. The first of these and the differences of adjacent values give the desired rounded values: 0.97, 0.92 = 1.89 − 0.97, 0.74 = 2.63 − 1.89, and 0.31 = 2.94 − 2.63. Monte Carlo arithmetic is a technique in Monte Carlo methods where the rounding is randomly up or down. Stochastic rounding can be used for Monte Carlo arithmetic, but in general, just rounding up or down with equal probability is more often used. Repeated runs will give a random distribution of results which can indicate the stability of the computation. It is possible to use rounded arithmetic to evaluate the exact value of a function with integer domain and range. For example, if an integer n is known to be a perfect square, its square root can be computed by converting n to a floating-point value z, computing the approximate square root x of z with floating point, and then rounding x to the nearest integer y. If n is not too big, the floating-point round-off error in x will be less than 0.5, so the rounded value y will be the exact square root of n. This is essentially why slide rules could be used for exact arithmetic. Rounding a number twice in succession to different levels of precision, with the latter precision being coarser, is not guaranteed to give the same result as rounding once to the final precision except in the case of directed rounding.[nb 2] For instance, rounding 9.46 to the nearest tenth gives 9.5, and then 10 when rounding to the nearest integer using rounding half to even, but would give 9 when rounded directly using the same method. Borman and Chatfield discuss the implications of double rounding when comparing data rounded to one decimal place to specification limits expressed using integers. In Martinez v. Allstate and Sendejo v. Farmers, litigated between 1995 and 1997, the insurance companies argued that double rounding premiums was permissible and in fact required. The US courts ruled against the insurance companies and ordered them to adopt rules to ensure single rounding. Some computer languages and the IEEE 754-2008 standard dictate that in straightforward calculations the result should not be rounded twice. This has been a particular problem with Java, as it is designed to be run identically on different machines; special programming tricks have had to be used to achieve this with x87 floating point. The Java language was changed to allow different results where the difference does not matter and require a strictfp qualifier to be used when the results have to conform accurately; strict floating point was restored in Java 17. In some algorithms, an intermediate result is computed in a larger precision, then must be rounded to the final precision. Double rounding can be avoided by choosing an adequate rounding for the intermediate computation. This consists in avoiding to round to midpoints for the final rounding (except when the midpoint is exact). In binary arithmetic, the idea is to round the result toward zero, and set the least significant bit to 1 if the rounded result is inexact; this rounding is called sticky rounding. Equivalently, it consists in returning the intermediate result when it is exactly representable, and the nearest floating-point number with an odd significand otherwise; this is why it is also known as rounding to odd. A concrete implementation of this approach, for binary and decimal arithmetic, is implemented as Rounding to prepare for shorter precision. This rounding mode is used to avoid getting a potentially wrong result after multiple roundings. This can be achieved if all roundings except the final one are done using rounding to prepare for shorter precision ("RPSP"), and only the final rounding uses the externally requested mode. With decimal arithmetic, final digits of 0 and 5 are avoided when the input is not representable exactly; if there is a choice between numbers with the least significant digit 0 or 1, 4 or 5, 5 or 6, 9 or 0, then the digit different from 0 or 5 shall be selected; otherwise, the choice is arbitrary. IBM defines that, in the latter case, a digit with the smaller magnitude shall be selected. RPSP can be applied with the step between two consequent roundings as small as a single digit (for example, rounding to 1/10 can be applied after rounding to 1/100). For example, when rounding to integer, In the example from "Double rounding" section, rounding 9.46 to one decimal gives 9.4, which rounding to integer in turn gives 9. With binary arithmetic, this rounding is also called "round to odd" (not to be confused with "round half to odd"). For example, when rounding to 1/4 (0.01 in binary), For correct results with binary arithmetic, each rounding step must remove at least 2 binary digits, otherwise, wrong results may appear. For example, If the erroneous middle step is removed, the final rounding to integer rounds 3.25 to the correct value of 3. RPSP is implemented in hardware in IBM zSeries and pSeries. In Python module "Decimal", Tcl module "math", Haskell package "decimal-arithmetic", and possibly others, this mode is called ROUND_05UP or round05up. William M. Kahan coined the term "The Table-Maker's Dilemma" for the unknown cost of rounding transcendental functions: Nobody knows how much it would cost to compute yw correctly rounded for every two floating-point arguments at which it does not over/underflow. Instead, reputable math libraries compute elementary transcendental functions mostly within slightly more than half an ulp and almost always well within one ulp. Why can't yw be rounded within half an ulp like SQRT? Because nobody knows how much computation it would cost... No general way exists to predict how many extra digits will have to be carried to compute a transcendental expression and round it correctly to some preassigned number of digits. Even the fact (if true) that a finite number of extra digits will ultimately suffice may be a deep theorem. The IEEE 754 floating-point standard guarantees that add, subtract, multiply, divide, fused multiply–add, square root, and floating-point remainder will give the correctly rounded result of the infinite-precision operation. No such guarantee was given in the 1985 standard for more complex functions and they are typically only accurate to within the last bit at best. However, the 2008 standard guarantees that conforming implementations will give correctly rounded results which respect the active rounding mode; implementation of the functions, however, is optional. Using the Gelfond–Schneider theorem and Lindemann–Weierstrass theorem, many of the standard elementary functions can be proved to return transcendental results, except on some well-known arguments; therefore, from a theoretical point of view, it is always possible to correctly round such functions. However, for an implementation of such a function, determining a limit for a given precision on how accurate results need to be computed, before a correctly rounded result can be guaranteed, may demand a lot of computation time or may be out of reach. In practice, when this limit is not known (or only a very large bound is known), some decision has to be made in the implementation (see below); but according to a probabilistic model, correct rounding can be satisfied with a very high probability when using an intermediate accuracy of up to twice the number of digits of the target format plus some small constant (after taking special cases into account). Some programming packages offer correct rounding. The GNU MPFR package gives correctly rounded arbitrary precision results. Some other libraries implement elementary functions with correct rounding in IEEE 754 double precision (binary64): There exist computable numbers for which a rounded value can never be determined no matter how many digits are calculated. Specific instances cannot be given but this follows from the undecidability of the halting problem. For instance, if Goldbach's conjecture is true but unprovable, then the result of rounding the following value, n, up to the next integer cannot be determined: either n=1+10−k where k is the first even number greater than 4 which is not the sum of two primes, or n=1 if there is no such number. The rounded result is 2 if such a number k exists and 1 otherwise. The value before rounding can however be approximated to any given precision even if the conjecture is unprovable. Rounding can adversely affect a string search for a number. For example, π rounded to four digits is "3.1416" but a simple search for this string will not discover "3.14159" or any other value of π rounded to more than four digits. In contrast, truncation does not suffer from this problem; for example, a simple string search for "3.1415", which is π truncated to four digits, will discover values of π truncated to more than four digits. History The concept of rounding is very old, perhaps older than the concept of division itself. Some ancient clay tablets found in Mesopotamia contain tables with rounded values of reciprocals and square roots in base 60. Rounded approximations to π, the length of the year, and the length of the month are also ancient—see base 60 examples. The round-half-to-even method has served as American Standard Z25.1 and ASTM standard E-29 since 1940. The origin of the terms unbiased rounding and statistician's rounding are fairly self-explanatory. In the 1906 fourth edition of Probability and Theory of Errors Robert Simpson Woodward called this "the computer's rule", indicating that it was then in common use by human computers who calculated mathematical tables. For example, it was recommended in Simon Newcomb's c. 1882 book Logarithmic and Other Mathematical Tables. Lucius Tuttle's 1916 Theory of Measurements called it a "universally adopted rule" for recording physical measurements. Churchill Eisenhart indicated the practice was already "well established" in data analysis by the 1940s. The origin of the term bankers' rounding remains more obscure. If this rounding method was ever a standard in banking, the evidence has proved extremely difficult to find. To the contrary, section 2 of the European Commission report The Introduction of the Euro and the Rounding of Currency Amounts suggests that there had previously been no standard approach to rounding in banking; and it specifies that "half-way" amounts should be rounded up. Until the 1980s, the rounding method used in floating-point computer arithmetic was usually fixed by the hardware, poorly documented, inconsistent, and different for each brand and model of computer. This situation changed after the IEEE 754 floating-point standard was adopted by most computer manufacturers. The standard allows the user to choose among several rounding modes, and in each case specifies precisely how the results should be rounded. These features made numerical computations more predictable and machine-independent, and made possible the efficient and consistent implementation of interval arithmetic. Currently, much research tends to round to multiples of 5 or 2. For example, Jörg Baten used age heaping in many studies, to evaluate the numeracy level of ancient populations. He came up with the ABCC Index, which enables the comparison of the numeracy among regions possible without any historical sources where the population literacy was measured. Rounding functions in programming languages Most programming languages provide functions or special syntax to round fractional numbers in various ways. The earliest numeric languages, such as Fortran and C, would provide only one method, usually truncation (toward zero). This default method could be implied in certain contexts, such as when assigning a fractional number to an integer variable, or using a fractional number as an index of an array. Other kinds of rounding had to be programmed explicitly; for example, rounding a positive number to the nearest integer could be implemented by adding 0.5 and truncating. In the last decades, however, the syntax and the standard libraries of most languages have commonly provided at least the four basic rounding functions (up, down, to nearest, and toward zero). The tie-breaking method can vary depending on the language and version or might be selectable by the programmer. Several languages follow the lead of the IEEE 754 floating-point standard, and define these functions as taking a double-precision float argument and returning the result of the same type, which then may be converted to an integer if necessary. This approach may avoid spurious overflows because floating-point types have a larger range than integer types. Some languages, such as PHP, provide functions that round a value to a specified number of decimal digits (e.g., from 4321.5678 to 4321.57 or 4300). In addition, many languages provide a printf or similar string formatting function, which allows one to convert a fractional number to a string, rounded to a user-specified number of decimal places (the precision). On the other hand, truncation (round to zero) is still the default rounding method used by many languages, especially for the division of two integer values. In contrast, CSS and SVG do not define any specific maximum precision for numbers and measurements, which they treat and expose in their DOM and in their IDL interface as strings as if they had infinite precision, and do not discriminate between integers and floating-point values; however, the implementations of these languages will typically convert these numbers into IEEE 754 double-precision floating-point values before exposing the computed digits with a limited precision (notably within standard JavaScript or ECMAScript interface bindings). Other rounding standards Some disciplines or institutions have issued standards or directives for rounding. In a guideline issued in mid-1966, the U.S. Office of the Federal Coordinator for Meteorology determined that weather data should be rounded to the nearest round number, with the "round half up" tie-breaking rule. For example, 1.5 rounded to integer should become 2, and −1.5 should become −1. Prior to that date, the tie-breaking rule was "round half away from zero". Some meteorologists may write "−0" to indicate a temperature between 0.0 and −0.5 degrees (exclusive) that was rounded to an integer. This notation is used when the negative sign is considered important, no matter how small is the magnitude; for example, when rounding temperatures in the Celsius scale, where below zero indicates freezing.[citation needed] See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Advance_and_secure] | [TOKENS: 484]
Contents Advance and secure Advance and secure often just AAS is player versus player team based game mode used in a few hardcore, tactical and milsim first-person shooter games. Players on opposing teams try to capture certain predefined points of interest in a game environment in a certain preset order. Basic concept Players in both teams have to capture a number of bases or zones. The winning team is the one that has most bases under control when the time or tickets runs out. History Advance and secure was first introduced in the game Joint Operations: Typhoon Rising. While advance and secure is somewhat similar to Battlefield game series Conquest game mode, it does naturally promote more teamwork and strategy compared to Battlefields more open ended Conquest mode. The original Joint Operations: Typhoon Rising AAS mode also shares some similarities with Team Fortress Classic Control Point game mode. Structure in different games The ArmA community has modded a Joint Operations style AAS game mode since ArmA I. In advance and secure mode in Joint Operations: Typhoon Rising and its expansion titled Joint Operations: Escalation the AAS game mode is the same. It is based on round capturing zones which can vary in size individually from tiny only few square meters to big 0,95 square kilometers in size, from meters in diameter to hundreds of meters in diameter depending the map designers decisions. In the middle of each zone is spawn point, where players can spawn if the zone is controlled by own team in other words is fully captured (From game, Joint Operations: Typhoon Rising / Escalation). The capturing zones can overlap and form sectors where one player can affect several zones simultaneously making such sectors strategically valuable, this overlapping is seen in some stock maps. Community made Project Reality modification for Battlefield 2 have its own modified version of advance and secure game mode. It uses the same basic idea of predefined capturing order, but instead of capturing bases, a player captures zones since the size of zones are normally big from 100 meters to 300 meters in radius. This forces a player to control an area before capturing the flag instead of just controlling a single building. One difference is that part of the zones can be randomly chosen at the start of the round, which add replay value to maps. Major difference to Joint Operations AAS mode is that the zone doesn't have rigid spawn point middle of the zone, instead the spawn points are player deployable assets. References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Word_game] | [TOKENS: 498]
Contents Word game Word games are spoken, board, card or video games often designed to test ability with language or to explore its properties. Word games are generally used as a source of entertainment, but can additionally serve an educational purpose. Young children may enjoy playing games such as Mad Libs Junior, while developing spelling and writing skills. Researchers have found that adults who regularly solved crossword puzzles, which require familiarity with a larger vocabulary, had better brain function later in life. Popular word-based game shows have been a part of television and radio throughout broadcast history, including Spelling Bee, the first televised game show, and Wheel of Fortune, the longest-running syndicated game show in the United States. Categories In a letter arrangement game, the goal is to form words out of given letters. These games generally test vocabulary skills as well as lateral thinking skills. Some examples of letter arrangement games include Scrabble, Upwords, Bananagrams, and Countdown. In a paper and pencil game, players write their own words, often under specific constraints. For example, a crossword requires players to use clues to fill out a grid, with words intersecting at specific letters. Other examples of paper and pencil games include hangman, categories, Boggle, and word searches. Semantic games focus on the semantics of words, utilising their meanings and the shared knowledge of players as a mechanic. Connections, Mad Libs, Blankety Blank, and Codenames are all semantic games. Games involving creating words that meet specific conditions, such as Wordle, Word Ladder. Modern word games As part of the modern "Golden Age" of board games, designers have created a variety of newer, non-traditional word games, often with more complex rules. Games like Codenames, Decrypto, and Anomia were all designed after 2010, and have earned widespread acclaim. Mobile games like Letterpress, Words with Friends, and Word Connect have also brought word games to modern audiences. In media Many popular word games have been adapted to television and radio game shows. In addition to the examples given above, shows like Lingo, Says You!, Catchphrase, and Only Connect either revolve around or include elements of word games. On NPR, the Sunday Puzzle is hosted by Will Shortz, The New York Times crossword editor. Word games have also been launched on the Internet and featured in major publications, such as The New York Times Spelling Bee, Connections, and Wordle. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet_pornography] | [TOKENS: 4274]
Contents Internet pornography Internet pornography or online pornography is any pornography that is accessible over the Internet; primarily via websites, FTP connections, peer-to-peer file sharing, or Usenet newsgroups. The greater accessibility of the World Wide Web from the late 1990s led to an incremental growth of Internet pornography, the use of which among adolescents and adults has since become increasingly popular. Danni's Hard Drive started in 1995 by Danni Ashe is considered one of the earliest online pornography websites. In 2020, estimates suggested there were nearly 30 million pornography websites, comprising about 12% of all websites on the Internet. In 2022, the total amount of pornographic content accessible online was estimated to be over 10,000 terabytes. The four most accessed pornography websites are Pornhub, xHamster, XVideos, and XNXX. As of 2025[update], a single company, Aylo, owns and operates most of the popular online streaming pornography websites, including: Pornhub, RedTube, Tube8, and YouPorn, as well as pornographic film studios like: Brazzers, Digital Playground, Reality Kings, and Sean Cody among others, but it does not own websites like xHamster, XVideos, and XNXX. Some have alleged that the company is a monopoly. Introduction Starting in the late 1980s, the Internet has played a major part in increasing access to pornography. Usenet newsgroups provided the base for what has been called the "amateur revolution" where amateur pornographers, with the help of digital cameras and the Internet, created and distributed their own pornographic content independent of the mainstream networks. The use of the World Wide Web became popular with the introduction of Netscape navigator in 1994. This development paved the way for newer methods of distribution and consumption of pornography. The Internet as a medium to access pornography became so popular that in 1995 Time published a cover story titled "Cyberporn". Danni's Hard Drive started in 1995, by Danni Ashe is considered one of the earliest online pornographic websites; coded by Ashe, a former stripper and nude model, the website was reported by CNN in 2000 to have made revenues of $6.5 million. In 2012, the total number of pornographic websites was estimated to be around 25 million, comprising 12% of all websites. In 2022, the amount of pornographic content accessible online was estimated at over 10,000 terabytes.[a] In 2024, according to the DSA regulation, 59 out of 100 Spaniards visits one of the three biggest websites monthly. Before its shutdown in 2025, ThisAV was a popular pornographic website in Hong Kong. History and methods of distribution Pornography is regarded by some as one of the driving forces behind the expansion of the World Wide Web, like camcorders, VCRs and cable television before it. Prior to the development of the World Wide Web, pornographic images had been transmitted over the Internet as ASCII porn. To send images over network required computers with graphics capabilities and higher network bandwidth. In the late 1980s and early 1990s this was possible through the use of anonymous FTP servers and the Gopher protocol, an early content delivery protocol that was later displaced by HTTP. One of the early Gopher/FTP sites to compile pornography was the Digital Archive on the 17th Floor at TU Delft. This small image archive contained some low quality scanned pornographic images that were initially available to anyone anonymously. The site soon became restricted to Netherlands only access after traffic grew to over 10,000 users around the world, who were obtaining approximately 30,000 images a day. Usenet newsgroups provided an early way of sharing images over the narrow bandwidth available in the early 1990s. Because of the network restrictions of the time, images had to be encoded as ascii text and then broken into sections before being posted to the Alt.binaries of the usenet. These files could then be downloaded and then reassembled before being decoded back to an image. Automated software such as Aub (Assemble Usenet Binaries) allowed the automatic download and assembly of the images from a newsgroup. There was rapid growth in the number of posts in the early 1990s but image quality was restricted by the size of files that could be posted. This method was also used to disseminate pornographic images, which were usually scanned from adult magazines. This type of distribution was generally free (apart from fees for Internet access), and provided a great deal of anonymity. The anonymity made it safe and easy to ignore copyright restrictions, as well as protecting the identity of uploaders and downloaders. Around this time frame, pornography was also distributed via pornographic Bulletin Board Systems such as Rusty n Edie's. These BBSes could charge users for access, leading to the first commercial online pornography. A 1995 article written in The Georgetown Law Journal titled "Marketing Pornography on the Information Superhighway: A Survey of 917,410 Images, Description, Short Stories and Animations Downloaded 8.5 Million Times by Consumers in Over 2000 Cities in Forty Countries, Provinces and Territories" by Martin Rimm, a Carnegie Mellon University graduate student, claimed that (as of 1994) 83.5% of the images on Usenet newsgroups where images were stored were pornographic in nature. Before publication, Philip Elmer-DeWitt used the research in a Time magazine article, "On a Screen Near You: Cyberporn." The findings were attacked by journalists and civil liberties advocates who insisted the findings were seriously flawed. "Rimm's implication that he might be able to determine 'the percentage of all images available on the Usenet that are pornographic on any given day' was sheer fantasy" wrote Mike Godwin in HotWired. The research was cited during a session of U.S. Congress. The student changed his name and disappeared from public view. Godwin recounts the episode in "Fighting a Cyberporn Panic" in his book Cyber Rights: Defending Free Speech in the Digital Age. The invention of the World Wide Web spurred both commercial and non-commercial distribution of pornography. The rise of pornography websites offering photos, video clips and streaming media including live webcam access allowed greater access to pornography. Both commercial and free pornographic sites are common on the Internet. The bandwidth usage of a pornographic website is relatively high, which can lead to large web hosting and Internet costs. Free websites, which often use advertising revenue to earn income, may not earn a sufficient amount to cover the costs of web hosting. One entry into the free pornographic website market are thumbnail gallery post sites. These are free websites that post links to commercial sites, providing a sampling of the commercial site in the form of thumbnail images, or in the form of Free Hosted Galleries—samplings of full-sized content provided and hosted by the commercial sites to promote their site. Some free websites primarily serve as portals by keeping up-to-date indexes of these smaller sampler sites. When a user purchases a subscription to a commercial site after clicking through from a free thumbnail gallery site, the commercial site makes a payment to the owner of the free site. There are several forms of sites delivering adult content. A common form of adult content is a categorized list (more often a table) of small pictures (called "thumbnails") linked to galleries. These sites are called a thumbnail gallery post (TGP). As a rule, these sites sort thumbnails by category and type of content available on a linked gallery. Sites containing thumbnails that lead to galleries with video content are called MGP (movie gallery post). The main benefit of TGP/MGP is that the surfer can get a first impression of the content provided by a gallery without actually visiting it. However, TGP sites are open to abuse, with the most abusive form being the so-called CJ (abbreviation for circlejerk), that contains links that mislead the surfer to sites he or she actually did not wish to see. This is also called a redirect. Linklists, unlike TGP/MGP sites, do not display a huge number of pictures. A linklist is a (frequently) categorised web list of links to so-called "freesites*", but unlike TGPs, links are provided in a form of text, not thumbs. It is still a question which form is more descriptive to a surfer, but many webmasters cite a trend that thumbs are much more productive, and simplify searching. On the other hand, linklists have a larger amount of unique text, which helps them improve their positions in search engine listings. TopLists are linklists whose internal ranking of freesites is based on incoming traffic from those freesites, except that freesites designed for TopLists have many more galleries. Peer-to-peer file sharing networks provide another form of free access to pornography. While such networks have been associated largely with the illegal sharing of copyrighted music and movies, the sharing of pornography has also been a popular use for file sharing. Many commercial sites have recognized this trend and have begun distributing free samples of their content on peer-to-peer networks. Viewership As of 2011, the majority of viewers of online pornography were men; women tended to prefer romance novels and erotic fan fiction. Women comprised about one quarter to one third of visitors to popular pornography websites, but were only 2% of subscribers to pay sites. Subscribers with female names were flagged as signs of potential credit card fraud, because "so many of these charges result in an angry wife or mother demanding a refund for the misuse of her card." Nonetheless, women spend more time on average on pornography websites, particularly Pornhub, than men and were more interested in pornography upon marriage. An anti-porn research group, Barna Group and Covenant Eyes, reported in 2020 that "33% of women aged 25 and under search for porn at least once per month. A 2015 study found "a big jump" in pornography viewing over the past few decades, with the largest increase driven by the people born in the 1970s and 1980s. While the study's authors noted this increase is "smaller than conventional wisdom might predict," it is still quite significant. Those who were born since the 1980s onward were the first to grow up in a world where they had access to the Internet from their teenage years, this early exposure and accessibility of Internet pornography might have been the primary driver of this increase. States that are highly religious and conservative were found to search for more Internet pornography. Internet pornography formats Pornographic images may be either scanned into the computer from photographs or magazines, produced with a digital camera or a frame from a video before being uploading onto a pornographic website. The JPEG format is one of the most common formats for these images. Another format is GIF which may provide an animated image where the people in the picture move. It can last for only a second or two up to a few minutes and then reruns (repeats) indefinitely. If the position of the objects in the last frame is about the same as the first frame, there is the illusion of continuous action. Pornographic video clips may be distributed in a number of formats, including MPEG, WMV, and QuickTime. More recently, VCD and DVD image files allow the distribution of whole VCDs and DVDs. Many commercial porn sites exist that allow one to view pornographic streaming video. As of 2020, some Internet pornography sites have begun offering 5K resolution content, while 1080p and 4K resolution are still more common. Since mid-2006, advertising-supported free pornographic video sharing websites based on the YouTube format have appeared. Referred to as Porn 2.0, these sites generally use Flash technology to distribute videos that were uploaded by users; these include user-generated content as well as scenes from commercial porn movies and advertising clips from pornographic websites. Another format of adult content that emerged with the advent of the Internet is live webcams. Webcam content can generally be divided into two categories: group shows offered to members of an adult paysite, and one-on-one private sessions usually sold on a pay-per-view basis. Server-based webcam sex shows spur unique international economics: adult models in various countries perform live webcam shows and chat for clients in affluent countries. This kind of activity is sometimes mediated by companies that will set up websites and manage finances. They may maintain "office" space for the models to perform from, or they provide the interface for models to work at home, with their own computer with webcam. As of 2020, most so-called cam hosts stream directly from their home, due to the availability of fast Internet and cheap HD webcams. These models earn money through tips or by selling exclusive content to their viewers through live cam sites, which can reach more than 20,000 viewers at once. Live cam sites are very popular with sites like Chaturbate or LiveJasmin appearing among the 100 most popular websites according to Alexa Internet. Other formats include text and audio files. While pornographic and erotic stories, distributed as text files, web pages, and via message boards and newsgroups, have been semi-popular, audio porn, via formats like MP3 and FLV, have increased in popularity.[citation needed] Audio pornography can include recordings of people having sex or simply reading erotic stories. (Pornographic magazines are available in Zinio format, which provides a reader program to enable access.) Combination formats, such as webteases that consist of images and text are also common. Legal status The Internet is an international network and there are currently no international laws regulating pornography; each country deals with Internet pornography differently. Generally, in the United States, if the act depicted in the pornographic content is legal in the jurisdiction that it is being distributed from then the distributor of such content would not be in violation of the law regardless of whether it is accessible in countries where it is illegal. This does not apply to those who access the pornography, however, as they could still be prosecuted under local laws in their country. Due to enforcement problems in anti-pornography laws over the Internet, countries that prohibit or heavily restrict access to pornography have taken other approaches to limit access by their citizens, such as employing content filters. Many activists and politicians have expressed concern over the easy availability of Internet pornography, especially to minors. This has led to a variety of attempts to restrict children's access to Internet pornography such as the 1996 Communications Decency Act in the United States. Some companies use an Adult Verification System (AVS) to deny access to pornography by minors. However, most Adult Verification Systems charge fees that are substantially higher than the actual costs of any verification they do (for example, in excess of $10/month) and are really part of a revenue collection scheme where sites encourage users to sign up for an AVS system, and get a percentage of the proceeds in return. In response to concerns with regard to children accessing age-inappropriate content, the adult industry, through the Association of Sites Advocating Child Protection (ASACP), began a self-labeling initiative called the Restricted to Adults label (RTA). This label is recognized by many web filtering products and is entirely free to use. Most employers have distinct policies against the accessing of any kind of online pornographic material from company computers,[citation needed] in addition to which some have also installed comprehensive filters and logging software in their local computer networks. One area of Internet pornography that has been the target of the strongest efforts at curtailment is child pornography. Because of this, most Internet pornography websites based in the U.S. have a notice on their front page that they comply with 18 USC Section 2257, which requires the keeping of records regarding the age of the people depicted in photographs, along with displaying the name of the company record keeper. Some site operators outside the U.S. have begun to include this compliance statement on their websites as well. On April 8, 2008 Evil Angel and its owner John Stagliano were charged in federal court with multiple counts of obscenity. One count was for, "using an interactive computer service to display an obscene movie trailer in a manner available to a person under 18 years of age." More than a dozen U.S. states have enacted laws requiring age verification to access online pornography. United Kingdom also does a similar thing and France, Spain, Italy, Denmark and Greece as for test by 2025 as for Online Safety Act 2023 and EU Digital Services. The UK one will also include all the internet including social media like Discord, Reddit and X, gaming platforms like Steam, Xbox and more. Web filters and blocking software A variety of content-control, parental control and filtering software is available to block pornography and other classifications of material from particular computers or (usually company-owned) networks. Commercially available Web filters include Bess, Net Nanny, SurfWatch, SeeNoEvil, and others. Various work-arounds and bypasses are available for some of these products; Peacefire is one of the most notable clearinghouses for such countermeasures. Child pornography The Internet has radically changed how child pornography is reproduced and disseminated, and, according to the United States Department of Justice, resulted in a massive increase in the "availability, accessibility, and volume of child pornography." The production of child pornography has become very profitable, bringing in several billion dollars a year, and is no longer limited to pedophiles. Philip Jenkins notes that there is "overwhelming evidence that [child pornography] is all but impossible to obtain through nonelectronic means." In 2006, the International Centre for Missing & Exploited Children (ICMEC) published a report of findings on the presence of child pornography legislation in the then-184 INTERPOL member countries. It later updated this information, in subsequent editions, to include 196 UN member countries. The report, entitled “Child Pornography: Model Legislation & Global Review,” assesses whether national legislation: (1) exists with specific regard to child pornography; (2) provides a definition of child pornography; (3) expressly criminalizes computer-facilitated offenses; (4) criminalizes the knowing possession of child pornography, regardless of intent to distribute; and (5) requires ISPs to report suspected child pornography to law enforcement or to some other mandated agency. ICMEC stated that it found in its initial report that only 27 countries had legislation needed to deal with child pornography offenses, while 95 countries did not have any legislation that specifically addressed child pornography, making child pornography a global issue worsened by the inadequacies of domestic legislation. The 7th Edition Report found that still only 69 countries had legislation needed to deal with child pornography offenses, while 53 did not have any legislation specifically addressing the problem. Over seven years of research from 2006 to 2012, ICMEC and its Koons Family Institute on International Law and Policy report that they have worked with 100 countries that have revised or put in place new child pornography laws. The NCMEC estimated in 2003 that 20 percent of all pornography traded over the Internet was child pornography, and that since 1997, the number of child pornography images available on the Internet had increased by 1,500 percent. Regarding Internet proliferation, the US DOJ states that "At any one time there are estimated to be more than one million pornographic images of children on the Internet, with 200 new images posted daily." They also note that a single offender arrested in the United Kingdom possessed 450,000 child pornography images, and that a single child pornography site received a million hits in a month. Further, much of the trade in child pornography takes place at hidden levels of the Internet. It has been estimated that between 50,000 and 100,000 pedophiles are involved in organized pornography rings around the world, and that one third of them operate from the United States. Digital cameras and Internet distribution facilitated by the use of credit cards and the ease of transferring images across national borders has made it easier than ever before for users of child pornography to obtain the photographs and videos. In 2007, the British-based Internet Watch Foundation reported that child pornography on the Internet was becoming more brutal and graphic, and the number of images depicting violent abuse had risen fourfold since 2003. The CEO stated "The worrying issue is the severity and the gravity of the images is increasing. We're talking about prepubescent children being raped." About 80 percent of the children in the abusive images were female, and 91 percent appeared to be children under the age of 12. Prosecution is difficult because multiple international servers are used, sometimes to transmit the images in fragments to evade the law. See also Notes References Bibliography External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Nematocyst] | [TOKENS: 2506]
Contents Cnidocyte A cnidocyte (also known as a cnidoblast) is a type of cell containing a large secretory organelle called a cnidocyst, that can deliver a sting to other organisms as a way to subdue prey and defend against predators. A cnidocyte explosively ejects the toxin-containing cnidocyst which is responsible for the stings delivered by a cnidarian. The presence of this cell defines the phylum Cnidaria, which also includes the corals, sea anemones, hydrae, and jellyfish. Cnidocytes are single-use cells that need to be continuously replaced. Structure and function Each cnidocyte contains an organelle called a cnidocyst,[a] which consists of a bulb-shaped capsule and a hollow, coiled tubule that is contained within. Immature cnidocytes are referred to as cnidoblasts or nematoblasts. The externally oriented side of the cell has a hair-like trigger called a cnidocil, a mechano-chemical receptor. When the trigger is activated, the tubule shaft of the cnidocyst is ejected and, in the case of the penetrant nematocyst, the forcefully ejected tubule penetrates the target organism. This discharge takes a few microseconds, and is able to reach accelerations of about 40,000 g. Research from 2006 suggests the process occurs in as little as 700 nanoseconds, thus reaching an acceleration of up to 5,410,000 g. After penetration, the toxic content of the nematocyst is injected into the target organism, allowing the sessile cnidarian to capture the immobilized prey. Recently, in two sea anemone species (Nematostella vectensis and Anthopleura elegantissima), the type I neurotoxin protein Nv1 was shown to be localized in ectodermal gland cells in the tentacles, next to but not in nematocysts. Upon encounter with a crustacean prey, nematocysts discharge and pierce the prey, and Nv1 is massively secreted into the extracellular medium by the nearby gland cells, thus suggesting another mode of entry for toxins. The cnidocyte capsule is made of novel Cnidaria-specific gene products which combine known protein domains. Minicollagen gene products (proteins) are one of the major structural components of the capsule. They are very short genes containing the characteristic collagen-triple helix sequence, as well as polyproline domains and cysteine-rich domains. Trimeres of mini collagen proteins assemble through their terminal cysteine-rich domain, forming highly organized and rigid supra-structures. Minicollagen 1 Ncol-1 polymers assemble on the inner shell while the outer capsule is composed of polymerized NOWA (Nematocyst Outer Wall Antigen) proteins. Nemato Galectin, minicollagen Ncol-15 and chondroitin are novel proteins used to build the tubule shaft. In piercing cnidocytes, the novel protein spinalin is used to make the spines present at the base of the shaft. The cnidocyst capsule stores a large concentration of calcium ions, which are released from the capsule into the cytoplasm of the cnidocyte when the trigger is activated. This causes a large concentration gradient of calcium across the cnidocyte plasma membrane. The resulting osmotic pressure causes a rapid influx of water into the cell. This increase in water volume in the cytoplasm forces the coiled cnidae tubule to eject rapidly. Prior to discharge the coiled cnidae tubule exists inside the cell in an "inside out" condition. The back pressure resulting from the influx of water into the cnidocyte together with the opening of the capsule tip structure or operculum, triggers the forceful eversion of the cnidae tubule causing it to right itself as it comes rushing out of the cell with enough force to impale a prey organism. That force is to be calculated as the mass of the mechanism's stylet multiplied by its acceleration. The pressure that is generated by this impact into its prey is to be calculated by the stylet's force divided by its area. Researchers have calculated an ejected mass of 1 nanogram, an acceleration of 5,410,000 g and a stylet tip radius of 15 ± 8 nm. Therefore, a pressure of more than 7 GPa was estimated at the stylet tip which they write is in the range of technical bullets. Few papers have modeled the discharge aside from direct observation. Observational studies typically used a tentacle solution assay with a chemical stimulant to create discharge and cameras to record them. One in 1984 and another in 2006 as imaging technology improved. One study involved computational fluid dynamics where variables such as barb plate size, prey cylindrical diameter and fluid medium Reynolds number were manipulated. Observational studies indicate that velocities of the barb/stylet decrease throughout the discharge. As such, the incredible maximum acceleration is achieved at the beginning. Dynamic traits such as maximum discharge velocities and trajectory patterns may not correspond to static traits such as tubule lengths and capsule volumes. Therefore, caution is appropriate when using medusan nematocyst assemblages as indicators of prey selection and trophic role. This is possibly the case for other jelly species and hence one cannot generally infer nematocyst static traits to prey size. Cnidae are "single use" cells, and thus represent a large expenditure of energy to produce. In Hydrozoans, in order to regulate discharge, cnidocytes are connected as "batteries", containing several types of cnidocytes connected to supporting cells and neurons. The supporting cells contain chemosensors, which, together with the mechanoreceptor on the cnidocyte (cnidocil), allow only the right combination of stimuli to cause discharge, such as prey swimming, and chemicals found in prey cuticle or cutaneous tissue. This prevents the cnidarian from stinging itself although sloughed off cnidae can be induced to fire independently. Types of cnidae Over 30 types of cnidae are found in different cnidarians. They can be divided into the following groups: Cnidocyte subtypes can be differentially localized in the animal. In the sea anemone Nematostella vectensis, the majority of its non-penetrant sticky cnidocytes, the spherocytes, are found in the tentacles, and are thought to help with prey capture by sticking to the prey. By contrast, the two penetrant types of cnidocytes present in this species display a much broader localization, on the outer epithelial layer of the tentacles and body column, as well as on the pharynx epithelium and within mesenteries. The diversity of cnidocytes types correlates with the expansion and diversification of structural cnidocyst genes like mini collagen genes. Minicollagen genes form compact gene clusters in Cnidarian genomes, suggesting a diversification through gene duplication and subfunctionalization. Anthozoans display less capsule diversity and a reduced number of mini collagen genes, and medusozoans have more capsule diversity (about 25 types) and a vastly expanded minicollagen genes repertoire. In the sea anemone Nematostella vectensis, some minicollagens display a differential expression pattern in different cnidocytes subtypes. Cnidocyte development Cnidocytes are single-use cells that need to be continuously replaced throughout the life of the animal with different mode of renewal across species. In Hydra polyps, cnidocytes differentiate from a specific population of stem cells, the interstitial cells (I-cells) located within the body column. Developing nematocysts first undergo multiple rounds of mitosis without cytokinesis, giving rise to nematoblast nests with 8, 16, 32 or 64 cells. After this expansion phase, nematoblasts develop their capsules. Nests separate into single nematocysts when the formation of the capsule is complete. Most of them migrate to the tentacles where they are incorporated into battery cells, which hold several nematocysts, and neurons. Battery cells coordinate firing of nematocysts. In the hydrozoan jellyfish Clytia hemisphaerica, nematogenesis takes place at the base of the tentacles, as well as in the manubrium. At the base of the tentacles, nematoblasts proliferate then differentiate along a proximal-distal gradient, giving rise to mature nematocytes in the tentacles through a conveyor belt system. In the Anthozoan sea anemone Nematostella vectensis, nematocytes are thought to develop throughout the animal from epithelial progenitors. Furthermore, a single regulatory gene that codes for the transcription factor ZNF845 also called CnZNF1 promotes the development of a cnidocyte and inhibits the development of a RFamide producing neuron cell. This gene evolved in the stem cnidarian through domain shuffling. The nematocyst forms through a multi-step assembly process from a giant post-Golgi vacuole. Vesicles from the Golgi apparatus first fuse onto a primary vesicle: the capsule primordium. Subsequent vesicle fusion enables the formation of a tubule outside of the capsule, which then invaginates into the capsule. Then, an early maturation phase enables the formation of long arrays of barbed spines onto the invaginated tubule through the condensation of spinalin proteins. Finally, a late maturation stage gives rise to undischarged capsules under high osmotic pressure through the synthesis of poly-γ-glutamate into the matrix of the capsule. This trapped osmotic pressure enables rapid thread discharge upon triggering through a massive osmotic shock. Nematocyst toxicity Nematocysts are very efficient weapons. A single nematocyst has been shown to suffice in paralyzing a small arthropod (Drosophila larva). The most deadly cnidocytes (to humans, at least) are found on the body of a box jellyfish. One member of this family, the sea wasp, Chironex fleckeri, is "claimed to be the most venomous marine animal known," according to the Australian Institute of Marine Science. It can cause excruciating pain to humans, sometimes followed by death. Other cnidarians, such as the jellyfish Cyanea capillata (the "Lion's Mane" made famous by Sherlock Holmes) or the siphonophore Physalia physalis (Portuguese man o' war, "Bluebottle") can cause extremely painful and sometimes fatal stings. On the other hand, aggregating sea anemones may have the lowest sting intensity, perhaps due to the inability of the nematocysts to penetrate the skin, creating a feeling similar to touching sticky candies. Besides feeding and defense, sea anemone and coral colonies use cnidocytes to sting one another in order to defend or win space. Despite their effectiveness in prey-predator interactions, there is an evolutionary tradeoff as cnidarian venom systems are known to reduce the cnidarian's reproductive fitness and overall growth. Venom from animals such as cnidarians, scorpions and spiders may be species-specific. A substance that is weakly toxic for humans or other mammals may be strongly toxic to the natural prey or predators of the venomous animal. Such specificity has been used to create new medicines and bioinsecticides, and biopesticides. Animals in the phylum Ctenophora ("sea-gooseberries" or "comb jellies") are transparent and jelly-like but have no nematocysts, and are harmless to humans. Certain types of sea slugs, such as the nudibranch aeolids, are known to undergo kleptocnidy (in addition to kleptoplasty), whereby the organisms store nematocysts of digested prey at the tips of their cerata. See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Israeli_cuisine] | [TOKENS: 13371]
Contents Israeli cuisine Israeli cuisine primarily comprises dishes brought from the Jewish diaspora, and has more recently been defined by the development of a notable fusion cuisine characterized by the mixing of Jewish cuisine and Arab cuisine. It also blends together the culinary traditions of the various diaspora groups, namely those of Middle Eastern Jews with roots in Southwest Asia and North Africa, Sephardi Jews from Iberia, and Ashkenazi Jews from Central and Eastern Europe. The country's cuisine also incorporates food and drinks traditionally included in other Middle Eastern cuisines (e.g., Iranian cuisine from Persian Jews and Turkish cuisine from Turkish Jews) as well as in Mediterranean cuisines, such that spices like za'atar and foods such as falafel, hummus, msabbaha, shakshouka, and couscous are now widely popular in Israel. However, the identification of Arab dishes as Israeli has led to accusations of cultural appropriation against Israel by Palestinians and other Arabs. Other influences on the cuisine are the availability of foods common to the Mediterranean, especially certain kinds of fruits and vegetables, dairy products, and fish; the tradition of observing kashrut; and food customs and traditions (minhag) specific to Shabbat and other Jewish holidays. Examples of these foods include challah, jachnun, malawach, gefilte fish, hamin, me'orav yerushalmi, and sufganiyot. New dishes based on agricultural products such as oranges, avocados, dairy products, and fish, and others based on world trends have been introduced over the years, and chefs trained abroad have brought in elements of other international cuisines. History Israel's culinary traditions comprise foods and cooking methods that span 3,000 years of history. Over that time, these traditions have been shaped by influences from Asia, Africa and Europe, and religious and ethnic influences have resulted in a culinary melting pot. Biblical and archaeological records provide insight into the culinary life of the region as far back as 1000 BCE. Ancient Israelite cuisine was based on several products that still play important roles in modern Israeli cuisine. These were known as the seven species: olives, figs, dates, pomegranates, wheat, barley and grapes. The diet, based on locally grown produce, was enhanced by imported spices, readily available due to the country's position at the crossroads of east–west trade routes. During the Second Temple period (516 BCE – 70 CE), Hellenistic and Roman culture heavily influenced cuisine, particularly of the priests and aristocracy of Jerusalem. Elaborate meals were served that included piquant entrées and alcoholic drinks, fish, beef, meat, pickled and fresh vegetables, olives, and tart or sweet fruits. After the destruction of the Second Temple and the exile of the majority of Jews from the Land of Israel, Jewish cuisine continued to develop in the many countries where Jewish communities have existed since Late Antiquity, influenced by the economics, agriculture, and culinary traditions of those countries.[citation needed] The Old Yishuv was the Jewish community that lived in Ottoman Syria prior to the Zionist Aliyah from the diaspora that began in 1881. The cooking style of the community was Sephardi cuisine, which developed among the Jews of Spain before their expulsion in 1492, and in the areas to which they migrated thereafter, particularly the Balkans and Ottoman Empire. Sephardim and Ashkenazim also established communities in the Old Yishuv. Particularly in Jerusalem, they continued to develop their culinary style, influenced by Ottoman cuisine, creating a style that became known as Jerusalem Sephardi cuisine. This cuisine included pies like sambousak, pastels and burekas, vegetable gratins and stuffed vegetables, and rice and bulgur pilafs, which are now considered to be Jerusalem classics. Groups of Hasidic Jews from Eastern Europe also began establishing communities in the late 18th century, and brought with them their traditional Ashkenazi cuisine, developing, however, distinct local variations, notably a peppery, caramelized noodle pudding known as kugel yerushalmi. Beginning with the First Aliyah in 1881, Jews began immigrating to the area from Yemen and Eastern Europe in larger numbers, particularly from Poland and Russia. These Zionist pioneers were motivated both ideologically and by the Mediterranean climate to reject the Ashkenazi cooking styles they grew up with, and adapt by using local produce, especially vegetables such as zucchini, peppers, eggplant, artichoke and chickpeas. The first Hebrew cookbook, How to Cook in Palestine, written by Erna Meyer and published in the early 1930s by the Palestine Federation of the Women's International Zionist Organization, exhorted cooks to use Mediterranean herbs and Middle-Eastern spices and local vegetables in their cooking. The bread, olives, cheese and raw vegetables they adopted became the basis for the kibbutz breakfast, which in more abundant forms is served in Israeli hotels, and in various forms in most Israeli homes today. The State of Israel faced enormous military and economic challenges in its early years, and the period from 1948 to 1958 was a time of food rationing and austerity, known as tzena. In this decade, over one million Jewish immigrants, mainly from Arab countries, but also including European Holocaust survivors, inundated the new state. They arrived when only basic foods were available and ethnic dishes had to be modified with a range of mock or simulated foods, such as chopped "liver" from eggplant, and turkey as a substitute for veal schnitzel for Ashkenazim, kubbeh made from frozen fish instead of ground meat for Iraqi Jews, and turkey in place of the lamb kebabs of the Mizrahi Jews. These adaptations remain a legacy of that time. Substitutes, such as the wheat-based rice substitute, ptitim, were introduced, and versatile vegetables such as eggplant were used as alternatives to meat. Additional flavor and nutrition were provided from inexpensive canned tomato paste and puree, hummus, tahina, and mayonnaise in tubes. Meat was scarce, and it was not until the late 1950s that herds of beef cattle were introduced into the agricultural economy. Khubeza, a local variety of the mallow plant, became an important food source during the War of Independence. During the siege of Jerusalem, when convoys of food could not reach the city, Jerusalemites went out to the fields to pick khubeza leaves, which are high in iron and vitamins. Instructions for cooking it broadcast by Jerusalem-based radio station Kol Hamagen, were picked up in Jordan, which convinced the Arabs that the Jews were dying of starvation and victory was at hand. In the past decade, food writers in Israel have encouraged the population to prepare khubeza on Israel Independence Day. Local chefs have begun to serve khubeza and other wild plants gathered from the fields in upscale restaurants. The dish from the independence war is called ktzitzot khubeza and is still eaten by Israelis today.[citation needed] Immigrants to Israel have introduced elements of the cuisines of the cultures and countries from whence they came. In the nearly 50 years before 1948, there were successive waves of Jewish immigration, which brought a whole range of foods and cooking styles. Immigrants arriving from central Europe brought foods such as schnitzel and strudels, while Russian Jews brought borscht and herring dishes, such as schmaltz herring and vorschmack (gehakte herring). Ashkenazi dishes include chicken soup, schnitzel, lox, chopped liver, gefilte fish, knishes, kishka and kugel. The first Israeli patisseries were opened by Ashkenazi Jews, who popularized cakes and pastries from central and Eastern Europe, such as yeast cakes (babka), nut spirals (schnecken), chocolate rolls and layered pastries. After 1948, the greatest impact came from the large migration of Jews from Turkey, Iraq, Kurdistan and Yemen, and Mizrahi Jews from North Africa, particularly Morocco. Typically, the staff of army kitchens, schools, hospitals, hotels and restaurant kitchens has consisted of Mizrahi, Kurdish and Yemenite Jews, and this has had an influence on the cooking fashions and ingredients of the country. Mizrahi cuisine, the cuisine of Jews from North Africa, features grilled meats, sweet and savory puff pastries, rice dishes, stuffed vegetables, pita breads and salads, and shares many similarities with Arab cuisine. Other North African dishes popular in Israel include couscous, shakshouka, matbucha, carrot salad and chraime (slices of fish cooked in a spicy tomato sauce). Sephardic dishes, with Balkan and Turkish influences incorporated in Israeli cuisine include burekas, yogurt and taramosalata. Yemenite Jewish foods include jachnun, malawach, skhug and kubane. Iraqi dishes popular in Israel include amba, various types of kubba, stuffed vegetables (mhasha), kebab, sambusac, sabich and pickled vegetables (hamutzim). As Israeli agriculture developed and new kinds of fruits and vegetables appeared on the market, cooks and chefs began to experiment and devise new dishes with them. They also began using "biblical" ingredients such as honey, figs, and pomegranates, and indigenous foods such as prickly pears (tzabar) and chickpeas. Since the late 1970s, there has been an increased interest in international cuisine, cooking with wine and herbs, and vegetarianism. A more sophisticated food culture in Israel began to develop when cookbooks, such as From the Kitchen with Love by Ruth Sirkis, published in 1974, introduced international cooking trends, and together with the opening of restaurants serving cuisines such as Chinese, Italian and French, encouraged more dining out. The 1980s were a formative decade: the increased optimism after the signing of the peace treaty with Egypt in 1979, the economic recovery of the mid-1980s and the increasing travel abroad by average citizens were factors contributing to a greater interest in food and wine. In addition, high-quality, locally produced ingredients became increasingly available. For example, privately owned dairies began to produce handmade cheeses from goat, sheep and cow's milk, which quickly became very popular both among chefs and the general public. In 1983, the Golan Heights Winery was the first of many new Israeli winemakers to help transform tastes with their production of world-class, semi-dry and dry wines. New attention was paid to the making of handmade breads and the production of high quality olive oil. The successful development of aquaculture ensured a steady supply of fresh fish, and the agricultural revolution in Israel led to an overwhelming choice and quality of fresh fruit, vegetables and herbs. Ethnic heritage cooking, both Sephardic and Ashkenazi, has made a comeback with the growing acceptance of the heterogeneous society. Apart from home cooking, many ethnic foods are now available in street markets, supermarkets and restaurants, or are served at weddings and bar mitzvahs, and people increasingly eat foods from ethnic backgrounds other than their own. Overlap and combinations of foods from different ethnic groups is becoming standard as a multi-ethnic food culture develops. The 1990s saw an increasing interest in international cuisines. Sushi, in particular, has taken hold as a popular style for eating out and as an entrée for events. In restaurants, fusion cuisine, with the melding of classic cuisines such as French and Japanese with local ingredients has become widespread.[citation needed] In the 2000s, the trend of "eating healthy" with an emphasis on organic and whole-grain foods has become prominent, and medical research has led many Israelis to re-embrace the Mediterranean diet, with its touted health benefits. Characteristics Geography has a large influence on Israeli cuisine, and foods common in the Mediterranean region, such as olives, wheat, chickpeas, dairy products, fish, and vegetables such as tomatoes, eggplants, and zucchini are prominent in Israeli cuisine. Fresh fruits and vegetables are plentiful in Israel and are cooked and served in many ways. There are various climatic areas in Israel and areas it has settled that allow a variety of products to be grown. Citrus trees such as orange, lemon and grapefruit thrive on the coastal plain. Figs, pomegranates and olives also grow in the cooler hill areas. The subtropical climate near the Sea of Galilee and in the Jordan River Valley is suitable for mangoes, kiwis and bananas, while the temperate climate of the mountains of the Galilee and the Golan is suitable for grapes, apples and cherries. Israeli eating customs also conform to the wider Mediterranean region, with lunch, rather than dinner, being the focal meal of a regular workday. "Kibbutz foods" have been adopted by many Israelis for their light evening meals as well as breakfasts, and may consist of various types of cheeses, both soft and hard, yogurt, labne and sour cream, vegetables and salads, olives, hard-boiled eggs or omelets, pickled and smoked herring, a variety of breads, and fresh orange juice and coffee. In addition, Jewish holidays influence the cuisine, with the preparation of traditional foods at holiday times, such as various types of challah (braided bread) for Shabbat and festivals, jelly doughnuts (sufganiyot) for Hanukah, the hamantaschen pastry (oznei haman) for Purim, charoset, a type of fruit paste, for Passover, and dairy foods for Shavuot. The Shabbat dinner, eaten on Friday, and to a lesser extent the Shabbat lunch, is a significant meal in Israeli homes, together with holiday meals. Although many, if not most, Jews in Israel do not keep kosher, the tradition of kashrut strongly influences the availability of certain foods and their preparation in homes, public institutions and many restaurants, including the separation of milk and meat and avoiding the use of non-kosher foods, especially pork and shellfish. During Passover, bread and other leavened foods are prohibited to observant Jews and matza and leaven-free foods are substituted. Foods Israel does not have a universally recognized national dish; in previous years this was considered to be falafel, deep-fried balls of seasoned, ground chickpeas. Street vendors throughout Israel used to sell falafel. It was a favorite "street food" for decades and is still popular as a mezze dish or as a top-up for hummus-in-pita, though it is less common nowadays as a sole filling in pita because people are more aware of the health effects of deep-frying food in oil. The Israeli breakfast has always been largely healthy, by today's standards, and one book called the Israeli breakfast "the Jewish state's contribution to world cuisine". Vegetable salads are eaten with most meals, including the traditional Israeli breakfast, which will usually include eggs, bread, and dairy products such as yogurt or cottage cheese. For lunch and dinner, salad may be served as a side dish. A light meal of salad (salat), hummus and French fries (chips) served in a pita is referred to as hummuschipsalat. Israeli salad is typically made with finely chopped tomatoes and cucumbers dressed in olive oil, lemon juice, salt and pepper. Variations include the addition of diced red or green bell peppers, grated carrot, finely shredded cabbage or lettuce, sliced radish, fennel, spring onions and chives, chopped parsley, or other herbs and spices such as mint, za'atar and sumac. Although popularized by the kibbutzim, versions of this mixed salad were brought to Israel from various places. For example, Jews from India prepare it with finely chopped ginger and green chili peppers, North African Jews may add preserved lemon peel and cayenne pepper, and Bukharan Jews chop the vegetables extremely finely and use vinegar, without oil, in the dressing. Tabbouleh is a Levantine vegan dish (sometimes considered a salad) traditionally made of tomatoes, finely chopped parsley, mint, bulgur and onion, and seasoned with olive oil, lemon juice, and salt. Some Israeli variations of the salad use pomegranate seeds instead of tomatoes. Sabich salad is a variation of the well known Israeli dish sabich, the ingredients of the salad are eggplant, boiled eggs/hard-boiled eggs, tahini, Israeli salad, potato, parsley and amba. Kubba is a dish made of rice/semolina/burghul (cracked wheat), minced onions and finely ground lean beef, lamb or chicken. The best-known variety is a torpedo-shaped fried croquette stuffed with minced beef, chicken or lamb. It was brought to Israel by Jews of Iraqi, Kurdish and Syrian origin. Sambusak is a semi-circular pocket of dough filled with mashed chickpeas, fried onions and spices. There is another variety filled with meat, fried onions, parsley, spices and pine nuts, which is sometimes mixed with mashed chickpeas and breakfast version with feta or tzfat cheese and za'atar. It can be fried or otherwise cooked. Roasted vegetables includes bell peppers, chili peppers, tomatoes, onions, eggplants and also sometimes potatoes and zucchini. Usually served with grilled meat. Khamutzim are pickled vegetables made by soaking in water and salt (and sometimes olive oil) in a pot and withdrawing them from air. Ingredients can include cucumber, cabbage, eggplant, carrot, turnip, radish, onion, caper, lemon, olives, cauliflower, tomatoes, chili pepper, bell pepper, garlic and beans. A large variety of eggplant salads and dips are made with roasted eggplants. Baba ghanoush, called salat ḥatzilim in Israel, is made with tahina and other seasonings such as garlic, lemon juice, onions, herbs and spices. Food writer and historian Gil Marks has stated that: "Israelis learned to make baba ghanouj from the Arabs". The eggplant is sometimes grilled over an open flame so that the pulp has a smoky taste. A particularly Israeli variation of the salad is made with mayonnaise called salat ḥatzilim b'mayonnaise. Eggplant salads are also made with yogurt, or with feta cheese, chopped onion and tomato, or in the style of Romanian Jews, with roasted red pepper. Tahina is often used as a dressing for falafel, serves as a cooking sauce for meat and fish, and forms the basis of sweets such as halva. Hummus is a cornerstone of Israeli cuisine, and consumption in Israel has been compared by food critic Elena Ferretti to "peanut butter in America, Nutella in Europe or Vegemite in Australia". Hummus in pita is a common lunch for schoolchildren, and is a popular addition to many meals. Supermarkets offer a variety of commercially prepared hummus, and some Israelis will go out of their way for fresh hummus prepared at a hummusia, an establishment devoted exclusively to selling hummus. Salat avocado is an Israeli-style avocado salad, with lemon juice and chopped scallions (spring onions), was introduced by farmers who planted avocado trees on the coastal plain in the 1920s. Avocados have since become a winter delicacy and are cut into salads as well as being spread on bread. A meze of fresh and cooked vegetable salads, pickled cucumbers and other vegetables, hummus, ful, tahini and amba dips, labneh cheese with olive oil, and ikra is served at festive meals and in restaurants. Salads include Turkish salad (a piquant salad of finely chopped onions, tomatoes, herbs and spices), tabbouleh, carrot salad, marinated roasted red and green peppers, deep fried cauliflower florets, matbucha, torshi (pickled vegetables) and various eggplant salads. Modern Israeli interpretations of the meze blend traditional and modern, pairing ordinary appetizers with unique combinations such as fennel and pistachio salad, beetroot and pomegranate salad, and celery and kashkaval cheese salad. Stuffed vegetables, called memula’im, were originally designed to extend cheap ingredients into a meal. They are prepared by cooks in Israel from all ethnic backgrounds and are made with many varying flavors, such as spicy or sweet-and-sour, with ingredients such as bell peppers, chili peppers, figs, onion, artichoke bottoms, Swiss chard, beet, dried fruits, tomato, vine leaves, potatoes, mallow, eggplants and zucchini squash, and stuffing such as meat and rice in Balkan style, bulgur in Middle-Eastern fashion, or with ptitim, a type of Israeli pasta. The Ottoman Turks introduced stuffed vine leaves in the 16th century and vine leaves are commonly stuffed with a combination of meat and rice, although other fillings, such as lentils, have evolved among the various communities. Artichoke bottoms stuffed with meat are famous as one of the grand dishes of the Sephardi Jerusalem cuisine of the Old Yishuv. Stuffed dates and dried fruits are served with rice and bulgur dishes. Stuffed half-zucchini has a Ladino name, medias. A variety of soups are enjoyed, particularly in the winter. Chicken soup has been a mainstay of Jewish cuisine since medieval times and is popular in Israel. Classic chicken soup is prepared as a simple broth with a few vegetables, such as onion, carrot and celery, and herbs such as dill and parsley. More elaborate versions are prepared by Sephardim with orzo or rice, or the addition of lemon juice or herbs such as mint or coriander, while Ashkenazim may add noodles. An Israeli adaption of the traditional Ashkenazi soup pasta known as mandlen, called shkedei marak ("soup almonds") in Israel, are commonly served with chicken soup. Particularly on holidays, dumplings are served with the soup, such as the kneidlach (matzah balls) of the Ashkenazim or the gondi (chickpea dumplings) of Iranian Jews, or kubba, a family of dumplings brought to Israel by Middle Eastern Jews. Especially popular are kubba prepared from bulgur and stuffed with ground lamb and pine nuts, and the soft semolina or rice kubba cooked in soup, which Jews of Kurdish or Iraqi heritage habitually enjoy as a Friday lunchtime meal. Lentil soup is prepared in many ways, with additions such as cilantro or meat. Other soups include the harira of the Moroccan Jews, a spicy soup of lamb (or chicken), chickpeas, lentils and rice, and a Yemenite bone-marrow soup known as ftut, served on special occasions such as weddings, seasoned with the traditional hawaij spice mix. White bean soup in tomato sauce is common in Jerusalem because Sephardic Jews settled in the city after being expelled from Andalusia. Rice is prepared in numerous ways in Israel, from simple steamed white rice to festive casseroles. It is also cooked with spices and served with almonds and pine nuts. "Green" rice, prepared with a variety of fresh chopped herbs, is favored by Persian Jews. Another rice dish is prepared with thin noodles that are first fried and then boiled with the rice. Mujadara is a popular rice and lentil dish, adopted from Arab cuisine. Orez Shu'it is a dish invented in Jerusalem by Sephardic Jews, made of white beans cooked in a tomato stew and served on plain boiled rice; it is eaten widely in the Jerusalem region. Couscous was brought to Israel by Jews from North Africa. It is still prepared in some restaurants or by traditional cooks by passing semolina through a sieve several times and then cooking it over an aromatic broth in a special steamer pot called a couscoussière. Generally, "instant" couscous is used for home cooking. Couscous is used in salads, main courses and even some desserts. As a main course, chicken or lamb, or vegetables cooked in a soup flavored with saffron or turmeric are served on steamed couscous. Ptitim is an Israeli pasta which now comes in many shapes, including pearls, loops, stars and hearts, but was originally shaped like grains of rice. It originated in the early days of the State of Israel as a wheat-based substitute for rice, when rice, a staple of the Mizrahi Jews, was scarce. Israel's first prime minister, David Ben-Gurion, is reputed to have asked the Osem company to devise this substitute, and so it was nicknamed "Ben-Gurion rice". Ptitim can be boiled like pasta, prepared pilaf-style by sautéing and then boiling in water or stock, or baked in a casserole. Like other pasta, it can be flavored in many ways with spices, herbs and sauces. Once considered primarily a food for children, ptitim is now prepared in restaurants both in Israel and internationally. Bulgur is a kind of dried cracked wheat, served sometimes instead of rice. Fresh fish is readily available, caught off Israel's coastal areas of the Mediterranean and the Red Sea, or in the Sea of Galilee, or raised in ponds in the wake of advances in fish farming in Israel. Fresh fish is served whole, in the Mediterranean style, grilled, or fried, dressed only with freshly squeezed lemon juice. Trout (forel), gilthead seabream (denisse), St. Peter's fish (musht) and other fresh fish are prepared this way. Fish are also eaten baked, with or without vegetables, or fried whole or in slices, or grilled over coals, and served with different sauces. Fish are also braised, as in a dish called hraime, in which fish such as grouper (better known in Israel by its Arabic name lokus) or halibut is prepared in a sauce with hot pepper and other spices for Rosh Hashanah, Passover and Shabbat by North-African Jews. Everyday versions are prepared with cheaper kinds of fish and are served in market eateries, public kitchens and at home for weekday meals. Fish, traditionally carp, but now other firm whitefish too, are minced and shaped into loaves or balls and cooked in fish broth, such as the gefilte fish of the Ashkenazi Jews, who also brought pickled herring from Eastern Europe. Herring is often served at the kiddush that follows synagogue services on Shabbat, especially in Ashkenazi communities. In the Russian immigrant community it may be served as a light meal with boiled potatoes, sour cream, dark breads and schnapps or vodka. Fish kufta is usually fried with spices, herbs and onions (sometimes also pine nuts) and served with tahini or yogurt sauce. Boiled fish kufta is cooked in a tomato, tahini or yogurt sauce. Tilapia baked with tahini sauce and topped with olive oil, coriander, mint, basil and pine nuts (and sometimes also with fried onions) is a specialty of Tiberias. Chicken is the most widely eaten meat in Israel, followed by turkey. Chicken is prepared in a multitude of ways, from simple oven-roasted chicken to elaborate casseroles with rich sauces such as date syrup, tomato sauce, etc. Examples include chicken casserole with couscous, inspired by Moroccan Jewish cooking, chicken with olives, a Mediterranean classic, and chicken albondigas (meat balls) in tomato sauce, from Jerusalem Sephardi cuisine. Albondigas are prepared from ground meat. Similar to them is the more popular kufta which is made of minced meat, herbs and spices and cooked with tomato sauce, date syrup, pomegranate syrup or tamarind syrup with vegetables or beans. Grilled and barbecued meat are common in Israeli cuisine. The country has many small eateries specializing in beef and lamb kebab, shish taouk, merguez and shashlik. Outdoor barbecuing, known as mangal or al ha-esh (on the fire) is a beloved Israeli pastime. In modern times, Israel Independence Day is frequently celebrated with a picnic or barbecue in parks and forests around the country. Skewered goose liver is a dish from southern Tel Aviv. It is grilled with salt and black pepper and sometimes with spices like cumin or Baharat spice mix. Chicken or lamb baked in the oven is very common with potatoes, and sometimes fried onions as well. Turkey schnitzel is an Israeli adaptation of veal schnitzel, and is an example of the transformations common in Israeli cooking. The schnitzel was brought to Israel by Jews from Central Europe, but before and during the early years of the State of Israel veal was unobtainable and chicken or turkey was an inexpensive and tasty substitute. Furthermore, a Wiener schnitzel is cooked in both butter and oil, but in Israel only oil is used, because of kashrut. Today, most cooks buy schnitzel already breaded and serve it with hummus, tahina, and other salads for a quick main meal. Other immigrant groups have added variations from their own backgrounds—Yemenite Jews, for example, flavor it with hawaij. In addition, vegetarian versions have become popular and the Israeli food company, Tiv′ol, was the first to produce a vegetarian schnitzel from a soya meat-substitute. Various types of sausage are part of Sephardi and Mizrahi cuisine in Israel. Jews from Tunisia make a sausage, called osban, with a filling of ground meat or liver, rice, chopped spinach, and a blend of herbs and spices. Jews from Syria make smaller sausages, called gheh, with a different spice blend while Jews from Iraq make the sausages, called mumbar, with chopped meat and liver, rice, and their traditional mix of spices. Moussaka is an oven-baked layer dish ground meat and eggplant casserole that, unlike its Levantine rivals, is served hot. Meat stews (chicken, lamb and beef) are cooked with spices, pine nuts, herbs like parsley, mint and oregano, onion, tomato sauce or tahini or juices such as pomegranate molasses, pomegranate juice, pomegranate wine, grape wine, arak, date molasses and tamarind. Peas, chickpeas, white beans, cowpeas or green beans are sometimes also added. Stuffed chicken in Israel is usually stuffed with rice, meat (lamb or beef), parsley, dried fruits like dates, apricots or raisins, spices like cinnamon, nutmeg or allspice; sometimes herbs like thyme and oregano (not the dried ones) are added on the top of the chicken to give it a flavor and then it is baked in the oven. Many fresh, high quality dairy products are available, such as cottage cheese, white cheeses, yogurts including leben and eshel, yellow cheeses, and salt-brined cheeses typical of the Mediterranean region. Dairy farming has been a major sector of Israeli agriculture since the founding of the state, and the yield of local milk cows is amongst the highest in the world. Initially, the moshavim (farming cooperatives) and kibbutzim produced mainly soft white cheese as it was inexpensive and nutritious. It became an important staple in the years of austerity and gained a popularity that it enjoys until today. Soft white cheese, gvina levana, is often referred to by its fat content, such as 5% or 9%. It is eaten plain, or mixed with fruit or vegetables, spread on bread or crackers and used in a variety of pies and pastries. Labneh is a yogurt-based white cheese common throughout the Balkans and the Middle East. It is sold plain, with za'atar, or in olive oil. It is often eaten for breakfast with other cheeses and bread. In the north of the country, labneh balls preserved in olive oil are more common than in the central and the southern parts. Adding spices like za'atar, dried oregano or sumac and herbs like thyme, mint or scallions is common when preserving the labneh balls. It is especially common to eat them during breakfast because meat is usually not eaten in the morning. Tzfat cheese, a white cheese in brine, similar to feta, was first produced by the Meiri dairy in Safed in 1837 and is still produced there by descendants of the original cheese makers. The Meiri dairy also became famous for its production of the Balkan-style brinza cheese, which became known as Bulgarian cheese due to its popularity in the early 1950s among Jewish immigrants from Bulgaria. Other dairies now also produce many varieties of these cheeses. Bulgarian yogurt, introduced to Israel by Bulgarian Jewish survivors of the Holocaust, is used to make a traditional yogurt and cucumber soup. In the early 1980s, small privately owned dairies began to produce handmade cheeses from goat and sheep's milk as well as cow's milk, resembling traditional cheeses like those made in rural France, Spain and Italy. Many are made with organic milk. These are now also produced by kibbutzim and the national Tnuva dairy. Shakshuka, a North-African dish of eggs poached in a spicy tomato sauce, is a national favorite, especially in the winter. It is traditionally served up in a cast-iron pan with bread to mop up the sauce. Some variations of the dish are cooked with liberal use of ingredients such as eggplant, chili peppers, hot paprika, spinach, feta cheese or safed cheese. Omelettes are seasoned with onions, herbs such as dill seeds (shamir), spinach, parsley, mint, coriander and mallow with spices such as turmeric, cumin, sumac, cinnamon and cloves and with cheese such as safed and feta. Haminados are eggs that are baked after being boiled, served alongside stew or meals; in hamin they are used in the morning for breakfast, also sometimes replacing the usual egg in sabich. They are also eaten as a breakfast alongside jachnun, grated tomatoes and skhug. Israel is one of the world's leading fresh citrus producers and exporters, and more than forty types of fruit are grown in Israel, including citrus fruits such as oranges, grapefruit, tangerines and the pomelit, a hybrid of a grapefruit and a pomelo, developed in Israel. Fruits grown in Israel include avocados, bananas, apples, cherries, plums, lychees, nectarines, grapes, dates, strawberries, prickly pear (tzabbar), persimmon, loquat (shesek) and pomegranates, and are eaten on a regular basis. Israelis consume an average of nearly 160 kg (350 lb) of fruit per person a year. Many unique varieties of mango are native to the country, most having been developed during the second half of the 20th century. New and improved mango varieties are still introduced to markets every few years. Arguably the most popular variety is the Maya type, which is small to medium in size, fragrant, colourful (featuring 3-4 colours) and usually fiberless. The Israeli mango season begins in May, and the last of the fruit ripen as October draws near. Different varieties are present on markets at different months, with the Maya type seen between July and September. Mangos are frequently used in fusion dishes and for making sorbet. A lot of Israelis keep fruit trees in their yards, citrus (especially orange and lemon) being the most common. Mangos are also now popular as household trees. Mulberry trees are frequently seen in public gardens, and their fruit is popularly served alongside various desserts and as a juice. Fruit is served as a snack or dessert alongside other items or by themselves. Fresh-squeezed fruit juices are prepared at street kiosks, and sold bottled in supermarkets. Various fruits are added to chicken or meat dishes and fresh fruit salad and compote are often served at the end of the meal. There is a strong tradition of home baking in Israel arising from the years when there were very few bakeries to meet demand. Many professional bakers came to Israel from Central Europe and founded local pastry shops and bakeries, often called konditoria, thus shaping local tastes and preferences. There is now a local style with a wide selection of cakes and pastries that includes influences from other cuisines and combines traditional European ingredients with Mediterranean and Middle-Eastern ingredients, such as halva, phyllo dough, dates, and rose water. Examples include citrus-flavored semolina cakes, moistened with syrup and called basbousa, tishpishti or revani in Sephardic bakeries. The Ashkenazi babka has been adapted to include halva or chocolate spread, in addition to the old-fashioned cinnamon. There are also many varieties of apple cake. Cookies made with crushed dates (ma'amoul) are served with coffee or tea, as throughout the Middle East. Jerusalem kugel (kugel yerushalmi) is an Israeli version of the traditional noodle pudding, kugel, made with caramelized sugar and spiced with black pepper. It was originally a specialty of the Ashkenazi Jews of the Old Yishuv. It is typically baked in a very low oven overnight and eaten after synagogue services on Shabbat morning. Bourekas are savory pastries brought to Israel by Jews from Turkey, the Balkans and Salonika. They are made of a flaky dough in a variety of shapes, frequently topped with sesame seeds, and are filled with meat, chickpeas, cheese, spinach, potatoes or mushrooms. Bourekas are sold at kiosks, supermarkets and cafes, and are served at functions and celebrations, as well as being prepared by home cooks. They are often served as a light meal with hardboiled eggs and chopped vegetable salad. Ashkenazi Jews from Vienna and Budapest brought sophisticated pastry making traditions to Israel. Sacher torte and Linzer torte are sold at professional bakeries, but cheesecake and strudel are also baked at home. Jelly donuts (sufganiyot), traditionally filled with red jelly (jam), but also custard or dulce de leche, are eaten as Hanukkah treats. Tahini cookies are an Israeli origin cookies made of tahini, flour, butter and sugar and usually topped with pine nuts. Rugelach is very popular in Israel, commonly found in most cafes and bakeries. It is also a popular treat among American Jews. In the Jewish communities of the Old Yishuv, bread was baked at home. Small commercial bakeries were set up in the mid-19th century. One of the earliest, Berman's Bakery, was established in 1875, and evolved from a cottage industry making home-baked bread and cakes for Christian pilgrims. Expert bakers who arrived among the immigrants from Eastern and Central Europe in the 1920s–30s introduced handmade sourdough breads. From the 1950s, mass-produced bread replaced these loaves and standard, government subsidized loaves known as leḥem aḥid became mostly available until the 1980s, when specialized bakeries again began producing rich sourdough breads in the European tradition, and breads in a Mediterranean style with accents such as olives, cheese, herbs or sun-dried tomatoes. A large variety of breads is now available from bakeries and cafes. Challah bread is widely purchased or prepared for Shabbat. Challah is typically an egg-enriched bread, often braided in the Ashkenazi tradition, or round for Rosh Hashana, the Jewish New Year. Shabbat and festival breads of the Yemenite Jews have become popular in Israel and can be bought frozen in supermarkets. Jachnun is very thinly rolled dough, brushed with oil or fat and baked overnight at a very low heat, traditionally served with a crushed or grated tomato dip, hard-boiled eggs and skhug. Malawach is a thin circle of dough toasted in a frying pan. Kubaneh is a yeast dough baked overnight and traditionally served on Shabbat morning. Lahoh is a spongy, pancake-like bread made of fermented flour and water, and fried in a pan. Jews from Ethiopia make a similar bread called injera from millet flour. Pita bread is a double-layered flat or pocket bread traditional in many Middle Eastern and Mediterranean cuisines. It is baked plain, or with a topping of sesame or nigella seeds or za'atar. Pita is used in multiple ways, such as stuffed with falafel, salads or various meats as a snack or fast food meal; packed with schnitzel, salad and French fries for lunch; filled with chocolate spread as a snack for schoolchildren; or broken into pieces for scooping up hummus, eggplant and other dips. A lafa is larger, soft flatbread that is rolled up with a falafel or shawarma filling. Various ethnic groups continue to bake traditional flat breads. Jews from the former Soviet republic of Georgia make the flatbread, lavash. Baklava is a nut-filled phyllo pastry sweetened with syrup served at celebrations in Jewish communities who originated in the Middle East. It is also often served in restaurants as dessert, along with small cups of Turkish coffee. Kadaif is a pastry made from long thin noodle threads filled with walnuts or pistachios and sweetened with syrup; it is served alongside baklava. Halva is a sweet, made from tehina and sugar, and is popular in Israel. It is used to make original desserts like halva parfait. Ma'amoul are small shortbread pastries filled with dates, pistachios or walnuts (or occasionally almonds, figs, or other fillings). Ozne Haman is a sweet yeast dough filled with crushed nuts, raisins, dried apricots, dates, halva or strawberry jam then oven baked, a specialty of Purim. The triangular shape may have been influenced by old illustrations of Haman, in which he wore a three-cornered hat. Sunflower seeds, called garinim (literally, seeds), are eaten everywhere, on outings, at stadiums and at home, usually purchased unshelled and are cracked open with the teeth. They can be bought freshly roasted from shops and market stalls that specialize in nuts and seeds as well as packaged in supermarkets, along with the also well-liked pumpkin and watermelon seeds, pistachios, and sugar-coated peanuts. Bamba is a soft, peanut-flavored snack food that is a favorite of children, and Bissli is a crunchy snack made of deep-fried dry pasta, sold in various flavors, including BBQ, pizza, falafel and onion. Malabi is a creamy pudding originating from Turkey prepared with milk or almond milk (for a pareve version) and cornstarch. It is sold as a street food from carts or stalls, in disposable cups with thick sweet syrup and various crunchy toppings such as chopped pistachios or coconut. Its popularity has resulted in supermarkets selling it in plastic packages and restaurants serving richer and more sophisticated versions using various toppings and garnishes such as berries and fruit. Sahlab is a similar dessert made from the powdered tubers of orchids and milk. Watermelon with feta cheese salad is a popular dessert, sometimes mint is added to the salad. Krembo is a chocolate-coated marshmallow treat sold only in the winter, and is a very popular alternative to ice cream. It comes wrapped in colorful aluminum foil, and consists of a round biscuit base covered with a dollop of marshmallow cream coated in chocolate. Milky is a popular dairy pudding that comes in chocolate, vanilla and mocha flavors with a layer of whipped cream on top. Chili-based hot sauces are prominent in Israeli food, and are based on green or red chili peppers. They are served with appetizers, felafel, casseroles and grilled meats, and are blended with hummus and tahina. Although originating primarily from North African and Yemenite immigrants, these hot sauces are now widely consumed. Skhug is a spicy chili pepper sauce brought to Israel by Yemenite Jews, and has become one of Israel's most popular condiments. It is added to falafel and hummus and is also spread over fish, and to white cheese, eggs, salami or avocado sandwiches for extra heat and spice. Other hot sauces made from chili peppers and garlic are the Tunisian harissa, and the filfel chuma of the Libyan Jewish community in Israel. Amba is a pickled mango sauce, introduced by Iraqi Jews, and commonly used a condiment with shawarma, kebabs, meorav yerushalmi and falafel and vegetable salads. Concentrated juices made of grape, carob, pomegranate and date are common in different regions, they are used at stews, soups or as a topping for desserts such as malabi and rice pudding. Almond syrup flavored with rose water or orange blossom water is a common flavor for desserts and sometimes added to cocktails such as arak. Sumac, a dark red spice is made by grinding the dried berries of the sumac bush, which is native to the Middle East, into a coarse powder. There is a strong coffee-drinking culture in Israel. Coffee is prepared as instant (nes), iced, latte (hafuḥ), Italian-style espresso, or Turkish coffee, which is sometimes flavored with cardamom (hel). Jewish writers, artists, and musicians from Germany and Austria who immigrated to Israel before the Second World War introduced the model of the Viennese coffee house with its traditional décor, relaxed atmosphere, coffee and pastries. Cafés are found everywhere in urban areas and function as meeting places for socializing and conducting business. Almost all serve baked goods and sandwiches and many also serve light meals. There are both chains and locally owned neighborhood cafés. Most have outdoor seating to take advantage of Israel's Mediterranean climate. Tel Aviv is particularly well known for its café culture. Tea is also a widely consumed beverage and is served at cafés and drunk at home. Tea is prepared in many ways, from plain brewed Russian and Turkish-style black tea with sugar, to tea with lemon or milk, and, available as a common option in most establishments, Middle Eastern-style with mint (nana). Tea with rose water is also common. Limonana, a type of lemonade made from freshly-squeezed lemons and mint, was invented in Israel in the early 1990s and has become a summer staple throughout the Middle East. Rimonana is similar to limonana, made of pomegranate juice and mint. Sahlab is a drinkable pudding once made of the powdered bulb of the orchid plant but today usually made with cornstarch. It is usually sold in markets or by street vendors, especially in the winter. It is topped with cinnamon and chopped pistachios. Malt beer, known as black beer (בִירָה שְחוֹרָה, bira shḥora), is a non-alcoholic beverage produced in Israel since pre-state times. Goldstar and Maccabi are Israeli beers. Recently, some small boutique breweries began brewing new brands of beer, such as Dancing Camel, Negev, and Can'an. Arak is a Levantine alcoholic spirit (~40–63% Alc. Vol./~80–126 proof) from the anis drinks family, common in Israel and throughout the Middle East. It is a clear, colorless, unsweetened anise-flavored distilled alcoholic drink (also labeled as an apéritif). It is often served neat or mixed with ice and water, which creates a reaction turning the liquor a milky-white colour. It is sometimes also mixed with grapefruit juice to create a cocktail known as arak eshkoliyyot. Other spirits, brandies, liquors can be found across the country in many villages and towns. The vast majority of Israelis drink wine in moderation, and almost always at meals or social occasions. Israelis drink about 6.5 liters of wine per person per year, which is low compared to other wine-drinking Mediterranean countries, but the per capita amount has been increasing since the 1980s as Israeli production of high-quality wine grows to meet demand, especially of semi-dry and dry wines. In addition to Israeli wines, an increasing number of wines are imported from France, Italy, Australia, the United States, Chile and Argentina. Most of the wine produced and consumed from the 1880s was sweet, kosher wine when the Carmel Winery was established, until the 1980s, when more dry or semi-dry wines began to be produced and consumed after the introduction of the Golan Heights Winery’s first vintage. The winery was the first to focus on planting and making wines from Cabernet Sauvignon, Merlot, Sauvignon blanc, Chardonnay, Pinot noir, white Riesling and Gewürztraminer. These wines are kosher and have won silver and gold medals in international competitions. Israeli wine is now produced by hundreds of wineries, ranging in size from small boutique wineries in the villages to large companies producing over 10 million bottles per year, which are also exported worldwide. Wine made of fruits other than grapes such as fig, cherry, pomegranate, carob and date are also common in the country. Foods variously prohibited in Jewish dietary laws (kashrut) and in Muslim dietary laws (halal) may also be included in pluralistic Israel's diverse cuisine. Although partly legally restricted, pork and shellfish are available at many non-kosher restaurants (only around a third of Israeli restaurants have a kosher license) and some stores all over the country which are widely spread, including by the Maadaney Mizra, Tiv Ta'am and Maadanei Mania supermarket chains. A modern Hebrew euphemism for pork is "white meat". Despite Jewish and Muslim religious restrictions on the consumption of pork, pigmeat consumption per capita was 2.7 kg (6.0 lb) in 2009. A 2008 survey reported that about half of Israeli Jews do not always observe kashrut. Israel's anomalous equanimity toward its religious dietary restrictions may be reflected by the fact that some of the Hebrew cookbooks of Yisrael Aharoni are published in two versions: kosher and non-kosher editions. Eating out In Israel, as in many other Middle Eastern countries, "street food" is a kind of fast food that is sometimes literally eaten while standing in the street, while in some cases there are places to sit down. The following are some foods that are usually eaten in this way: Falafel are fried balls or patties of spiced, mashed chickpeas or fava beans and are a common Middle-Eastern street food that have become identified with Israeli cuisine. Falafel is most often served in a pita, with pickles, tahina, hummus, cut vegetable salad and often, harif, a hot sauce, the type used depending on the origin of the falafel maker. Variations include green falafel, which include parsley and coriander, red falafel made with filfel chuma, yellow falafel made with turmeric, and falafel coated with sesame seeds. Shawarma, (from çevirme, meaning "rotating" in Turkish) is usually made in Israel with turkey, with lamb fat added. The shawarma meat is sliced and marinated and then roasted on a huge rotating skewer. The cooked meat is shaved off and stuffed into a pita, with hummus and tahina, or with additional trimmings such as fresh or fried onion rings, French fries, salads and pickles. More upscale restaurant versions are served on an open flat bread, a lafa, with steak strips, flame roasted eggplant and salads. Shakshouka, originally a workman's breakfast popularized by North-African Jews in Israel, is made simply of fried eggs in spicy tomato sauce, with other vegetable ingredients or sausage optional. Shakshouka is typically served in the same frying pan in which it is cooked, with thick slices of white bread to mop up the sauce, and a side of salad. Modern variations include a milder version made with spinach and feta without tomato sauce, and hot-chili shakshouka, a version that includes both sweet and hot peppers and coriander. Shakshouka in pita is called shakshouka be-pita. Jerusalem mixed grill, or me'urav Yerushalmi, consists of mixed grill of chicken giblets and lamb with onion, garlic and spices. It is one of Jerusalem's most popular and profitable street foods. Although the origin of the dish is in Jerusalem, it is today common in all of the cities and towns in Israel. Jerusalem bagels, unlike the round, boiled and baked bagels popularized by Ashkenazi Jews, are long and oblong-shaped, made from bread dough, covered in za’atar or sesame seeds, and are soft, chewy and sweet. They have become a favorite snack for football match crowds, and are also served in hotels as well as at home. Malabi is a creamy pudding originating from Turkey prepared with milk or cream and cornstarch. It is sold as a street food from carts or stalls, in disposable cups with thick sweet syrup and various crunchy toppings such as chopped pistachios or coconut. Its popularity has resulted in supermarkets selling it in plastic packages and restaurants serving richer and more sophisticated versions using various toppings and garnishes such as berries and fruit. Sahlab is a similar dessert made from the powdered tubers of orchids and milk. Sabikh is a traditional sandwich that Mizrahi Jews introduced to Israel and is sold at kiosks throughout the country, but especially in Ramat Gan, where it was first introduced. Sabiḥ is a pita filled with fried eggplant, hardboiled egg, salad, tehina and pickles. Tunisian sandwich is usually made from a baguette with various fillings that may include tuna, egg, pickled lemon, salad, and fried hot green pepper. There are thousands of restaurants, casual eateries, cafés and bars in Israel, offering a wide array of choices in food and culinary styles. Places to eat out that are distinctly Israeli include the following: Falafel stands or kiosks are common in every neighborhood. Falafel vendors compete to stand apart from their competitors and this leads to the offering of additional special extras like chips, deep-fried eggplant, salads and pickles for the price of a single portion of falafel. A hummusia is an establishment that offers mainly hummus with a limited selection of extras such as tahina, hardboiled egg, falafel, onion, pickles, lemon and garlic sauce and pita or taboon bread. Misada Mizrahit (literally "Eastern restaurant") refers to Mizrahi Jewish, Middle-Eastern or Arabic restaurants. These popular and relatively inexpensive establishments often offer a selection of meze salads followed by grilled meat with a side of french fries and a simple dessert such as chocolate mousse for dessert. Steakiyot are meat grills selling sit down and take-away chicken, turkey or lamb as steak, shishlik, kebab and even Jerusalem mixed grill, all in pita or in taboon bread. Holiday cuisine Friday night (eve of Shabbat) dinners are usually family and socially oriented meals. Along with family favorites, and varying to some extent according to ethnic background, traditional dishes are served, such as challah bread, chicken soup, salads, chicken or meat dishes, and cakes or fruits for dessert. Shabbat lunch is also an important social meal. Since antiquity, Jewish communities all over the world devised meat casseroles that begin cooking before lighting of candles that marks the commencement of Shabbat on Friday night, so as to comply with religious regulations for observing Shabbat. In modern Israel, this filling meal, in many variations, is still eaten on the Sabbath day, not only in religiously observant households, and is also served in some restaurants during the week. The basic ingredients are meat and beans or rice simmered overnight on a hotplate or blech, or placed in a slow oven. Ashkenazi cholent usually contains meat, potatoes, barley and beans, and sometimes kishke, and seasonings such as pepper and paprika. Sephardi hamin contains chicken or meat, rice, beans, garlic, sweet or regular potatoes, seasonings such as turmeric and cinnamon, and whole eggs in the shell known as haminados. Moroccan Jews prepare variations known as dafina or skhina (or s′hina) with meat, onion, marrow bones, potatoes, chickpeas, wheat berries, eggs and spices such as turmeric, cumin, paprika and pepper. Iraqi Jews prepare tebit, using chicken and rice. For desserts or informal gatherings on Shabbat, home bakers still bake a wide variety of cakes on Fridays to be enjoyed on the Sabbath, or purchased from bakeries or stores, cakes such as sponge cake, citrus semolina cake, cinnamon or chocolate babkas, and fruit and nut cakes. Rosh Hashana, the Jewish New Year, is widely celebrated with festive family meals and symbolic foods. Sweetness is the main theme and the Rosh Hashana dinners typically begin with apples dipped in honey, and end with honey cake. The challah is usually round, often studded with raisins and drizzled with honey, and other symbolic fruits and vegetables are eaten as an entree, such as pomegranates, carrots, leeks and beets. Fish dishes, symbolizing abundance, are served; for example, gefilte fish is traditional for Ashkenazim, while Moroccan Jews prepare the spicy fish dish, chraime. Honey cake (lekach) is often served as dessert, accompanied by tea or coffee. Dishes cooked with pomegranate juice are common during this period. The holiday of Hanukkah is marked by the consumption of traditional Hanukkah foods fried in oil in commemoration of the miracle in which a small quantity of oil sufficient for one day lasted eight days. The two most popular Hannukah foods are potato pancakes, levivot, also known by the Yiddish latkes; and jelly doughnuts, known as sufganiyot in Hebrew, pontshkes (in Yiddish) or bimuelos (in Ladino), as these are deep-fried in oil. Hannukah pancakes are made from a variety of ingredients, from the traditional potato or cheese, to more modern innovations, among them corn, spinach, zucchini and sweet potato. Bakeries in Israel have popularized many new types of fillings for sufganiyot besides the standard strawberry jelly filling, and these include chocolate, vanilla or cappuccino cream, and others. In recent years downsized, "mini" sufganiyot have also appeared due to concerns about calories. Tu BiShvat is a minor Jewish holiday, usually sometime in late January or early February, that marks the "New Year of the Trees". Customs include planting trees and eating dried fruits and nuts, especially figs, dates, raisins, carob, and almonds. Many Israelis, both religious and secular, celebrate with a kabbalistic-inspired Tu BiShvat seder that includes a feast of fruits and four cups of wine according to the ceremony presented in special haggadot modeled on the Haggadah of Passover for this purpose. The festival of Purim celebrates the deliverance of the Jewish people from the plot of Haman to annihilate them in the ancient Persian Achaemenid Empire, as described in the Book of Esther. It is a day of rejoicing and merriment, on which children, and many adults, wear costumes. It is customary to eat a festive meal, seudat Purim, in the late afternoon, often with wine as the prominent beverage, in keeping with the atmosphere of merry-making. Many people prepare packages of food that they give to neighbors, friends, family, and colleagues on Purim. These are called mishloach manot ("sending of portions"), and often include wine and baked goods, fruit and nuts, and sweets. The food most associated with Purim is called oznei haman ("Haman's ears"). These are three-cornered pastries filled most often with poppy seeds, but also other fillings. The triangular shape may have been influenced by old illustrations of Haman, in which he wore a three-cornered hat. The week-long holiday of Passover in the spring commemorates the Exodus from Egypt, and in Israel is usually a time for visiting friends and relatives, travelling, and on the first night of Passover, the traditional ritual dinner, known as the Seder. Foods containing ḥametz—leavening or yeast—may not be eaten during Passover. This means bread, pastries and certain fermented beverages, such as beer, cannot be consumed. Ashkenazim also do not eat legumes, known as kitniyot. Over the centuries, Jewish cooks have developed dishes using alternative ingredients and this characterizes Passover food in Israel today. Chicken soup with matzah dumplings (kneidlach) is often a starter for the Seder meal among Israelis of all ethnic backgrounds. Spring vegetables, such as asparagus and artichokes often accompany the meal. Restaurants in Israel have come up with creative alternatives to ḥametz ingredients to create pasta, hamburger buns, pizza, and other fast foods in kosher-for-Passover versions by using potato starch and other non-standard ingredients. After Passover, the celebration of Mimouna takes place, a tradition brought to Israel by the Jewish communities of North Africa. In the evening, a feast of fruit, confectionery and pastries is set out for neighbors and visitors to enjoy. Most notably, the first leaven after Passover, a thin crepe called a mofletta, eaten with honey, syrup or jam, is served. The occasion is celebrated the following day by outdoor picnics at which salads and barbecued meat feature prominently. In the early summer, the Jewish harvest festival of Shavuot is celebrated. Shavuot marks the peak of the new grain harvest and the ripening of the first fruits, and is a time when milk was historically most abundant. To celebrate this holiday, many types of dairy foods (milchig) are eaten. These include cheeses and yogurts, cheese-based pies and quiches called pashtidot, cheese blintzes, and cheesecake prepared with soft white cheese (gvina levana) or cream cheese. Allegations of cultural appropriation The labelling of the foodstuffs originating outside of Israel as "Israeli" has led to the charge of cultural appropriation being raised by some critics. A notable example that has been lamented by Palestinians, Lebanese and other Arab populations is falafel, which Israelis view as a national dish, despite it being of likely Egyptian origin. Though never a specifically Jewish dish, it has been long been consumed by Syrian and Egyptian Jews, and was adopted into the diet of early Jewish immigrants to the Jewish communities of Ottoman Syria. As it is plant-based, Jewish dietary laws classify it as pareve and thus allow it to be eaten with both meat and dairy meals. Palestinian-Jordanian academic Joseph Massad has characterized the celebration of falafel and other dishes of Arab origin in American and European restaurants as Israeli to be part of a broader trend of "colonial conquest". The Lebanese Industrialists' Association has raised assertions of copyright infringement against Israel concerning falafel. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-397] | [TOKENS: 12858]
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_note-Colinge2016-114] | [TOKENS: 10628]
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Trail_of_Tears] | [TOKENS: 11117]
Contents Trail of Tears The Trail of Tears was the forced displacement and ethnic cleansing of about 60,000 Native Americans of the "Five Civilized Tribes", including their black slaves, between 1830 and 1850 by the United States government. As part of Indian removal, members of the Cherokee, Muscogee, Seminole, Chickasaw, and Choctaw nations were forcibly removed from their ancestral homelands in the Southeastern United States to newly designated Indian Territory west of the Mississippi River after the passage of the Indian Removal Act in 1830. The Cherokee removal in 1838 was the last forced removal east of the Mississippi and was brought on by the discovery of gold near Dahlonega, Georgia, in 1828, resulting in the Georgia Gold Rush. The relocated peoples suffered from exposure, disease, and starvation while en route to their newly designated Indian reserve. Thousands died from disease before reaching their destinations or shortly after. A variety of scholars have classified the Trail of Tears as an example of the genocide of Native Americans; others categorize it as ethnic cleansing.[b] Overview In 1830, a group of Indian nations collectively referred to as the "Five Civilized Tribes" (the Cherokee, Chickasaw, Choctaw, Muscogee, and Seminole nations), were living autonomously in what would later be termed the American Deep South. The process of cultural transformation from their traditional way of life towards a white American way of life as proposed by George Washington and Henry Knox was gaining momentum, especially among the Cherokee and Choctaw. American settlers had been pressuring the federal government to remove Indians from the Southeast; many settlers were encroaching on Indian lands, while others wanted more land made available to the settlers. Although the effort was vehemently opposed by some, including U.S. Congressman Davy Crockett of Tennessee, President Andrew Jackson was able to gain Congressional passage of the Indian Removal Act of 1830, which authorized the government to extinguish any Indian title to land claims in the Southeast. In 1831, the Choctaw became the first Nation to be removed, and their removal served as the model for all future relocations. After two wars, many Seminoles were removed in 1832. The Creek removal followed in 1834, the Chickasaw in 1837, and lastly the Cherokee in 1838. Some managed to evade the removals, however, and remained in their ancestral homelands; some Choctaw still reside in Mississippi, Creek in Alabama and Florida, Cherokee in North Carolina, and Seminole in Florida. A small group of Seminole, fewer than 500, evaded forced removal; the modern Seminole Nation of Florida is descended from these individuals. A number of non-Indians who lived with the nations, including over 4,000 slaves and others of African descent such as spouses or Freedmen, also accompanied the Indians on the trek westward. By 1837, 46,000 Indians from the southeastern states had been removed from their homelands, thereby opening 25 million acres (100,000 km2) for white settlement. When the "Five Tribes" arrived in Indian Territory, "they followed their physical appropriation of Plains Indians' land with an erasure of their predecessor's history", and "perpetuated the idea that they had found an undeveloped 'wilderness" when they arrived" in an attempt to appeal to white American values by participating in the settler colonial process themselves. Other Indian nations, such as the Quapaws and Osages had moved to Indian Territory before the "Five Tribes" and saw them as intruders. Before 1838, the fixed boundaries of these autonomous Indian nations, comprising large areas of the United States, were subject to continual cession and annexation, in part due to pressure from squatters and the threat of military force in the newly declared U.S. territories—federally administered regions whose boundaries supervened upon the Indian treaty claims. As these territories became U.S. states, state governments sought to dissolve the boundaries of the Indian nations within their borders, which were independent of state jurisdiction, and to expropriate the land therein. These pressures were exacerbated by U.S. population growth and the expansion of slavery in the South, with the rapid development of cotton cultivation in the uplands after the invention of the cotton gin by Eli Whitney. Many people of the southeastern Indian nations had become economically integrated into the economy of the region. This included the plantation economy and the possession of slaves, who were also forcibly relocated during the removal. Prior to Jackson's presidency, removal policy was already in place and justified by the myth of the "Vanishing Indian". Historian Jeffrey Ostler explains that "Scholars have exposed how the discourse of the vanishing Indian was an ideology that made declining Indigenous American populations seem to be an inevitable consequence of natural processes and so allowed Americans to evade moral responsibility for their destructive choices". Despite the common association of Andrew Jackson and the Trail of Tears, ideas for Removal began prior to Jackson's presidency. Ostler explains, "A singular focus on Jackson obscures the fact that he did not invent the idea of removal...Months after the passage of the Removal Act, Jackson described the legislation as the 'happy consummation' of a policy 'pursued for nearly 30 years'". James Fenimore Cooper was also a key component of the maintenance of the "vanishing Indian" myth. This vanishing narrative can be seen as existing prior to the Trail of Tears through Cooper's novel The Last of the Mohicans. Scholar and author Roxanne Dunbar-Ortiz shows that: Cooper has the last of the 'noble' and 'pure' Natives die off as nature would have it, with the 'last Mohican' handing the continent over to Hawkeye, the nativized settler, his adopted son ... Cooper had much to do with creating the US origin myth to which generations of historians have dedicated themselves, fortifying what historian Francis Jennings has described as "exclusion from the process of formation of American society and culture". Although Jackson was not the sole, or original, architect of Removal policy, his contributions were influential in its trajectory. Jackson's support for the removal of the Indians began at least a decade before his presidency. Indian removal was Jackson's top legislative priority upon taking office. After being elected president, he wrote in his first address to Congress: "The emigration should be voluntary, for it would be as cruel as unjust to compel the aborigines to abandon the graves of their fathers and seek a home in a distant land. But they should be distinctly informed that if they remain within the limits of the States they must be subject to their laws. In return for their obedience as individuals they will without doubt be protected in the enjoyment of those possessions which they have improved by their industry". The prioritization of American Indian removal and his violent past created a sense of restlessness among U.S. territories. During his presidency, "the United States made eighty-six treaties with twenty-six American Indian nations between New York and the Mississippi, all of them forcing land cessions, including removals". In a speech regarding Indian removal, Jackson said, It will separate the Indians from immediate contact with settlements of whites; free them from the power of the States; enable them to pursue happiness in their own way and under their own rude institutions; will retard the progress of decay, which is lessening their numbers, and perhaps cause them gradually, under the protection of the Government and through the influence of good counsels, to cast off their savage habits and become an interesting, civilized, and Christian community. The removals, conducted under both Presidents Jackson and Van Buren, followed the Indian Removal Act of 1830, which provided the president with powers to exchange land with Indian nations and provide infrastructure improvements on the existing lands. The law also gave the president power to pay for transportation costs to the West, should the nations willingly choose to relocate. The law did not, however, allow the president to force Indian nations to move west without a mutually agreed-upon treaty. Referring to the Indian Removal Act, Martin Van Buren, Jackson's vice president and successor, is quoted as saying "There was no measure, in the whole course of [Jackson's] administration, of which he was more exclusively the author than this." According to historian Roxanne Dunbar-Ortiz, Jackson's intentions were outwardly violent. Dunbar-Ortiz claims that Jackson believed in "bleeding enemies to give them their senses" on his quest to "serve the goal of U.S. expansion". According to her, American Indians presented an obstacle to the fulfillment of Manifest Destiny, in his mind.[page needed] Throughout his military career, according to historian Amy H. Sturgis, "Jackson earned and emphasized his reputation as an 'Indian fighter', a man who believed creating fear in the native population was more desirable than cultivating friendship". In a message to Congress on the eve of Indian Removal, December 6, 1830, Jackson wrote that removal "will relieve the whole State of Mississippi and the western part of Alabama of Indian occupancy, and enable those States to advance rapidly in population, wealth, and power. It will separate the Indians from immediate contact with settlements of whites." In this way, Sturgis has argued that Jackson demarcated the Indian population as an "obstacle" to national success. Sturgis writes that Jackson's removal policies were met with pushback from respectable social figures and that "many leaders of Jacksonian reform movements were particularly disturbed by U.S policy toward American Indians". Among these opponents were women's advocate and founder of the American Woman's Educational Association Catherine Beecher and politician Davy Crockett. Historian Francis Paul Prucha, on the other hand, writes that these assessments were put forward by Jackson's political opponents and that Jackson had benevolent intentions. According to him, Jackson's critics have been too harsh, if not wrong. He states that Jackson never developed a doctrinaire anti-Indian attitude and that his dominant goal was to preserve the security and well-being of the United States and its Indian and white inhabitants. Corroborating Prucha's interpretation, historian Robert V. Remini argues that Jackson never intended the "monstrous result" of his policy. Remini argues further that had Jackson not orchestrated the removal of the "Five Civilized Tribes" from their ancestral homelands, they would have been totally wiped out. Jackson chose to continue with Indian removal, and negotiated the Treaty of New Echota, on December 29, 1835, which granted the Cherokee two years to move to Indian Territory (modern-day Oklahoma). The Chickasaws and Choctaws had readily accepted and signed treaties with the U.S. government, while the Creeks did so under coercion. The negotiation of the Treaty of New Echota was largely encouraged by Jackson, and it was signed by a minority Cherokee political faction, the Treaty Party, led by Cherokee leader Elias Boudinot. However, the treaty was opposed by most of the Cherokee people, as it was not approved by the Cherokee National Council, and it was not signed by Principal Chief John Ross. The Cherokee National Council submitted a petition, signed by thousands of Cherokee citizens, urging Congress to void the agreement in February 1836. Despite this opposition, the Senate ratified the treaty in March 1836, and the Treaty of New Echota thus became the legal basis for the Trail of Tears. Only a fraction of the Cherokees left voluntarily. The U.S. government, with assistance from state militias, forced most of the remaining Cherokees west in 1838.[full citation needed] The Cherokees were temporarily remanded in camps in eastern Tennessee. In November, the Cherokee were broken into groups of around 1,000 each and began the journey west. They endured heavy rains, snow, and freezing temperatures. When the Cherokee negotiated the Treaty of New Echota, they exchanged all their land east of the Mississippi for land in modern Oklahoma and a $5 million payment from the federal government. Many Cherokee felt betrayed that their leadership accepted the deal, and over 16,000 Cherokee signed a petition to prevent the passage of the treaty. By the end of the decade in 1840, tens of thousands of Cherokee and other Indian nations had been removed from their land east of the Mississippi River. The Creek, Choctaw, Seminole, and Chicksaw were also relocated under the Indian Removal Act of 1830. Legal background The establishment of the Indian Territory and the extinguishing of Indian land claims east of the Mississippi by the Indian Removal Act anticipated the U.S. Indian reservation system, which was imposed on remaining Indian lands later in the 19th century.[citation needed] The statutory argument for Indian sovereignty persisted until the U.S. Supreme Court ruled in Cherokee Nation v. Georgia (1831), that the Cherokee were not a sovereign and independent nation, and therefore not entitled to a hearing before the court. In the years after the Indian Removal Act, the Cherokee filed several lawsuits regarding conflicts with the state of Georgia. Some of these cases reached the Supreme Court, the most influential being Worcester v. Georgia (1832).[citation needed] Samuel Worcester and the group of white Christian missionaries he was in were convicted by Georgia law for living in Cherokee territory in the state of Georgia without a license in 1831.[citation needed] Worcester was sentenced to prison for four years and appealed the ruling, arguing that this sentence violated treaties made between Indian nations and the United States federal government by imposing state laws on Cherokee lands. The Court ruled in Worcester's favor, declaring that the Cherokee Nation was subject only to federal law and that the Supremacy Clause barred legislative interference by the state of Georgia. Chief Justice Marshall argued, "The Cherokee nation, then, is a distinct community occupying its own territory in which the laws of Georgia can have no force. The whole intercourse between the United States and this Nation, is, by our constitution and laws, vested in the government of the United States." The Court did not ask federal marshals to carry out the decision. Worcester thus imposed no obligations on Jackson; there was nothing for him to enforce, although Jackson's' political enemies conspired to find evidence, to be used in the forthcoming political election, to claim that he would refuse to enforce the Worcester decision. He feared that enforcement would lead to open warfare between federal troops and the Georgia militia, which would compound the ongoing crisis in South Carolina and lead to a broader civil war. Instead, he vigorously negotiated a land exchange treaty with the Cherokee. After this, Jackson's political opponents Henry Clay and John Quincy Adams, who supported the Worcester decision, became outraged by Jackson's alleged refusal to uphold Cherokee claims against the state of Georgia. Author and political activist Ralph Waldo Emerson wrote an account of Cherokee assimilation into the American culture, declaring his support of the Worcester decision. At the time, members of individual Indian nations were not considered United States citizens. While citizenship tests existed for Indians living in newly annexed areas before and after forced relocation, individual U.S. states did not recognize the Indian nations' land claims, only individual title under State law, and distinguished between the rights of white and non-white citizens, who often had limited standing in court; and Indian removal was carried out under U.S. military jurisdiction, often by state militias. As a result, individual Indians who could prove U.S. citizenship were nevertheless displaced from newly annexed areas. Choctaw removal The Choctaw nation resided in large portions of what are now the U.S. states of Alabama, Mississippi, and Louisiana. After a series of treaties starting in 1801, the Choctaw nation was reduced to 11 million acres (45,000 km2). The Treaty of Dancing Rabbit Creek ceded the remaining country to the United States and was ratified in early 1831. The removals were only agreed to after a provision in the Treaty of Dancing Rabbit Creek allowed some Choctaw to remain. The Choctaws were the first to sign a removal treaty presented by the federal government. President Jackson wanted strong negotiations with the Choctaws in Mississippi, and the Choctaws seemed much more cooperative than Andrew Jackson had imagined. The treaty provided that the United States would bear the expense of moving their homes and that they had to be removed within two and a half years of the signed treaty. The chief of the Choctaw nation, George W. Harkins, wrote to the citizens of the United States before the removals were to commence: It is with considerable diffidence that I attempt to address the American people, knowing and feeling sensibly my incompetency; and believing that your highly and well-improved minds would not be well entertained by the address of a Choctaw. But having determined to emigrate west of the Mississippi river this fall, I have thought proper in bidding you farewell to make a few remarks expressive of my views, and the feelings that actuate me on the subject of our removal... We as Choctaws rather chose to suffer and be free, than live under the degrading influence of laws, which our voice could not be heard in their formation. — George W. Harkins, George W. Harkins to the American People United States Secretary of War Lewis Cass appointed George Gaines to manage the removals. Gaines decided to remove Choctaws in three phases starting in November 1831 and ending in 1833. The first groups met at Memphis and Vicksburg, where a harsh winter battered the emigrants with flash floods, sleet, and snow. Initially, the Choctaws were to be transported by wagon but floods halted them. With food running out, the residents of Vicksburg and Memphis were concerned. Five steamboats (the Walter Scott, the Brandywine, the Reindeer, the Talma, and the Cleopatra) would ferry Choctaws to their river-based destinations. The Memphis group traveled up the Arkansas for about 60 miles (100 km) to Arkansas Post. There the temperature stayed below freezing for almost a week with the rivers clogged with ice, so there could be no travel for weeks. Food rationing consisted of a handful of boiled corn, one turnip, and two cups of heated water per day. Forty government wagons were sent to Arkansas Post to transport them to Little Rock. When they reached Little Rock, a Choctaw chief referred to their trek as a "trail of tears and death". The Vicksburg group was led by an incompetent guide and was lost in the Lake Providence swamps.[citation needed] Alexis de Tocqueville, the French philosopher, witnessed the Choctaw removals while in Memphis, Tennessee, in 1831: In the whole scene there was an air of ruin and destruction, something which betrayed a final and irrevocable adieu; one couldn't watch without feeling one's heart wrung. The Indians were tranquil but somber and taciturn. There was one who could speak English and of whom I asked why the Chactas were leaving their country. "To be free," he answered, could never get any other reason out of him. We ... watch the expulsion ... of one of the most celebrated and ancient American peoples. — Alexis de Tocqueville, Democracy in America Nearly 17,000 Choctaws made the move to what would be called Indian Territory and then later Oklahoma. About 2,500–6,000 died along the trail of tears. Approximately 5,000–6,000 Choctaws remained in Mississippi in 1831 after the initial removal efforts. The Choctaws who chose to remain in newly formed Mississippi were subject to legal conflict, harassment, and intimidation. The Choctaws "have had our habitations torn down and burned, our fences destroyed, cattle turned into our fields and we ourselves have been scourged, manacled, fettered and otherwise personally abused, until by such treatment some of our best men have died". The Choctaws in Mississippi were later reformed as the Mississippi Band of Choctaw Indians and the removed Choctaws became the Choctaw Nation of Oklahoma.[citation needed] Seminole resistance The U.S. acquired Florida from Spain via the Adams–Onís Treaty and took possession in 1821. In 1832 the Seminoles were called to a meeting at Payne's Landing on the Ocklawaha River. The Treaty of Payne's Landing called for the Seminoles to move west, if the land were found to be suitable. They were to be settled on the Creek reservation and become part of the Creek nation, who considered them deserters[full citation needed]; some of the Seminoles had been derived from Creek bands but also from other Indian nations. Those among the nation who once were members of Creek bands did not wish to move west to where they were certain that they would meet death for leaving the main band of Creek Indians. The delegation of seven chiefs who were to inspect the new reservation did not leave Florida until October 1832. After touring the area for several months and conferring with the Creeks who had already settled there, the seven chiefs signed a statement on March 28, 1833, that the new land was acceptable. Upon their return to Florida, however, most of the chiefs renounced the statement, claiming that they had not signed it, or that they had been forced to sign it, and in any case, that they did not have the power to decide for all the Indian nations and bands that resided on the reservation. The villages in the area of the Apalachicola River were more easily persuaded, however, and went west in 1834.[full citation needed] On December 28, 1835, a group of Seminoles and blacks ambushed a U.S. Army company marching from Fort Brooke in Tampa to Fort King in Ocala, killing all but three of the 110 army troops. This came to be known as the Dade Massacre.[citation needed] As the realization that the Seminoles would resist relocation sank in, Florida began preparing for war. The St. Augustine Militia asked the War Department for the loan of 500 muskets. Five hundred volunteers were mobilized under Brig. Gen. Richard K. Call. Indian war parties raided farms and settlements, and families fled to forts, large towns, or out of the territory altogether. A war party led by Osceola captured a Florida militia supply train, killing eight of its guards and wounding six others. Most of the goods taken were recovered by the militia in another fight a few days later. Sugar plantations along the Atlantic coast south of St. Augustine were destroyed, with many of the slaves on the plantations joining the Seminoles.[full citation needed] Other war chiefs such as Halleck Tustenuggee, Jumper, and Black Seminoles Abraham and John Horse continued the Seminole resistance against the army. The war ended, after a full decade of fighting, in 1842. The U.S. government is estimated to have spent about $20,000,000 on the war, ($667,241,379 today). Many Indians were forcibly exiled to Creek lands west of the Mississippi; others retreated into the Everglades. In the end, the government gave up trying to subjugate the Seminole in their Everglades redoubts and left fewer than 500 Seminoles in peace. Other scholars state that at least several hundred Seminoles remained in the Everglades after the Seminole Wars. As a result of the Seminole Wars, the surviving Seminole band of the Everglades claims to be the only federally recognized Indian nation which never relinquished sovereignty or signed a peace treaty with the United States. In general the American people tended to view the Indian resistance as unwarranted. An article published by the Virginia Enquirer on January 26, 1836, called the "Hostilities of the Seminoles", assigned all the blame for the violence that came from the Seminole's resistance to the Seminoles themselves. The article accuses the Indians of not staying true to their word—the promises they supposedly made in the treaties and negotiations from the Indian Removal Act. Creek dissolution After the War of 1812, some Muscogee leaders such as William McIntosh and Chief Shelocta signed treaties that ceded more land to Georgia. The 1814 signing of the Treaty of Fort Jackson signaled the end for the Creek Nation and for all Indians in the South. Friendly Creek leaders, like Shelocta and Big Warrior, addressed Sharp Knife (the Indian nickname for Andrew Jackson) and reminded him that they keep the peace. Nevertheless, Jackson retorted that they did not "cut (Tecumseh's) throat" when they had the chance, so they must now cede Creek lands. Jackson also ignored Article 9 of the Treaty of Ghent that restored sovereignty to Indians and their nations.[citation needed] Jackson opened this first peace session by faintly acknowledging the help of the friendly Creeks. That done, he turned to the Red Sticks and admonished them for listening to evil counsel. For their crime, he said, the entire Creek Nation must pay. He demanded the equivalent of all expenses incurred by the United States in prosecuting the war, which by his calculation came to 23,000,000 acres (93,000 km2) of land. — Robert V. Remini, Andrew Jackson Eventually, the Creek Confederacy enacted a law that made further land cessions a capital offense. Nevertheless, on February 12, 1825, McIntosh and other chiefs signed the Treaty of Indian Springs, which gave up most of the remaining Creek lands in Georgia. After the U.S. Senate ratified the treaty, McIntosh was assassinated on April 30, 1825, by Creeks led by Menawa.[citation needed] The Creek National Council, led by Opothle Yohola, protested to the United States that the Treaty of Indian Springs was fraudulent. President John Quincy Adams was sympathetic, and eventually, the treaty was nullified in a new agreement, the Treaty of Washington (1826). The historian R. Douglas Hurt wrote: "The Creeks had accomplished what no Indian nation had ever done or would do again—achieve the annulment of a ratified treaty." However, Governor George Troup of Georgia ignored the new treaty and began to forcibly remove the Indians under the terms of the earlier treaty. At first, President Adams attempted to intervene with federal troops, but Troup called out the militia, and Adams, fearful of a civil war, conceded. As he explained to his intimates, "The Indians are not worth going to war over."[citation needed] Although the Creeks had been forced from Georgia, with many Lower Creeks moving to the Indian Territory, there were still about 20,000 Upper Creeks living in Alabama. However, the state moved to abolish tribal governments and extend state laws over the Creeks. Opothle Yohola appealed to the administration of President Andrew Jackson for protection from Alabama; when none was forthcoming, the Treaty of Cusseta was signed on March 24, 1832, which divided up Creek lands into individual allotments. Creeks could either sell their allotments and receive funds to remove to the west, or stay in Alabama and submit to state laws. The Creeks were never given a fair chance to comply with the terms of the treaty, however. Rampant illegal settlement of their lands by Americans continued unabated with federal and state authorities unable or unwilling to do much to halt it. Further, as recently detailed by historian Billy Winn in his thorough chronicle of the events leading to removal, a variety of fraudulent schemes designed to cheat the Creeks out of their allotments, many of them organized by speculators operating out of Columbus, Georgia and Montgomery, Alabama, were perpetrated after the signing of the Treaty of Cusseta.[page needed] A portion of the beleaguered Creeks, many desperately poor and feeling abused and oppressed by their American neighbors, struck back by carrying out occasional raids on area farms and committing other isolated acts of violence. Escalating tensions erupted into open war with the United States after the destruction of the village of Roanoke, Georgia, located along the Chattahoochee River on the boundary between Creek and American territory, in May 1836. During the so-called "Creek War of 1836" Secretary of War Lewis Cass dispatched General Winfield Scott to end the violence by forcibly removing the Creeks to the Indian Territory west of the Mississippi River. With the Indian Removal Act of 1830 it continued into 1835 and after as in 1836 over 15,000 Creeks were driven from their land for the last time. 3,500 of those 15,000 Creeks did not survive the trip to Oklahoma where they eventually settled. Chickasaw monetary removal The Chickasaw received financial compensation from the United States for their lands east of the Mississippi River. In 1836, the Chickasaws had reached an agreement to purchase land from the previously removed Choctaws after a bitter five-year debate. They paid the Choctaws $530,000 (equal to $15,500,000 today) for the westernmost part of the Choctaw land. The first group of Chickasaws moved in 1837 and was led by John M. Millard. The Chickasaws gathered at Memphis on July 4, 1837, with all of their assets—belongings, livestock, and slaves. Once across the Mississippi River, they followed routes previously established by the Choctaws and the Creeks. Once in Indian Territory, the Chickasaws merged with the Choctaw nation. Cherokee forced relocation By 1838, about 2,000 Cherokee had voluntarily relocated from Georgia to Indian Territory (present day Oklahoma). Forcible removals began in May 1838 when General Winfield Scott received a final order from President Martin Van Buren to relocate the remaining Cherokees. On 17 May 1838, General Scott issued Order No. 25, in the text he stated " Every possible kindness, compatible with the necessity of removal, must, therefore, be shown by the troops, and, if, in the ranks, a despicable individual should be found, capable of inflicting a wanton injury or insult on any Cherokee man, woman or child, it is hereby made the special duty of the nearest good officer or man, instantly to interpose, and to seize and consign the guilty wretch to the severest penalty of the laws. The Major General is fully persuaded that this injunction will not be neglected by the brave men under his command, who cannot be otherwise than jealous of their own honor and that of their country." Approximately 4,000 Cherokees died in the ensuing trek to Oklahoma. In the Cherokee language, the event is called nu na da ul tsun yi ('the place where they cried') or nu na hi du na tlo hi lu i ('the trail where they cried').[citation needed] The Cherokee Trail of Tears resulted from the enforcement of the Treaty of New Echota, an agreement signed under the provisions of the Indian Removal Act of 1830, which exchanged Indian land in the East for lands west of the Mississippi River, but which was never accepted by the elected tribal leadership or a majority of the Cherokee people. There were significant changes in gender relations within the Cherokee Nation during the implementation of the Indian Removal Act during the 1830s. Cherokee historically operated on a matrilineal kinship system, where children belonged to the clan of their mother and their only relatives were those who could be traced through her. In addition to being matrilineal, Cherokees were also matrilocal. According to the naturalist William Bartram, "Marriage gives no right to the husband over the property of his wife; and when they part, she keeps the children and property belonging to them." In this way, the typical Cherokee family was structured in a way where the wife held possession to the property, house, and children. However, during the 1820s and 1830s, "Cherokees [began adopting] the Anglo-American concept of power—a political system dominated by wealthy, highly acculturated men and supported by an ideology that made women ... subordinate". The Treaty of New Echota was largely signed by men. While women were present at the rump council negotiating the treaty, they did not have a seat at the table to participate in the proceedings. Historian Theda Perdue explains that "Cherokee women met in their own councils to discuss their own opinions" despite not being able to participate. The inability for women to join in on the negotiation and signing of the Treaty of New Echota shows how the role of women changed dramatically within Cherokee Nation following colonial encroachment. For instance, Cherokee women played a significant role in the negotiation of land transactions as late as 1785, where they spoke at a treaty conference held at Hopewell, South Carolina to clarify land cessions to the U.S. forced on the Cherokee due to their alliance with Britain during the American Revolutionary War. The sparsely inhabited Cherokee lands were highly attractive to Georgian farmers experiencing population pressure, and illegal settlements resulted. Long-simmering tensions between Georgia and the Cherokee Nation were brought to a crisis by the discovery of gold near Dahlonega, Georgia, in 1829, resulting in the Georgia Gold Rush, the second gold rush in U.S. history. Hopeful gold speculators began trespassing on Cherokee lands, and pressure mounted to fulfill the Compact of 1802 in which the US Government promised to extinguish Indian land claims in the state of Georgia. When Georgia moved to extend state laws over Cherokee lands in 1830, the matter went to the U.S. Supreme Court. In Cherokee Nation v. Georgia (1831), the Marshall court ruled that the Cherokee Nation was not a sovereign and independent nation, and therefore refused to hear the case. However, in Worcester v. Georgia (1832), the Court ruled that Georgia could not impose laws in Cherokee territory, since only the national government—not state governments—had authority in Indian affairs. Worcester v Georgia is associated with Andrew Jackson's famous, though apocryphal, quote "John Marshall has made his decision; now let him enforce it!" In reality, this quote did not appear until 30 years after the incident and was first printed in a textbook authored by Jackson critic Horace Greeley. Fearing open warfare between federal troops and the Georgia militia, Jackson decided not to enforce Cherokee claims against the state of Georgia. He was already embroiled in a constitutional crisis with South Carolina (i.e. the nullification crisis) and favored Cherokee relocation over civil war. With the Indian Removal Act of 1830, the U.S. Congress had given Jackson authority to negotiate removal treaties, exchanging Indian land in the East for land west of the Mississippi River. Jackson used the dispute with Georgia to put pressure on the Cherokees to sign a removal treaty. The final treaty, passed in Congress by a single vote, and signed by President Andrew Jackson, was imposed by his successor President Martin Van Buren. Van Buren allowed Georgia, Tennessee, North Carolina, and Alabama an armed force of 7,000 militiamen, army regulars, and volunteers under General Winfield Scott to relocate about 13,000 Cherokees to Cleveland, Tennessee. After the initial roundup, the U.S. military oversaw the emigration to Oklahoma. Former Cherokee lands were immediately opened to settlement. Most of the deaths during the journey were caused by disease, malnutrition, and exposure during an unusually cold winter. The U.S. Army, under General Scott, initially rounded up and brought the Cherokee to several staging camps, after which Scott halted operations due to high numbers of deaths. The rest of the overland journey was subsequently conducted under the auspices of Chief John Ross, with government funding. In the winter of 1838 the Cherokee began the 1,000-mile (1,600 km) march with scant clothing and most on foot without shoes or moccasins. The march began in Red Clay, Tennessee, the location of the last Eastern capital of the Cherokee Nation. Because of the diseases, the Indians were not allowed to go into any towns or villages along the way; many times this meant traveling much farther to go around them. After crossing Tennessee and Kentucky, they arrived at the Ohio River across from Golconda in southern Illinois about the 3rd of December 1838. Here the starving Indians were charged a dollar a head (equal to $30.23 today) to cross the river on "Berry's Ferry" which typically charged twelve cents, equal to $3.63 today. They were not allowed passage until the ferry had serviced all others wishing to cross and were forced to take shelter under "Mantle Rock", a shelter bluff on the Kentucky side, until "Berry had nothing better to do". Many died huddled together at Mantle Rock waiting to cross. Several Cherokee were murdered by locals. The Cherokee filed a lawsuit against the U.S. Government through the courthouse in Vienna, suing the government for $35 a head (equal to $1,058.2 today) to bury the murdered Cherokee. As they crossed southern Illinois, on December 26, Martin Davis, commissary agent for Moses Daniel's detachment, wrote: There is the coldest weather in Illinois I ever experienced anywhere. The streams are all frozen over something like 8 or 12 inches [20 or 30 cm] thick. We are compelled to cut through the ice to get water for ourselves and animals. It snows here every two or three days at the fartherest. We are now camped in Mississippi [River] swamp 4 miles (6 km) from the river, and there is no possible chance of crossing the river for the numerous quantity of ice that comes floating down the river every day. We have only traveled 65 miles (105 km) on the last month, including the time spent at this place, which has been about three weeks. It is unknown when we shall cross the river.... A volunteer soldier from Georgia who participated in the removal recounted: I fought through the civil war and have seen men shot to pieces and slaughtered by thousands, but the Cherokee removal was the cruelest work I ever knew. It eventually took almost three months to cross the 60 miles (97 kilometres) on land between the Ohio and Mississippi Rivers. The trek through southern Illinois is where the Cherokee suffered most of their deaths. However a few years before forced removal, some Cherokee who opted to leave their homes voluntarily chose a water-based route through the Tennessee, Ohio and Mississippi rivers. It took only 21 days, but the Cherokee who were forcibly relocated were wary of water travel. Environmental researchers David Gaines and Jere Krakow outline the "context of the tragic Cherokee relocation" as one predicated on the difference between "Indian regard for the land, and its contrast with the Euro-Americans view of land as property". This divergence in perspective on land, according to sociologists Gregory Hooks and Chad L. Smith, led to the homes of American Indian people "being donated and sold off" by the United States government to "promote the settlement and development of the West," with railroad developers, white settlers, land developers, and mining companies assuming ownership. In American Indian society, according to Colville scholar Dina Gilio-Whitaker, it caused "the loss of ancient connections to homelands and sacred sites," "the deaths of upward of 25 percent of those on the trail" and "the loss of life-sustaining livestock and crops." Dina Gilio-Whitaker draws on research by Choctaw and Chippewa historian Clara Sue Kidwell to show the relationship between the Trail of Tears and a negative impact on the environment. In tracking the environmental changes of the southeastern tribes who relocated to new lands across the Trail of Tears, Kidwell finds that "prior to removal the tribes had already begun adapting to a cash-based, private property economic system with their adoption of many European customs (including the practice of slave owning), after their move west they had become more deeply entrenched into the American economic system with the discovery of coal deposits and the western expansion of the railroads on and through their lands. So while they adapted to their new environments, their relationship to land would change to fit the needs of an imposed capitalist system". Cherokee ethnobotanist Clint Carroll illustrates how this imposed capitalist system altered Cherokee efforts to protect traditional medicinal plants during relocation, saying that "these changes have resulted in contrasting land management paradigms, rooted in the language of 'resource-based' versus 'relationship-based' approaches, a binary imposed on tribal governments by the Bureau of Indian Affairs through their historically paternalistic relationship". This shift in land management as a function of removal had negative environmental ramifications such as solidifying "a deeply entrenched bureaucratic structure that still drives much of the federal-tribal relationship and determines how tribal governments use their lands, sometimes in ways that contribute to climate change and, in extreme cases, ways that lead to human rights abuses". In addition to a physical relocation, American Indian removal and the Trail of Tears had social and cultural effects as American Indians were forced "to contemplate abandonment of their native land. To the Cherokees life was a part of the land. Every rock, every tree, every place had a spirit. And the spirit was central to the tribal lifeway. To many, the thought of loss of place was a thought of loss of self, loss of Cherokeeness, and a loss of life- way". This cultural shift is characterized by Gilio-Whitaker as "environmental deprivation," a concept that "relates to historical processes of land and resource dispossession calculated to bring about the destruction of Indigenous Americans' lives and cultures. Environmental deprivation in this sense refers to actions by settlers and settler governments that are designed to block Native peoples' access to life-giving and culture-affirming resources". The separation of American Indian people from their land lead to the loss of cultural knowledge and practices, as described by scholar Rachel Robison-Greene, who finds the "legacy of fifteenth-century European colonial domination" led to "Indigenous knowledge and perspectives" being "ignored and denigrated by the vast majority of social, physical, biological and agricultural scientists, and governments using colonial powers to exploit Indigenous resources". Indigenous cultural and intellectual contribution to "environmental issues" in the form of a "rich history, cultural customs, and practical wisdom regarding sustainable environmental practices" can be lost because of Indigenous removal and the Trail of Tears, according to Greene. Removed Cherokees initially settled near Tahlequah, Oklahoma. When signing the Treaty of New Echota in 1835 Major Ridge said "I have signed my death warrant." The resulting political turmoil led to the killings of Major Ridge, John Ridge, and Elias Boudinot; of the leaders of the Treaty Party, only Stand Watie escaped death. The population of the Cherokee Nation eventually rebounded, and today the Cherokees are the largest American Indian group in the United States. There were some exceptions to removal. Approximately 100 Cherokees evaded the U.S. soldiers and lived off the land in Georgia and other states. Those Cherokees who lived on private, individually owned lands (rather than communally owned tribal land) were not subject to removal. In North Carolina, about 400 Cherokees, sometimes referred to as the Oconaluftee Cherokee due to their settlement near to the river of the same name, lived on land in the Great Smoky Mountains owned by a white man named William Holland Thomas (who had been adopted by Cherokees as a boy), and were thus not subject to removal. Added to this were some 200 Cherokee from the Nantahala area allowed to stay in the Qualla Boundary after assisting the U.S. Army in hunting down and capturing the family of the old prophet, Tsali, who was executed by a firing squad as were most of his family. These North Carolina Cherokees became the Eastern Band of the Cherokee Nation. A local newspaper, the Highland Messenger, said July 24, 1840, "that between nine hundred and a thousand of these deluded beings ... are still hovering about the homes of their fathers, in the counties of Macon and Cherokee" and "that they are a great annoyance to the citizens" who wanted to buy land there believing the Cherokee were gone; the newspaper reported that President Martin Van Buren said "they ... are, in his opinion, free to go or stay.' Several Cherokee speakers throughout history offered first-hand accounts of the events of the Trail of Tears as well as provided insight into its lasting effects. Elias Budinot, Major Ridge, Speckled Snake, John Ross, and Richard Taylor were all notable Cherokee orators during the 19th century who used the speech as a form of resistance against the U.S. government. John Ross, the Cherokee Chief from 1828 to 1866, and Major Ridge embarked on a speaking tour within the Cherokee Nation itself in hopes of strengthening a sense of unity amongst the tribal members. Tribal unity was a central tenet to Cherokee resistance, with Ross stating in his council address: "'Much...depends on our unity of sentiment and firmness of action, in maintaining those sacred rights which we have ever enjoyed'". Cherokee speeches like this were made even more important because the state of Georgia made it illegal for members of American Indian tribes to both speak to an all-white court and convince other Indian tribal members not to move. Another influential Cherokee figure was Cherokee writer John Ridge, son of Major Ridge, who wrote four articles using the pseudonym "Socrates". His works were published in the Cherokee Phoenix, the nation's newspaper. The choice of pseudonym, according to literary scholar Kelly Wisecup, "...facilitated a rhetorical structure that created not only a public persona recognizable to the Phoenix's multiple readerships but also a public character who argued forcefully that white readers should respect Cherokee rights and claims". The main focal points of Ridge's articles critiqued the perceived hypocrisy of the U.S. government, colonial history, and the events leading up to the Trail of Tears, using excerpts from American and European history and literature. The United States Court of Claims ruled in favor of the Eastern Cherokee Nation's claim against the U.S. on May 18, 1905. This resulted in the appropriation of $1 million (equal to $27,438,023.04 today) to the Nation's eligible individuals and families. Interior Department employee Guion Miller created a list using several rolls and applications to verify tribal enrollment for the distribution of funds, known as the Guion Miller Roll. The applications received documented over 125,000 individuals; the court approved more than 30,000 individuals to share in the funds.[page needed] Statistics Legacy The events have sometimes been referred to as "death marches", in particular when referring to the Cherokee march across Tennessee, Kentucky and Missouri in 1838. Indians who had the means initially provided for their own removal. Contingents that were led by conductors from the U.S. Army included those led by Edward Deas, who was claimed to be a sympathizer for the Cherokee plight.[citation needed] The largest death toll from the Cherokee forced relocation comes from the period after the May 23, 1838 deadline. This was at the point when the remaining Cherokee were rounded up into camps and placed into large groups, often over 700 in size. Communicable diseases spread quickly through these closely quartered groups, killing many. These groups were among the last to move, but following the same routes the others had taken; the areas they were going through had been depleted of supplies due to the vast numbers that had gone before them. The marchers were subject to extortion and violence along the route. In addition, these final contingents were forced to set out during the hottest and coldest months of the year, killing many. Exposure to the elements, disease, starvation, harassment by local frontiersmen, and insufficient rations similarly killed up to one-third of the Choctaw and other nations on the march. There exists some debate among historians and the affected nations as to whether the term "Trail of Tears" should be used to refer to the entire history of forced relocations from the Eastern United States into Indian Territory, to the relocations of specifically the Five Civilized Tribes, to the route of the march, or to specific marches in which the remaining holdouts from each area were rounded up.[citation needed] There is debate among historians about how the Trail of Tears should be classified. Some historians classify the events as a form of ethnic cleansing; others refer to it as genocide.[c] Historian and biographer Robert V. Remini wrote that Jackson's policy on Native Americans was based on good intentions. He writes: "Jackson fully expected the Indians to thrive in their new surroundings, educate their children, acquire the skills of white civilization so as to improve their living conditions, and become citizens of the United States. Removal, in his mind, would provide all these blessings....Jackson genuinely believed that what he had accomplished rescued these people from inevitable annihilation." Historian Sean Wilentz writes that some critics who label Indian removal as genocide view Jacksonian democracy as a "momentous transition from the ethical community upheld by antiremoval men", and says this view is a caricature of US history that "turns tragedy into melodrama, exaggerates parts at the expense of the whole, and sacrifices nuance for sharpness". Historian Donald B. Cole, too, argues that it is difficult to find evidence of a conscious desire for genocide in Jackson's policy on Native Americans, but dismisses the idea that Jackson was motivated by the welfare of Native Americans. Colonial historian Daniel Blake Smith disagrees with the usage of the term genocide, adding that "no one wanted, let alone planned for, Cherokees to die in the forced removal out West". Historian Justin D. Murphy argues that: Although the "Trail of Tears" was tragic, it does not quite meet the standard of genocide, and the extent to which tribes were allowed to retain their identity, albeit by removal, does not quite meet the standard of cultural genocide". In contrast, some scholars have debated that the Trail of Tears was a genocidal act. Historian Jeffrey Ostler argues that the threat of genocide was used to ensure Natives' compliance with removal policies,: 1 and concludes that, "In its outcome and in the means used to gain compliance, the policy had genocidal dimensions." Patrick Wolfe argues that settler colonialism and genocide are interrelated but should be distinguished from each other, writing that settler colonialism is "more than the summary liquidation of Indigenous people, though it includes that." Wolfe describes the assimilation of Indigenous people who escaped relocation (and particularly their abandonment of collectivity) as a form of cultural genocide, though he emphasises that cultural genocide is "the real thing" in that it resulted in large numbers of deaths. The Trail of Tears was thus a settler-colonial replacement of Indigenous people and culture in addition to a genocidal mass-killing according to Wolfe.: 1 : 2 Roxanne Dunbar-Ortiz describes the policy as genocide, saying: "The fledgling United States government's method of dealing with native people—a process which then included systematic genocide, property theft, and total subjugation—reached its nadir in 1830 under the federal policy of President Andrew Jackson." Mankiller emphasises that Jackson's policies were the natural extension of much earlier genocidal policies toward Native Americans established through territorial expansion during the Jefferson administration. Dina Gilio-Whitaker, in As Long as Grass Grows, describes the Trail of Tears and the Diné long walk as structural genocide, because they destroyed Native relations to land, one another, and nonhuman beings which imperiled their culture, life, and history. According to her, these are ongoing actions that constitute both cultural and physical genocide.: 35–51 In 1987, about 2,200 miles (3,500 km) of trails were authorized by federal law to mark the removal of 17 detachments of the Cherokee people. Called the Trail of Tears National Historic Trail, it traverses portions of nine states and includes land and water routes. A historical drama based on the Trail of Tears, Unto These Hills written by Kermit Hunter, has sold over five million tickets for its performances since its opening on July 1, 1950, both touring and at the outdoor Mountainside Theater of the Cherokee Historical Association in Cherokee, North Carolina. A regular event, the "Remember the Removal Bike Ride," entails six cyclists from the Cherokee Nation to ride over 950 miles while retracing the same path that their ancestors took. The cyclists, who average about 60 miles a day, start their journey in the former capital of the Cherokee Nation, New Echota, Georgia, and finish in Tahlequah, Oklahoma. In June 2024, Shawna Baker, justice of the Cherokee Nation Supreme Court was a mentor cyclist on the 40th commemorative ride. Cherokee artist Troy Anderson was commissioned to design the Cherokee Trail of Tears Sesquicentennial Commemorative Medallion. The falling-tear medallion shows a seven-pointed star, the symbol of the seven clans of the Cherokees. From the homes their fathers made // From the graves the tall trees shade // For the sake of greed and gold, // The Cherokees were forced to go // To a land they did not know; // And Father Time or wisdom old // Cannot erase, through endless years // The memory of the trail of tears. Notes See also References Bibliography Documents Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-181] | [TOKENS: 10515]
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Capture_the_flag] | [TOKENS: 2068]
Contents Capture the flag Capture the Flag (CTF) is a traditional outdoor sport where two or more teams each have a flag (or other markers) and the objective is to capture the other team's flag, located at the team's "base" (or hidden or even buried somewhere in the territory), and bring it safely back to their own base. Enemy players can be "tagged" by players when out of their home territory and, depending on the rules, they may be out of the game, become members of the opposite team, be sent back to their own territory, be frozen in place, or be sent to "jail" until freed by a member of their own team. Overview Capture the Flag requires a playing field. In both indoor and outdoor versions, the field is divided into two clearly designated halves, known as territories. Players form two teams, one for each territory. Each side has a "flag", which is most often a piece of fabric, but can be any object small enough to be easily carried by a person (night time games might use flashlights, glowsticks or lanterns as the "flags"). Sometimes teams wear dark colors at night to make it more difficult for their opponents to see them.[citation needed] The objective of the game is for players to venture into the opposing team's territory, grab the flag, and return with it to their territory without being tagged. The flag is defended mainly by tagging opposing players who attempt to take it. Within their territory players are "safe", meaning that they cannot be tagged by opposing players. Once they cross into the opposing team's territory they are vulnerable to tagging. Rules for Capture the Flag appear in 1860 in the German gymnastic manual Lehr- und Handbuch der deutschen Turnkunst by Wilhelm Lübeck under the name Fahnenbarlauf. In the 19th century, Capture the Flag was not considered a distinct game, but a variation of the European game "Barlaufen" (Barlauf mit Fahnenraub), played in France and Germany. Descriptions of Capture the Flag in English appeared in the early 20th century, e. g. in "Scouting for Boys" written in 1908 by Robert Baden-Powell, the founder of the Scouting Movement, under the title "Flag Raiding". They also appeared in the 1920 Edition of "The Official Handbook for Boys" published by the Boy Scouts of America. The flag is usually placed in a visibly obvious location at the rear of a team's territory. In a more difficult version, the flag is hidden in a place where it can only be seen from certain angles. Capturing the flag also might require completing some challenge. For example, the flag could be hidden in the leaves up in a tall tree, and the players have to see the flag, then knock it out and bring it to their base.[citation needed] The rules for jail vary from game to game and deal with what happens if a player is tagged in the other team's territory. Either each team can decide a designated area for the other team's players to go to when tagged, or a single jail can be used for both teams. When tagged, a player should be trusted to go to jail on their own, or should need to be escorted, depending on the game. Players can leave the jail after a certain time, for example three minutes, or can leave early if tagged by a teammate. Depending on the game, players can be released on the way to jail, or may have to be in jail before being released. The rules for the handling of the flag also vary from game to game and deal mostly with the disposition of the flag after a failed attempt at capturing it. In one variant, after a player is tagged while carrying the flag, it is returned to its original place. In another variant, the flag is left in the location where the player was tagged. This latter variant makes offensive play easier, as the flag will tend, over the course of the game, to be moved closer to the dividing line between territories. In some games, it is possible for the players to throw the flag to teammates. As long as the flag stays in play without hitting the ground, the players are allowed to pass it. When the flag is captured by one player, they're not safe from being tagged, unless they trip.[citation needed] Sometimes, the flag holder may not be safe at all, even in their home territory, until they obtain both flags, thus ending the game. Their goal is to return to their own side or hand it off to a teammate who will then carry it to the other side. In most versions the flag may be handed off while running. The game is won when a player returns to their own territory with the enemy flag or both teams' flags. Also, rarely the flag carrier may not attempt to free any of their teammates from jail. Alterations may include "one flag" Capture the Flag in which there is a defensive team and an offensive team, or games with three or more flags. In the case of the latter, one can only win when all flags are captured. Another variation is when the players put bandannas in their pockets with about six inches sticking out. Instead of tagging opponents, players must pull their opponent's bandanna out of their pocket. No matter where a player is when their bandanna is pulled, they're captured and must either go to jail or return to their base before returning to play. In this version there is no team territory, only a small base where the team's flag is kept. To win, one team must have both of the flags in their base.[citation needed] In some urban settings, the game is played indoors in an enclosed area with walls. There is also an area in the opposing ends for the flag to be placed in. In this urban variation legal checking as in hockey, including against the sideboards, is allowed.[citation needed] A player who commits a foul or illegal check is placed in a penalty box for a specified amount of time, depending on the severity of the foul. A player who deliberately injures an opponent is expelled from the rest of the game.[citation needed] Throwing the flag is allowed in this variation, as long as the flag is caught before it hits the ground.[citation needed] If the flag is thrown to a teammate but hits the ground before it can be caught, the flag is placed from the spot of the throw. If a player throws the flag, but is blocked or intercepted by a player from the opposing team, the flag is placed back at the base. It is not uncommon for people to play airsoft, paintball, or Nerf variations of Capture the Flag.[citation needed] Typically there are no territories in these versions. Players who are "hit" must sit out a predetermined amount of time before returning to play (respawning). "Stealing sticks" is a similar game played in the British Isles, the United States, and Australia. However, instead of a flag, a number of sticks or other items such as coats or hats are placed in a "goal" on the far end of each side of the playing field or area. As in Capture the Flag, players are sent to a "prison" if tagged on the opponents' side, and may be freed by teammates. Each player may only take one of their opponents' sticks at a time. The first team to take all of the opponents' sticks to their own side wins. Software and games In 1984, Scholastic published Bannercatch for the Apple II and Commodore 64 computers. An educational video game with recognizable capture-the-flag mechanics, Bannercatch allows up to two humans (each alternating between two characters in the game world) to play capture the flag against an increasingly difficult team of four AI bots. Bannercatch's game world is divided into quadrants: home, enemy, and two "no-mans land" areas which hold the jails. A successful capture requires bringing the enemy flag into one team's "home" quadrant. Players can be captured when in an enemy territory, or in "no-mans land" while holding a flag. Captured players must be "rescued" from their designated jail by one of the other members of the team. Fallen flags remain where they dropped until a time-out period elapses, after which the flag returns to one of several starting locations in home territory. The 2D map also features walls, trees and a moving river, enabling a wide variety of strategies. Special locations in the play area allow humans to query the game state (such as flag status) using binary messages. In 1992, Richard Carr released an MS-DOS based game called Capture the Flag. It is a turn-based strategy game with real time network / modem play (or play-by-mail) based around the traditional outdoor game. The game required players to merely move one of their characters onto the same square as their opponent's flag, as opposed to bringing it back to friendly territory, because of difficulties implementing the artificial intelligence that the computer player would have needed to bring the enemy flag home and intercept opposing characters carrying the flag. [citation needed] In computer security Capture the Flag (CTF), "flags" are secrets hidden in purposefully-vulnerable programs or websites. Competitors steal flags either from other competitors (attack/defense-style CTFs) or from the organizers (jeopardy-style challenges). Several variations exist, including hiding flags in hardware devices. Urban gaming Capture the Flag is among the games that made a comeback among adults in the early 21st century as part of the urban gaming trend (which includes games like Pac-Manhattan, Fugitive, Unreal Tournament, Ultimate frisbee, and Manhunt). The game is played on city streets and players use cellphones to communicate. News about the games spreads virally through the use of blogs and mailing lists. Urban Capture the Flag has been played in cities throughout North America. One long-running example occurs on the Northrop Mall at the University of Minnesota on Fridays with typical attendance ranging from 50 to several hundred. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Ancient_Israelite_cuisine] | [TOKENS: 14818]
Contents Ancient Israelite cuisine Ancient Israelite cuisine was similar to other contemporary Mediterranean cuisines. Dietary staples were bread, wine, and olive oil; also included were legumes, fruits and vegetables, dairy products, and fish and other meat. Importance was placed on the Seven Species, which are listed in the Hebrew Bible as being special agricultural products of the Land of Israel. Like many cultures, the Israelites abided by a number of dietary regulations and restrictions that were variously unique or shared with other Near Eastern civilizations. These culinary practices were largely shaped by the Israelite religion, which later developed into Judaism and Samaritanism. People in ancient Israel generally adhered to a particular slaughter method and only consumed from certain animals, notably excluding pigs and camels and all predators and scavengers, as well as forbidding blood consumption and the mixing of milk and meat. There was a considerable continuity in the main components of the diet over time, despite the introduction of new foodstuffs at various stages. Sources Information about the food of the ancient Israelites is based on written sources, archaeological records and comparative evidence from the wider region of the ancient Levant. The primary written source for the period is the Hebrew Bible, the largest collection of written documents surviving from ancient Israel. Other texts, such as the Dead Sea Scrolls, apocryphal works, the New Testament, the Mishnah and the Talmud also provide information. Epigraphic sources include ostraca from Samaria and Arad. The Bible provides names of plants and animals that were used for food, such as the lists of permitted and forbidden animals (for example, Leviticus 11 and Deuteronomy 14), and the lists of foods brought to the king’s table (for example, 1 Kings 5:2–3) or the foods that the Israelites are said to have longed for after leaving Egypt (Numbers 11:5). These lists indicate the potential foods that were available, but not necessarily how regularly the food was eaten or how significant it was in the cuisine, which needs to be derived from other sources. Archaeological remains include the items used for the production of food, such as wine or olive presses, stone and metal implements used in the preparation of food, and amphorae, jars, storerooms and grain pits used for storage. Animal bones provide evidence of meat consumption, the types of animals eaten, and whether they were kept for milk production or other uses, while paleobotanical remains, such as seeds or other carbonized or desiccated plant remains provide information about plant foods. Using both written and archaeological data, some comparisons can be drawn between the food of ancient Israel and its neighbors. Although there is much information about the foods of ancient Egypt and Mesopotamia, the inferences that can be made are limited due to differences in topography and climate; Israelite agriculture also depended on rainfall rather than the river-based irrigation of these two civilizations, resulting in the preference for different crops. Ugarit and Phoenicia were closer neighbors of ancient Israel, and shared a topography and climate similar to that of ancient Israel. Thus, conclusions about the food and drink in ancient Israel have been made with some confidence from this evidence. History Significant milestones in the availability and development food production characteristic of Israelite cuisine occurred well before the Israelite period. On the other hand, vestiges of the cuisine and the practices associated with it continue to resonate in later Jewish cuisine and traditions that developed in Israel and Babylonia during the Talmudic period (200 CE-500 CE), and may still be discerned in the various culinary styles that have developed among Jewish communities since then. Wild species of barley and emmer wheat were domesticated and cultivated in the Jordan River Valley as early as the 9th millennium BCE. Archaeologists have found the carbonized seeds of two kinds of primitive wheat, einkorn and emmer, and two-row barley, in the earliest levels of digs at Jericho, one of the first cities in the world. During the Pottery Neolithic period (6000-4300 BCE), the development of pottery enabled people to produce portable containers for the transportation and storage of food, and an economy based on agriculture and herding developed. Archaeological evidence indicates that figs, lentils and broad beans were being cultivated from Neolithic times. During the Chalcolithic period (4300-3300 BCE), large pottery containers, indicative of settled peoples, appear in the archaeological record. Date palm cultivation began in the Jordan River Valley, and the earliest date pits have been discovered at Ein Gedi by the Dead Sea. In the Golan, olives trees were grown and olive oil was produced there. Chickpea cultivation dates back to the Bronze Age (3300 – 1200 BCE) and grapes and olives became important crops in the hill country. Wine and oil were traded for wheat with the cities on the coastal plain, and for meat and skins with semi-nomadic herders. Wine and carobs were also exported to Egypt during this period. At Arad in the northern Negev, the remains of wheat, barley and legumes have been found, along with stone-lined storage pits for grain from this period. Pottery was imported from Cyprus and Mycenae in Greece for the first time, probably for use as good-quality tableware. After the Bronze Age collapse of urban culture, there was an increase in herding and the disappearance of smaller agricultural communities. The Israelite presence emerged during the Early Iron Age (1200-1000 BCE), at first in the central hill country, Transjordan and the northern Negev, and later in the Galilee, while the Philistines and other Sea Peoples arrived at roughly the same time and settled in the coastal regions. Pastoralism and animal husbandry remained important, and walled open spaces in villages that probably served as paddocks have been discovered. The construction of terraces in the hills, and of additional plastered cisterns for water storage, enabled more cultivation than before. Storage pits and silos were dug into the ground to hold grain. Under the united Israelite monarchy, central store cities were built, and greater areas of the northern Negev came under cultivation. The Gezer agricultural calendar, detailing the crops that were raised, dates from this period. After the division of the Israelite kingdom, Jerusalem and a number of other cities expanded, supported by the surrounding villages and farms. These were called “daughters of” the major towns in the Hebrew Bible (for example, Josh 17:11 and Josh 15:47). Large food storage facilities and granaries were built, such as the city of Hazor. During the later Iron Age (Iron Age II) period, roughly the same period as the Israelite and Judean monarchies, olive oil and wine were produced on a large scale for commerce and export, as well as for local consumption. The ancient Israelites depended on bread, wine and oil as the basic dietary staples and this trio is often mentioned in the Bible (for example, Deut 7:13 and 2 Kings 18:32) and in other texts, such as the Samaria and Arad ostraca. Written and archaeological evidence indicate that the diet also included other products from plants, trees and animals. Seven basic agricultural products, called the Seven Species, are listed in the Bible: wheat, barley, figs, grapes, olives, pomegranates, and dates (Deut 8:8). The Bible also often describes the land of Israel as a land "flowing with milk and honey" (for example, Exod 3:8). The cuisine maintained many consistent traits based on the main products available from the early Israelite period until the Roman period, even though new foods became available during this extended time. For example, rice was introduced during the Persian era; during the Hellenistic period, as trade with the Nabateans increased, more spices became available, at least for those who could afford them, and more Mediterranean fish were imported into the cities; and during the Roman period, sugar cane was introduced. The symbolic food of the ancient Israelites continued to be important among Jews after the destruction of the Second Temple in 70 CE (AD) and the beginning of the Jewish Diaspora. Bread, wine, and olive oil were seen as direct links to the three main crops of ancient Israel—wheat, grapes, and olives. In the Bible, this trio is described as representing the divine response to human needs (Hosea 2:23–24) and particularly the need for the seasonal rains vital for the successful cultivation of these three crops. (Deuteronomy 11:13–14). The significance of wine, bread and oil is indicated by their incorporation into Jewish religious ritual, with the blessings over wine and bread for Shabbat and holiday meals and at religious ceremonies such as weddings, and the lighting of Shabbat and festival lights with olive oil.: 22–23 Characteristics The daily diet of the ordinary ancient Israelite was mainly one of bread, cooked grains, and legumes. Bread was eaten with every meal. Vegetables played a smaller, but significant role in the diet. Legumes and vegetables were typically eaten in stews. The Israelites drank goat and sheep’s milk when it was available in the spring and summer, and ate butter and cheese. Honey, both from bees and date honey, was also eaten. Figs and grapes were the fruits most commonly eaten, while dates, pomegranates, and other fruits and nuts were eaten more occasionally. Wine was the most popular beverage and sometimes other fermented beverages were produced. Meat, usually goat and mutton, was eaten rarely by most Israelites and was reserved for special occasions such as celebrations, festival meals, or sacrificial feasts. However, goat meat was consumed more than sheep meat since sheep were valued. The wealthy ate meat, including beef and venison, more frequently. Olives were used primarily for their oil, which was used raw and to cook meat and stews. Game, birds, eggs, and fish, especially fresh and saltwater fish, were also eaten, depending on availability. Non-kosher fish consumption was also very common until the first century CE.: 22–24 Most food was eaten fresh and in season. Fruits and vegetables had to be eaten as they ripened and before they spoiled. People had to contend with periodic episodes of hunger and famine; producing enough food required hard and well-timed labor, and the climatic conditions resulted in unpredictable harvests and the need to store as much food as possible. Thus, grapes were made into raisins and wine; olives were made into oil; figs, beans, and lentils were dried; and grains were stored for use throughout the year. An Israelite meal is illustrated by the biblical description of the rations that Abigail brought to David’s group: bread loaves, wine, butchered sheep, parched grain, raisins, and fig cakes (1 Samuel 25:18). Foods Grain products constituted the majority of the food consumed by the ancient Israelites. The staple food was bread, and it was such a vital part of each meal that the Hebrew word for bread, lehem, also referred to food in general. The supreme importance of bread to the ancient Israelites is also demonstrated by how Biblical Hebrew has at least a dozen words for bread, and bread features in numerous Hebrew proverbs (for example, Proverbs 20:17, Proverbs 28:19). Bread was eaten at just about every meal and is estimated to have provided from 50 to 70 percent of an ordinary person’s daily calories. The bread eaten until the end of the Israelite monarchy was mainly made from barley flour; during the Second Temple period, bread from wheat flour become predominant. Porridge and gruel were made from ground grain, water, salt, and butter. This mixture also formed the basis for cakes, to which oil, called shemen, and fruits were sometimes added before baking. The Israelites cultivated both wheat and barley; these two grains are mentioned first in the biblical list of the Seven Species of the land of Israel and their importance as food is also seen in the celebration of the barley harvest at the festival of Shavuot and of the wheat harvest at the festival of Sukkot. Rice was introduced during the early Second Temple period through contact with the Persians. By the Roman period, rice had become an important export, and the Jerusalem Talmud states about rice that “there is none like it outside Israel,” and that notable rabbis served rice at the Passover seder. Barley (hordeum vulgare) was the most important grain during the biblical period, and this was recognized ritually on the second day of Passover in the Omer offering, consisting of barley flour from the newly ripened crop. Furthermore, its significance to Israelite society, not only as a source of food, is illustrated by the biblical method for measuring a field by the amount of barley (rather than of wheat) with which it could be sown. Barley was initially predominant because it matured earlier and tolerated harsher conditions than wheat, growing in areas with less rainfall and poorer soils, such as northern Negev and the hill country. It had high yield potential and was resistant to insect infestation. It could be sown without plowing and could therefore be grown on small plots of land that oxen or even donkeys could not reach, and it did not need artificial irrigation. It ripened a month earlier than wheat and was thus available to replenish supplies used up during the winter sooner than wheat, and also provide some food security if the more vulnerable wheat crop was poor or failed. Two varieties of barley were cultivated: two-rowed, and six-rowed. Two-rowed barley was the older, hulled form; six-rowed barley was unhulled and easier to thresh, and, since the kernels remained intact, store for longer periods. Hulled barley was thus the prevalent type during the Iron Age, but gruels made from it must have had a gritty taste due to the barley’s tough outer layers. Bread was primarily made from barley flour during the Iron Age (Judges 7:13, 2 Kings 4:42), as barley was more widely and easily grown, and was thus more available, cheaper, and could be made into bread without a leavening agent even though wheat flour was regarded as superior. It was presumably made from dough that was a simple mixture of barley flour and water, divided into small pieces, formed by hand into round shapes, then baked. However, barley declined as the staple from the biblical period to a poverty food by the end of the Second Temple period, and by the Talmudic era, it was regarded mostly as animal fodder. Emmer wheat (Triticum dicoccum) was initially the most widespread variety of wheat, as it grew well in the warm climate and was resistant to fungal rot. It was high yielding, with large grains and relatively high amounts of gluten, and bread made from emmer wheat flour was thus fairly light in texture. However, emmer required time-consuming pounding or roasting to remove its husk, and during the Iron Age, durum wheat (Triticum durum), a descendant of emmer, gradually replaced emmer and became the favored grain for making fine flour. Durum grew well in the rich soil of the larger valleys of the central and northern areas of the country, where rainfall exceeded 225 millimeters per year, was higher yielding than emmer, and its grains released more easily from the chaff. It could therefore be separated from the husk without roasting or pounding first, thus reducing the work required for threshing, and also leaving most of the grains whole, which was better for longer storage. However, durum is a hard grain and was difficult to grind with the early hand-held grindstones. The flour also had to be sifted repeatedly to obtain fine flour (such as the solet required in the Temple offerings). Thus, durum was primarily used for porridges, or parboiled and dried, or roasted and boiled, and barley flour continued to be used for making bread until another hybrid of emmer, common or "bread" wheat (Triticum aestivum) replaced barley as the primary grain after the Greek conquest of the land of Israel; this together with durum wheat, became widespread during the Greco-Roman period, constituting the bulk of the grain crop by the end of the Second Temple period. The introduction of common wheat, which contained more starch and had a higher level of gluten, spread the use of wheat for bread-making and led to the production of loaves that were more lightly textured than barley and durum wheat breads. A series of developments in technology for threshing, milling, and baking improved both the quantity and the quality of the grain and the means for preparation that were available from the beginning of the Iron Age until the end of the Second Temple period. In the early Iron Age, grain was threshed to remove it from the stalks by beating it with sticks or by oxen treading on it. This usually broke most of the grain kernels, which limited their storage time because broken kernels spoil more quickly than unbroken ones. The development of the threshing-board, which was pulled over the stalks by oxen, left most of the grain kernels intact and enhanced their storage time. Numerous threshing floors and threshing boards have been discovered at archaeological sites of ancient Israel. Once separated from the stalks, the grain was used in a number of ways: Most simply, unripe kernels of grain were eaten fresh, particularly in the spring, before ripe grain was available, and both unripe and ripe grain was roasted over fire for immediate use. Ripe grains of wheat were also parboiled and dried, like modern bulgur, and then prepared as porridge. Whole or cracked grain was also used in stews and to make gruel. Most frequently, grains were ground into flour to prepare bread. Bread was the main source of nourishment in biblical times, and making bread was a daily activity: Bread-making began with the milling of the grain. It was a difficult and time-consuming task performed by women. Each household stored its own grain, and it is estimated that it required at least three hours of daily effort to produce enough flour to make sufficient bread for a family of five. The earliest milling was performed with a pestle and mortar, or a stone quern consisting of a large lower stone that held the grain and a smooth upper stone that was moved back and forth over the grains (Numbers 11:8). This often left small pieces of grit in the flour. The use of the millstone became more widespread during the Iron Age, resulting in greater speed and increased production of flour. Smaller versions for household use, the rotary or beehive quern, appeared during the early Persian period. After the grain was milled into flour, it was mixed with water and kneaded in a large trough. For dough made with wheat flour, starter, called seor, was added. The starter was prepared by reserving a small portion of dough from a previous batch to absorb the yeasts in the air and thus help leaven the new dough. Seor thus gave the bread a sourdough flavor. Once prepared, the dough could be baked in various ways: Originally, the dough was placed directly on the heated stones of a cooking fire or in a griddle or pan made of clay or iron (Leviticus 7:9). In the time of the First Temple, two types of oven were used for baking bread: the jar-oven, and the pit-oven. The jar-oven was a large pottery container, narrowing into an opening toward the top; fuel was burned on the inside to heat it and the dough was pressed against the outside to bake. The pit-oven was a clay-lined excavation in the ground in which the fuel was burned and then pushed aside before the loaves were baked on the heated surface. People also began placing a convex dome, initially earthenware and later metal, over the pit-oven and cooking the flatbreads on the dome instead of on the ash-covered surface; this type of oven is probably what was meant by the biblical machabat, often translated as "griddle". The Persians introduced a clay oven called a tanur (similar to the Indian tandoor), which had an opening at the bottom for the fire, and through which the bread was placed to be baked on the inner wall of the upper chamber from the heat of the oven and ashes after the flames had died down. This continued to be the way in which Yemenite Jews baked bread until modern times. The remains of clay ovens and fragments of bread trays have been found in several archaeological excavations. All these methods produced only thin loaves, and the custom was thus to break bread rather than cut it. The bread was soft and pliable and used for dipping and sopping up gravies and juices. The Romans introduced an oven called a furn (purni in Talmudic Aramaic), a large, wood-burning, stone-lined oven with a bottom on which the dough or baking sheet was placed. This provided a major advance in bread and pastry baking, and made the baking of much thicker loaves possible. A variety of breads were produced. Probably most common were unleavened flat loaves called ugah or kikkar. Another type was a thin wafer, known as a rakik. A thicker loaf, known as hallah, was made with the best-quality flour, usually for ritual purposes. Bread was sometimes enriched by the addition of flour from legumes (Ezekiel 4:9). The Mishna (Hallah 2:2) mentions bread dough made with fruit juice instead of water. The sugar in the juice, interacting with the flour and water, provided some leavening and sweetened the bread. The Israelites also sometimes added fennel and cumin to bread dough for flavor and dipped their bread in vinegar (Ruth 2:14), olive oil, or sesame oil for extra flavor. After grain, legumes such as lentils, broad or fava beans, chickpeas, and peas were the main element in the diet and were the main source of protein, since meat was rarely eaten. Broad beans, chickpeas, and lentils are the only legumes mentioned in the Bible but lentils, broad beans, chickpeas, fenugreek, field peas and bitter vetch have been found at Iron Age Israelite sites. By the Roman period, legumes are mentioned frequently in other texts. They are cited as one of the elements of the “wife’s food basket” in the Mishna (Ketubot 5:8), by which it is estimated that legumes supplied 17% of daily calories at that time. Lentils were the most important of the legumes and were used to make pottages and soups, as well as fried lentil cakes called ashishim, such as those that King David is described as distributing to the people when the Ark of the Covenant was brought to Jerusalem. According to Tova Dickstein, a researcher at Neot Kedumim in Israel, ashishim were honey-dipped pancakes made from crushed red lentils and sesame seeds. Stews made of lentils or beans were common and they were cooked with onion, garlic, and leeks for flavor. Fresh legumes were also roasted, or dried and stored for extended periods. They were then cooked in a soup or a stew. The Bible mentions roasted legumes (2 Samuel 17:28), and relates how Jacob prepared bread and a pottage of lentils for Esau (Genesis 25:29–34). Vegetables are not found often in the archaeological record, and it is difficult to determine the role that they played, because plant foods were often eaten raw or were simply boiled, without requiring special equipment for preparation, and thus barely leaving any trace other than the type of food itself. Vegetables also are not mentioned often in the written record, and when the Bible does mention them, the attitude is mixed: sometimes they are regarded as a delicacy, but more often, they were held in low esteem (for example, (Proverbs 15:17, Daniel 1:11–15). Vegetables were perhaps a more important food at the extremes in society: the wealthy who could afford to dedicate land and resources to grow them, and the poor who depended on gathering them in the wild to supplement their meager supplies. More people may have gathered wild plants during famine conditions. Vegetables that were commonly eaten included leeks, garlic, onions, black radishes, melons (sometimes misidentified as the cucumber) and watermelons. Other vegetables played a minor role in the diet of the ancient Israelites. Field greens and root plants were generally not cultivated and were gathered seasonally when they grew in the wild. Leafy plants included dandelion greens and the young leaves of the orach plant. Leeks, onions, and garlic were eaten cooked in stews, and uncooked with bread, and their popularity may be indicated by the observation in the Bible that they are among the foods that the Israelites yearned for after leaving Egypt. Gourds and melons were eaten raw or flavored with vinegar. Black radishes were also eaten raw when in season during the autumn and winter. The Talmud mentions the use of radish seeds to produce oil and considered eating radishes to have health benefits. Wild herbs were collected and eaten uncooked or cooked. These are known to have included garden rocket and mallow, and both leaf chicory and endive. Wild lettuce, known as chazeret, was a leafy herb with prickly, red tinged leaves that became bitter as they matured. It was cultivated from around 800 BC. Sweeter head-lettuce was only developed and introduced by the Romans. Bitter herbs eaten at the Passover sacrifice with the unleavened bread, matza, were known as merorim. Chazeret is listed in the Mishna (Pesahim 2:6) as the preferred bitter herb for this Passover ritual, along with other bitter herbs, including chicory or endive (ulshin), horehound (tamcha), reichardia or eryngo (charchavina), and wormwood (maror). Mushrooms, especially of the Boletus type, were gathered in many areas, particularly when plentiful after a major rainfall. The Talmud mentions mushrooms in connection with their exemption from tithes and as a dessert at the Passover seder. Sesame seeds were used in the preparation of oil, were eaten dry, or were added to dishes such as stews as a flavoring; the leftovers after pressing out the oil were eaten in a cake form. The Hebrew for sesame, shumshum, is related to the Akkadian samassammu, meaning "oil plant", as the seeds contain about 50% oil, which was pressed from the seeds. Sesame is not mentioned in the Bible, but the Mishna lists sesame oil as suitable for lighting the Sabbath lights, and the oil was also used for frying. Fruit was an important source of food for the Israelites, particularly grapes, olives, and figs. Grapes were grown mostly for wine, although some were eaten fresh at harvest time, or dried as raisins for storage, while olives were grown exclusively for their oil, until the Roman period. Other fruits that were eaten were the date, pomegranate, and sycamore fig. The ancient Israelites built terraces of leveled areas in the hill country for planting a variety of crops, including grains, vegetables, and fruit trees. All the trees, with the exception of the olive, produced fruit that could be eaten fresh or juiced while in season. Fruit was also processed for later use in a variety of ways: fruit with high sugar content was fermented to make alcoholic beverages; grapes were most commonly used for this. Fruit was also boiled down into thick, sweet syrup, referred to in the Bible as dvash (honey). Grapes, figs, dates, and apricots were also dried and preserved individually, put on a string, or pressed into cakes. Since dried fruit is an efficient source of energy, such were prepared as provisions for journeys and long marches. The sycamore fig, carob, mulberry, and possibly the apple were also eaten. Usually, these fruits were not cultivated but were picked in the wild when they were in season. The sycamore fig (Ficus sycamorus) was very common in the warmer parts of Israel and was grown primarily for its wood, but it provided a steady supply of small figs, eaten mainly by the poor. Other native trees producing fruits included the carob, which was probably popular due to its sweet taste, and the black mulberry. The tapuah, which means "apple" in modern Hebrew, is mentioned in the Bible, but it is not clear if this referred to another fruit, such as the apricot or quince. Almonds, walnuts, and pistachios were eaten and are mentioned in the Bible. Almonds were widespread in the region from prehistoric times, and the Bible mentions almonds (shaked) and pistachios (botnim) as among the "choice fruits of the land" sent by Jacob as a gift to the ruler of Egypt (Genesis 43:11). Almonds and pistachios were probably eaten primarily by the wealthy. The walnut reached Israel from Mesopotamia by at least 2000 BCE and is mentioned once in the Bible (Song of Solomon 6:11). Walnuts became common during the Second Temple period and so widespread that the word for walnut, egoz, became the generic Hebrew word for nut at that time. The olive is one of the biblical Seven Species and one of the three elements of the "Mediterranean triad" in Israelite cuisine. Olive oil was used not only as food and for cooking, but also for lighting, sacrificial offerings, ointment, and anointment for priestly or royal office. The olive tree was well suited to the climate and soil of the Israelite highlands, and a significant part of the hill country was allocated to the cultivation of olive trees, which were one of ancient Israel’s most important natural resources. Olive oil was more versatile and longer-lasting than the oil from other plants, such as sesame, and was also considered to be the best-tasting. Although olives were used to produce oil from the Bronze Age, it was only by the Roman period that the techniques were introduced to cure olives in lye and then brine to remove their natural bitterness and make them edible. Olives were harvested in the late summer and were processed for oil by crushing them, pressing the mash, and separating the oil from the flesh. In the early Iron Age period, this was done by treading the olives in basins cut into rock, or with a mortar or stone on a flat slab. In the later Iron Age period, the introduction of the beam press made large scale processing possible. The discovery of many ancient olive presses in various locations indicates that olive-oil production was highly developed in ancient Israel. The oil production center dating from the 7th century BC discovered at Ekron, a Philistine city, has over one hundred large olive presses and is the most complete olive oil production center from ancient times yet discovered. It indicates that ancient Israel was a major producer of olive oil for its residents and other parts of the ancient Near East, such as Egypt, and especially Mesopotamia. In addition to the large-scale olive oil production for commerce and export, presses have been found in ordinary houses, indicating that this was also a cottage industry. Archaeological remains at Masada and other sites indicate that the most common olive cultivar was the indigenous Nabali, followed by the Souri. In Roman times, other olive cultivars were imported from Syria and Egypt. There is also some written information about olive oil. The Bible describes its use in relation to certain sacrifices in which olive oil is used (for example, (Leviticus 6:13–14, Leviticus 7:9–12). However, these sacrificial "recipes" can be assumed to represent some of the everyday uses of oil and methods for cooking and frying. Olive oil was mixed with flour to make bread in the story of Elijah and the widow of Zarephath (1 Kings 17:12–13) and is also noted as a valuable product for eating (Ezekiel 16:13,19). Olive oil is also mentioned on the Samaria and Arad ostraca. The consumption of olive oil varied with social class: it was less available to the poor, but it may have become more available later in the Israelite period as the means of production improved and became more widespread. By early Roman times, the Mishna indicates that it was one of the four essential foods that a husband had to provide his wife, and it has been calculated that at a minimum, this represented about 11 percent of the overall calories supplied by the "food basket" described at that time. Grapes are another of the biblical Seven Species and were used mainly for the production of wine, although they were also eaten fresh and dried. Grapes were dried in the sun to produce raisins, which could then be stored for a long time. Raisins were also pressed into clusters and dried as cakes, which kept the interior raisins softer. Grapes were also used to produce a thick, honey-like liquid, called grape honey (dvash anavim), that was used as a sweetener. Grape honey was made by treading the grapes in vats, but instead of fermenting the liquid produced, it was boiled to evaporate its water, leaving behind the thick, grape syrup. Figs were an important source of food. Figs were cultivated throughout the land of Israel, and fresh or dried figs were part of the daily diet. A common way of preparing dried figs was to chop them and press them into a cake. Figs are one of the biblical Seven Species and are frequently mentioned in the Bible (for example, 1 Samuel 25:18, 1 Samuel 30:12 and 1 Chronicles 12:41). The remains of dried figs have been discovered from as early as the Neolithic period in Gezer, Israel and Gilgal in the Jordan Valley. The fig tree (ficus carica) grew well in the hill country and produced two crops a season. Early-ripening figs were regarded as delicacy because of their sweetness and were eaten fresh. Figs ripening in the later harvest were often dried and strung into a chain, or pressed into hard round or square-shaped cakes called develah and stored as a major source of winter food. The blocks of dried fig were sliced and eaten like bread. The Mishna mentions figs as one of the components of the prescribed "wife’s food basket" and they are estimated to have constituted 16% of the overall calories of the basket. Dates were eaten fresh or dried, but were mostly boiled into thick, long-lasting syrup called "date honey" (dvash temarim) for use as a sweetener. This syrup was prepared by soaking the dates in water for some time until they disintegrated and then boiling the resulting liquid down into thick syrup. The honey in the Biblical reference of "a land flowing with milk and honey" is probably date honey. Fresh, ripe dates were available from mid to late summer. Some were sun-dried and pressed into blocks to dry completely and then used throughout the year, especially as food for travelers. Dates were also fermented into one of the "strong drinks" referred to in the Bible as "shechar". The date palm required a hot and dry climate and mostly grew and produced fruit in the Jordan Rift Valley from Jericho to the Sea of Galilee. In these arid areas, the date was sometimes the only plant-food available and was a primary component of the diet, but it was less important elsewhere. Pomegranates were usually eaten fresh, although occasionally they were used to make juice or wine, or sun-dried for use when the fresh fruit was out of season. They probably played a minor part in Israelite cuisine but were symbolically important as adornments on the hem of the robe of the high priest and the Temple pillars and embossed on coinage; they are also listed in the Bible as one of the Seven Species of the Land of Israel. The Israelites usually ate meat from domesticated goats and sheep. Goat’s meat was the most common. Fat-tailed sheep were the predominant variety of sheep in ancient Israel, but, as sheep were valued more than goats, they were eaten less often. The fat of the tail was considered a delicacy. Beef and venison were eaten primarily by the elites, and fattened calves provided veal for the wealthy (for example, as mentioned in the Bible, Amos 6:4). For most people, meat was eaten only a few times a year when animals were slaughtered for the major festivals, or at tribal meetings, celebrations such as weddings, and for the visits of important guests (1 Samuel 28:24). Only at the king's table was meat served daily, according to the Bible. Although most meat was obtained from domesticated animals, meat from hunted animals was also sometimes available, as the story of Isaac and Esau (Genesis 27:3–4), certain Biblical lists (for example, Deuteronomy 14:5), and archaeological evidence indicate. The remains of gazelle, red deer, and fallow deer are the most commonly found in the archaeological record. Archaeological evidence from an Iron-Age market excavated at Ashkelon shows that game was also sold to those who could not hunt or trap them themselves. However, meat from wild animals was more common at times of economic distress and in the northern areas, where forests and open land provided a habitat for more wild animals. Meat was prepared in several different ways. The most common was to cook it with water as a broth or a stew (for example, Ezekiel 24:4–5). Meat stewed with onions, garlic, and leeks and flavored with cumin and coriander is described on ancient Babylonian cuneiform tablets, and it is most likely that it was prepared similarly in ancient Israel. Stewed meat was considered to be a dish worthy of serving to honored guests (Judges 6:19–20). A less common way to prepare meat was to roast it over an open fire, but this was done particularly for the meat of the Passover lamb. For long-term storage, meat was smoked, dried, or salted, according to indications in texts and ethnographic studies. The Israelites ate domesticated birds such as pigeons, turtledoves, ducks, and geese, and wild birds such as quail, and partridge. Remains from archaeological excavations at the Ophel in Jerusalem and other Iron-Age sites show that domestic birds were available, but consumption was small. The inclusion of pigeons and turtledoves in the Biblical sacrifice lists implies that they were raised domestically, and the remains of dovecotes discovered from the Greek and Roman periods confirm this. Biblical references and archaeological evidence also demonstrate that wild birds were hunted and eaten. The turtledove was present from about April to October, while the rock pigeon was available throughout the year. The pigeon appears to have been domesticated in Sumeria and Canaan during the second millennium BC, and remained the predominant fowl in ancient Israel until the end of the Second Temple period. Nonetheless, to avoid the spread of disease, pigeons could only be raised in small numbers and were thus fairly costly and not a regular part of the diet. Geese, originally domesticated in ancient Egypt, were raised in ancient Israel. They are most likely the "fattened fowl" on King Solomon’s table (1 Kings 5:3). Goose breeding is also discussed in the Mishna. Like other animals, birds were fattened for consumption on special occasions, and for the wealthy. It is unclear when chicken became part of the diet. There are some archaeological remains from Iron-Age sites, but these were likely from roosters as a fighting bird, which are also pictured on seals from the period as a symbol of ferocity, such as on the 6th-century BC onyx seal of Jaazaniah. Chicken became common around the 2nd century BC, and during the Roman period, chickens emerged as an important feature of the cuisine, with the Talmud describing it as "the choicest of birds." By Roman times, pigeons and chickens were the principal poultry. Until the domestication of the chicken, eggs were available in limited quantities and were considered a delicacy, as in ancient Egypt. The most common birds—turtledoves and pigeons—were reared for their meat and not for their very small eggs. Biblical references to eggs are only in reference to gathering them from the wild (for example, Deuteronomy 22:6–7 and Isaiah 10:14). Eggs seem to have increased in use for food only with the introduction of chickens as food and were commonly used as food by Roman times. The Israelites ate a variety of fresh and saltwater fish, according to both archaeological and textual evidence. Remains of freshwater fish from the Yarkon and Jordan rivers and the Sea of Galilee have been found in excavations, and include St. Peter’s fish and mouthbreeders. Saltwater fish discovered in excavations include sea bream, grouper, meager, and gray mullet. Most of these come from the Mediterranean Sea, but in the later Iron Age period, some are from the Red Sea. Although the Torah prohibits the consumption of fish without fins or scales, archeological evidence indicates that many Israelites flouted or were unaware of these restrictions and ate non-kosher seafood, mostly catfish but also shark, eel, and ray, and that religious restrictions on seafood began to be observed more strictly starting in the first century CE. Fishermen supplied fish to inland communities, as remains of fish, including bones and scales, have been discovered at many inland sites. To preserve them for transport, the fish were first smoked or dried and salted. Merchants also imported fish, sometimes from as far as from Egypt, where pickled roe was an export article. Remains of Nile Perch from Egypt have been found, and these must have been smoked or dried before being imported through the trade network that connected ancient Near Eastern societies. Merchants shipped fish to Jerusalem, and there was evidently a significant trade in fish; one of the gates of Jerusalem was called the Fish Gate, named for a fish market nearby (Zephaniah 1:10, Nehemiah 3:3, Nehemiah 12:39, Nehemiah 13:16, 2 Chronicles 33:14). It is unclear to what extent fish played a role in the cuisine, but it is apparent that fish became steadily more available during the Israelite and Judean monarchies. Fish products were salted and dried and sent great distances. However, even in the later Persian, Greek, and Roman periods, the cost of preserving and transporting fish must have meant that only wealthier inhabitants of the highland towns and cities could afford it, or those who lived close to the sources, where it was less expensive. In the Galilee, small-scale fishing was a fundamental component of the agrarian economy. Goats, and, to a lesser extent, sheep, provided milk for part of the year, and milk and dairy products were a significant source of food. Dairy products are mentioned in the Bible (for example, Genesis 18:8, Judges 4:19, and 2 Samuel 17:29, and a repeated description of the Land of Israel in the Bible is "a land flowing with milk and honey" (for example, Exodus 3:8, Exodus 33:3, and Joel 4:18)). Fresh milk could not be stored for long without spoiling. Typically, thick sour milk called laban was drunk because the Israelites stored the milk in skin containers, in which it curdled quickly. Milk had to be processed to preserve it. This was done by first churning it, using a goatskin or clay container to separate the butterfat from the whey. The butterfat was processed by boiling and then cooling it to make clarified butter, which could then be stored for a long time. Clarified butter was used principally for cooking and frying. Butter churns have been excavated at Beersheba, dating from the 4th century BC, and other ancient Israelite sites. Goat milk and sheep’s milk cheeses were the most prevalent types of cheese. Soft cheese was made using cloth bags filled with soured milk. The thin liquid was drained through the cloth until a soft cheese remained in the bag. A hard cheese was made from fermented soured milk: milk was poured into special moulds in which it curdled and was then hardened by drying in the sun or by heating numerous, small, cheese molds with holes for draining the whey. Cheese is not mentioned often in the Bible, but in one case, David is sent to take a gift of cheese to the commander of the army (1 Samuel 17:18). The Mishna and Talmud mention using the sap of fruit trees, such as figs, to harden cheese (a method still used by nomadic herders of the region until modern times). Using fig sap instead of animal enzymes to make cheese also conformed to the prohibition on mixing meat and milk. Fruit syrup called dvash served as the primary sweetener and was most often made from dates. It was not until Talmudic times that the word "dvash" now translated as "honey", generally meant bee honey. The Biblical term dvash usually did not mean bee honey, but thick syrup obtained from grapes, figs, or dates. This syrup was similar to the date syrup, or halek, that many Jews continue to use in modern times. The Biblical references to "honey from the crag" (Deuteronomy 32:13) or "honey from the rock" (Psalms 81:17) could refer either to fig honey, as fig trees commonly grew in rocky outcrops, or to honey collected from wild bees, which made their nests in these places, as they still do in the region until today. The Bible refers to honey from bees in only a few instances, for example, when Samson eats honey which bees made in the carcass of a lion (Judges 14:8–9) and when Jonathan eats honey from a honeycomb (1 Samuel 14:25–27), and these references are to honey obtained from the wild. Nonetheless, the oldest archaeological find relating to beekeeping discovered to date is an apiary dating from about 900 BC at Rehov, a Bronze-Age and Iron-Age site in the Jordan Valley. The hives, made of straw and unbaked clay, could have housed more than a million bees, and indicate that honey was produced on a large scale. It is most likely that the inhabitants of Tel Rehov imported the bees from Anatolia, because they were less aggressive than the local bees and produced a higher yield of honey. It is also possible that the domestication of bees for honey production was introduced from Egypt during the Iron Age, and honey was being obtained from domesticated bees from late in the Iron Age period. The most common and important seasoning was salt (Job 6:6), demonstrated by how it is referenced throughout the Bible, and by how its use was mandated with most sacrifices (Leviticus 2:13). Salt was obtained from the Mediterranean or the Dead Sea. It was produced by evaporating seawater from both natural and artificially created drying pans along the Mediterranean coast. It was also obtained by mining salt deposits, such as at Sodom near the Dead Sea. Salt had to be transported to other locations, so most communities had to purchase it. Food was also flavored by plants, most native to the region and either cultivated, or gathered in the wild, although a few spices were imported. Garlic, onions, and possibly fenugreek were used to season cooked foods, as well as being eaten as vegetables. Herbs and spices included capers, coriander, cumin, black cumin, dill, dwarf chicory, hyssop, marjoram, mint, black mustard, reichardia, saffron, and thyme. Some seasonings were imported, such as myrrh, galbanum, saffron, and cinnamon, but their high cost limited their widespread use. Spices for special feasts were imported by the wealthy and royalty from Arabia and India and were highly valued. These included various types of pepper, and ginger. Another seasoning was vinegar, which was produced by extra fermentation of new wine. It was used for seasoning foods, pickling vegetables, and medicinal purposes. Drinks The Israelites usually drank water drawn from wells, cisterns, or rivers. They also drank milk (for example, as mentioned in the Bible in Judges 5:25), often in the form of sour milk, thin yogurt, or whey, when it was available in the spring and summer. They drank fresh juices from fruits in season as well. The most strongly preferred beverage was wine, although some beer may have also been produced, and wine was an important part of the diet and a source of calories, sugar, and iron. Making wine was also a practical way to preserve fruit juices for long-term storage. Usually, wine was made from grapes for everyday use, as well as for rituals, such as sacrificial libations. Less often, wine was made from pomegranates and dates. The Mediterranean climate and soil of the mountainous areas of the area are well suited to viticulture, and both archaeological evidence and written records indicate the significant cultivation of grapes in ancient Israel and the popularity of wine-drinking. The production capacity apparent from archaeological remains and the frequent biblical references to wine suggest that it was the principal alcoholic beverage of the ancient Israelites. Based on the remains of wine production facilities and storage rooms, it has been estimated that on average, people could have consumed one liter of wine per person per day. Many rock-hewn winepresses and vats, dating to the biblical period, have been found. One typical example at Gibeon has a wide surface for treading the grapes and a series of collecting vats. Archaeological finds at Ashkelon and Gibeon indicate large-scale wine production in the 8th and 7th centuries BC, which most likely developed to supply the Assyrian empire, and then the Babylonians, as well as the local population. Vineyards are mentioned many times in the Bible, including in detailed descriptions of the method for establishing a vineyard (Isaiah 5:1–2) and the types of vines (Ezekiel 17:6–8). The Bible refers to several types of wine, and one of the Arad ostraca also mentions wine among the supplies being sent to a garrison of soldiers. Another indication of the importance of wine in ancient Israel is that Hebrew contains numerous terms for various stages and types of vines, grape varieties, and words for wine. The word yayin was used both as a generic word for wine and as a term for wine in its first year, once it had undergone sufficient fermentation from the initial stage, when it was called tirosh. The type of wine was determined by the grapes, the time allowed for fermentation, and the age of the wine. The often coarse and unrefined taste of ancient wine was adjusted to make it more drinkable. Spices were added directly to the wine to improve the aroma, and other ingredients, such as honey, pepper, herbs, and even lime, resin, or seawater were added to improve the flavor or disguise a poor-tasting wine. Wine was also sweetened by the addition of grape juice syrup. Wine was also sometimes given an aroma by rubbing the winepress with wood resin. Wine could also be added to drinking water to improve the taste, especially towards the end of the summer when rainwater had been standing in a cistern for at least six months. This also had the beneficial effect of lowering the bacterial content of the water. After the grape harvest in mid-summer, most grapes were taken to wine presses to extract their juice for winemaking. Once fermented, wine was transferred to wineskins or large amphorae for storage. Israelite amphorae were typically tall with large handles and little decoration, and the handles were often inscribed with the name of the city in which the wine had been produced, the winemaker’s stamp, and sometimes the year and the vintage. Amphorae made long-term storage possible, especially in caves or cool cellars. Glass bottles were introduced only in the 1st century AD by the Romans. The insides of amphorae were often coated with a preservative resin, such as from the terebinth, and this imparted a pine flavor and aroma to the wine. Before the jars were sealed with pitch, they were filled completely and often topped with a thin layer of olive oil to prevent spoilage due to exposure to air. During the Greek period, the style of winemaking changed. Ripe grapes were first dried to concentrate the sugars, and these then produced a much sweeter and higher alcohol content wine that needed to be diluted with water to be drinkable. Before this, watered-down wine was disparaged, but by the time of the Talmud, wine that did not require dilution with water was considered unfit for consumption. Beer, produced by brewing barley, was another alcoholic beverage common in the ancient Near East. Beer was the primary beverage of ancient Egypt and Mesopotamia and it can be assumed that in Israel, which is located between the two, beer was also known. The biblical term sekhar may refer to beer or to alcoholic drinks in general. The production of bread and beer were closely linked, since barley was the same key ingredient used for both, and most of the tools used in beer production, such as mortars, querns and winnowing baskets were also the same as for bread making. Archaeological evidence specific to beer making is thus uncommon, and earlier indications were that the ancient Israelites did not often drink beer. More recently, Iron-Age sites in Israel have produced remains such as beer jugs, bottles, strainers and stoppers, all of which provide evidence that the Israelites drank beer. Nonetheless, the widespread cultivation of grapes, used primarily for winemaking, indicates that wine drinking was probably far more common than beer drinking. Cooking, storage, and preservation Storing water and food was critical for survival, and particularly, being able to store enough food for use from one harvest to the next. To protect grain from damp conditions and vermin, underground granaries were used for the bulk storage of grain. Families also stored grain, wine and oil in large pottery jars in their houses. When well protected, wheat, barley, legumes and nuts could be kept for long periods. Rainwater from roofs and courtyards was collected in cisterns to supplement natural sources like springs and wells. Fermentation, oil extraction and drying were all ways of converting food into products that could be stored. Feeding crops to animals was also a means of "storage on the hoof" with the animals converting the fodder into meat or milk. Food was cooked in pots made of clay and placed on earthenware stands built in a horseshoe shape, so that the fire could be lit through the opening, under the pot, or pots were suspended above the fire from tripods. Cooked food included soups and stews that were a mixture of meat and vegetables. Beans and lentils were likely to have been cooked several times a week. However, vegetables, such as melons, garlic, leek and onions were also eaten uncooked. Culinary customs Meals eaten by the Israelites fell into two categories: daily meals, and festive or ritual meals. Daily meals were prepared by women. Two daily meals were usually eaten by the family, either in the home or in the field. The first meal was eaten in the late morning, as a break in the workday, and could include roasted grain, olives, figs or some other fruit, bread dipped in olive oil or vinegar, or bread eaten with garlic, onions, or black radishes for flavor, and water or wine. A description in the Book of Ruth provides an example of this kind of meal: the harvest workers eat bread dipped in vinegar and parched or roasted grain (Ruth 2:14). Agricultural workers, who comprised the largest part of the population, also ate a light meal in the early morning before leaving for their work in the fields (Proverbs 31:15). The second meal was the main meal of the day and was eaten in the evening. In addition to bread, it typically included soup or a stew of vegetables or legumes, served in a common pot into which everyone dipped their bread. Also served from time to time were cheese and fruits such as fresh figs and melon when in season, as well as dried fruits. Water, wine, and milk could also accompany the meal. Small bowls were used for both eating and drinking. Small jugs contained condiments like olive oil, vinegar, and sweeteners. Wide-mouthed pitchers held water and milk, while spouted decanters with narrow, ridged necks with built-in strainers held wine. Festive meals were held to mark significant occasions, entertain important guests, or as sacrificial or ritual meals. The meal was prepared by both men and women. Meat was always served at these meals and many people participated so that there would be no leftovers that would go to waste. Ritual feasts and banquets in ancient Israel, and the ancient Near East in general, were important for building social relationships and demonstrating status, transacting business and concluding agreements, enlisting divine help, or showing thanks, devotion or propitiation to a deity, and for conveying social instruction. These meals were imbued with significance by the occasion and were a time for entertainment and enjoyment. Festive meals were held only from time to time, but they are the ones recorded by biblical and extra-biblical sources. Many biblical stories are set within the context of a meal, such as the accounts of the food Abraham prepares for his visitors (Genesis 18:1–8), the stew which Jacob prepares for his father, Isaac, and the Passover meal (Exodus 12). In the story of Abraham hosting the three visitors, Abraham offers cakes, a well prepared young calf, curds, and milk. This meal has similar elements to an earlier meal described in the story of Sinuhe, an Egyptian nobleman who lived for a time in Canaan around 1900 BCE, at which bread, wine, cooked meat, roast fowl, and dairy products were served. One of the distinguishing features of the meals of the wealthier social class, as illustrated in the stories of Abraham and Sinuhe, was the more frequent consumption of meat. A description of the provisions for Solomon's kitchen also illustrates this: "Solomon's daily provisions consisted of 30 kor of fine flour and 60 kor of flour, 10 fat oxen, 20 pasture-fed oxen, and 100 sheep and goats, in addition to deer and gazelles, roebucks and fattened geese" (1 Kings 5:2–3). This account describes the provisions that were possible to obtain for those with the resources to purchase them and indicates they were sufficient to provide sumptuous meals for thousands of people. Another example of a lavish meal celebrating an important occasion is the inauguration of the Temple by Solomon (1 Kings 8:65, 2 Chronicles 7:8). Similar meals are described regarding Hezekiah's temple consecration (2 Chronicles 29:31–35) and Passover celebration (2 Chronicles 30:23–24). In contrast to the simplicity of the daily fare of ordinary people, the cuisine of the royal courts of the ancient Near East was sophisticated, and it is assumed that the dishes served at the table of King Solomon and other Israelite kings were also elaborate. King David had officials who were in charge of wine cellars, olive stores, cattle, olive and fig trees (1 Chronicles 27:27–31) and the royal kitchen was a complex organization. The kings of Israel are recorded as having displayed an extraordinary measure of royal hospitality, like other kings of the ancient Near East who held elaborate banquets. Solomon’s royal table is described as providing such a variety of foods that the Queen of Sheba is said to have been amazed that the reports of Solomon’s wealth did not exceed what she had seen (1 Kings 10:4–7). Royal entertainment in Israel included music (Ecclesiastes 2:8), large numbers of guests (1 Kings 18:19), and presumably many servers and cupbearers, though these are not expressly mentioned in the Bible. Feasts and banquets were important social and political tools throughout Israel’s history, especially in the early years of the Israelite monarchy, when an invitation to the king’s table was important for creating and maintaining political support and was also an important marker of social status and influence. Regular meals too, developed as expressions of common identity, social unity and communal celebration. By the Roman period, Jewish communities came together at banquets for both food and company and the weekly Sabbath meal was an occasion for families to gather and enjoy both food and company. The practice of hospitality was a fundamental custom of Israelite society and serving food was integral to the hosting of guests. Additionally, in ancient Israel, the belief that God had delivered Israel from slavery resulted in the social imperative and religious commandment to look after guests and strangers as an act of recognition and gratitude. The importance of hospitality to the Israelites can be inferred from the texts of the Bible, in numerous instances, including the stories of Abraham hosting the messengers, Gideon’s call to leadership (Judges 6:19), the hospitality of the woman from Zarephath towards the Prophet Elijah (1 Kings 17:8–16) and the Shunammite woman towards Elisha (2 Kings 4:8–11), David’s hosting of Mephiboshet, son of Jonathan (2 Samuel 9:6–7) and Hezekiah’s invitation to the people of the northern kingdom of Israel to celebrate the Passover in Jerusalem (2 Chronicles 30). Meals at which important guests were present were viewed as special occasions, and as such, meat was served. The order in which the guests were served indicated the recognition of the social status of the guest. The choice of meat and dishes indicated the importance of the occasion. The Bible illustrates this in relating how Samuel hosted Saul, who, seated at the head of the hall is served first with a portion of meat that has been especially reserved for him (1 Samuel 9:22–24). Certain parts of the animal, such as the breast and the right thigh, were considered to be the best portions and were reserved for the most honored participants in the meal. Guests were always served before family members. The host would also sit with the guests to encourage them to eat and see to all their needs, as related in the story of Abraham, who waited on his visitors while they ate. Sacrificial meals were eaten when a portion of a sacrifice was reserved for the priest (kohen) or the ordinary Israelite who brought the offering was permitted to eat a portion with his family at a festive meal. The offerings considered "most holy" were eaten by the males of the priests in the court of the Temple sanctuary (Leviticus 7:9–10). The meal was considered to be a part of the priest’s duties. Other offerings could be eaten by the priests with their families in any ritually clean place (Leviticus 10:14). The ordinary Israelite had to eat his share within a fixed time, with his family, guests, and any Levites and strangers that he invited. Depending on the type of sacrifice, the animals that were brought as sacrifices could be a lamb, kid, goat, ram, calf, bull or cow; bird offerings were doves and turtledoves (pigeons). Of these, the guilt offering (asham) (Leviticus 5) and the communal peace offering (shalmei tzibur) (Leviticus 23:19–29) were eaten only by the male priests (kohanim). Other offerings, such as the Firstborn offering (Numbers 18:17–18), could be eaten by the priests and other members of their households, while for the personal peace offering (shalmei yachid) (Leviticus 3) and Thanksgiving offering (Leviticus 7:31–34), the breast and thigh meat were eaten by the priests and other members of their households and the remainder by ordinary Israelites. The Tithe offering (Leviticus 27:32) could be eaten by anyone and the Passover offering (Exodus 12) was eaten by all who had purchased a share in the sacrifice. Meal offerings called mincha all consisted primarily of flour and were either completely or partially burned on the altar. Those not entirely burned on the altar were eaten by the priests. Some mincha offerings were fried or baked before being offered. Types of mincha included fine flour (solet) mixed with oil and of which a portion was given to the kohen; flour mixed with oil and fried on a griddle or on a pan; bread called challot mixed with oil and baked in an oven; and wafers (rekikim) smeared with oil baked in an oven. There were also baked goods, all made of wheat flour and baked in an oven, which were not burned on the altar. These were the twelve unleavened and specially shaped showbreads, eaten by the priests after they had been displayed; two loaves of leavened bread prepared for the festival of Shavuot and eaten by the priests; thanksgiving breads, which included leavened bread, unleavened bread, unleavened wafers and scalded loaves, with one of each kind given to the priests and the remainder eaten by the owner and guests; and the unleavened loaves and wafers accompanying the Nazirite’s sacrificial ram, one of each kind given to the priests and the remainder eaten by the Nazirite and guests. Whole extended families or clans also participated in a sacrifice that was offered on occasions such as the New Moon, and it is referred to as both the "sacrifice of days" and a kinship sacrifice. In the early Israelite period, before the centralization of sacrificial offerings as an exclusive part of the Temple services, these sacrifices were offered at various locations. David is described as leaving Saul’s table to participate with his family in Bethlehem (1 Samuel 20:6) and Elkanah goes to Shiloh to participate with his household in the annual sacrifice (1 Samuel 1:21). Perhaps the oldest and most important feast celebrated by the Jews is the Passover. The original feast, with its origins in the story of the Exodus, consisted of a sacrificial lamb, bitter herbs and unleavened bread eaten by each family at home. Under the Israelite monarchy, and with the establishment of the Temple in Jerusalem, the sacrifice and celebration of Passover became centralized as one of the three pilgrimage festivals. Families who were able to travel to Jerusalem ate the Passover meal together in Jerusalem. Those who could not make the pilgrimage celebrated the holiday by holding a special meal and observing the Feast of Unleavened Bread. Prohibitions There are no biblical lists containing forbidden plants, so it can be assumed that any plant or fruit was permissible as food, with their use limited only by taste or toxicity (for example, 2 Kings 4:39–40) and the fulfillment of religious requirements such as the tithes. In addition to requiring that certain foods be eaten for sacred purposes, the Israelite diet was shaped by religious practices which prohibited the consumption of certain foods, both in terms of the animals permissible for eating, and the manner of their preparation. The cuisine of the Israelites thus differed from that of their neighbors in significant ways. For example, ancient Mesopotamian recipes describe foods cooked with animal blood and milk added to meat stews; this would have been avoided by the ancient Israelites. Only animals specifically slaughtered for food or for use in the sacrificial service could be eaten. Detailed lists of which animals, birds, and fish could be eaten and which were prohibited appear in the Bible (Leviticus 11 and Deuteronomy 14:3–21), and animal bones found in the archaeological record tend to support this, with some exceptions. For the Israelites, food was one way for self-definition. While it is impossible to know to what extent dietary laws were observed, self-definition is most likely the basis for certain biblical lists of different kinds of animals permitted or forbidden for consumption. The taboo against eating certain animals, particularly the pig, may have developed from the early Iron Age. Archaeological evidence from various sites shows that the consumption of pork, while limited, was higher in the early Iron Age but had mostly disappeared during the later Iron Age. Sites in the highlands and the coastal plains show low levels of pig utilization in the early Iron Age, but on the coastal plain, excavations such as Ekron show a higher consumption of pig; this is usually associated with the arrival of the Philistines. However, even at Philistine sites, pig remains were a small proportion of the bones discovered, and these decline after the initial period of settlement. This may have been due to unsuitable environmental factors for raising pigs. At archaeological excavations at Mount Ebal in Samaria, from the period immediately after the Israelite conquest, animal bones discovered were only from animals considered permissible, such as cattle, sheep, goats, and deer. In addition, some taboos did not relate to the source of the food but to the way in which they were prepared, as in the prohibition against boiling a young goat in its mother’s milk (and mentioned in the Bible in three separate instances: Exodus 23:19, Exodus 34:26, Deuteronomy 14:21). Milk and its by-products served as offerings in Near-Eastern pagan worship to gods and kings. Milk was used in connection with the phenomenon of reproduction, and a goat kid would be cooked in its mother's milk. Thus, the Israelite practice was to avoid an act similar to that carried out by the Canaanites as part of their cult worship (Ezra 9:1). The Israelites believed that since an animal’s blood represented its life, its blood should not be consumed (Deuteronomy 12:23–24). The blood of a slaughtered animal was thus drained before the meat was used and the blood itself was not used as a cooking liquid or drink. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Planner_(programming_language)] | [TOKENS: 1458]
Contents Planner (programming language) Planner (often seen in publications as "PLANNER" although it is not an acronym) is a programming language designed by Carl Hewitt at MIT, and first published in 1969. First, subsets such as Micro-Planner and Pico-Planner were implemented, and then essentially the whole language was implemented as Popler by Julian Davies at the University of Edinburgh in the POP-2 programming language. Derivations such as QA4, Conniver, QLISP and Ether (see scientific community metaphor) were important tools in artificial intelligence research in the 1970s, which influenced commercial developments such as Knowledge Engineering Environment (KEE) and Automated Reasoning Tool (ART). Procedural approach versus logical approach The two major paradigms for constructing semantic software systems were procedural and logical. The procedural paradigm was epitomized by Lisp which featured recursive procedures that operated on list structures. The logical paradigm was epitomized by uniform proof procedure resolution-based derivation (proof) finders. According to the logical paradigm it was “cheating” to incorporate procedural knowledge. Procedural embedding of knowledge Planner was invented for the purposes of the procedural embedding of knowledge and was a rejection of the resolution uniform proof procedure paradigm, which Planner was a kind of hybrid between the procedural and logical paradigms because it combined programmability with logical reasoning. Planner featured a procedural interpretation of logical sentences where an implication of the form (P implies Q) can be procedurally interpreted in the following ways using pattern-directed invocation: In this respect, the development of Planner was influenced by natural deductive logical systems (especially the one by Frederic Fitch ). Micro-planner implementation A subset called Micro-Planner was implemented by Gerry Sussman, Eugene Charniak and Terry Winograd and was used in Winograd's natural-language understanding program SHRDLU, Eugene Charniak's story understanding work, Thorne McCarty's work on legal reasoning, and some other projects. This generated a great deal of excitement in the field of AI. It also generated controversy because it proposed an alternative to the logic approach that had been one of the mainstay paradigms for AI. At SRI International, Jeff Rulifson, Jan Derksen, and Richard Waldinger developed QA4 which built on the constructs in Planner and introduced a context mechanism to provide modularity for expressions in the database. Earl Sacerdoti and Rene Reboh developed QLISP, an extension of QA4 embedded in INTERLISP, providing Planner-like reasoning embedded in a procedural language and developed in its rich programming environment. QLISP was used by Richard Waldinger and Karl Levitt for program verification, by Earl Sacerdoti for planning and execution monitoring, by Jean-Claude Latombe for computer-aided design, by Nachum Dershowitz for program synthesis, by Richard Fikes for deductive retrieval, and by Steven Coles for an early expert system that guided use of an econometric model. Computers were expensive. They had only a single slow processor and their memories were very small by comparison with today. So Planner adopted some efficiency expedients including the following: The genesis of Prolog Gerry Sussman, Eugene Charniak, Seymour Papert and Terry Winograd visited the University of Edinburgh in 1971, spreading the news about Micro-Planner and SHRDLU and casting doubt on the resolution uniform proof procedure approach that had been the mainstay of the Edinburgh Logicists. At the University of Edinburgh, Bruce Anderson implemented a subset of Micro-Planner called PICO-PLANNER, and Julian Davies (1973) implemented essentially all of Planner. According to Donald MacKenzie, Pat Hayes recalled the impact of a visit from Papert to Edinburgh, which had become the "heart of artificial intelligence's Logicland," according to Papert's MIT colleague, Carl Hewitt. Papert eloquently voiced his critique of the resolution approach dominant at Edinburgh "…and at least one person upped sticks and left because of Papert." The above developments generated tension among the Logicists at Edinburgh. These tensions were exacerbated when the UK Science Research Council commissioned Sir James Lighthill to write a report on the AI research situation in the UK. The resulting report [Lighthill 1973; McCarthy 1973] was highly critical although SHRDLU was favorably mentioned. Pat Hayes visited Stanford where he learned about Planner. When he returned to Edinburgh, he tried to influence his friend Bob Kowalski to take Planner into account in their joint work on automated theorem proving. "Resolution theorem-proving was demoted from a hot topic to a relic of the misguided past. Bob Kowalski doggedly stuck to his faith in the potential of resolution theorem proving. He carefully studied Planner.”. Kowalski states "I can recall trying to convince Hewitt that Planner was similar to SL-resolution." But Planner was invented for the purposes of the procedural embedding of knowledge and was a rejection of the resolution uniform proof procedure paradigm. Colmerauer and Roussel recalled their reaction to learning about Planner in the following way: "While attending an IJCAI convention in September ‘71 with Jean Trudel, we met Robert Kowalski again and heard a lecture by Terry Winograd on natural language processing. The fact that he did not use a unified formalism left us puzzled. It was at this time that we learned of the existence of Carl Hewitt’s programming language, Planner. The lack of formalization of this language, our ignorance of Lisp and, above all, the fact that we were absolutely devoted to logic meant that this work had little influence on our later research." In the fall of 1972, Philippe Roussel implemented a language called Prolog (an abbreviation for PROgrammation en LOGique – French for "programming in logic"). Prolog programs are generically of the following form (which is a special case of the backward-chaining in Planner): Prolog duplicated the following aspects of Micro-Planner: Prolog also duplicated the following capabilities of Micro-Planner which were pragmatically useful for the computers of the era because they saved space and time: Use of the Unique Name Assumption and Negation as Failure became more questionable when attention turned to Open Systems. The following capabilities of Micro-Planner were omitted from Prolog: Prolog did not include negation in part because it raises implementation issues. Consider for example if negation were included in the following Prolog program: The above program would be unable to prove not P even though it follows by the rules of mathematical logic. This is an illustration of the fact that Prolog (like Planner) is intended to be a programming language and so does not (by itself) prove many of the logical consequences that follow from a declarative reading of its programs. The work on Prolog was valuable in that it was much simpler than Planner. However, as the need arose for greater expressive power in the language, Prolog began to include many of the capabilities of Planner that were left out of the original version of Prolog. References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/KHz] | [TOKENS: 1830]
Contents Hertz The hertz (symbol: Hz) is the unit of frequency in the International System of Units (SI), often described as being equivalent to one event (or cycle) per second.[a] The hertz is an SI derived unit whose formal expression in terms of SI base units is 1/s or s−1, meaning that one hertz is one per second or the reciprocal of one second. It is used only in the case of periodic events. It is named after Heinrich Rudolf Hertz (1857–1894), the first person to provide conclusive proof of the existence of electromagnetic waves. For high frequencies, the unit is commonly expressed in multiples: kilohertz (kHz), megahertz (MHz), gigahertz (GHz), terahertz (THz). Some of the unit's most common uses are in the description of periodic waveforms and musical tones, particularly those used in radio- and audio-related applications. It is also used to describe the clock speeds at which computers and other electronics are driven. The units are sometimes also used as a representation of the energy of a photon, via the Planck relation E = hν, where E is the photon's energy, ν is its frequency, and h is the Planck constant. Definition The hertz is defined as one per second for periodic events. The International Committee for Weights and Measures defined the second as "the duration of 9192631770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom" and then adds: "It follows that the hyperfine splitting in the ground state of the caesium 133 atom is exactly 9192631770 hertz, νhfs Cs = 9192631770 Hz." The dimension of the unit hertz is 1/time (T−1). Expressed in base SI units, the unit is the reciprocal second (1/s). In English, "hertz" is also used as the plural form. As an SI unit, Hz can be prefixed; commonly used multiples are kHz (kilohertz, 103 Hz), MHz (megahertz, 106 Hz), GHz (gigahertz, 109 Hz) and THz (terahertz, 1012 Hz). One hertz (i.e. one per second) simply means "one periodic event occurs per second" (where the event being counted may be a complete cycle); 100 Hz means "one hundred periodic events occur per second", and so on. The unit may be applied to any periodic event—for example, a clock might be said to tick at 1 Hz, or a human heart might be said to beat at 1.2 Hz. The occurrence rate of aperiodic or stochastic events is expressed in reciprocal second or inverse second (1/s or s−1) in general or, in the specific case of radioactivity, in becquerels.[b] Whereas 1 Hz (one per second) specifically refers to one cycle (or periodic event) per second, 1 Bq (also one per second) specifically refers to one radionuclide event per second on average. Even though frequency, angular velocity, angular frequency and radioactivity all have the dimension T−1, of these only frequency is expressed using the unit hertz. Thus a disc rotating at 60 revolutions per minute (rpm) is said to have an angular velocity of 2π rad/s and a frequency of rotation of 1 Hz. The correspondence between a frequency f with the unit hertz and an angular velocity ω with the unit radians per second is The hertz is named after Heinrich Hertz. As with every SI unit named after a person, its symbol starts with an upper case letter (Hz), but when written in full, it follows the rules for capitalisation of a common noun; i.e., hertz becomes capitalised at the beginning of a sentence and in titles but is otherwise in lower case. History The hertz is named after the German physicist Heinrich Hertz (1857–1894), who made important scientific contributions to the study of electromagnetism. The name was established by the International Electrotechnical Commission (IEC) in 1935. It was adopted by the General Conference on Weights and Measures (CGPM) (Conférence générale des poids et mesures) in 1960, replacing the previous name for the unit, "cycles per second" (cps), along with its related multiples, primarily "kilocycles per second" (kc/s) and "megacycles per second" (Mc/s), and occasionally "kilomegacycles per second" (kMc/s). The term "cycles per second" was largely replaced by "hertz" by the 1970s.[failed verification] In some usage, the "per second" was omitted, so that "megacycles" (Mc) was used as an abbreviation of "megacycles per second" (that is, megahertz (MHz)). Applications Sound is a traveling longitudinal wave, which is an oscillation of pressure. Humans perceive the frequency of a sound as its pitch. Each musical note corresponds to a particular frequency. An infant's ear is able to perceive frequencies ranging from 20 Hz to 20000 Hz; the average adult human can hear sounds between 20 Hz and 16000 Hz. The range of ultrasound, infrasound and other physical vibrations such as molecular and atomic vibrations extends from a few femtohertz into the terahertz range[c] and beyond. Electromagnetic radiation is often described by its frequency—the number of oscillations of the perpendicular electric and magnetic fields per second—expressed in hertz. Radio frequency radiation is usually measured in kilohertz (kHz), megahertz (MHz), or gigahertz (GHz), with the latter known as microwaves. Light is electromagnetic radiation that is even higher in frequency, and has frequencies in the range of tens of terahertz (THz, infrared) to a few petahertz (PHz, ultraviolet), with the visible spectrum being 400–790 THz. Electromagnetic radiation with frequencies in the low terahertz range (intermediate between those of the highest normally usable radio frequencies and long-wave infrared light) is often called terahertz radiation. Even higher frequencies exist, such as that of X-rays and gamma rays, which can be measured in exahertz (EHz). For historical reasons, the frequencies of light and higher frequency electromagnetic radiation are more commonly specified in terms of their wavelengths or photon energies: for a more detailed treatment of this and the above frequency ranges, see Electromagnetic spectrum. Current[when?] observations of gravitational waves are conducted in the 30–7000 Hz range by laser interferometers like LIGO, and the nanohertz (1–1000 nHz) range by pulsar timing arrays. Future space-based detectors are planned to fill in the gap, with LISA operating from 0.1–10 mHz (with some sensitivity from 10 μHz to 100 mHz), and DECIGO in the 0.1–10 Hz range. In computers, most central processing units (CPU) are labeled in terms of their clock rate expressed in megahertz (MHz) or gigahertz (GHz). This specification refers to the frequency of the CPU's master clock signal. This signal is nominally a square wave, which is an electrical voltage that switches between low and high logic levels at regular intervals. As the hertz has become the primary unit of measurement accepted by the general populace to determine the performance of a CPU, many experts have criticized this approach, which they claim is an easily manipulable benchmark. Some processors use multiple clock cycles to perform a single operation, while others can perform multiple operations in a single cycle. For personal computers, CPU clock speeds have ranged from approximately 1 MHz in the late 1970s (Atari, Commodore, Apple computers) to up to 6 GHz in IBM Power microprocessors. Various computer buses, such as the front-side bus connecting the CPU and northbridge, also operate at various frequencies in the megahertz range. SI multiples Higher frequencies than the International System of Units provides prefixes for are believed to occur naturally in the frequencies of the quantum-mechanical vibrations of massive particles, although these are not directly observable and must be inferred through other phenomena. By convention, these are typically not expressed in hertz, but in terms of the equivalent energy, which is proportional to the frequency by the factor of the Planck constant. Unicode The CJK Compatibility block in Unicode contains characters for common SI units for frequency. These are intended for compatibility with East Asian character encodings, and not for use in new documents (which would be expected to use Latin letters, e.g. "MHz"). See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Bogle] | [TOKENS: 786]
Contents Bogle A bogle, boggle, or bogill is a Northumbrian, Cumbrian and Scots term for a ghost or folkloric being, used for a variety of related folkloric creatures including Shellycoats, Barghests, Brags, the Hedley Kow and even giants such as those associated with Cobb's Causeway (also known as "ettins", "yetuns" or "yotuns" in Northumberland and "Etenes", "Yttins" or "Ytenes" in the South and South West). They are reputed to live for the simple purpose of perplexing mankind, rather than seriously harming or serving them. Etymology The name is derived from the Middle-English Bugge (from which the term bogey is also derived) which is in turn a cognate of the German term word bögge (from which böggel-mann ("Goblin") is derived) and possibly the Norwegian dialect word bugge meaning "important man". The Welsh Bwg could also be connected, and was thought in the past to be the origin of the English term; however, it has been suggested that it is itself a borrowing from Middle English. The Irish Gaelic word "bagairt" meaning "threat" could also be related.[citation needed] Terms such as ettin and yotun are derived from Middle English eten, etend, from Old English eoten (“giant, monster, enemy”), from Proto-Germanic *etunaz (“giant, glutton”), from Proto-Indo-European *h₁ed- (“to eat”) and is cognate with Old Norse jötunn. Usage One of the most famous usages of the term was by Gavin Douglas, who was in turn quoted by Robert Burns at the beginning of Tam O' Shanter: Of Brownyis and of Bogillis full is this Buke. There is a popular story of a bogle known as Tatty Bogle, who would hide himself in potato fields (hence his name) and either attack unwary humans or cause blight within the patch. This bogle was depicted as a scarecrow, "bogle" being an old name for "scarecrow" in various parts of England and Scotland. Another popular Scottish reference to bogles comes in The Bogle by the Boor Tree, a Scots poem written by W. D. Cocker. In this ghostly ode, the Bogle is heard in the wind and in the trees to "fricht wee weans" (frighten small children). In the Scottish Lowlands circa 1950, a bogle was a ghost as was a bogeyman, and a Tattie-Bogle was a scarecrow, used to keep creatures out of the potato fields. All three words were in common use among the children. It is unclear what the connection is between "Bogle" and various other similarly named creatures in various folklores. The "Bocan" of the Highlands may be a cognate of the Norse Puki however, and thus also the English "Puck". The Larne Weekly Reporter of 31 March 1866, in County Antrim, Northern Ireland, carried a front-page article entitled Bogles in Ballygowan, detailing strange goings on in a rural area where a particular house became the target for missiles being thrown through windows and on one occasion through the roof. Local people were terrified. The occurrences appeared to have ceased after several months and were being blamed on the fact that the house in question had been refurbished using materials from an older house that was apparently the preserve of the "little people". This is one of the few references in Northern Ireland to "bogles" although the phrase "bogey man" is widely used. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Brag_(folklore)] | [TOKENS: 197]
Contents Brag (folklore) A brag is a mischievous shapeshifting goblin in the folklore of Northumbria (Northumberland and Durham) and often takes the form of a horse or donkey. It is fond of letting unsuspecting humans ride on its back before bucking them off into a pond or bush and running away laughing. One notable example is the Picktree Brag that was said to take other unusual forms such as a calf with a white handkerchief around its neck, a naked headless man, and even four men holding a white sheet. A brag at Humbleknowe was never seen but made hideous noises in the night. Popular culture Gary the Horse, from the webcomic Bad Machinery, identifies himself as a brag after bucking Shauna Wickle into the water after convincing her to take a ride on his back. See also References This article about a legendary creature is a stub. You can help Wikipedia by adding missing information.
========================================
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-183] | [TOKENS: 8773]
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Timeline_of_WhatsApp] | [TOKENS: 11342]
Contents WhatsApp WhatsApp Messenger, commonly known simply as WhatsApp, is an American social media, instant messaging (IM), and Voice over IP (VoIP) service accessible via desktop and mobile app. Owned by Meta Platforms, the service allows users to send text messages, voice messages, and video messages, make voice and video calls, and share images, documents, user locations, and other content. The service requires a cellular mobile telephone number to register. WhatsApp was launched in May 2009. In January 2018, WhatsApp released a standalone business app called WhatsApp Business which can communicate with the standard WhatsApp client. As of May 2025, the service had 3 billion monthly active users, making it the most used messenger app. The name of the app is meant to sound like "what's up". The service was created by WhatsApp Inc. of Mountain View, California, which was acquired by Facebook in February 2014 for approximately US$19.3 billion. It became the world's most popular messaging application in 2015, with 900 million users, and had more than 2 billion active users worldwide in February 2020. WhatsApp Business had approximately 200 million monthly users in 2023. By 2016, it had become the primary means of Internet communication in regions including the Americas, the Indian subcontinent, and large parts of Europe and Africa. History WhatsApp was founded by Brian Acton and Jan Koum, former employees of Yahoo. Koum incorporated WhatsApp Inc. in California on February 24, 2009. A month earlier, Koum had purchased an iPhone, and he and Acton decided to create an app for the App Store. The idea started off as an app that would display statuses in a phone's Contacts menu, showing if a person was at work or on a call. Their discussions often took place at the home of Koum's Russian friend Alex Fishman in West San Jose. They realized that to take the idea further, they would need an iPhone developer. Fishman visited RentACoder.com, found Russian developer Igor Solomennikov, and introduced him to Koum. Koum named the app WhatsApp to sound like "what's up" and it was published on the Apple App Store and BlackBerry App World in May and June 2009 respectively. However, when early versions of WhatsApp kept crashing, Koum considered giving up and looking for a new job. Acton encouraged him to wait for a "few more months". In June 2009, when the app had been downloaded by only a handful of Fishman's Russian-speaking friends, Apple launched push technology, allowing users to be pinged even when not using the app. Koum updated WhatsApp so that everyone in the user's network would be notified when a user's status changed. This new facility, to Koum's surprise, was used by users to ping "each other with jokey custom statuses like, 'I woke up late' or 'I'm on my way.'" Fishman said, "At some point it sort of became instant messaging". WhatsApp 2.0, released for iPhone in August 2009, featured a purpose-designed messaging component; the number of active users suddenly increased to 250,000.[citation needed] Although Acton was working on another startup idea, he decided to join the company. In October 2009, Acton persuaded five former friends at Yahoo! to invest $250,000 in seed funding, and Acton became a co-founder and was given a stake. He officially joined WhatsApp on November 1. Koum then hired a friend in Los Angeles, Chris Peiffer, to develop a BlackBerry version, which arrived two months later. Subsequently, WhatsApp for Symbian OS was added in May 2010, and for Android OS in August 2010. In 2010 Google made multiple acquisition offers for WhatsApp, which were all declined. To cover the cost of sending verification texts to users, WhatsApp was changed from a free service to a paid one. In December 2009, the ability to send photos was added to the iOS version. By early 2011, WhatsApp was one of the top 20 apps in the U.S. Apple App Store. In April 2011, Sequoia Capital invested about $8 million for more than 15% of the company, after months of negotiation by Sequoia partner Jim Goetz. By February 2013, WhatsApp had about 200 million active users and 50 staff members. Sequoia invested another $50 million at a $1.5 billion valuation. Some time in 2013 WhatsApp acquired Santa Clara–based startup SkyMobius, the developers of Vtok, a video and voice calling app. As of December 2013, the service had 400 million monthly active users. That year, the company had $148 million in expenses and a net loss of $138 million. On February 19, 2014, one year after the venture capital financing round at a $1.5 billion valuation, Facebook, Inc. (now Meta Platforms) agreed to acquire the company for US$19 billion, its largest acquisition to date. At the time, it was the largest acquisition of a venture-capital-backed company in history. Sequoia Capital received an approximate 5,000% return on its initial investment. Facebook paid $4 billion in cash, $12 billion in Facebook shares, and an additional $3 billion in restricted stock units granted to WhatsApp's founders Koum and Acton. Employee stock was scheduled to vest over four years subsequent to closing. Days after the announcement, WhatsApp users experienced a loss of service, leading to anger across social media. The acquisition was influenced by the data provided by Onavo, Facebook's research app for monitoring competitors and trending usage of social activities on mobile phones, as well as startups that were performing "unusually well". The acquisition caused many users to try, or move to, other message services. Telegram claimed that it acquired 8 million new users, and Line, 2 million. At a keynote presentation at the Mobile World Congress in Barcelona in February 2014, Facebook CEO Mark Zuckerberg said that Facebook's acquisition of WhatsApp was closely related to the Internet.org vision. A TechCrunch article said about Zuckerberg's vision: The idea, he said, is to develop a group of basic internet services that would be free of charge to use – "a 911 for the internet". These could be a social networking service like Facebook, a messaging service, maybe search and other things like weather. Providing a bundle of these free of charge to users will work like a gateway drug of sorts – users who may be able to afford data services and phones these days just don't see the point of why they would pay for those data services. This would give them some context for why they are important, and that will lead them to pay for more services like this – or so the hope goes. Three days after announcing the Facebook purchase, Koum said they were working to introduce voice calls. He also said that new mobile phones would be sold in Germany with the WhatsApp brand, and that their ultimate goal was to be on all smartphones. In August 2014, WhatsApp was the most popular messaging app in the world, with more than 600 million users. By early January 2015, WhatsApp had 700 million monthly users and over 30 billion messages every day. In April 2015, Forbes predicted that between 2012 and 2018, the telecommunications industry would lose $386 billion because of "over-the-top" services like WhatsApp and Skype. That month, WhatsApp had over 800 million users. By September 2015, it had grown to 900 million; and by February 2016, one billion. On November 30, 2015, the Android WhatsApp client made links to Telegram unclickable and not copyable. Multiple sources confirmed that it was intentional, not a bug, and that it had been implemented when the Android source code that recognized Telegram URLs had been identified. (The word "telegram" appeared in WhatsApp's code.) Some considered it an anti-competitive measure; WhatsApp offered no explanation. On January 18, 2016, WhatsApp's co-founder Jan Koum announced that it would no longer charge users a $1 annual subscription fee, in an effort to remove a barrier faced by users without payment cards. He also said that the app would not display any third-party ads, and that it would have new features such as the ability to communicate with businesses. On May 18, 2017, the European Commission announced that it was fining Facebook €110 million for "providing misleading information about WhatsApp takeover" in 2014. The Commission said that in 2014 when Facebook acquired the messaging app, it "falsely claimed it was technically impossible to automatically combine user information from Facebook and WhatsApp." However, in the summer of 2016, WhatsApp had begun sharing user information with its parent company, allowing information such as phone numbers to be used for targeted Facebook advertisements. Facebook acknowledged the breach, but said the errors in their 2014 filings were "not intentional". In September 2017, WhatsApp's co-founder Brian Acton left the company to start a nonprofit group, later revealed as the Signal Foundation, which developed the WhatsApp competitor Signal. He explained his reasons for leaving in an interview with Forbes a year later. WhatsApp also announced a forthcoming business platform to enable companies to provide customer service at scale, and airlines KLM and Aeroméxico announced their participation in the testing. Both airlines had previously launched customer services on the Facebook Messenger platform. In January 2018, WhatsApp launched WhatsApp Business for small business use. In April 2018, WhatsApp co-founder and CEO Jan Koum announced he would be leaving the company. By leaving before November 2018, due to concerns about privacy, advertising, and monetization by Facebook, Acton and Koum were initially believed to have given up $1.3 billion in unvested stock options, however, it was later reported that Koum retained $450M worth of options via a "rest and vest" program. Facebook later announced that Koum's replacement would be Chris Daniels. On November 25, 2019, WhatsApp announced an investment of $250,000 through a partnership with Startup India to provide 500 startups with Facebook ad credits of $500 each. In December 2019, WhatsApp announced that a new update would lock out any Apple users who had not updated to iOS 9 or higher and Samsung, Huawei, Sony and Google users who had not updated to version 4.0 by February 1, 2020. The company also reported that Windows Phone operating systems would no longer be supported after December 31, 2019. WhatsApp was announced to be the 3rd most downloaded mobile phone app of the decade 2010–2019. In March 2020, WhatsApp partnered with the World Health Organization and UNICEF to provide messaging hotlines for people to get information on the COVID-19 pandemic. In the same month, WhatsApp began testing a feature to help users find out more information and context about information they receive to help combat misinformation. In January 2021, WhatsApp announced a controversial new privacy policy allowing WhatsApp to share data with its parent company, Facebook. This led many users to delete WhatsApp and instead use services such as Signal and Telegram. However, the WhatsApp privacy policy does not apply in the EU, since it violates the principles of GDPR. Facing criticism, WhatsApp postponed the update to May 15, 2021, and had no plans to limit functionality of users, nor nag users who did not approve the new terms. The 2021 Facebook outage affected other platforms owned by Facebook, such as Instagram and WhatsApp. In May 2022, WhatsApp launched its Cloud API services (now known as WhatsApp Business Platform) for larger businesses requiring features beyond the WhatsApp Business App. The Cloud API enables businesses to integrate WhatsApp with other software, have a central WhatsApp account for multiple users and implement advanced automation. In August 2022, WhatsApp launched an integration with JioMart, available only to users in India. Local users can text special numbers in the app to launch an in-app shopping process, where they can order groceries. In March 2024, Meta announced that WhatsApp would let third-party messaging services enable interoperability with WhatsApp, a requirement of the EU's Digital Markets Act (DMA). This allows users to send messages between other messaging apps and WhatsApp while maintaining end-to-end encryption. In January 2026, WhatsApp placed third in YouGov's Best Brands Rankings 2026 report. Features On February 24, 2017, WhatsApp launched a new Status feature similar to Snapchat and Facebook stories. WhatsApp has rolled out a feature called 'Voice Status Updates', which allows users to record voice notes and share them as their status on the app. WhatsApp has the facility to hide users' online status ("Last Seen"). In December 2021, WhatsApp changed the default setting from "everyone" to only people in the user's contacts or who have been conversed with ("nobody" is also an option). In 2022, WhatsApp added the ability for users to turn off their online status. In October 2018, the "Swipe to Reply" option was added to the Android beta version, 16 months after it was introduced for iOS. In early 2020, WhatsApp launched its "dark mode" for iPhone and Android devices – a new design consisting of a darker palette. In October 2020, WhatsApp rolled out a feature allowing users to mute both individuals and group chats forever. The mute options are "8 hours", "1 week", and "Always". The "Always" option replaced the "1 year" option that was originally part of the settings. In May 2023, WhatsApp allowed users to edit messages, aligning itself with competitors such as Telegram and Signal which already offered this feature. According to the company, messages could be edited within a 15-minute window after being sent. Edited messages were tagged as "edited" to inform recipients that the content had been modified. Text formatting options like code blocks, quote blocks, and bulleted lists also became available for the first time. In October 2024, WhatsApp expanded their chat filter feature, adding the ability for users to create custom lists that contain specific chats of their choice. In August 2013, WhatsApp added voice messages to their apps, giving users a way to send short audio recordings directly in their chats. Voice calls between two accounts were added to the app in March and April 2015. By June 2016, the company's blog reported more than 100 million voice calls per day were being placed on WhatsApp. In November 2016, video calls between two accounts were introduced. Later in September 2018, WhatsApp introduced group audio and video call features. In July 2023, video messages were added to WhatsApp. Similar to voice messages, this feature allows users to record and send short videos directly in a chat. This lets users share videos of themselves more quickly, and without adding anything to their device's gallery. Currently, video messages are limited to 60 seconds. In November 2023, WhatsApp added a "voice chat" feature for groups with more than 32 members. Unlike their 32-person group calls, starting a voice chat does not call all group members directly; they instead receive a notification to join the voice chat. In December 2023, WhatsApp's "View Once" feature expanded to include voice messages. Voice messages sent this way are deleted after the recipient listens to them the first time. In June 2024, improvements were made to voice and video calls, allowing up to 32 participants in video calls, adding audio to screen sharing, and introducing a new codec to increase call reliability. In November 2024, the ability to transcribe voice messages was added, allowing users to read out what was said in a voice message, rather than listening to the audio. In December 2024, WhatsApp introduced several new video calling features, including the ability to select specific participants from a group to make a call, rather than calling all group members. Visual effects also became available, adding visual filters to a user's video feed. In November 2010, a slate of improvements for the iOS version of WhatsApp were released, including the ability to search for messages in your chat history, trimming long videos to a sendable size, the ability to cancel media messages as they upload or download, and previewing photos before sending them. In March 2012, WhatsApp improved its location-sharing function, allowing users to share not only their location, but also the location of places, such as restaurants or hotels. In July 2017, WhatsApp added support for file uploads of all file types, with a limit of 100 MB. Previously between March 2016 and May 2017, only limited file types categorised as images (JPG, PNG, GIF), videos (MP4, AVI), and documents (CSV, DOC/DOCX, PDF, PPT/PPTX, RTF, TXT, XLS/XLSX), were allowed to be shared for file attachments. In July 2021, WhatsApp announced forthcoming support for sending uncompressed images and videos in 3 options: Auto, Best Quality and Data Saver. In May 2022, the file upload limit was raised from 100 MB to 2 GB, and the maximum group size increased to 512 members. On November 10, 2016, WhatsApp launched a beta version of two-factor authentication for Android users, which allowed them to use their email addresses for further protection. Also in November 2016, Facebook ceased collecting WhatsApp data for advertising in Europe. In October 2019, WhatsApp officially launched a new fingerprint app-locking feature for Android users. In July 2021, WhatsApp announced forthcoming support for end-to-end encryption for backups stored in Facebook's cloud. In August 2021, WhatsApp launched a feature that allows chat history to be transferred between mobile operating systems. This was implemented only on Samsung phones, with plans to expand to Android and iOS "soon". In October 2023 they also introduced passkey support, where a user can verify their login with on-device biometrics, rather than SMS. In November 2023, WhatsApp also began rolling out support for sending login codes to a linked email address, rather than via SMS. In a later update on November 30, WhatsApp added a Secret Code feature, which allows those who use locked chats to enter a unique password that hides those chats from view when unlocking the app. In January 2015, WhatsApp launched a web client that allowed users to scan a QR code with their mobile app, mirroring their chats to their browser. The web client was not standalone, and required the user's phone to stay on and connected to the internet. It was also not available for iOS users on launch, due to limitations from Apple. Since then, linked devices support has expanded and more information is written in the Platform Support part of this article. In July 2021 the company was also testing multi-device support, allowing computer users to run WhatsApp without an active phone session. In April 2023, the app rolled out a feature that would allow account access across multiple phones, in a shift that would make it more like competitors. Messages would still be end-to-end encrypted. WhatsApp officially rolled out the Companion mode for Android users, allowing linking up to five Android phones to a single account. Now, the feature is also made available to iOS users, allowing them to link up to four iPhones. In October 2023, support for logging in to multiple (meaning two) accounts was added, allowing users to switch between different WhatsApp accounts in the same app. On October 25, 2018, WhatsApp announced support for Stickers. Unlike other platforms, WhatsApp requires third-party apps to add Stickers to WhatsApp. In March 2021, WhatsApp started rolling out support for third-party animated stickers, initially in Iran, Brazil and Indonesia, then worldwide. In December 2022, WhatsApp launched 3D digital avatars. Users are able to use an avatar as their profile picture or use it for stickers during instant messaging, similar to those offered by Bitmoji or Memoji. In April 2022, WhatsApp announced undated plans to roll out a Communities feature allowing several group chats to exist in a shared space, getting unified notifications and opening up smaller discussion groups. The company also announced plans to implement reactions, the ability for administrators to delete messages in groups and voice calls up to 32 participants. In June 2023, a feature called WhatsApp Channels was launched which allows content creators, public figures and organizations to send newsletter-like broadcasts to large numbers of users. Unlike messages in groups or private chats, channels are not end-to-end encrypted. Channels were initially only available to users in Colombia and Singapore, then later Egypt, Chile, Malaysia, Morocco, Ukraine, Kenya and Peru before becoming widely available in September 2023. In April 2024, an AI-powered "Smart Assistant" became widely available in WhatsApp, allowing users to ask it questions or have it complete tasks such as generating images. The assistant is based on the LLaMa 3 model, and is also available on other Meta platforms like Facebook and Instagram. WhatsApp also introduced chat filters, allowing users to sort their chats by All, Unread or Groups. In September 2024, WhatsApp expanded support for Meta AI, allowing users to send text and photos to Meta AI to ask questions, identify objects, translate text or edit pictures. In December 2024, WhatsApp introduced a reverse image search feature, allowing users to verify image authenticity directly within the app using Google Search. In November 2025, WhatsApp announced that they would update their About feature, which allows users to add a short message to explain what they are doing. By default, it is set to disappear in 24 hours, but it can be set for a longer amount of time and can be restricted for viewing by other contacts in the Settings menu. According to WhatsApp, this update was made for the incoming Christmas period for users to inform their contacts on their activities during the holidays. Prior to the update, the feature was "largely hidden within the apps menus" and difficult to find, with the reason for updating being that WhatsApp wanted the feature to be used more often. Engadget called the revamped feature "WhatsApp’s version of an AIM away message" and likened it to Instagram's and Facebook's Notes. Platform support Currently, WhatsApp's principal platforms, which are fully supported, are devices supporting mobile telephony running Android, and iPhones. As of 2025, the software requires at least Android version 5.0 or iOS version 15.1 respectively. This table details platform support history. Linked devices are secondary devices running the WhatsApp messenger software. They link to and sync with WhatsApp actively running on a supported primary phone. Up to four linked devices can be added per user account. Linked devices automatically log out after 14 days of inactivity on the primary phone. Linked devices allow the service to be used on multiple other platforms like desktop computers and smartwatches (e.g. WhatsApp Web, Facebook Portal), but also on other smartphones (called companions). Originally it was required for the primary phone to keep an online connection to WhatsApp for linked devices to work, but now WhatsApp can run on linked devices without such requirement. This ability (named multi-device support) began testing in July 2021 and rolled out to all users in April 2023. WhatsApp was officially made available for PCs through a web client, under the name WhatsApp Web, released on January 21, 2015. WhatsApp Web is accessed through web.whatsapp.com and access is granted after the user scans their personal QR code through their mobile WhatsApp client. The desktop version was first only available to Android, BlackBerry, and Windows Phone users. Later on, it also added support for iOS, Nokia Series 40, and Nokia S60 (Symbian). Previously the WhatsApp user's handset had to be connected to the Internet for the browser application to function but as of an update in October 2021 (and integrated by default in WhatsApp as of April 2022) that is no longer the case. When this multi-device feature was first introduced to Android and iOS users, it could only show messages for the last three months on the Web version, because the Web version was syncing with the phone. Since the complete roll out of this feature, users cannot check old messages before this period on the Web version anymore. There are similar unofficial WhatsApp solutions for macOS, such as the open-source ChitChat, previously known as WhatsMac. On May 10, 2016, the messenger was introduced for both Microsoft Windows and macOS operating systems. Support for video and voice calls from desktop clients was later added. Similar to the WhatsApp Web format, the app, which synchronises with a user's mobile device, is available for download on the website. It supported operating systems Windows 8 and OS X 10.10 and higher. In 2023, WhatsApp replaced the Electron-based apps with native versions for their respective platforms. The Windows version is based on UWP while the Mac version is a port of the iOS version using Catalyst technology. In July 2025, WhatsApp stopped developing the Windows UWP-based app due to poor support and deprecation of the UWP framework by Microsoft. WhatsApp for Windows transitioned over to the Microsoft Edge WebView2 framework, marking a return to utilising a web-based framework (just like Electron previously) instead of a native framework. The WebView2-based app has been criticised for its sluggish performance, high RAM usage, and requirement to keep the app running in the background to receive push notifications, compared to the previous native version. WhatsApp has been officially supported for iPads and its iPadOS since May 27, 2025. Similarly to WhatsApp for web, Windows, Mac, and smartwatches, the iPad is a type of linked device that connects and syncs to WhatsApp running on a smartphone. WhatsApp added support for Android Wear (now called Wear OS) in 2014 and for the Apple Watch in 2025. Technical WhatsApp uses a customized version of the open standard Extensible messaging and presence protocol (XMPP). A 2019 document released by the DOJ confirms this by naming "FunXMPP" as the protocol used by WhatsApp. The document was part of a lawsuit by WhatsApp and Meta against the NSO Group for their Pegasus malware. Upon installation, it creates a user account using the user's phone number as the username (Jabber ID: [phone number]@s.whatsapp.net). WhatsApp automatically compares all the phone numbers from the device's address book with its central database of WhatsApp users to automatically add contacts to the user's WhatsApp contact list. Previously the Android and Nokia Series 40 versions used an MD5-hashed, reversed-version of the phone's IMEI as a password, while the iOS version used the phone's Wi-Fi MAC address instead of the IMEI. A 2012 update implemented generation of a random password on the server side. Alternatively a user can also contact any other WhatsApp user through the URL https://api.whatsapp.com/send/?phone=[phone number] where [phone number] is the number of the contact including the country code. Some devices using dual SIMs may not be compatible with WhatsApp, though there are unofficial workarounds to install the app. In February 2015, WhatsApp implemented voice calling, which helped WhatsApp to attract a different segment of the user population. WhatsApp's voice codec is Opus, which uses the modified discrete cosine transform (MDCT) and linear predictive coding (LPC) audio compression algorithms. WhatsApp uses Opus at 8–16 kHz sampling rates. On November 14, 2016, WhatsApp added video calling for users using Android, iPhone, and Windows Phone devices. In November 2017, WhatsApp implemented a feature giving users seven minutes to delete messages sent by mistake. Multimedia messages are sent by uploading the image, audio or video to be sent to an HTTP server and then sending a link to the content along with its Base64 encoded thumbnail, if applicable. WhatsApp uses a "store and forward" mechanism for exchanging messages between two users. When a user sends a message, it is stored on a WhatsApp server, which tries to forward it to the addressee, and repeatedly requests acknowledgement of receipt. When the message is acknowledged, the server deletes it; if undelivered after 30 days, it is also deleted.[self-published source?] On November 18, 2014, Open Whisper Systems announced a partnership with WhatsApp to provide end-to-end encryption by incorporating the encryption protocol used in Signal into each WhatsApp client platform. Open Whisper Systems said that they had already incorporated the protocol into the latest WhatsApp client for Android, and that support for other clients, group/media messages, and key verification would be coming soon after. WhatsApp confirmed the partnership to reporters, but there was no announcement or documentation about the encryption feature on the official website, and further requests for comment were declined. In April 2015, German magazine Heise security used ARP spoofing to confirm that the protocol had been implemented for Android-to-Android messages, and that WhatsApp messages from or to iPhones running iOS were still not end-to-end encrypted. They expressed the concern that regular WhatsApp users still could not tell the difference between end-to-end encrypted messages and regular messages. On April 5, 2016, WhatsApp and Open Whisper Systems announced that they had finished adding end-to-end encryption to "every form of communication" on WhatsApp, and that users could now verify each other's keys. Users were also given the option to enable a trust on first use mechanism to be notified if a correspondent's key changes. According to a white paper that was released along with the announcement, WhatsApp messages are encrypted with the Signal Protocol. WhatsApp calls are encrypted with SRTP, and all client-server communications are "layered within a separate encrypted channel". On October 14, 2021, WhatsApp rolled out end-to-end encryption for backups on Android and iOS. The feature has to be turned on by the user and provides the option to encrypt the backup either with a password or a 64-digit encryption key. The application can store encrypted copies of the chat messages onto the SD card, but chat messages are also stored unencrypted in the SQLite database file "msgstore.db". WhatsApp uses the Sender Keys protocol. WhatsApp Payments (marketed as WhatsApp Pay) is a peer-to-peer money transfer feature. The service became generally available in India and Brazil, and in Singapore for WhatsApp Business transactions only. In July 2017, WhatsApp received permission from the National Payments Corporation of India (NPCI) to enter into partnership with multiple Indian banks, for transactions over Unified Payments Interface (UPI), which relies on mobile phone numbers to make account-to-account transfers. In November 2020, UPI payments via WhatsApp were initially restricted to 20 million users, and to 100 million users in April 2022, and became generally available to everyone in August 2022. On February 28, 2019, The New York Times reported that Facebook was "hoping to succeed where Bitcoin failed" by developing an in-house cryptocurrency that would be incorporated into WhatsApp. The project reportedly involved more than 50 engineers under the direction of former PayPal president David A. Marcus. This "Facebook coin" would reportedly be a stablecoin pegged to the value of a basket of different foreign currencies. In June 2019, Facebook said that the project would be named Libra, and that a digital wallet named "Calibra" was to be integrated into Facebook and WhatsApp. After financial regulators in many regions raised concerns, Facebook stated that the currency, renamed Diem since December 2020, would require a government-issued ID for verification, and the wallet app would have fraud protection. Calibra was rebranded to Novi in May 2020. Meta (formerly Facebook) ended its Novi project on September 1, 2022. Controversies and criticism WhatsApp has repeatedly imposed limits on message forwarding in response to the spread of misinformation in countries including India and Australia. The measure, first introduced in 2018 to combat spam, was expanded and remained active in 2021. WhatsApp stated that the forwarding limits had helped to curb the spread of misinformation regarding COVID-19. In India, WhatsApp encouraged people to report messages that were fraudulent or incited violence after lynch mobs in India murdered innocent people because of malicious WhatsApp messages falsely accusing the victims of intending to abduct children. There were a series of incidents between 2017 and 2020, after which WhatsApp announced changes for Indian users of the platform that labels forwarded messages as such. In an investigation on the use of social media in politics, it was found that WhatsApp was being abused for the spread of fake news in the 2018 presidential elections in Brazil. It was reported that US$3 million was spent in illegal concealed contributions related to this practice. Researchers and journalists called on WhatsApp's parent company, Facebook, to adopt measures similar to those adopted in India and restrict the spread of hoaxes and fake news. WhatsApp was initially criticized for its lack of encryption, sending information as plaintext. Encryption was first added in May 2012. End-to-end encryption was only fully implemented in April 2016 after a two-year process. As of September 2021[update], it is known that WhatsApp makes extensive use of outside contractors and artificial intelligence systems to examine certain user messages, images and videos (those that have been flagged by users as possibly abusive); and turns over to law enforcement metadata including critical account and location information. In 2016, WhatsApp was widely praised for the addition of end-to-end encryption and earned a 6 out of 7 points on the Electronic Frontier Foundation's "Secure Messaging Scorecard". WhatsApp was criticized by security researchers and the Electronic Frontier Foundation for using backups that are not covered by end-to-end encryption and allow messages to be accessed by third-parties. In 2019, Edward Snowden alarmed: "The problem with applications like WhatsApp is, it was actually designed to have very strong encryption, just the same as the gold standard today which would be the signal messenger or the wire messenger, but then it was bought by Facebook because it was so good, and now Facebook is quite aggressively reducing the security of WhatsApp about once a quarter, and they’re trying to do it as quietly as possible, so a messenger that the people are comfortable using now is actually a danger to you." In May 2019, a security vulnerability in WhatsApp was found and fixed that allowed a remote person to install spyware by making a call which did not need to be answered. In September 2019, WhatsApp was criticized for its implementation of a 'delete for everyone' feature. iOS users can elect to save media to their camera roll automatically. When a user deletes media for everyone, WhatsApp does not delete images saved in the iOS camera roll and so those users are able to keep the images. WhatsApp released a statement saying that "the feature is working properly", and that images stored in the camera roll cannot be deleted due to Apple's security layers. In November 2019, WhatsApp released a new privacy feature that let users decide who can add them to groups. In December 2019, WhatsApp confirmed a security flaw that would allow hackers to use a malicious GIF image file to gain access to the recipient's data. When the recipient opened the gallery within WhatsApp, even if not sending the malicious image, the hack is triggered and the device and its contents become vulnerable. The flaw was patched and users were encouraged to update WhatsApp. On December 17, 2019, WhatsApp fixed a security flaw that allowed cyber attackers to repeatedly crash the messaging application for all members of group chat, which could only be fixed by forcing the complete uninstall and reinstall of the app. The bug was discovered by Check Point in August 2019 and reported to WhatsApp. It was fixed in version 2.19.246 onwards. For security purposes, since February 1, 2020, WhatsApp has been made unavailable on smartphones using legacy operating systems like Android 2.3.7 or older and iPhone iOS 8 or older that are no longer updated by their providers. In April 2020, the NSO Group held its governmental clients accountable for the allegation of human rights abuses by WhatsApp. In its revelation via documents received from court, the group claimed that the lawsuit brought against the company by WhatsApp threatened to infringe on its clients' "national security and foreign policy concerns". However, the company did not reveal names of the end users, which according to a research by Citizen Lab include, Saudi Arabia, Bahrain, Kazakhstan, Morocco, Mexico and the United Arab Emirates. On December 16, 2020, a claim that WhatsApp gave Google access to private messages was included in the anti-trust case against the latter. As the complaint was heavily redacted due to being an ongoing case, it did not disclose whether this was alleged tampering with the app's end-to-end encryption, or Google accessing user backups.[clarification needed] In January 2021, WhatsApp announced an updated privacy policy which stated that WhatsApp would share user data with Facebook and its "family of companies" beginning February 2021. Previously, users could opt-out of such data sharing, but the new policy removed this option. The new privacy policy would not apply within the EU, as it is illegal under the GDPR. Facebook and WhatsApp were widely criticized for this move. The enforcement of the privacy policy was postponed from February 8 to May 15, 2021, WhatsApp announced they had no plans to limit the functionality of the app for those who did not approve the new terms. On October 15, 2021, WhatsApp announced that it would begin offering an end-to-end encryption service for chat backups, meaning no third party (including both WhatsApp and the cloud storage vendor) would have access to a user's information. This new encryption feature added an additional layer of protection to chat backups stored either on Apple iCloud or Google Drive. On November 29, 2021, an FBI document was uncovered by Rolling Stone, revealing that WhatsApp responds to warrants and subpoenas from law enforcement within minutes, providing user metadata to the authorities. The metadata includes the user's contact information and address book. In January 2022, an unsealed surveillance application revealed that WhatsApp started tracking seven users from China and Macau in November 2021, based on a request from US DEA investigators. The app collected data on who the users contacted and how often, and when and how they were using the app. This is reportedly not an isolated occurrence, as federal agencies can use the Electronic Communications Privacy Act to covertly track users without submitting any probable cause or linking a user's number to their identity. At the beginning of 2022, it was revealed that San Diego–based startup Boldend had developed tools to hack WhatsApp's encryption, gaining access to user data, at some point since the startup's inception in 2017. The vulnerability was reportedly patched in January 2021. Boldend is financed, in part, by Peter Thiel, a notable investor in Facebook. In September 2022, a critical security issue in WhatsApp's Android video call feature was reported. An integer overflow bug allowed a malicious user to take full control of the victim's application once a video call between two WhatsApp users was established. The issue was patched on the day it was officially reported. In 2025, WhatsApp alerted 90 journalists and other members of civil society that they had been targeted by spyware used by the Israeli technology company Paragon Solutions. In April 2025, a group of Austrian researchers were able to extract 3.5 billion users' phone numbers by being able to make a hundred million contact discovery requests an hour, a flaw that exposed previous warnings from researchers in 2017 were not addressed. The researchers notified Meta (who updated the enumeration problem in October), and deleted their copy of the phone numbers. As of 2023[update], WhatsApp is widely used by government institutions in the UK, although such use is viewed as problematical since it hinders the public, including journalists, from obtaining accurate government records when making freedom of information requests. The information commissioner has said that the use of WhatsApp posed risks to transparency since members of Parliament, government ministers, and officials who wished to avoid scrutiny might use WhatsApp despite there being official channels. Transparency campaigners have challenged the practice in court. Notably, during the COVID-19 pandemic, the UK government routinely used WhatsApp to make decisions on managing the crisis, including on personal rather than government-issued devices. When the official inquiry into the pandemic began seeking evidence in May 2023, this presented issues for its ability to gather the material it sought. A personal device of the former Prime Minister, Boris Johnson, had been compromised by a security breach, and it was claimed that it could not be switched on to recover messages. Further, the Cabinet Office had claimed that since many messages were not relevant to the inquiry, it only needed to hand over material it had selected as being relevant. The High Court, in a judicial review sought by the Cabinet Office, declared that all documents sought by the inquiry were to be handed over unredacted. In 2018, it was reported that around 500,000 National Health Service (NHS) staff used WhatsApp and other instant messaging systems at work and around 29,000 had faced disciplinary action for doing so. Higher usage was reported by frontline clinical staff to keep up with care needs, even though NHS trust policies do not permit their use. In March 2019, WhatsApp released a guide for users who had installed unofficial modified versions of WhatsApp and warned that it may ban those using unofficial clients. In May 2019, WhatsApp was attacked by hackers who installed spyware on a number of victims' smartphones. The hack, allegedly developed by Israeli surveillance technology firm NSO Group, injected malware onto WhatsApp users' phones via a remote-exploit bug in the app's Voice over IP calling functions. A Wired report noted the attack was able to inject malware via calls to the targeted phone, even if the user did not answer the call. In October 2019, WhatsApp filed a lawsuit against NSO Group in a San Francisco court, claiming that the alleged cyberattack violated US laws including the Computer Fraud and Abuse Act (CFAA). According to WhatsApp, the exploit "targeted at least 100 human-rights defenders, journalists and other members of civil society" among a total of 1,400 users in 20 countries. In April 2020, the NSO Group held its governmental clients accountable for the allegation of human rights abuses by WhatsApp. In its revelation via documents received via court, the group claimed that the lawsuit brought against the company by WhatsApp threatened to infringe on its clients' "national security and foreign policy concerns". However, the company did not reveal the names of the end users, which according to research by Citizen Lab include, Saudi Arabia, Bahrain, Kazakhstan, Morocco, Mexico and the United Arab Emirates. In July 2020, a US federal judge ruled that the lawsuit against NSO group could proceed. NSO Group filed a motion to have the lawsuit dismissed, but the judge denied all of its arguments. In January 2020, a digital forensic analysis revealed that the Amazon founder Jeff Bezos received an encrypted message on WhatsApp from the official account of Saudi Arabia's Crown Prince Mohammed bin Salman. The message reportedly contained a malicious file, the receipt of which resulted in Bezos' phone being hacked. The United Nations' special rapporteur David Kaye and Agnes Callamard later confirmed that Jeff Bezos' phone was hacked through WhatsApp, as he was one of the targets of Saudi's hit list of individuals close to The Washington Post journalist Jamal Khashoggi. In 2021, an FBI document obtained through a Freedom of Information request by Property of the People, Inc., a 501(c)(3) nonprofit organization, revealed that WhatsApp and iMessage are vulnerable to law-enforcement real-time searches. In January 2022, an investigation by The Wire claimed that BJP, an Indian political party, allegedly used an app called Tek Fog which was capable of hacking inactive WhatsApp accounts en masse to mass message their contacts with propaganda. According to the report, a whistleblower with app access was able to hack a test WhatsApp account controlled by reporters "within minutes." It was later determined that staff of their Meta investigative team had been duped by false information; The Wire fired the staff member involved and issued a formal apology to its readers. In December 2015, it was reported that terrorist organization ISIS had been using WhatsApp to plot the November 2015 Paris attacks. According to The Independent, ISIS also uses WhatsApp to traffic sex slaves. In March 2017, British Home Secretary Amber Rudd said encryption capabilities of messaging tools like WhatsApp are unacceptable, as news reported that Khalid Masood used the application several minutes before perpetrating the 2017 Westminster attack. Rudd publicly called for police and intelligence agencies to be given access to WhatsApp and other encrypted messaging services to prevent future terror attacks. In April 2017, the perpetrator of the Stockholm truck attack reportedly used WhatsApp to exchange messages with an ISIS supporter shortly before and after the incident. The messages involved discussing how to make an explosive device and a confession to the attack. In April 2017, nearly 300 WhatsApp groups with about 250 members each were reportedly being used to mobilize stone-pelters in Jammu and Kashmir to disrupt security forces' operations at encounter sites. According to police, 90% of these groups were closed down after police contacted their admins. Further, after a six-month probe which involved the infiltration of 79 WhatsApp groups, the National Investigation Agency reported that out of about 6386 members and admins of these groups, about 1000 were residents of Pakistan and gulf nations. Further, for their help in negating anti-terror operations, the Indian stone pelters were getting funded through barter trade from Pakistan and other indirect means. In May 2022, the FBI stated that an ISIS sympathizer, who was plotting to assassinate George W. Bush, was arrested based on his WhatsApp data. According to the arrest warrant for the suspect, his WhatsApp account was placed under surveillance. There are numerous ongoing scams on WhatsApp that let hackers spread viruses or malware. In May 2016, some WhatsApp users were reported to have been tricked into downloading a third-party application called WhatsApp Gold, which was part of a scam that infected the users' phones with malware. A message that promises to allow access to their WhatsApp friends' conversations, or their contact lists, has become the most popular hit against anyone who uses the application in Brazil. Clicking on the message actually sends paid text messages. Since December 2016, more than 1.5 million people have clicked and lost money. Another application called GB WhatsApp is considered malicious by cybersecurity firm Symantec because it usually performs some unauthorized operations on end-user devices. WhatsApp is owned by Meta, whose main social media service Facebook has been blocked in China since 2009. In September 2017, security researchers reported to The New York Times that the WhatsApp service had been completely blocked in China. On April 19, 2024, Apple removed WhatsApp from the App Store in China, citing government orders that stemmed from national security concerns. On May 9, 2014, the government of Iran announced that it had proposed to block the access to WhatsApp service to Iranian residents. "The reason for this is the assumption of WhatsApp by the Facebook founder Mark Zuckerberg, who is an American Zionist", said Abdolsamad Khorramabadi, head of the country's Committee on Internet Crimes. Subsequently, Iranian president Hassan Rouhani issued an order to the Ministry of ICT to stop filtering WhatsApp. It was once again blocked on September 2022 but unblocked on December 2024. Turkey temporarily banned WhatsApp in 2016, following the assassination of the Russian ambassador to Turkey. On March 1, 2016, Diego Dzodan, Facebook's vice-president for Latin America was arrested in Brazil for not cooperating with an investigation in which WhatsApp conversations were requested. On March 2, 2016, at dawn the next day, Dzodan was released because the Court of Appeal held that the arrest was disproportionate and unreasonable. On May 2, 2016, mobile providers in Brazil were ordered to block WhatsApp for 72 hours for the service's second failure to cooperate with criminal court orders. Once again, the block was lifted following an appeal, after less than 24 hours. Brazil's Central Bank issued an order to payment card companies Visa and Mastercard on June 23, 2020, to stop working with WhatsApp on its new electronic payment system. A statement from the Bank asserted the decision to block the Facebook-owned company's latest offering was taken to "preserve an adequate competitive environment" in the mobile payments space and to ensure "functioning of a payment system that's interchangeable, fast, secure, transparent, open and cheap." The government of Uganda banned WhatsApp and Facebook, along with other social media platforms, to enforce a tax on the use of social media. Users are to be charged USh.200/= per day to access these services according to the new law set by parliament. The United Arab Emirates banned WhatsApp video chat and VoIP call applications in as early as 2013 due to what is often reported as an effort to protect the commercial interests of their home grown nationally owned telecom providers (du and Etisalat). Their app ToTok has received press suggesting it is able to spy on users. In July 2021, the Cuban government blocked access to several social media platforms, including WhatsApp, to curb the spread of information during the anti-government protests. In December 2021, the Swiss army banned the use of WhatsApp and several other non-Swiss encrypted messaging services by army personnel. The ban was prompted by concerns of US authorities potentially accessing user data for such apps because of the CLOUD Act. The army recommended that all army personnel use Threema instead, as the service is based in Switzerland. In August 2021, the digital rights organization Access Now reported that WhatsApp along with several other social media apps was being blocked in Zambia for the duration of the general election. The organization reported a massive drop-off in traffic for the blocked services, though the country's government made no official statements about the block. The Saudi Central Bank (SAMA) has prohibited local banks from using instant messaging applications like WhatsApp for customer communication. This decision aims to enhance data security and protect customer information. In Russia, authorities increased pressure on WhatsApp in late 2025. On 28 November 2025, officials warned of a potential full ban on the service. On 11 February 2026, the Russian government fully blocked WhatsApp, which had at least 100 million users in the country until recently, citing its alleged failure to comply with domestic regulations concerning extremist content and state oversight requirements. In mid-2013, WhatsApp Inc. filed for the DMCA takedown of the discussion thread on the XDA Developers forums about the then popular third-party client "WhatsApp Plus". In 2015, some third-party WhatsApp clients that were reverse-engineering the WhatsApp mobile app, received a cease and desist to stop activities that were violating WhatsApp legal terms. As a result, users of third-party WhatsApp clients were also banned. WhatsApp Business WhatsApp launched two business-oriented apps in January 2018, separated by the intended userbase: This solution was originally available as on-premise only, but in 2022, WhatsApp Cloud API became available. The on-premise API has been deprecated and will be fully sunset on October 23, 2025. As WhatsApp API does not have a frontend interface, businesses need to subscribe through one of Meta's approved Business Solution Providers. Examples of these include respond.io, Gupshup, Trengo, Wati and Manychat.[citation needed] In October 2020, Facebook announced the introduction of pricing tiers for services offered via the WhatsApp Business API, charged on a per-conversation basis. On July 1, 2025, a new pricing tier system came into effect which charges per-message rather than per-conversation. User statistics WhatsApp handled ten billion messages per day in August 2012, growing from two billion in April 2012, and one billion the previous October. On June 13, 2013, WhatsApp announced that they had reached their new daily record by processing 27 billion messages. According to the Financial Times, WhatsApp "has done to SMS on mobile phones what Skype did to international calling on landlines". By April 22, 2014, WhatsApp had over 500 million monthly active users, 700 million photos and 100 million videos were being shared daily, and the messaging system was handling more than 10 billion messages each day. On August 24, 2014, Koum announced on his Twitter account that WhatsApp had over 600 million active users worldwide. At that point WhatsApp was adding about 25 million new users every month, or 833,000 active users per day. In May 2017, it was reported that WhatsApp users spend over 340 million minutes on video calls each day on the app. This is the equivalent of roughly 646 years of video calls per day. By February 2017, WhatsApp had over 1.2 billion users globally, reaching 1.5 billion monthly active users by the end of 2017. In January 2020, WhatsApp reached over 5 billion installs on Google Play Store making it only the second non-Google app to achieve this milestone. In February 2020, WhatsApp had over 2 billion users globally. In May 2025, Meta reported WhatsApp had over 3 billion monthly active users globally. India is by far WhatsApp's largest market in terms of total number of users. In May 2014, WhatsApp crossed 50 million monthly active users in India, which is also its largest country by the number of monthly active users, then 70 million in October 2014, making users in India 10% of WhatsApp's total user base. In February 2017, WhatsApp reached 200 million monthly active users in India. Israel is one of WhatsApp's strongest markets in terms of ubiquitous usage. According to Globes, already by 2013 the application was installed on 92% of all smartphones, with 86% of users reporting daily use. In July 2024, WhatsApp reached 100 million users in the United States. WhatsApp competes with messaging services including iMessage (estimated 1.3 billion active users), WeChat (1.26 billion active users), Telegram (1 billion users), Viber (260 million active users), LINE (217 million active users), KakaoTalk (57 million active users), and Signal (70 million active users). Both Telegram and Signal in particular were reported to get registration spikes during WhatsApp outages and controversies. WhatsApp has increasingly drawn its innovation from competing services, such as a Telegram-inspired web version and features for groups. In 2016, WhatsApp was accused of copying features from a then-unreleased version of iMessage. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/PLEX_(programming_language)] | [TOKENS: 343]
Contents PLEX (programming language) PLEX (Programming Language for EXchanges) is a special-purpose, concurrent, real-time programming language. The proprietary PLEX language is closely tied to the architecture of Ericsson's AXE telephone exchanges which it was designed to control. PLEX was developed by Göran Hemdahl at Ericsson in the 1970s, and it has been continuously evolving since then. PLEX was described in 2008 as "a cross between Fortran and a macro assembler." The language has two variants: Plex-C used for the AXE Central Processor (CP) and Plex-M used for Extension Module Regional Processors (EMRP). Ericsson started a project in the mid-1980s to create a successor language, which resulted in Erlang. According to co-creator Joe Armstrong, "Erlang was heavily influenced by PLEX and the AXE design." Erlang did not replace PLEX, but was used alongside it. Execution model A system is divided into separately compiled and loaded units of code called "blocks." A block waits for one or more signals sent from elsewhere in the system, which triggers code execution. Pre-compilers Several precompilers or code generators exist, to produce source code in Plex-C from higher level languages or graphical models. These can generate Plex-C from: Source code in Plex-C is compiled into the assembly language ASA210C. The binary form of ASA210C is either interpreted by a combination of hardware and microcode, or is compiled by a just-in-time compiler into native machine code for a high-capacity microprocessor. References This programming-language-related article is a stub. You can help Wikipedia by adding missing information.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Rounding#Tie-breaking] | [TOKENS: 7584]
Contents Rounding Rounding or rounding off is the process of adjusting a number to an approximate, more convenient value, often with a shorter or simpler representation. For example, replacing $23.4476 with $23.45, the fraction 312/937 with 1/3, or the expression √2 with 1.414. Rounding is often done to obtain a value that is easier to report and communicate than the original. Rounding can also be important to avoid misleadingly precise reporting of a computed number, measurement, or estimate; for example, a quantity that was computed as 123456 but is known to be accurate only to within a few hundred units is usually better stated as "about 123500". On the other hand, rounding of exact numbers will introduce some round-off error in the reported result. Rounding is almost unavoidable when reporting many computations – especially when dividing two numbers in integer or fixed-point arithmetic; when computing mathematical functions such as square roots, logarithms, and sines; or when using a floating-point representation with a fixed number of significant digits. In a sequence of calculations, these rounding errors generally accumulate, and in certain ill-conditioned cases they may make the result meaningless. Accurate rounding of transcendental mathematical functions is difficult because the number of extra digits that need to be calculated to resolve whether to round up or down cannot be known in advance. This problem is known as "the table-maker's dilemma". Rounding has many similarities to the quantization that occurs when physical quantities must be encoded by numbers or digital signals. A wavy equals sign (≈) is sometimes used to indicate rounding of exact numbers, e.g. 9.98 ≈ 10. This sign was introduced by Alfred George Greenhill in 1892. Ideal characteristics of rounding methods include: Because it is not usually possible for a method to satisfy all ideal characteristics, many different rounding methods exist. As a general rule, rounding is idempotent; i.e., once a number has been rounded, rounding it again to the same precision will not change its value. Rounding functions are also monotonic; i.e., rounding two numbers to the same absolute precision will not exchange their order (but may give the same value). In the general case of a discrete range, they are piecewise constant functions. Types of rounding Typical rounding problems include: Rounding to integer The most basic form of rounding is to replace an arbitrary number by an integer. All the following rounding modes are concrete implementations of an abstract single-argument "round()" procedure. These are true functions (with the exception of those that use randomness). These four methods are called directed rounding to an integer, as the displacements from the original number x to the rounded value y are all directed toward or away from the same limiting value (0, +∞, or −∞). Directed rounding is used in interval arithmetic and is often required in financial calculations. If x is positive, round-down is the same as round-toward-zero, and round-up is the same as round-away-from-zero. If x is negative, round-down is the same as round-away-from-zero, and round-up is the same as round-toward-zero. In any case, if x is an integer, y is just x. Where many calculations are done in sequence, the choice of rounding method can have a very significant effect on the result. A famous instance involved a new index set up by the Vancouver Stock Exchange in 1982. It was initially set at 1000.000 (three decimal places of accuracy), and after 22 months had fallen to about 520, although the market appeared to be rising. The problem was caused by the index being recalculated thousands of times daily, and always being truncated (rounded down) to 3 decimal places, in such a way that the rounding errors accumulated. Recalculating the index for the same period using rounding to the nearest thousandth rather than truncation corrected the index value from 524.811 up to 1098.892. For the examples below, sgn(x) refers to the sign function applied to the original number, x. One may round down (or take the floor, or round toward negative infinity): y is the largest integer that does not exceed x. For example, 23.7 gets rounded to 23, and −23.2 gets rounded to −24. One may also round up (or take the ceiling, or round toward positive infinity): y is the smallest integer that is not less than x. For example, 23.2 gets rounded to 24, and −23.7 gets rounded to −23. One may also round toward zero (or truncate, or round away from infinity): y is the integer that is closest to x such that it is between 0 and x (included); i.e. y is the integer part of x, without its fraction digits. For example, 23.7 gets rounded to 23, and −23.7 gets rounded to −23. One may also round away from zero (or round toward infinity): y is the integer that is closest to 0 (or equivalently, to x) such that x is between 0 and y (included). For example, 23.2 gets rounded to 24, and −23.2 gets rounded to −24. These six methods are called rounding to the nearest integer. Rounding a number x to the nearest integer requires some tie-breaking rule for those cases when x is exactly half-way between two integers – that is, when the fraction part of x is exactly 0.5. If it were not for the 0.5 fractional parts, the round-off errors introduced by the round to nearest method would be symmetric: for every fraction that gets rounded down (such as 0.268), there is a complementary fraction (namely, 0.732) that gets rounded up by the same amount. When rounding a large set of fixed-point numbers with uniformly distributed fractional parts, the rounding errors by all values, with the omission of those having 0.5 fractional part, would statistically compensate each other. This means that the expected (average) value of the rounded numbers is equal to the expected value of the original numbers when numbers with fractional part 0.5 from the set are removed. In practice, floating-point numbers are typically used, which have even more computational nuances because they are not equally spaced. One may round half up (or round half toward positive infinity), a tie-breaking rule that is widely used in many disciplines.[citation needed] That is, half-way values of x are always rounded up. If the fractional part of x is exactly 0.5, then y = x + 0.5 For example, 23.5 gets rounded to 24, and −23.5 gets rounded to −23. Some programming languages (such as Java and Python) use "half up" to refer to round half away from zero rather than round half toward positive infinity. This method only requires checking one digit to determine rounding direction in two's complement and similar representations. One may also round half down (or round half toward negative infinity) as opposed to the more common round half up. If the fractional part of x is exactly 0.5, then y = x − 0.5 For example, 23.5 gets rounded to 23, and −23.5 gets rounded to −24. Some programming languages (such as Java and Python) use "half down" to refer to round half toward zero rather than round half toward negative infinity. One may also round half toward zero (or round half away from infinity) as opposed to the conventional round half away from zero. If the fractional part of x is exactly 0.5, then y = x − 0.5 if x is positive, and y = x + 0.5 if x is negative. For example, 23.5 gets rounded to 23, and −23.5 gets rounded to −23. This method treats positive and negative values symmetrically, and therefore is free of overall positive/negative bias if the original numbers are positive or negative with equal probability. It does, however, still have bias toward zero. One may also round half away from zero (or round half toward infinity), a tie-breaking rule that is commonly taught and used, namely: If the fractional part of x is exactly 0.5, then y = x + 0.5 if x is positive, and y = x − 0.5 if x is negative. For example, 23.5 gets rounded to 24, and −23.5 gets rounded to −24. This can be more efficient on computers that use sign-magnitude representation for the values to be rounded, because only the first omitted digit needs to be considered to determine if it rounds up or down. This is one method used when rounding to significant figures due to its simplicity. This method, also known as commercial rounding,[citation needed] treats positive and negative values symmetrically, and therefore is free of overall positive/negative bias if the original numbers are positive or negative with equal probability. It does, however, still have bias away from zero. It is often used for currency conversions and price roundings (when the amount is first converted into the smallest significant subdivision of the currency, such as cents of a euro) as it is easy to explain by just considering the first fractional digit, independently of supplementary precision digits or sign of the amount (for strict equivalence between the paying and recipient of the amount). One may also round half to even, a tie-breaking rule without positive/negative bias and without bias toward/away from zero. By this convention, if the fractional part of x is 0.5, then y is the even integer nearest to x. Thus, for example, 23.5 becomes 24, as does 24.5; however, −23.5 becomes −24, as does −24.5. This function minimizes the expected error when summing over rounded figures, regardless of the inputs being mostly positive or mostly negative, provided they are neither mostly even nor mostly odd. This variant of the round-to-nearest method is also called convergent rounding, statistician's rounding, Dutch rounding, Gaussian rounding, odd–even rounding, or bankers' rounding. This is the default rounding mode used in IEEE 754 operations for results in binary floating-point formats. By eliminating bias, repeated addition or subtraction of independent numbers, as in a one-dimensional random walk, will give a rounded result with an error that tends to grow in proportion to the square root of the number of operations rather than linearly. However, this rule distorts the distribution by increasing the probability of evens relative to odds. That is why this rule is for situations where sums are more important than distribution.[clarification needed] One may also round half to odd, a similar tie-breaking rule to round half to even. In this approach, if the fractional part of x is 0.5, then y is the odd integer nearest to x. Thus, for example, 23.5 becomes 23, as does 22.5; while −23.5 becomes −23, as does −22.5. This method is also free from positive/negative bias and bias toward/away from zero, provided the numbers to be rounded are neither mostly even nor mostly odd. It also shares the round half to even property of distorting the original distribution, as it increases the probability of odds relative to evens. It was the method used for bank balances in the United Kingdom when it decimalized its currency[clarification needed]. This variant is almost never used in computations, except in situations where one wants to avoid increasing the scale of floating-point numbers, which have a limited exponent range. With round half to even, a non-infinite number would round to infinity, and a small denormal value would round to a normal non-zero value. Effectively, this mode prefers preserving the existing scale of tie numbers, avoiding out-of-range results when possible for numeral systems of even radix (such as binary and decimal).[clarification needed (see talk)]. One method, more obscure than most, is to alternate direction when rounding a number with 0.5 fractional part. All others are rounded to the closest integer. Whenever the fractional part is 0.5, alternate rounding up or down: for the first occurrence of a 0.5 fractional part, round up, for the second occurrence, round down, and so on. Alternatively, the first 0.5 fractional part rounding can be determined by a random seed. "Up" and "down" can be any two rounding methods that oppose each other - toward and away from positive infinity or toward and away from zero. If occurrences of 0.5 fractional parts occur significantly more than a restart of the occurrence "counting", then it is effectively bias free. With guaranteed zero bias, it is useful if the numbers are to be summed or averaged. If the fractional part of x is 0.5, choose y randomly between x + 0.5 and x − 0.5, with equal probability. All others are rounded to the closest integer. Like round-half-to-even and round-half-to-odd, this rule is essentially free of overall bias, but it is also fair among even and odd y values. An advantage over alternate tie-breaking is that the last direction of rounding on the 0.5 fractional part does not have to be "remembered". Rounding as follows to one of the closest integer toward negative infinity and the closest integer toward positive infinity, with a probability dependent on the proximity is called stochastic rounding and will give an unbiased result on average. For example, 1.6 would be rounded to 1 with probability 0.4 and to 2 with probability 0.6. Stochastic rounding can be accurate in a way that a rounding function can never be. For example, suppose one started with 0 and added 0.3 to that one hundred times while rounding the running total between every addition. The result would be 0 with regular rounding, but with stochastic rounding, the expected result would be 30, which is the same value obtained without rounding. This can be useful in machine learning where the training may use low precision arithmetic iteratively. Stochastic rounding is also a way to achieve 1-dimensional dithering. Rounding to other values The most common type of rounding is to round to an integer; or, more generally, to an integer multiple of some increment – such as rounding to whole tenths of seconds, hundredths of a dollar, to whole multiples of 1/2 or 1/8 inch, to whole dozens or thousands, etc. In general, rounding a number x to a multiple of some specified positive value m entails the following steps: For example, rounding x = 2.1784 dollars to whole cents (i.e., to a multiple of 0.01) entails computing 2.1784 / 0.01 = 217.84, then rounding that to 218, and finally computing 218 × 0.01 = 2.18. When rounding to a predetermined number of significant digits, the increment m depends on the magnitude of the number to be rounded (or of the rounded result). The increment m is normally a finite fraction in whatever numeral system is used to represent the numbers. For display to humans, that usually means the decimal numeral system (that is, m is an integer times a power of 10, like 1/1000 or 25/100). For intermediate values stored in digital computers, it often means the binary numeral system (m is an integer times a power of 2). The abstract single-argument "round()" function that returns an integer from an arbitrary real value has at least a dozen distinct concrete definitions presented in the rounding to integer section. The abstract two-argument "roundToMultiple()" function is formally defined here, but in many cases it is used with the implicit value m = 1 for the increment and then reduces to the equivalent abstract single-argument function, with also the same dozen distinct concrete definitions. Rounding to a specified power is very different from rounding to a specified multiple; for example, it is common in computing to need to round a number to a whole power of 2. The steps, in general, to round a positive number x to a power of some positive number b other than 1, are: Many of the caveats applicable to rounding to a multiple are applicable to rounding to a power. In the chromatic "twelve-tone" scale of music, 3⁄2 is rounded to 27/12 (a fifth), 4⁄3 is rounded to 25/12 (a fourth), 5⁄4 is rounded to 24/12 (a major third), 6⁄5 is rounded to 23/12 (a minor third), and 9⁄8 is rounded to 22/12 (a diminished third). This type of rounding, which is also named rounding to a logarithmic scale, is a variant of rounding to a specified power. Rounding on a logarithmic scale is accomplished by taking the log of the amount and doing normal rounding to the nearest value on the log scale. For example, resistors are supplied with preferred numbers on a logarithmic scale. In particular, for resistors with a 10% accuracy, they are supplied with nominal values 100, 120, 150, 180, 220, etc. rounded to multiples of 10 (E12 series). If a calculation indicates a resistor of 165 ohms is required then log(150) = 2.176, log(165) = 2.217 and log(180) = 2.255. The logarithm of 165 is closer to the logarithm of 180 therefore a 180 ohm resistor would be the first choice if there are no other considerations. Whether a value x ∈ (a, b) rounds to a or b depends upon whether the squared value x2 is greater than or less than the product ab. The value 165 rounds to 180 in the resistors example because 1652 = 27225 is greater than 150 × 180 = 27000. In floating-point arithmetic, rounding aims to turn a given value x into a value y with a specified number of significant digits. In other words, y should be a multiple of a number m that depends on the magnitude of x. The number m is a power of the base (usually 2 or 10) of the floating-point representation. Apart from this detail, all the variants of rounding discussed above apply to the rounding of floating-point numbers as well. The algorithm for such rounding is presented in the Scaled rounding section above, but with a constant scaling factor s = 1, and an integer base b > 1. Where the rounded result would overflow the result for a directed rounding is either the appropriate signed infinity when "rounding away from zero", or the highest representable positive finite number (or the lowest representable negative finite number if x is negative), when "rounding toward zero". The result of an overflow for the usual case of round to nearest is always the appropriate infinity. In some contexts it is desirable to round a given number x to a "neat" fraction – that is, the nearest fraction y = m/n whose numerator m and denominator n do not exceed a given maximum. This problem is fairly distinct from that of rounding a value to a fixed number of decimal or binary digits, or to a multiple of a given unit m. This problem is related to Farey sequences, the Stern–Brocot tree, and continued fractions. Finished lumber, writing paper, electronic components, and many other products are usually sold in only a few standard values. Many design procedures describe how to calculate an approximate value, and then "round" to some standard size using phrases such as "round down to nearest standard value", "round up to nearest standard value", or "round to nearest standard value". When a set of preferred values is equally spaced on a logarithmic scale, choosing the closest preferred value to any given value can be seen as a form of scaled rounding. Such rounded values can be directly calculated. More general rounding rules can separate values at arbitrary break points, used for example in data binning. A related mathematically formalized tool is signpost sequences, which use notions of distance other than the simple difference – for example, a sequence may round to the integer with the smallest relative (percent) error. Rounding in other contexts When digitizing continuous signals, such as sound waves, the overall effect of a number of measurements is more important than the accuracy of each individual measurement. In these circumstances, dithering, and a related technique, error diffusion, are normally used. A related technique called pulse-width modulation is used to achieve analog type output from an inertial device by rapidly pulsing the power with a variable duty cycle. Delta-sigma modulation is commonly used for converting between real-world signals and digital signals, which allows control of the frequency statistics of quantization. Error diffusion tries to ensure the error, on average, is minimized. When dealing with a gentle slope from one to zero, the output would be zero for the first few terms until the sum of the error and the current value becomes greater than 0.5, in which case a 1 is output and the difference subtracted from the error so far. Floyd–Steinberg dithering is a popular error diffusion procedure when digitizing images. As a one-dimensional example, suppose the numbers 0.9677, 0.9204, 0.7451, and 0.3091 occur in order and each is to be rounded to a multiple of 0.01. In this case the cumulative sums, 0.9677, 1.8881 = 0.9677 + 0.9204, 2.6332 = 0.9677 + 0.9204 + 0.7451, and 2.9423 = 0.9677 + 0.9204 + 0.7451 + 0.3091, are each rounded to a multiple of 0.01: 0.97, 1.89, 2.63, and 2.94. The first of these and the differences of adjacent values give the desired rounded values: 0.97, 0.92 = 1.89 − 0.97, 0.74 = 2.63 − 1.89, and 0.31 = 2.94 − 2.63. Monte Carlo arithmetic is a technique in Monte Carlo methods where the rounding is randomly up or down. Stochastic rounding can be used for Monte Carlo arithmetic, but in general, just rounding up or down with equal probability is more often used. Repeated runs will give a random distribution of results which can indicate the stability of the computation. It is possible to use rounded arithmetic to evaluate the exact value of a function with integer domain and range. For example, if an integer n is known to be a perfect square, its square root can be computed by converting n to a floating-point value z, computing the approximate square root x of z with floating point, and then rounding x to the nearest integer y. If n is not too big, the floating-point round-off error in x will be less than 0.5, so the rounded value y will be the exact square root of n. This is essentially why slide rules could be used for exact arithmetic. Rounding a number twice in succession to different levels of precision, with the latter precision being coarser, is not guaranteed to give the same result as rounding once to the final precision except in the case of directed rounding.[nb 2] For instance, rounding 9.46 to the nearest tenth gives 9.5, and then 10 when rounding to the nearest integer using rounding half to even, but would give 9 when rounded directly using the same method. Borman and Chatfield discuss the implications of double rounding when comparing data rounded to one decimal place to specification limits expressed using integers. In Martinez v. Allstate and Sendejo v. Farmers, litigated between 1995 and 1997, the insurance companies argued that double rounding premiums was permissible and in fact required. The US courts ruled against the insurance companies and ordered them to adopt rules to ensure single rounding. Some computer languages and the IEEE 754-2008 standard dictate that in straightforward calculations the result should not be rounded twice. This has been a particular problem with Java, as it is designed to be run identically on different machines; special programming tricks have had to be used to achieve this with x87 floating point. The Java language was changed to allow different results where the difference does not matter and require a strictfp qualifier to be used when the results have to conform accurately; strict floating point was restored in Java 17. In some algorithms, an intermediate result is computed in a larger precision, then must be rounded to the final precision. Double rounding can be avoided by choosing an adequate rounding for the intermediate computation. This consists in avoiding to round to midpoints for the final rounding (except when the midpoint is exact). In binary arithmetic, the idea is to round the result toward zero, and set the least significant bit to 1 if the rounded result is inexact; this rounding is called sticky rounding. Equivalently, it consists in returning the intermediate result when it is exactly representable, and the nearest floating-point number with an odd significand otherwise; this is why it is also known as rounding to odd. A concrete implementation of this approach, for binary and decimal arithmetic, is implemented as Rounding to prepare for shorter precision. This rounding mode is used to avoid getting a potentially wrong result after multiple roundings. This can be achieved if all roundings except the final one are done using rounding to prepare for shorter precision ("RPSP"), and only the final rounding uses the externally requested mode. With decimal arithmetic, final digits of 0 and 5 are avoided when the input is not representable exactly; if there is a choice between numbers with the least significant digit 0 or 1, 4 or 5, 5 or 6, 9 or 0, then the digit different from 0 or 5 shall be selected; otherwise, the choice is arbitrary. IBM defines that, in the latter case, a digit with the smaller magnitude shall be selected. RPSP can be applied with the step between two consequent roundings as small as a single digit (for example, rounding to 1/10 can be applied after rounding to 1/100). For example, when rounding to integer, In the example from "Double rounding" section, rounding 9.46 to one decimal gives 9.4, which rounding to integer in turn gives 9. With binary arithmetic, this rounding is also called "round to odd" (not to be confused with "round half to odd"). For example, when rounding to 1/4 (0.01 in binary), For correct results with binary arithmetic, each rounding step must remove at least 2 binary digits, otherwise, wrong results may appear. For example, If the erroneous middle step is removed, the final rounding to integer rounds 3.25 to the correct value of 3. RPSP is implemented in hardware in IBM zSeries and pSeries. In Python module "Decimal", Tcl module "math", Haskell package "decimal-arithmetic", and possibly others, this mode is called ROUND_05UP or round05up. William M. Kahan coined the term "The Table-Maker's Dilemma" for the unknown cost of rounding transcendental functions: Nobody knows how much it would cost to compute yw correctly rounded for every two floating-point arguments at which it does not over/underflow. Instead, reputable math libraries compute elementary transcendental functions mostly within slightly more than half an ulp and almost always well within one ulp. Why can't yw be rounded within half an ulp like SQRT? Because nobody knows how much computation it would cost... No general way exists to predict how many extra digits will have to be carried to compute a transcendental expression and round it correctly to some preassigned number of digits. Even the fact (if true) that a finite number of extra digits will ultimately suffice may be a deep theorem. The IEEE 754 floating-point standard guarantees that add, subtract, multiply, divide, fused multiply–add, square root, and floating-point remainder will give the correctly rounded result of the infinite-precision operation. No such guarantee was given in the 1985 standard for more complex functions and they are typically only accurate to within the last bit at best. However, the 2008 standard guarantees that conforming implementations will give correctly rounded results which respect the active rounding mode; implementation of the functions, however, is optional. Using the Gelfond–Schneider theorem and Lindemann–Weierstrass theorem, many of the standard elementary functions can be proved to return transcendental results, except on some well-known arguments; therefore, from a theoretical point of view, it is always possible to correctly round such functions. However, for an implementation of such a function, determining a limit for a given precision on how accurate results need to be computed, before a correctly rounded result can be guaranteed, may demand a lot of computation time or may be out of reach. In practice, when this limit is not known (or only a very large bound is known), some decision has to be made in the implementation (see below); but according to a probabilistic model, correct rounding can be satisfied with a very high probability when using an intermediate accuracy of up to twice the number of digits of the target format plus some small constant (after taking special cases into account). Some programming packages offer correct rounding. The GNU MPFR package gives correctly rounded arbitrary precision results. Some other libraries implement elementary functions with correct rounding in IEEE 754 double precision (binary64): There exist computable numbers for which a rounded value can never be determined no matter how many digits are calculated. Specific instances cannot be given but this follows from the undecidability of the halting problem. For instance, if Goldbach's conjecture is true but unprovable, then the result of rounding the following value, n, up to the next integer cannot be determined: either n=1+10−k where k is the first even number greater than 4 which is not the sum of two primes, or n=1 if there is no such number. The rounded result is 2 if such a number k exists and 1 otherwise. The value before rounding can however be approximated to any given precision even if the conjecture is unprovable. Rounding can adversely affect a string search for a number. For example, π rounded to four digits is "3.1416" but a simple search for this string will not discover "3.14159" or any other value of π rounded to more than four digits. In contrast, truncation does not suffer from this problem; for example, a simple string search for "3.1415", which is π truncated to four digits, will discover values of π truncated to more than four digits. History The concept of rounding is very old, perhaps older than the concept of division itself. Some ancient clay tablets found in Mesopotamia contain tables with rounded values of reciprocals and square roots in base 60. Rounded approximations to π, the length of the year, and the length of the month are also ancient—see base 60 examples. The round-half-to-even method has served as American Standard Z25.1 and ASTM standard E-29 since 1940. The origin of the terms unbiased rounding and statistician's rounding are fairly self-explanatory. In the 1906 fourth edition of Probability and Theory of Errors Robert Simpson Woodward called this "the computer's rule", indicating that it was then in common use by human computers who calculated mathematical tables. For example, it was recommended in Simon Newcomb's c. 1882 book Logarithmic and Other Mathematical Tables. Lucius Tuttle's 1916 Theory of Measurements called it a "universally adopted rule" for recording physical measurements. Churchill Eisenhart indicated the practice was already "well established" in data analysis by the 1940s. The origin of the term bankers' rounding remains more obscure. If this rounding method was ever a standard in banking, the evidence has proved extremely difficult to find. To the contrary, section 2 of the European Commission report The Introduction of the Euro and the Rounding of Currency Amounts suggests that there had previously been no standard approach to rounding in banking; and it specifies that "half-way" amounts should be rounded up. Until the 1980s, the rounding method used in floating-point computer arithmetic was usually fixed by the hardware, poorly documented, inconsistent, and different for each brand and model of computer. This situation changed after the IEEE 754 floating-point standard was adopted by most computer manufacturers. The standard allows the user to choose among several rounding modes, and in each case specifies precisely how the results should be rounded. These features made numerical computations more predictable and machine-independent, and made possible the efficient and consistent implementation of interval arithmetic. Currently, much research tends to round to multiples of 5 or 2. For example, Jörg Baten used age heaping in many studies, to evaluate the numeracy level of ancient populations. He came up with the ABCC Index, which enables the comparison of the numeracy among regions possible without any historical sources where the population literacy was measured. Rounding functions in programming languages Most programming languages provide functions or special syntax to round fractional numbers in various ways. The earliest numeric languages, such as Fortran and C, would provide only one method, usually truncation (toward zero). This default method could be implied in certain contexts, such as when assigning a fractional number to an integer variable, or using a fractional number as an index of an array. Other kinds of rounding had to be programmed explicitly; for example, rounding a positive number to the nearest integer could be implemented by adding 0.5 and truncating. In the last decades, however, the syntax and the standard libraries of most languages have commonly provided at least the four basic rounding functions (up, down, to nearest, and toward zero). The tie-breaking method can vary depending on the language and version or might be selectable by the programmer. Several languages follow the lead of the IEEE 754 floating-point standard, and define these functions as taking a double-precision float argument and returning the result of the same type, which then may be converted to an integer if necessary. This approach may avoid spurious overflows because floating-point types have a larger range than integer types. Some languages, such as PHP, provide functions that round a value to a specified number of decimal digits (e.g., from 4321.5678 to 4321.57 or 4300). In addition, many languages provide a printf or similar string formatting function, which allows one to convert a fractional number to a string, rounded to a user-specified number of decimal places (the precision). On the other hand, truncation (round to zero) is still the default rounding method used by many languages, especially for the division of two integer values. In contrast, CSS and SVG do not define any specific maximum precision for numbers and measurements, which they treat and expose in their DOM and in their IDL interface as strings as if they had infinite precision, and do not discriminate between integers and floating-point values; however, the implementations of these languages will typically convert these numbers into IEEE 754 double-precision floating-point values before exposing the computed digits with a limited precision (notably within standard JavaScript or ECMAScript interface bindings). Other rounding standards Some disciplines or institutions have issued standards or directives for rounding. In a guideline issued in mid-1966, the U.S. Office of the Federal Coordinator for Meteorology determined that weather data should be rounded to the nearest round number, with the "round half up" tie-breaking rule. For example, 1.5 rounded to integer should become 2, and −1.5 should become −1. Prior to that date, the tie-breaking rule was "round half away from zero". Some meteorologists may write "−0" to indicate a temperature between 0.0 and −0.5 degrees (exclusive) that was rounded to an integer. This notation is used when the negative sign is considered important, no matter how small is the magnitude; for example, when rounding temperatures in the Celsius scale, where below zero indicates freezing.[citation needed] See also Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/List_of_mergers_and_acquisitions_by_Meta_Platforms] | [TOKENS: 736]
Contents List of mergers and acquisitions by Meta Platforms Meta Platforms (formerly Facebook, Inc.) is a technology company that has acquired 91 other companies, including WhatsApp. The WhatsApp acquisition closed at a steep $16 billion; more than $40 per user of the platform. Meta also purchased the defunct company ConnectU in a court settlement and acquired intellectual property formerly held by rival Friendster. The majority of the companies acquired by Meta are based in the United States, and in turn, a large percentage of these companies are based in or around the San Francisco Bay Area. Meta has also made investments in LuckyCal and Wildfire Interactive. Most of Meta's acquisitions have primarily been "talent acquisitions" and acquired products are often shut-down. In 2009, Meta (as Facebook) CEO Mark Zuckerberg posted a question on Quora, titled "What startups would be good acquisitions for Facebook?", receiving 79 answers. He stated in 2010 that "We have not once bought a company for the company. We buy companies to get excellent people... In order to have a really entrepreneurial culture one of the key things is to make sure we're recruiting the best people. One of the ways to do this is to focus on acquiring great companies with great founders." The Instagram acquisition, announced on April 9, 2012, appears to have been the first exception to this pattern. While continuing with a pattern of primarily talent acquisitions, other notable product focused acquisitions include the $19 billion WhatsApp acquisition and the $2 billion Oculus VR acquisition. Acquisitions August 23, 2005 July 19, 2007 June 23, 2008 August 10, 2009 February 19, 2010 March 2, 2010 May 13, 2010 May 26, 2010 July 6, 2010 July 8, 2010 August 15, 2010 August 20, 2010 October 29, 2010 November 15, 2010 January 25, 2011 March 2, 2011 March 20, 2011 March 24, 2011 April 27, 2011 June 9, 2011 June 9, 2011 August 2, 2011 October 10, 2011 November 8, 2011 December 2, 2011 February 20, 2012 April 9, 2012 April 13, 2012 May 5, 2012 May 15, 2012 May 21, 2012 June 18, 2012 July 14, 2012 July 20, 2012 August 24, 2012 February 28, 2013 March 2013 March 2013 March 14, 2013 April 23, 2013 April 25, 2013 July 18, 2013 August 12, 2013 October 13, 2013 December 17, 2013 January 8, 2014 January 13, 2014 February 19, 2014 March 25, 2014 March 27, 2014 April 24, 2014 August 7, 2014 August 14, 2014 August 26, 2014 January 6, 2015 January 8, 2015 March 14, 2015 May 26, 2015 October 3, 2015 July 16, 2015 March 9, 2016 See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Music_sequencer] | [TOKENS: 3366]
Contents Music sequencer A music sequencer (or audio sequencer or simply sequencer) is a device or application software that can record, edit, or play back music, by handling note and performance information in several forms, typically CV/Gate, MIDI, or Open Sound Control, and possibly audio and automation data for digital audio workstations (DAWs) and plug-ins. Overview The advent of Musical Instrument Digital Interface (MIDI) in the 1980s gave programmers the opportunity to design software that could more easily record and play back sequences of notes played or programmed by a musician. As the technology matured, sequencers gained more features, such as the ability to record multitrack audio. Sequencers used for audio recording are called digital audio workstations (DAWs). Many modern sequencers can be used to control virtual instruments implemented as software plug-ins. This allows musicians to replace expensive and cumbersome standalone synthesizers with their software equivalents. Today the term sequencer is often used to describe software. However, hardware sequencers still exist. Workstation keyboards have their own proprietary built-in MIDI sequencers. Drum machines and some older synthesizers have their own step sequencer built in. The market demand for standalone hardware MIDI sequencers has diminished greatly due to the greater feature set of their software counterparts. Types of music sequencer Music sequencers can be categorized by handling data types, such as: Also, a music sequencer can be categorized by its construction and supported modes. Analog sequencers are typically implemented with analog electronics, and play the musical notes designated by a series of knobs or sliders for adjusting the note corresponding to each step in the sequence. It is designed for both composition and live performance; users can change the musical notes at any time without regard to recording mode. The time interval between each musical note (length of each step) may be independently adjustable. Typically, analog sequencers are used to generate repeated minimalistic phrases which may be reminiscent of Tangerine Dream, Giorgio Moroder or trance music. On step sequencers, musical notes are rounded into steps of equal time intervals, and users can enter each musical note without exact timing; Instead, the timing and duration of each step can be designated in several different ways: In general, step mode, along with roughly quantized semi-realtime mode, is often supported on the drum machines, bass machines and several groove machines. Realtime sequencers record the musical notes in real-time as on audio recorders, and play back musical notes with designated tempo, quantizations, and pitch. For editing, often punch in/out features originating in tape recording workflows are provided. This mode is widely supported on software sequencers, DAWs, and built-in hardware sequencers. A software sequencer is application software providing the functionality of a music sequencer, and often provided as one feature of the DAW or the integrated music authoring environments. The user may control the software sequencer either by using the graphical user interfaces or a specialized input devices, such as a MIDI controller. Note manipulation on audio tracks Alternative subsets of audio sequencers include: This type of software actually controls sequences of audio samples; thus, it can potentially be called an audio sequencer. This technique is possibly referred as audio sequencing. Possibly it may be one origin of audio sequencing. History The early music sequencers were sound-producing devices such as automatic musical instruments, music boxes, mechanical organs, player pianos, and Orchestrions. Player pianos, for example, had much in common with contemporary sequencers. Composers or arrangers transmitted music to piano rolls which were subsequently edited by technicians who prepared the rolls for mass duplication. Eventually consumers were able to purchase these rolls and play them back on their own player pianos. The origin of automatic musical instruments seems remarkably old. As early as the 9th century, the Persian (Iranian) Banū Mūsā brothers invented a hydropowered organ using exchangeable cylinders with pins, and also an automatic flute-playing machine using steam power, as described in their Book of Ingenious Devices. The Banu Musa brothers' automatic flute player was the first programmable music sequencer device, and the first example of repetitive music technology, powered by hydraulics. In 1206, Al-Jazari, an Arab engineer, invented programmable musical automata, a "robot band" which performed "more than fifty facial and body actions during each musical selection." It was notably the first programmable drum machine. Among the four automaton musicians were two drummers. It was a drum machine where pegs (cams) bump into little levers that operated the percussion. The drummers could be made to play different rhythms and different drum patterns if the pegs were moved around. In the 14th century, rotating cylinders with pins were used to play a carillon (steam organ) in Flanders,[citation needed] and at least in the 15th century, barrel organs were seen in the Netherlands. In the late-18th or early-19th century, with technological advances of the Industrial Revolution various automatic musical instruments were invented. Some examples: music boxes, barrel organs and barrel pianos consisting of a barrel or cylinder with pins or a flat metal disc with punched holes; or mechanical organs, player pianos and orchestrions using book music / music rolls (piano rolls) with punched holes, etc. These instruments were disseminated widely as popular entertainment devices prior to the inventions of phonographs, radios, and sound films which eventually eclipsed all such home music production devices. Of them all, punched-paper-tape media had been used until the mid-20th century. The earliest programmable music synthesizers including the RCA Mark II Sound Synthesizer in 1957, and the Siemens Synthesizer in 1959, were also controlled via punch tapes similar to piano rolls. Additional inventions grew out of sound film audio technology. The drawn sound technique which appeared in the late 1920s, is notable as a precursor of today's intuitive graphical user interfaces. In this technique, notes and various sound parameters are triggered by hand-drawn black ink waveforms directly upon the film substrate, hence they resemble piano rolls (or the 'strip charts' of the modern sequencers/DAWs). Drawn soundtrack was often used in early experimental electronic music, including the Variophone developed by Yevgeny Sholpo in 1930, and the Oramics designed by Daphne Oram in 1957, and so forth. During the 1940s–1960s, Raymond Scott, an American composer of electronic music, invented various kind of music sequencers for his electric compositions. The "Wall of Sound", once covered on the wall of his studio in New York during the 1940s–1950s, was an electro-mechanical sequencer to produce rhythmic patterns, consisting of stepping relays (used on dial pulse telephone exchange), solenoids, control switches, and tone circuits with 16 individual oscillators. Later, Robert Moog would explain it in such terms as "the whole room would go 'clack – clack – clack', and the sounds would come out all over the place". The Circle Machine, developed in 1959, had incandescent bulbs each with its own rheostat, arranged in a ring, and a rotating arm with photocell scanning over the ring, to generate an arbitrary waveform. Also, the rotating speed of the arm was controlled via the brightness of lights, and as a result, arbitrary rhythms were generated. The first electronic sequencer was invented by Raymond Scott, using thyratrons and relays. Clavivox, developed since 1952, was a kind of keyboard synthesizer with sequencer. On its prototype, a theremin manufactured by young Robert Moog was utilized to enable portamento over 3-octave range, and on later version, it was replaced by a pair of photographic film and photocell for controlling the pitch by voltage. In 1968, Ralph Lundsten and Leo Nilsson had a polyphonic synthesizer with sequencer called Andromatic built for them by Erkki Kurenniemi. The step sequencers played rigid patterns of notes using a grid of (usually) 16 buttons, or steps, each step being 1/16 of a measure. These patterns of notes were then chained together to form longer compositions. Sequencers of this kind are still in use, mostly built into drum machines and grooveboxes. They are monophonic by nature, although some are multi-timbral, meaning that they can control several different sounds but only play one note on each of those sounds.[clarification needed] On the other hand, software sequencers were continuously utilized since the 1950s in the context of computer music, including computer-played music (software sequencer), computer-composed music (music synthesis), and computer sound generation (sound synthesis). In June 1951, the first computer music Colonel Bogey was played on CSIRAC, Australia's first digital computer. In 1956, Lejaren Hiller at the University of Illinois at Urbana–Champaign wrote one of the earliest programs for computer music composition on ILLIAC, and collaborated on the first piece, Illiac Suite for String Quartet, with Leonard Issaction. In 1957 Max Mathews at Bell Labs wrote MUSIC, the first widely used program for sound generation, and a 17-second composition was performed by the IBM 704 computer. Subsequently, computer music was mainly researched on the expensive mainframe computers in computer centers, until the 1970s when minicomputers and then microcomputers became available in this field. In Japan, experiments in computer music date back to 1962, when Keio University professor Sekine and Toshiba engineer Hayashi experimented with the TOSBAC computer. This resulted in a piece entitled TOSBAC Suite. Also in 1970, Mathews and F. R. Moore developed the GROOVE (Generated Real-time Output Operations on Voltage-controlled Equipment) system, a first fully developed music synthesis system for interactive composition (that implies sequencer) and realtime performance, using 3C/Honeywell DDP-24 (or DDP-224) minicomputers. It used a CRT display to simplify the management of music synthesis in realtime, 12-bit D/A converter for realtime sound playback, an interface for CV/gate analog devices, and even several controllers including a musical keyboard, knobs, and rotating joysticks to capture realtime performance. In 1971, Electronic Music Studios (EMS) released one of the first digital sequencer products as a module of Synthi 100, and its derivation, Synthi Sequencer series. After then, Oberheim released the DS-2 Digital Sequencer in 1974, and Sequential Circuits released Model 800 in 1977 In 1977, Roland Corporation released the MC-8 MicroComposer, also called computer music composer by Roland. It was an early stand-alone, microprocessor-based, digital CV/gate sequencer, and an early polyphonic sequencer. It equipped a keypad to enter notes as numeric codes, 16 KB of RAM for a maximum of 5200 notes (large for the time), and a polyphony function which allocated multiple pitch CVs to a single Gate. It was capable of eight-channel polyphony, allowing the creation of polyrhythmic sequences. The MC-8 had a significant impact on popular electronic music, with the MC-8 and its descendants (such as the Roland MC-4 Microcomposer) impacting popular electronic music production in the 1970s and 1980s more than any other family of sequencers. The MC-8's earliest known users were Yellow Magic Orchestra in 1978. In 1975, New England Digital (NED) released ABLE computer (microcomputer) as a dedicated data processing unit for Dartmouth Digital Synthesizer (1973), and based on it, later Synclavier series were developed. The Synclavier I, released in September 1977, was one of the earliest digital music workstation product with multitrack sequencer. Synclavier series evolved throughout the late-1970s to the mid-1980s, and they also established integration of digital-audio and music-sequencer, on their Direct-to-Disk option in 1984, and later Tapeless Studio system. In 1982, renewed the Fairlight CMI Series II and added new sequencer software Page R, which combined step sequencing with sample playback. While there were earlier microprocessor-based sequencers for digital polyphonic synthesizers,[c] their early products tended to prefer the newer internal digital buses than the old-style analogue CV/gate interface once used on their prototype system. Then in the early-1980s, they also re-recognized the needs of CV/gate interface, and supported it along with MIDI as options. Yamaha's GS-1, their first FM digital synthesizer, was released in 1980. In June 1981, Roland Corporation founder Ikutaro Kakehashi proposed the concept of standardization between different manufacturers' instruments as well as computers, to Oberheim Electronics founder Tom Oberheim and Sequential Circuits president Dave Smith. In October 1981, Kakehashi, Oberheim and Smith discussed the concept with representatives from Yamaha, Korg and Kawai. In 1983, the MIDI standard was unveiled by Kakehashi and Smith. The first MIDI sequencer was the Roland MSQ-700, released in 1983. It was not until the advent of MIDI that general-purpose computers started to play a role as sequencers. Following the widespread adoption of MIDI, computer-based MIDI sequencers were developed. MIDI-to-CV/gate converters were then used to enable analogue synthesizers to be controlled by a MIDI sequencer. Since its introduction, MIDI has remained the musical instrument industry standard interface through to the present day. In 1987, software sequencers called trackers were developed to realize the low-cost integration of sampling sound and interactive digital sequencer as seen on Fairlight CMI II Page R. They became popular in the 1980s and 1990s as simple sequencers for creating computer game music, and remain popular in the demoscene and chiptune music. Modern computer digital audio software after the 2000s, such as Ableton Live, incorporates aspects of sequencers among many other features.[clarification needed] In 1978, Japanese personal computers such as the Hitachi Basic Master equipped the low-bit D/A converter to generate sound which can be sequenced using Music Macro Language (MML). This was used to produce chiptune video game music. It was not until the advent of MIDI, introduced to the public in 1983, that general-purpose computers really started to play a role as software sequencers. NEC's personal computers, the PC-88 and PC-98, added support for MIDI sequencing with MML programming in 1982. In 1983, Yamaha modules for the MSX featured music production capabilities, real-time FM synthesis with sequencing, MIDI sequencing, and a graphical user interface for the software sequencer. Also in 1983, Roland Corporation's CMU-800 sound module introduced music synthesis and sequencing to the PC, Apple II, and Commodore 64. The spread of MIDI on personal computers was facilitated by Roland's MPU-401, released in 1984. It was the first MIDI-equipped PC sound card, capable of MIDI sound processing and sequencing. After Roland sold MPU sound chips to other sound card manufacturers, it established a universal standard MIDI-to-PC interface. Following the widespread adoption of MIDI, computer-based MIDI software sequencers were developed. Mechanical (pre-20th century) Rhythmicon (1930) Drum machine (1959–) Transistorized drum machine (1964–) Step drum machine (1972–) Digital drum machine (1980–) Groove machine (1981–) "Page R" on Fairlight (1982) Tracker (1987–) Beat slicer (1990s–) Loop sequencer (1998–) Note manipulation on audio tracks (2009–) See also Notes References Further reading List of papers sharing a similar perspective with this Wikipedia article: External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_note-115] | [TOKENS: 10628]
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Sephardic_Jewish_cuisine] | [TOKENS: 3110]
Contents Sephardic Jewish cuisine Sephardic Jewish cuisine, belonging to the Sephardic Jews—descendants of the Jewish population of the Iberian Peninsula until their expulsion in 1492—encompassing traditional dishes developed as they resettled in the Ottoman Empire, North Africa, and the Mediterranean, including Jewish communities in Turkey, Greece, Bulgaria, North Macedonia, and Syria, as well as the Sephardic community in the Land of Israel. It may also refer to the culinary traditions of the Western Sephardim, who settled in Holland, England, and from these places elsewhere. The cuisine of Jerusalem, in particular, is considered predominantly Sephardic. Sephardic Jewish cuisine preserves medieval traditions while also incorporating dishes developed in the regions where Sephardic Jews resettled after the expulsion. Notable dishes include bourekas (savory pastries), eggplant-based dishes, medias (halved vegetables filled with meat or cheese and cooked in tomato sauce), stuffed vegetables, agristada (a sour sauce), tishpishti (a semolina and nuts cake), baklava, and cookies such as biscochos and qurbayel. Many of these dishes' names originate from Judaeo-Spanish, Turkish, and Greek, the main languages spoken by Sephardic Jews in the diaspora. As with other Jewish ethnic divisions composing the Jewish Diaspora, Sephardim cooked foods that were popular in their countries of residence, adapting them to Jewish religious dietary requirements, kashrut. Their choice of foods was also determined by economic factors, with many of the dishes based on inexpensive and readily available ingredients. Terminology Sephardi Jews are the Jews of the Iberian Peninsula, who were expelled or forced to convert to Christianity in 1492. Many of those expelled settled in North-African Berber and Arabic-speaking countries, such as Morocco, Tunisia, Algeria and Libya, becoming the North African Sephardim. Those who settled in Greece, Turkey, the Balkans, Syria, the Lebanon and the Holy Land became the Eastern Sephardim. The Western Sephardim, also known more ambiguously as the Spanish and Portuguese Jews, left Spain and Portugal as New Christians in a steady stream over the next few centuries, and converted back to Judaism once in Holland, England, etc.[citation needed] While the pre-existing Jews of the countries in which they settled (in the Greater Middle East, for example, are called Mizrahim) are distinct, the term Sephardi as used in "Sephardi cuisine" would refer only to the culinary traditions of those Jews with ancestral origins to the Jews of Spain and Portugal.[citation needed] Both the Jews of the Iberian Peninsula and the pre-existing Jews of Morocco, Tunisia, Algeria, Bulgaria, Turkey, Syria, Egypt, Italy, and Greece into whose communities they settled adapted local dishes to the constraints of the kosher dietary laws. Since the establishment of a Jewish state and the convergence of Jews from all the globe in Israel, these local cuisines, with all their differences, have come to represent the collection of culinary traditions broadly known as Sephardi cuisine.[citation needed] History Prior to their expulsion in 1492, Sephardic Jews enjoyed a vibrant cultural life in medieval Spain, marked by their integration into both Muslim and Christian societies while maintaining a distinct Jewish identity and developing a rich Jewish culture of their own.[citation needed] This period saw the development of a well-established culinary tradition that not only reflected the broader food culture of medieval Spain but also featured ingredients like eggplant, chard, and chickpeas, which became closely associated with Jews in this area. The Kitāb al-Ṭabikh, a cookbook composed in Al-Andalus during the 12th or 13th centuries, includes six explicitly Jewish recipes. It also features an early version of mofletta, a sweet pancake dish still enjoyed by Sephardic Moroccan Jews during Mimouna, as well as a possibly early version of challah bread, which may have traveled with Jews from Spain to Central Europe and subsequently influenced Ashkenazi cuisine. Sephardic Jewish cuisine underwent significant changes following the expulsion of Jews from Spain in 1492, representing a pivotal moment for Sephardic Jews, who were faced with the choice of converting to Christianity or fleeing their homes. Many resettled across the Mediterranean, with a significant number finding refuge in the Ottoman Empire. This migration posed considerable challenges, including the disruption of established communal structures and institutions. According to Sara Gardner, during this period, Sephardic women played a crucial role in preserving cultural identity, especially as communal institutions collapsed. The domestic sphere, traditionally overseen by women, became the focal point for maintaining religious and cultural practices, including culinary traditions. Upon their settlement in the Ottoman Empire, Sephardic Jews began the process of recreating their Spanish culinary heritage despite the lack of familiar ingredients and cooking methods. Sephardic women were instrumental in this process, modifying their recipes to incorporate new local ingredients while maintaining traditional dishes. Foods such as adafina, a traditional Shabbat stew; almodrote, a casserole made with eggplant and cheese; and biscochos, cakes made with ground nuts and eggs—once used to identify crypto-Jews in Spain during the Spanish Inquisition—were reintroduced in their new Ottoman homes. The integration of Sephardic Jews into Ottoman society led to a fusion of Sephardic and Ottoman culinary styles. Sephardic women adapted local ingredients and techniques, resulting in the creation of new dishes such as tishpishti, a semolina cake soaked in syrup, and pishkado ahilado, a stew of fried fish with tomato sauce. Other ingredients known from Spain were also available in the new home. Eggplant, a key ingredient associated with Jews in Spain, remained a hallmark of Sephardic cuisine in the Ottoman Empire. The classic Judaeo-Spanish koplas song "Siete modos de gizar la berendgena"[a] lists various methods of preparing eggplant popular among Jews in Ottoman lands, alongside several eggplant dishes influenced by local cuisine. The incorporation of Ottoman culinary methods, such as the use of filo pastry, facilitated innovations such as bourekas. This adaptation illustrates the evolution of Sephardic cuisine within its new context, while retaining elements from its Iberian origins and reflecting both continuity and change across the Mediterranean. Sephardic Jews also settled in various regions worldwide, developing distinct cuisines influenced by local ingredients and cooking methods. For instance, some Jews who fled the Inquisition in the 15th century settled in Recife, Brazil, where their cuisine incorporated local ingredients such as molasses, rum, sugar, vanilla, chocolate, bell peppers, corn, tomatoes, kidney beans, string beans, and turkey.[citation needed] In 1654, 23 Sephardic Jews arrived in New Amsterdam (present-day New York) bringing this cuisine with them to the early colonial United States. Early American Jewish cuisine was heavily influenced by this branch of Sephardic cuisine. Many of the recipes were bound up in observance of traditional holidays and remained true to their origins. These included dishes such as stew and fish fried in olive oil, beef and bean stews, almond puddings, and egg custards. The first kosher cookbook in America was the Jewish Cookery Book by Esther Levy which was published in 1871 in Philadelphia and includes many of the traditional recipes. Sephardic Jews arriving in Jerusalem from Ottoman lands during the 17th and 18th centuries introduced their cuisine to the city. Consequently, the cuisine of Jerusalem is predominantly Sephardic, featuring dishes such as slow-cooked meat stews, stuffed vegetables, and a variety of savory pastries, including pastelitos, borekitas, and biscochos. This cuisine has also integrated with other local culinary traditions, incorporating elements from Levantine Arab, Ashkenazi, and Kurdish Jewish practices. Cuisine basics Sephardi cuisine emphasizes salads, stuffed vegetables and vine leaves, olive oil, lentils, fresh and dried fruits, herbs and nuts, and chickpeas. Meat dishes often make use of lamb or ground beef. Fresh lemon juice is added to many soups and sauces. Many meat and rice dishes incorporate dried fruits such as apricots, prunes and raisins. Pine nuts are used as a garnish. In the early days, Sephardic cuisine was influenced by the local cuisines of Spain and Portugal, both under Catholic and Islamic regimes. A particular affinity to exotic foods from outside of Spain became apparent under Muslim rule, as evidenced even today with ingredients brought in by the Muslims. Cumin, cilantro, and turmeric are very common in Sephardi cooking. Caraway and capers were brought to Spain by the Muslims and are featured in the cuisine. Cardamom (hel) is used to flavor coffee. Chopped fresh cilantro and parsley are popular garnishes. Chopped mint is added to salads and cooked dishes, and fresh mint leaves (nana) are served in tea. Cinnamon is sometimes used as a meat seasoning, especially in dishes made with ground meat. Saffron, which is grown in Spain, is used in many varieties of Sephardic cooking, as well as spices found in the areas where they have settled. Tiny cups of Turkish coffee, sometimes spiced with cardamom, are often served at the end of a festive meal, accompanied by small portions of baklava or other pastries dipped in syrup or honey. Hot sahlab, a liquidy cornstarch pudding originally flavored with orchid powder (today invariably replaced by artificial flavorings), is served in cups as a winter drink, garnished with cinnamon, nuts, coconut and raisins. Arak is the preferred alcoholic beverage. Rose water is a common ingredient in cakes and desserts. Malabi, a cold cornstarch pudding, is sprinkled with rose water and red syrup. Shabbat and holiday dishes As cooking on Shabbat is prohibited, Sephardi Jews, like their Ashkenazi counterparts, developed slow-cooked foods that would simmer on a low flame overnight and be ready for eating the next day. One slow-cooked food was ropa vieja. The oldest name of the dish is chamin (from the Hebrew word "cham," which means "hot"), but there are several other names. When the Sephardic Jews were expelled from Spain in 1492, many fled to northwestern Africa across the Straits of Gibraltar. The hamin was changed, adjusting for local ingredients and then called dafina ("covered") in Morocco. Any favorite vegetables can be added, and the eggs can be removed and eaten at any time. Its Ashkenazi counterpart is called shalet or cholent. Shavfka is another Sephardi dish that has an Ashkenazi counterpart, namely kugel. Bourekas and bulemas are often served on Shabbat morning. Pestelas and Pastelikos, sesame-topped pastry filled with pine nuts, meat and onion, are also traditionally eaten on the same time. Sambusak is a semicircular pocket of dough filled with mashed chickpeas, fried onions and spices associated with Sephardic Jewish cuisine. According to Gil Marks, an Israeli food historian, sambusak has been a traditional part of the Sephardic Sabbath meal since the 13th century. At the beginning of the evening meals of Rosh Hashana, it is traditional to eat foods symbolic of a good year and to recite a short prayer beginning with the Hebrew words yehi ratson ("May it be Your will") over each one, with the name of the food in Hebrew or Aramaic often presenting a play on words. The foods eaten at this time have thus become known as yehi ratsones. Typical foods, often served on a large platter called a yehi ratson platter, include: It is also common to symbolize a year filled with blessings by eating foods with stuffing on Rosh Hashana such as a stuffed, roasted bird or a variety of stuffed vegetables called legumbres yaprakes. Sephardic Hanukkah dishes include cassola (sweet cheese pancakes), bimuelos (puffed fritters with an orange glaze), keftes de espinaka (spinach patties), keftes de prasa (leek patties) and shamlias (fried pastry frills). Sephardi and Ashkenazi cooking differs substantially on Passover due to rabbinic rulings that allow the consumption of kitniyot, a category which is forbidden to Ashkenazi Jews. Sephardi Jews prepare charoset, one of the symbolic foods eaten at the Passover seder, from different ingredients. Whereas charoset in Ashkenazi homes is a blend of chopped apples and nuts spiced with wine and cinnamon, Sephardi charoset is based on raisins or dates and is generally much thicker in consistency. Mina (known as scacchi in Italy) is a Passover meat or vegetable pie made with a matzo crust. A challah bread traditionally made for Shavuot by Sephardic women is siete cielos, meaning "seven heavens" in Ladino. The name refers to the belief that seven celestial spheres opened when the Ten Commandments were given. The siete cielos bread has a central orb representing Mount Sinai surrounded by seven dough rings symbolising the seven heavens. These rings are adorned with small dough sculptures, including representations of Miriam's well, the Ten Commandments, an open Torah scroll, a dove symbolising the Jewish people, and the copper serpent stick created by Moses as a sign for repentance and healing. Food in Sephardic popular culture Michal Held Delaroza documented several food-related sayings from the old Sephardic Jewish community of Jerusalem. One such saying was: "The holidays of the Christians are in the fields, the holidays of the Muslims are in the tombs, and the holidays of the Jews are in the pots" (Arabic: أعياد المسيحيين بالظهور، أعياد المسلمين بالقبور، أعياد اليهود بالقدور, romanized: A'yad al-Masihiyyin bil-zuhur, a'yad al-Muslimin bil-qubur, a'yad al-Yahud bil-qudur) Another saying was: El mundo es un bizcocho, ke ayen ke lo kome crudo, ke ayen ke lo kome kocho, which means: "The world is a cake; those who want to eat it raw, let them eat it raw; those who want to eat it cooked, let them eat it cooked." Additionally, the phrase "despues de Purim, platicos" (after Purim, a mishloach manot) serves as a metaphor for something done when it already had no value or significance. Other specialities See also Footnotes References Further reading
========================================