text stringlengths 0 473k |
|---|
[SOURCE: https://github.com/features/models] | [TOKENS: 667] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. Your AI toolbox built into GitHub Models, prompts, evals, and more. Everything you need to go from idea to shipped—without ever leaving GitHub. Pick the right model, fast Run side-by-side evaluations to compare outputs from industry-leading models in real time. No guesswork, just better results. Manage prompts like code Version, share, and reuse your prompts across projects. Treat AI inputs as first-class development assets, just like your source code. Secure by design Control which models your team can use, keep data and prompts private, and ensure everything runs within GitHub and Azure infrastructure. Building with AI, made easy Build, test, and ship AI—right from your GitHub workflow. Make direct API calls or integrate with the Azure AI SDK or any supported model SDK. /ai/models - logos Store, manage, and collaborate on AI prompts just like code, with built-in tools to track changes, preview diffs, and roll back anytime. Build faster with structured evaluations. Score outputs on quality, relevance, or any metric you define, using custom evaluators or LLMs as judges. Turn prompt editing into a team sport. Built on trusted pull request workflows, the natural language prompt editor makes it easy for anyone to improve prompt quality. From idea to deployment, GitHub Spark and GitHub Models let you move fast with the right model for the job. Go from prototype to production in a snap. Get started with GitHub Models Learn how to set up, test, compare, and securely deploy with GitHub Models. If you want to use GitHub Models beyond the free usage included in your account, you can choose to opt-in to paid usage. Browse and try out different models from top providers. GitHub Models brings AI directly into the developer workflow by providing access to multiple leading models through a single API key. It allows teams to manage prompts as code, run side-by-side model evaluations, and move from testing to production within the same environment they already use. Yes, GitHub Models is a separate product, outside of GitHub Copilot. GitHub Models is free for everyone to get started building AI with and can be leveraged directly within GitHub. GitHub Models includes a playground where you can explore a curated selection of AI models from providers like OpenAI, Meta, and Microsoft. There you can experiment with prompts, tweak parameters (such as temperature or max tokens), and see how different models respond, all in real time. Yes, you can bring your own API keys (BYOK) from different providers, such as OpenAI or Azure AI. Model inference runs directly through your provider, and usage is billed and tracked through your provider account. View the GitHub BYOK documentation. Billing for GitHub Models is designed to be flexible and to allow you to use your preferred model providers, while also providing the ability to control your spending. Model usage is powered by the Azure OpenAI Service and billed through GitHub using the same global pay-as-you-go pricing as Azure OpenAI Service. Learn more about billing for GitHub Models. Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Sexual_abuse] | [TOKENS: 1765] |
Contents Sexual abuse Sexual abuse or sex abuse is abusive sexual behavior by one person upon another. It is often perpetrated using physical force, or by taking advantage of another. It often consists of a persistent pattern of sexual assaults. The offender is referred to as a sexual abuser. Live streaming sexual abuse involves trafficking and coerced sexual acts, or rape, in real time on webcam. Molestation refers to an instance of sexual assault, especially when perpetrated against a child. The perpetrator is called (often pejoratively)[failed verification] a molester. The term also covers behavior by an adult or older adolescent towards a child to sexually stimulate any of the involved. The use of a child for sexual stimulation is referred to as child sexual abuse and, for pubescent or post-pubescent individuals younger than the age of consent, statutory rape. Sexual abuse can be perpetrated against other vulnerable populations like the elderly, a form of elder abuse, or those with developmental disabilities. Victims Spousal sexual abuse is a form of domestic violence. When the abuse involves threats of unwanted sexual contact or forced sex by a woman's husband or ex-husband, it may constitute rape, depending on the jurisdiction, and may also constitute an assault. Child sexual abuse is a form of child abuse in which a child is abused for the sexual gratification of an adult or older adolescent. It includes direct sexual contact, the adult or otherwise older person engaging indecent exposure (of the genitals, female nipples, etc.) to a child with intent to gratify their own sexual desires or to intimidate or groom the child, asking or pressuring a child to engage in sexual activities, displaying pornography to a child, or using a child to produce child pornography. Effects of child sexual abuse include shame, self-blame, depression, anxiety, post-traumatic stress disorder, self-esteem issues, sexual dysfunction, chronic pelvic pain, addiction, self-injury, suicidal ideation, borderline personality disorder, and propensity to re-victimization in adulthood. Child sexual abuse is a risk factor for attempting suicide. Additionally, some studies have shown childhood sexual abuse to be a risk factor of the perpetration of intimate partner violence in men. Much of the harm caused to victims becomes apparent years after the abuse happens. With specific regard to addiction, a study by Reiger et al. supports previous findings that adverse life events increase sensitivity to drug rewards and bolster drug reward signaling by exposing an association between heightened limbic response to cocaine cues. Sexual abuse by a family member is a form of incest, which can result in severe long-term psychological trauma, especially in the case of parental incest. Globally, approximately 18–19% of women and 8% of men disclose being sexually abused during their childhood. The gender gap may be caused by higher victimization of girls, lower willingness of men to disclose abuse, or both. Most sexual abuse offenders are acquainted with their victims; approximately 30% are relatives of the child, most often fathers, uncles or cousins; around 60% are other acquaintances such as friends of the family, babysitters, or neighbors; strangers are the offenders in approximately 10% of child sexual abuse cases. Most child sexual abuse is committed by men; women commit approximately 14% of offenses reported against boys and 6% of offenses reported against girls. Child sexual abuse offenders are not pedophiles unless they have a primary or exclusive sexual interest in prepubescent children. People with developmental disabilities are often victims of sexual abuse. According to research, people with disabilities are at a greater risk for victimization of sexual assault or sexual abuse because of lack of understanding (Sobsey & Varnhagen, 1989). Elderly people, especially those with dementia, can be at risk of abuse. There were over 6,000 "safeguarding concerns and alerts" at UK care homes from 2013 to 2015. These included alleged inappropriate touching and worse allegations. Offenders were most often other residents but staff also offended. It is suspected some care homes may deliberately overlook these offenses. People in poverty, including those from developing countries, are vulnerable to forced prostitution, live streaming sexual abuse, and other forms of molestation. Victims who come from families in poverty often have less connections, power, protection, and education about sex crimes. Sexual abuse is a problem in some minority communities. In 2007, a number of Hispanic victims were included in the settlement of a massive sexual abuse case involving the Los Angeles archdiocese of the Catholic Church. A qualitative study by Kim et al. discusses the experiences of sexual abuse in the US population of Mexican immigrant women, citing immigration, acculturation, and several other social elements as risk factors for abuse. The experience of forced strip searches can be experienced as a traumatic event similarly to that of rape, especially when combined with habitual body cavity searches. The prevalence of CCTV in modern correctional facilities and the generally indiscreet nature of strip searches, often with a number of prison guards observing, usually adds to the experienced humiliation. Strip searches are often arbitrarily used under various pretences, when the actual ambition is to assert control and predominance as well as to intimidate the subjected prison inmates. Captive breeding activities are sometimes described as sexual abuse. People for the Ethical Treatment of Animals (PETA) has specifically objected, for example, to SeaWorld's breeding of orcas (Orcinus orca). Captive breeding of animals led to the idea of capturing and enslaving women for involuntary breeding according to Charles Patterson. Treatment In the emergency department, contraceptive medications are offered to women raped by men because about 5% of such rapes result in pregnancy. Preventative medication against sexually transmitted infections are given to victims of all types of sexual abuse (especially for the most common diseases like chlamydia, gonorrhea, trichomoniasis and bacterial vaginosis) and a blood serum is collected to test for STIs (such as HIV, hepatitis B and syphilis). Any survivor with abrasions are immunized for tetanus if 5 years have elapsed since the last immunization. Short-term treatment with a benzodiazepine may help with acute anxiety and antidepressants may be helpful for symptoms of PTSD, depression and panic attacks. Sexual abuse has been linked to the development of psychotic symptoms in abused children. Treatment for psychotic symptoms may also be involved in sexual abuse treatment. In regards to long term psychological treatment, prolonged exposure therapy has been tested as a method of long-term PTSD treatment for victims of sexual abuse. Prevention Child sexual abuse prevention programmes were developed in the United States of America during the 1970s and originally delivered to children. Programmes delivered to parents were developed in the 1980s and took the form of one-off meetings, two to three hours long. In the last 15 years, web-based programmes have been developed. Survivor The term survivor is sometimes used for a living victim, including victims of non-fatal harm, to honor and empower the strength of an individual to heal, in particular a living victim of sexual abuse or assault. For example, there are the Survivors Network of those Abused by Priests and The Survivors Trust. Positions of power Sexual misconduct can occur where one person uses a position of authority to compel another person to engage in an otherwise unwanted sexual activity. For example, sexual harassment in the workplace might involve an employee being coerced into a sexual situation out of fear of being dismissed. Sexual harassment in education might involve a student submitting to the sexual advances of a person in authority in fear of being punished, for example by being given a failing grade. Several sexual abuse scandals have involved religious abuse or religious settings and often cover-up among non-abusers, including cases in the Southern Baptist Convention, Catholic Church, Episcopalian religion, Islam, Jehovah's Witnesses, Lutheran church, Methodist Church, Anabaptist/Mennonite Church, The Church of Jesus Christ of Latter-day Saints, the Fundamentalist Church of Jesus Christ of Latter Day Saints, Orthodox Judaism, other branches of Judaism, various buddhist schools such as Zen and Tibetan, Yoga classes, and various cults. Social media Due to social media censorship algorithms, people wishing to discuss sex and particular sexual assault have adopted the 'algospeak' code word 'mascara' to refer to a boyfriend or romantic partner in a sexual context and then proceed to euphemistically describe bad experiences. The use of such code language can also lead to confusion and embarrassment for those who are unfamiliar with the intended meaning. Animals Sexual abuse has been identified among animals as well, for example, among the Adélie penguins. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Theano_(software)] | [TOKENS: 475] |
Contents Theano (software) Theano is a Python library and optimizing compiler for manipulating and evaluating mathematical expressions, especially matrix-valued ones. In Theano, computations are expressed using a NumPy-esque syntax and compiled to run efficiently on either CPU or GPU architectures. History Theano is an open source project primarily developed by the Montreal Institute for Learning Algorithms (MILA) at the Université de Montréal. The name of the software references the ancient philosopher Theano, long associated with the development of the golden mean. On 28 September 2017, Pascal Lamblin posted a message from Yoshua Bengio, Head of MILA: major development would cease after the 1.0 release due to competing offerings by strong industrial players. Theano 1.0.0 was then released on 15 November 2017. On 17 May 2018, Chris Fonnesbeck wrote on behalf of the PyMC development team that the PyMC developers will officially assume control of Theano maintenance once the MILA development team steps down. On 29 January 2021, they started using the name Aesara for their fork of Theano. On 29 Nov 2022, the PyMC development team announced that the PyMC developers will fork the Aesara project under the name PyTensor. Sample code The following code is the original Theano's example. It defines a computational graph with 2 scalars a and b of type double and an operation between them (addition) and then creates a Python function f that does the actual computation. Examples The following code demonstrates how to perform matrix multiplication using Theano, which is essential for linear algebra operations in many machine learning tasks. The following code uses Theano to compute the gradient of a simple operation (like a neuron) with respect to its input. This is useful in training machine learning models (backpropagation). The following code shows how to start building a simple neural network. This is a very basic neural network with one hidden layer. The following code demonstrates how broadcasting works in Theano. Broadcasting allows operations between arrays of different shapes without needing to explicitly reshape them. See also References External links This scientific software article is a stub. You can help Wikipedia by adding missing information. This artificial neural network-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/ELAN_(programming_language)] | [TOKENS: 259] |
Contents ELAN (programming language) ELAN is an interpreted educational programming language for learning and teaching systematic programming. (Note: In May 2023 design commenced on a new programming language named 'Elan' also designed for teaching and learning programming in schools, but it has no historical connection to the 'ELAN' language described here.) It was developed in 1974 by C.H.A. Koster and a group at Technische Universität Berlin as an alternative to BASIC in teaching, and approved for use in secondary schools in Germany by the "Arbeitskreis Schulsprache". It was in use until the late 1980s in a number of schools in Germany, Belgium, the Netherlands, and Hungary for informatics teaching in secondary education, and used at the Radboud University Nijmegen in the Netherlands for teaching systematic programming to students from various disciplines and in teacher courses. The language design focuses strongly on structured programming, and has a special construction for stepwise refinement, allowing students to focus on top-down design, and bottom-up coding. The microkernel operating system Eumel began as a runtime system (environment) for ELAN. See also References External links This programming-language-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Fresco] | [TOKENS: 4565] |
Contents Fresco Fresco (pl. frescos or frescoes) is a technique of mural painting executed upon freshly laid ("wet") lime plaster. Water is used as the vehicle for the dry-powder pigment to merge with the plaster, and with the setting of the plaster, the painting becomes an integral part of the wall. The word fresco (Italian: affresco) is derived from the Italian adjective fresco meaning "fresh", and may thus be contrasted with fresco-secco or secco mural painting techniques, which are applied to dried plaster, to supplement painting in fresco. The fresco technique has been employed since antiquity and is closely associated with Italian Renaissance painting. The word fresco is commonly and inaccurately used in English to refer to any wall painting regardless of the plaster technology or binding medium. This, in part, contributes to a misconception that the most geographically and temporally common wall painting technology was the painting into wet lime plaster. Even in apparently buon fresco technology, the use of supplementary organic materials was widespread, if underrecognized. Technology Buon fresco pigment is mixed with room temperature water and is used on a thin layer of wet, fresh plaster, called the intonaco (after the Italian word for plaster). Because of the chemical makeup of the plaster, a binder is not required, as the pigment mixed solely with the water will sink into the intonaco, which itself becomes the medium holding the pigment. The pigment is absorbed by the wet plaster; after a number of hours, the plaster dries in reaction to air: it is this chemical reaction which fixes the pigment particles in the plaster. The chemical processes are as follows: In painting buon fresco, a rough underlayer called the arriccio is added to the whole area to be painted and allowed to dry for some days. Many artists sketched their compositions on this underlayer, which would never be seen, in a red pigment called sinopia, a name also used to refer to these under-paintings. Later,[when?]new techniques for transferring paper drawings to the wall were developed. The main lines of a drawing made on paper were pricked over with a point, the paper held against the wall, and a bag of soot (spolvero) banged on them to produce black dots along the lines. If the painting was to be done over an existing fresco, the surface would be roughened to provide better adhesion. On the day of painting, the intonaco, a thinner, smooth layer of fine plaster was added to the amount of wall that was expected to be completed that day, sometimes matching the contours of the figures or the landscape, but more often just starting from the top of the composition. This area is called the giornata ("day's work"), and the different day stages can usually be seen in a large fresco, by a faint seam that separates one from the next. Buon frescoes are difficult to create because of the deadline associated with the drying plaster. Generally, a layer of plaster will require ten to twelve hours to dry; ideally, an artist would begin to paint after one hour and continue until two hours before the drying time—giving seven to nine hours' working time. Once a giornata is dried, no more buon fresco can be done, and the unpainted intonaco must be removed with a tool before starting again the next day. If mistakes have been made, it may also be necessary to remove the whole intonaco for that area—or to change them later, a secco. An indispensable component of this process is the carbonatation of the lime, which fixes the colour in the plaster ensuring durability of the fresco for future generations. A technique used in the popular frescoes of Michelangelo and Raphael was to scrape indentations into certain areas of the plaster while still wet to increase the illusion of depth and to accent certain areas over others. The eyes of the people of the School of Athens are sunken-in using this technique which causes the eyes to seem deeper and more pensive. Michelangelo used this technique as part of his trademark 'outlining' of his central figures within his frescoes. In a wall-sized fresco, there may be ten to twenty or even more giornate, or separate areas of plaster. After five centuries, the giornate, which were originally nearly invisible, have sometimes become visible, and in many large-scale frescoes, these divisions may be seen from the ground. Additionally, the border between giornate was often covered by an a secco painting, which has since fallen off. One of the first painters in the post-classical period to use this technique was the Isaac Master (or Master of the Isaac fresco, and thus a name used to refer to the unknown master of a particular painting) in the Upper Basilica of Saint Francis in Assisi. A person who creates fresco is called a frescoist. Other types of wall painting A secco or fresco-secco painting is done on dry plaster (secco meaning "dry" in Italian). The pigments thus require a binding medium, such as egg (tempera), glue or oil to attach the pigment to the wall. It is important to distinguish between a secco work done on top of buon fresco, which according to most authorities was in fact standard from the Middle Ages onwards, and work done entirely a secco on a blank wall. Generally, buon fresco works are more durable than any a secco work added on top of them, because a secco work lasts better with a roughened plaster surface, whilst true fresco should have a smooth one. The additional a secco work would be done to make changes, and sometimes to add small details, but also because not all colours can be achieved in true fresco, because only some pigments work chemically in the very alkaline environment of fresh lime-based plaster. Blue was a particular problem, and skies and blue robes were often added a secco, because neither azurite blue nor lapis lazuli, the only two blue pigments then available, works well in wet fresco. It has also become increasingly clear, thanks to modern analytical techniques, that even in the early Italian Renaissance painters quite frequently employed a secco techniques so as to allow the use of a broader range of pigments. In most early examples this work has now entirely vanished, but a whole painting done a secco on a surface roughened to give a key for the paint may survive very well, although damp is more threatening to it than to buon fresco. A third type called a mezzo-fresco is painted on nearly dry intonaco—firm enough not to take a thumb-print, says the sixteenth-century author Ignazio Pozzo—so that the pigment only penetrates slightly into the plaster. By the end of the sixteenth century this had largely displaced buon fresco, and was used by painters such as Giovanni Battista Tiepolo or Michelangelo. This technique had, in reduced form, the advantages of a secco work. The three key advantages of work done entirely a secco were that it was quicker, mistakes could be corrected, and the colours varied less from when applied to when fully dry—in wet fresco there was a considerable change. For wholly a secco work, the intonaco is laid with a rougher finish, allowed to dry completely and then usually given a key by rubbing with sand. The painter then proceeds much as he or she would on a canvas or wood panel. History The first known Egyptian fresco was found in Tomb 100 at Hierakonpolis, and dated to c. 3500–3200 BC. Several of the themes and designs visible in the fresco are otherwise known from other Naqada II objects, such as the Gebel el-Arak Knife. It shows the scene of a "Master of Animals", a man fighting against two lions, individual fighting scenes, and Egyptian and foreign boats. Ancient Egyptians painted many tombs and houses, but those wall paintings are not frescoes. An old fresco from Mesopotamia is the Investiture of Zimri-Lim (modern Syria), dating from the early 18th century BC. The oldest frescoes done in the buon fresco method date from the first half of the second millennium BCE during the Bronze Age and are to be found among Aegean civilizations, more precisely Minoan art from the island of Crete and other islands of the Aegean Sea. The most famous of these[citation needed], the Bull-Leaping Fresco, depicts a sacred ceremony in which individuals jump over the backs of large bulls. The oldest surviving Minoan frescoes are found on the island of Santorini (classically known as Thera), dated to the Neo-Palatial period (c. 1640–1600 BC).[citation needed] While some similar frescoes have been found in other locations around the Mediterranean basin, particularly in Egypt and Morocco, their origins are subject to speculation. Some art historians believe that fresco artists from Crete may have been sent to various locations as part of a trade exchange, a possibility which raises to the fore the importance of this art form within the society of the times. The most common form of fresco was Egyptian wall paintings in tombs, usually using the a secco technique.[citation needed] Frescoes were also painted in ancient Greece, but few of these works have survived. In southern Italy, at Paestum, which was a Greek colony of the Magna Graecia, a tomb containing frescoes dating back to 470 BC, the so-called Tomb of the Diver, was discovered in June 1968. These frescoes depict scenes of the life and society of ancient Greece, and constitute valuable historical testimonials. One shows a group of men reclining at a symposium, while another shows a young man diving into the sea. Etruscan frescoes, dating from the 4th century BC, have been found in the Tomb of Orcus near Veii, Italy. The richly decorated Thracian frescoes of the Tomb of Kazanlak are dating back to 4th century BC, making it a UNESCO protected World Heritage Site. Roman wall paintings, such as those at the magnificent Villa dei Misteri (1st century BC) in the ruins of Pompeii, and others at Herculaneum, were completed in buon fresco. Roman (Christian) frescoes from the 1st to 2nd centuries AD were found in catacombs beneath Rome, and Byzantine icons were also found in Cyprus, Crete, Ephesus, Cappadocia, and Antioch. Roman frescoes were done by the artist painting the artwork on the still damp plaster of the wall, so that the painting is part of the wall, actually colored plaster. Also a historical collection of Ancient Christian frescoes can be found in the Churches of Göreme. Thanks to large number of ancient rock-cut cave temples, valuable ancient and early medieval frescoes have been preserved in more than 20 locations of India. The frescoes on the ceilings and walls of the Ajanta Caves were painted between c. 200 BC and 600 and are the oldest known frescoes in India. They depict the Jataka tales that are stories of the Buddha's life in former existences as Bodhisattva. The narrative episodes are depicted one after another although not in a linear order. Their identification has been a core area of research on the subject since the time of the site's rediscovery in 1819. Other locations with valuable preserved ancient and early medieval frescoes include Bagh Caves, Ellora Caves, Sittanavasal, Armamalai Cave, Badami Cave Temples and other locations. Frescoes have been made in several techniques, including tempera technique. The later Chola paintings were discovered in 1931 within the circumambulatory passage of the Brihadisvara Temple in India and are the first Chola specimens discovered. Researchers have discovered the technique used in these frescos. A smooth batter of limestone mixture was applied over the stones, which took two to three days to set. Within that short span, such large paintings were painted with natural organic pigments. During the Nayak period, the Chola paintings were painted over. The Chola frescos lying underneath have an ardent spirit of saivism expressed in them. They probably synchronised with the completion of the temple by Rajaraja Cholan the Great. The frescoes in Dogra/ Pahari style paintings exist in their unique form at Sheesh Mahal of Ramnagar (105 km from Jammu and 35 km west of Udhampur). Scenes from epics of Mahabharat and Ramayan along with portraits of local lords form the subject matter of these wall paintings. Rang Mahal of Chamba (Himachal Pradesh) is another site of historic Dogri fresco with wall paintings depicting scenes of Draupti Cheer Haran, and Radha- Krishna Leela. This can be seen preserved at National Museum at New Delhi in a chamber called Chamba Rang Mahal. During the Mughal Era, frescos were used for making interior design on walls and inside the ceilings of domes. The Sigiriya Frescoes are found in Sigiriya in Sri Lanka. Painted during the reign of King Kashyapa I (ruled 477 – 495 AD). The generally accepted view is that they are portrayals of women of the royal court of the king depicted as celestial nymphs showering flowers upon the humans below. They bear some resemblance to the Gupta style of painting found in the Ajanta Caves in India. They are, however, far more enlivened and colorful and uniquely Sri Lankan in character. While some scholars contend that these frescos are the only surviving secular art from antiquity found in Sri Lanka today, others argue that they are Buddhist in nature (potentially representing goddesses from Tusita heaven) The painting technique used on the Sigiriya paintings is "fresco lustro". It varies slightly from the pure fresco technique in that it also contains a mild binding agent or glue. This gives the painting added durability, as clearly demonstrated by the fact that they have survived, exposed to the elements, for over 1,500 years. Located in a small sheltered depression a hundred meters above ground only 19 survive today. Ancient references, however, refer to the existence of as many as five hundred of these frescoes. The late Medieval period and the Renaissance saw the most prominent use of fresco, particularly in Italy, where most churches and many government buildings still feature fresco decoration. This change coincided with the reevaluation of murals in the liturgy. Romanesque churches in Catalonia were richly painted in 12th and 13th century, with both decorative and educational—for the illiterate faithfuls—roles, as can be seen in the MNAC in Barcelona, where is kept a large collection of Catalan romanesque art. In Denmark too, church wall paintings or kalkmalerier were widely used in the Middle Ages (first Romanesque, then Gothic) and can be seen in some 600 Danish churches as well as in churches in the south of Sweden, which was Danish at the time. Fresco painting continued into the Baroque in southern Europe, for churches and especially palaces. Gianbattista Tiepolo was arguably the last major exponent of this tradition, with huge schemes for palaces in Madrid and Würzburg in Germany. Northern Romania (historical region of Moldavia) boasts about a dozen painted monasteries, completely covered with frescos inside and out, that date from the last quarter of the 15th century to the second quarter of the 16th century. The most remarkable are the monastic foundations at Voroneţ (1487), Arbore (1503), Humor (1530), and Moldoviţa (1532). Suceviţa, dating from 1600, represents a late return to the style developed some 70 years earlier. The tradition of painted churches continued into the 19th century in other parts of Romania, although never to the same extent. Henri Clément Serveau produced several frescos including a three by six meter painting for the Lycée de Meaux, where he was once a student. He directed the École de fresques at l'École nationale supérieure des beaux-arts, and decorated the Pavillon du Tourisme at the 1937 Exposition Internationale des Arts et Techniques dans la Vie Moderne (Paris), Pavillon de la Ville de Paris; now at Musée d'Art Moderne de la Ville de Paris. In 1954 he realized a fresco for the Cité Ouvrière du Laboratoire Débat, Garches. He also executed mural decorations for the Plan des anciennes enceintes de Paris in the Musée Carnavalet. The Foujita chapel in Reims completed in 1966, is an example of modern frescos, the interior being painted with religious scenes by the School of Paris painter Tsuguharu Foujita. In 1996, it was designated an historic monument by the French government. José Clemente Orozco, Fernando Leal, David Siqueiros and Diego Rivera the famous Mexican artists, renewed the art of fresco painting in the 20th century. Orozco, Siqueiros, Rivera and his wife Frida Kahlo contributed more to the history of Mexican fine arts and to the reputation of Mexican art in general than anybody else. Channeling pre-Columbian Mexican artworks including the true frescoes at Teotihuacan, Orozco, Siqueiros, River and Fernando Leal established the art movement known as Mexican Muralism. There have been comparatively few frescoes created since the 1960s but there are some significant exceptions. The American artist, Brice Marden's monochrome works first shown in 1966 at Bykert Gallery, New York were inspired by frescos and "watching masons plastering stucco walls." While Marden employed the imagistic effects of fresco, David Novros was developing a 50-year practice around the technique. David Novros is an American painter and a muralist of geometric abstraction. In 1968 Donald Judd commissioned Novros to create a work at 101 Spring Street, New York, NY soon after he had purchased the building. Novros used medieval techniques to create the mural by "first preparing a full-scale cartoon, which he transferred to the wet plaster using the traditional pouncing technique," the act of passing powdered pigment onto the plaster through tiny perforations in a cartoon. The surface unity of the fresco was important to Novros in that the pigment he used bonded with the drying plaster, becoming part of the wall rather than a surface coating. This site-specific work was Novros's first true fresco, which was restored by the artist in 2013. The American painter, James Hyde first presented frescoes in New York at the Esther Rand Gallery, Thompkins Square Park in 1985. At that time Hyde was using true fresco technique on small panels made of cast concrete arranged on the wall. Throughout the next decade Hyde experimented with multiple rigid supports for the fresco plaster including composite board and plate glass. In 1991 at John Good Gallery in New York City, Hyde debuted true fresco applied on an enormous block of Styrofoam. Holland Cotter of the New York Times described the work as "objectifying some of the individual elements that have made modern paintings paintings." While Hyde's work "ranges from paintings on photographic prints to large-scale installations, photography, and abstract furniture design" his frescoes on Styrofoam have been a significant form of his work since the 1980s. The frescoes have been shown throughout Europe and the United States. In ArtForum David Pagel wrote, "like ruins from some future archaeological dig, Hyde's nonrepresentational frescoes on large chunks of Styrofoam give suggestive shape to the fleeting landscape of the present." Over its long history, practitioners of frescoes always took a careful methodological approach. Hyde's frescoes are done improvisationally. The contemporary disposability of the Styrofoam structure contrast the permanence of the classical fresco technique. In 1993, Hyde mounted four automobile sized frescoes on Styrofoam suspended from a brick wall. Progressive Insurance commissioned this site-specific work for the monumental 80- foot atrium in their headquarters in Cleveland, Ohio. Selected examples of frescoes Conservation of frescoes The climate and environment of Venice has proved to be a problem for frescoes and other works of art in the city for centuries. The city is built on a lagoon in northern Italy. The humidity and the rise of water over the centuries have created a phenomenon known as rising damp. As the lagoon water rises and seeps into the foundation of a building, the water is absorbed and rises up through the walls often causing damage to frescoes. Venetians have become quite adept in the conservation methods of frescoes. The mold aspergillus versicolor can grow after flooding, to consume nutrients from frescoes. The following is the process that was used when rescuing frescoes in La Fenice, a Venetian opera house, but the same process can be used for similarly damaged frescoes. First, a protection and support bandage of cotton gauze and polyvinyl alcohol is applied. Difficult sections are removed with soft brushes and localized vacuuming. The other areas that are easier to remove (because they had been damaged by less water) are removed with a paper pulp compress saturated with bicarbonate of ammonia solutions and removed with deionized water. These sections are strengthened and reattached then cleansed with base exchange resin compresses and the wall and pictorial layer were strengthened with barium hydrate. The cracks and detachments are stopped with lime putty and injected with an epoxy resin loaded with micronized silica. Gallery See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Polish_joke] | [TOKENS: 1059] |
Contents Polish joke A Polish joke is an English-language ethnic joke deriding Polish people, based on derogatory stereotypes. The Polish joke belongs in the category of conditional jokes, whose full understanding requires the audience to have prior knowledge of what a Polish joke is. As with all discriminatory jokes, Polish jokes depend on the listener's preconceived notions and antipathies. The relation between the internalized derogatory stereotypes about Polish people, and the persistence of ethnic jokes about them, is not easy to trace, though the jokes seem to be understood by many who hear them. Sometimes an offensive term for a Pole, such as Polack, is used in the joke. Example: History Some early 20th-century Polish jokes may have been told originally before World War II in disputed border regions such as Silesia, suggesting that Polish jokes did not originate in Nazi Germany but rather much earlier as an outgrowth of regional jokes rooted in historical discrimination of Poles in German-ruled areas, at least from the 18th-century Partitions of Poland, and actively pursued from the end of the 19th century by the government-backed German Eastern Marches Society, resulting in social class differences. Nonetheless, these jokes were later fuelled by ethnic slurs disseminated by German warlords and National Socialist propaganda that attempted to justify Nazi crimes against ethnic Poles by representing Poles as dirty and relegating them as inferior on the basis of their not being German. Polish Americans became the subject of derogatory jokes at the time when Polish immigrants moved to the United States in considerable numbers fleeing mass persecution at home perpetrated under Prussian and Russian rule. They took the only jobs available to them, usually requiring physical labor. The same job-related stereotypes persisted even as Polish Americans joined the middle class in the mid 20th century. During the Cold War era, despite the sympathy in the US for Poland being subjected to communism, negative stereotypes about Polish Americans endured, mainly because of Hollywood/TV media involvement. Some Polish jokes were brought to the United States by German displaced persons fleeing war-torn Europe in the late 1940s. During the political transformations of the Soviet controlled Eastern bloc in the 1980s, the much earlier German anti-Polish sentiment—dating at least to the policies of Otto von Bismarck and the persecution of Poles under the German Empire—was revived in East Germany against Solidarność (Solidarity). Polish jokes became common, reminding some of the spread of such jokes under the Nazis. According to Christie Davies, American versions of Polish jokes are an unrelated "purely American phenomenon" and do not express the "historical Old World hatreds". Researchers of the Polish American Journal argue instead that Nazi and Soviet propaganda shaped the perception of Poles. Negative stereotypes Debate continues whether the early Polish jokes brought to states like Wisconsin by German immigrants were directly related to the wave of American jokes of the early 1960s. Since the late 1960s, Polish American organizations made continuous efforts to challenge the negative stereotyping of Polish people once prevalent in the US media. In the 1960s and 70s, television shows such as All in the Family, The Tonight Show, and Laugh-In often used jokes perceived by American Poles as demeaning. The Polish jokes heard in the 1970s led the Polish Ministry of Foreign Affairs to approach the U.S. State Department to complain, a move that ultimately had no effect. The 2010 documentary film Polack by James Kenney explores the source of the Polish joke in America, tracing it through history and into contemporary politics. The depiction of Polish Americans in the play Polish Joke by David Ives has resulted in a number of complaints by the Polonia in the United States. The book Hollywood's War with Poland shows how Hollywood's World War II (and onwards) negative portrayal of Polish people as being "backward", helped condition the American people to see Polish people as having inferior intelligence. The book supports the Polish-American Journal's assertion that Hollywood historically was fertile ground for anti-Polish prejudice, based on Hollywood's left-wing and Soviet sympathies. The Polish American Congress Anti-Bigotry Committee was created in the early 1980s to fight anti-Polish sentiment, expressed for example in Polish jokes. Notable public cases include protests against the use of Polish jokes by Drew Carey (early 2000s) and Jimmy Kimmel (2013), both on the ABC network. In the 1990s, popular culture in Germany experienced a surge of Polish jokes. In their televisions shows, entertainers such as Harald Schmidt and Thomas Koschwitz made jokes about the Polish economy and about increased automobile thefts in Germany, attributed to Poles: English translation: The Bild tabloid employed stereotypical headlines about Poland. This triggered public outrage among German and Polish intellectuals, but in the latter half of the decade, fears of theft had even led to a decrease in German tourists visiting Poland. The greatest percentage of foreign tourists in Poland, exceeding 1.3 million annually, arrive from Germany. In recent decades, it has been observed that the public image of Poland in Germany itself was largely shaped by stereotypical jokes. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Twisted_(software)] | [TOKENS: 956] |
Contents Twisted (software) Twisted is an event-driven network programming framework written in Python and licensed under the MIT License. Twisted projects variously support TCP, UDP, SSL/TLS, IP multicast, Unix domain sockets, many protocols (including HTTP, XMPP, NNTP, IMAP, SSH, IRC, FTP, and others), and much more. Twisted is based on the event-driven programming paradigm, which means that users of Twisted write short callbacks which are called by the framework. Core ideas Twisted is designed for complete separation between logical protocols (usually relying on stream-based connection semantics, such as HTTP or POP3) and transport layers supporting such stream-based semantics (such as files, sockets or SSL libraries). Connection between a logical protocol and a transport layer happens at the last possible moment — just before information is passed into the logical protocol instance. The logical protocol is informed of the transport layer instance, and can use it to send messages back and to check for the peer's identity. Note that it is still possible, in protocol code, to deeply query the transport layer on transport issues (such as checking a client-side SSL certificate). Naturally, such protocol code will fail (raise an exception) if the transport layer does not support such semantics. Central to the Twisted application model is the concept of a deferred (elsewhere called a future). A deferred is an instance of a class designed to receive and process a result which has not been computed yet, for example because it is based on data from a remote peer. Deferreds can be passed around, just like regular objects, but cannot be asked for their value. Each deferred supports a callback chain. When the deferred gets the value, it is passed to the functions on the callback chain, with the result of each callback becoming the input for the next. Deferreds make it possible to operate on the result of a function call before its value has become available. For example, if a deferred returns a string from a remote peer containing an IP address in quad format, a callback can be attached to translate it into a 32-bit number. Any user of the deferred can now treat it as a deferred returning a 32-bit number. This, and the related ability to define "errbacks" (callbacks which are called as error handlers), allows code to specify in advance what to do when an asynchronous event occurs, without stopping to wait for the event. In non-event-driven systems, for example using threads, the operating system incurs premature and additional overhead organizing threads each time a blocking call is made. Twisted supports an abstraction over raw threads — using a thread as a deferred source. Thus, a deferred is returned immediately, which will receive a value when the thread finishes. Callbacks can be attached which will run in the main thread, thus alleviating the need for complex locking solutions. A prime example of such usage, which comes from Twisted's support libraries, is using this model to call into databases. The database call itself happens on a foreign thread, but the analysis of the result happens in the main thread. Twisted can integrate with foreign event loops, such as those of GTK+, Qt and Cocoa (through PyObjC). This allows using Twisted as the network layer in graphical user interface (GUI) programs, using all of its libraries without adding a thread-per-socket overhead, as using Python's native library would. A full-fledged web server can be integrated in-process with a GUI program using this model, for example. Applications using Twisted Nevow (pronounced like the French nouveau) is a Python web application framework originally developed by the company Divmod. Template substitution is achieved via a small Tag Attribute Language, which is usually embedded in on-disk XML templates, though there is also a pure-Python domain-specific language called Stan, for expressing this markup programmatically. Nevow integrates well with Twisted. Nevow was deployed on several high-profile web sites, most notably the official Python site. As of mid-2010, Divmod went out of business, causing development work on Nevow to all but cease, and in 2011 its homepage was no longer accessible. There is a project on Launchpad, hosting the source code of Divmod including the source code of the Nevow project. Athena is a Nevow component which facilitates bi-directional, asynchronous communication between the Python and JavaScript portions of a web application in the form of remote procedure calls. This technique is typically called Ajax or Comet, though Nevow's implementation predates both of these labels. Athena also includes an inheritance-based JavaScript object system, which forms the basis of a client-side widget abstraction, module system and in-browser unit testing kit. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Confederate_States_of_America] | [TOKENS: 20119] |
Contents Confederate States of America The Confederate States of America (CSA), also known as the Confederate States (C.S.), the Confederacy, or the South, was an unrecognized breakaway republic in the Southern United States from 1861 to 1865. It comprised eleven U.S. states that declared secession: South Carolina, Mississippi, Florida, Alabama, Georgia, Louisiana, Texas, Virginia, Arkansas, Tennessee, and North Carolina. These states fought against the United States during the American Civil War. With Abraham Lincoln's election as President of the United States in 1860, eleven southern states believed their slavery-dependent plantation economies were threatened, and seven initially seceded from the United States. The Confederacy was formed on February 8, 1861, by South Carolina, Mississippi, Florida, Alabama, Georgia, Louisiana, and Texas. They adopted a new constitution establishing a confederation government of "sovereign and independent states". The federal government in Washington D.C. and states under its control were known as the Union. The Civil War began in April 1861, when South Carolina's militia attacked Fort Sumter. Four slave states of the Upper South—Virginia, Arkansas, Tennessee, and North Carolina—then seceded and joined the Confederacy. In February 1862, Confederate States Army leaders installed a centralized federal government in Richmond, Virginia, and enacted the first Confederate draft on April 16, 1862. By 1865, the Confederacy's federal government dissolved into chaos, and the Confederate States Congress adjourned, effectively ceasing to exist as a legislative body on March 18. After four years of heavy fighting, most Confederate land and naval forces either surrendered or otherwise ceased hostilities by May 1865. The most significant capitulation was Confederate general Robert E. Lee's surrender on April 9, after which any doubt about the war's outcome or the Confederacy's survival was extinguished. After the war, during the Reconstruction era, the Confederate states were readmitted to Congress after each ratified the Thirteenth Amendment to the U.S. Constitution, which outlawed slavery, "except as a punishment for crime". Lost Cause mythology, an idealized view of the Confederacy valiantly fighting for a just cause, emerged in the decades after the war among former Confederate generals and politicians, and in organizations such as the United Daughters of the Confederacy, Ladies' Memorial Associations, and the Sons of Confederate Veterans. Intense periods of Lost Cause activity developed around the turn of the 20th century and during the civil rights movement of the 1950s and 1960s in reaction to growing support for racial equality. Advocates sought to ensure future generations of Southern whites would continue to support white supremacist policies such as the Jim Crow laws through activities such as building Confederate monuments and influencing the authors of textbooks. The modern display of the Confederate battle flag primarily started during the 1948 presidential election, when it was used by the pro-segregationist and white supremacist Dixiecrat Party. Origins Historians widely agree that the preservation of the institution of slavery was the principal aim of the eleven Southern states that declared their secession from the United States (the Union) and formed the Confederate States of America. Seven of these states seceded before the outbreak of the American Civil War and four did so after hostilities began. While there is broad consensus among 21st-century historians that slavery was central to the conflict, there remains debate over which specific ideological, economic, political, or social factors were most influential and over the reasons why the North rejected the Southern states’ attempt to secede. Proponents of the Lost Cause interpretation, a viewpoint rejected by mainstream historians, deny that slavery was the primary cause of secession—a position contradicted by overwhelming historical evidence, including the secession documents of the states themselves. A central political dispute in the antebellum period concerned whether slavery would be permitted to spread into the Western territories that were destined to become states. Initially, the Congress admitted new states in pairs—one slave and one free—to preserve sectional balance in the Senate, though not in the House of Representatives since free states tended to have larger electorates. By the mid‑19th century, the status of new territories as free or slave had become a defining political issue. Anti‑slavery sentiment was growing in the North, while in the South fear of abolition was intensifying. Another contributing factor was the rise of distinctly white Southern nationalism in preceding decades. The primary reason the North rejected secession was a commitment to preserving the Union, grounded in a sense of American nationalism. Abraham Lincoln won the 1860 presidential election, and his victory prompted declarations of secession by seven slave states of the Deep South. These states—whose cotton‑based economies depended on enslaved labor—formed the Confederate States after Lincoln's election in November 1860 but before he took office in March 1861. Northern nationalists and Southern “Unionists” refused to recognize these declarations. No foreign government ever officially recognized the Confederacy. The U.S. government, under President James Buchanan, did not cede control of federal forts located in territory claimed by the Confederacy. The war began on April 12, 1861, when Confederate forces bombarded the Union garrison at Fort Sumter in the harbor of Charleston, South Carolina. Other background factors contributing to the breakdown of the Union included partisan politics under the Second Party System, the growth of abolitionism, disputes over nullification versus secession, regional nationalisms, expansionism, economic tensions such as the Panic of 1857, and differing paths of modernization in the antebellum period. Although slavery and its related conflicts were the primary cause of the break with the Union, it was the act of disunion itself that sparked the ensuing war. Historian David M. Potter observed: “The problem for Americans who, in the age of Lincoln, wanted slaves to be free was not simply that southerners wanted the opposite, but that they themselves cherished a conflicting value: they wanted the Constitution, which protected slavery, to be honored, and the Union, which was a fellowship with slaveholders, to be preserved. Thus they were committed to values that could not logically be reconciled.” Secession The first secession state conventions from the Deep South sent representatives to the Montgomery Convention in Alabama on February 4, 1861. A provisional government was established. The new provisional Confederate President Jefferson Davis issued a call for 100,000 men from the states' militias to defend the newly formed Confederacy. All federal property was seized, including gold bullion and coining dies at the U.S. mints. The Confederate capital was moved from Montgomery to Richmond, Virginia, in May 1861. On February 22, 1862, Davis was inaugurated as president with a term of six years. The Confederate administration pursued a policy of national territorial integrity, continuing earlier state efforts in 1860–1861 to remove U.S. government presence. This included taking possession of U.S. courts, custom houses, post offices, and most notably, arsenals and forts. After the Confederate attack and capture of Fort Sumter in April 1861, Lincoln called up 75,000 of the states' militia to muster under his command. The stated purpose was to re-occupy U.S. properties throughout the South, as the U.S. Congress had not authorized their abandonment. The resistance at Fort Sumter signaled his change of policy from that of the Buchanan Administration. Lincoln's response ignited a firestorm of emotion. The people of both North and South demanded war, with soldiers rushing to their colors in the hundreds of thousands. Secessionists argued that the United States Constitution was a contract among sovereign states that could be abandoned without consultation and each state had a right to secede. After intense debates and statewide votes, seven Deep South states passed secession ordinances by February 1861, while secession efforts failed in the other eight slave states. The Confederacy expanded in May–July 1861 (with Virginia, Arkansas, Tennessee, North Carolina), and disintegrated in April–May 1865. It was formed by delegations from seven slave states of the Lower South that had proclaimed their secession. After the fighting began in April, four additional slave states seceded and were admitted. Later, two slave states (Missouri and Kentucky) and two territories were given seats in the Confederate Congress. Its establishment flowed from and deepened Southern nationalism, which prepared men to fight for "The Southern Cause". This "Cause" included support for states' rights, tariff policy, and internal improvements, but above all, cultural and financial dependence on the South's slavery-based economy. The convergence of race and slavery, politics, and economics raised South-related policy questions to the status of moral questions over, way of life, merging love of things Southern and hatred of things Northern. As the war approached, political parties split, and national churches and interstate families divided along sectional lines. According to historian John M. Coski: The statesmen who led the secession movement were unashamed to explicitly cite the defense of slavery as their prime motive ... Acknowledging the centrality of slavery to the Confederacy is essential for understanding the Confederate. Following South Carolina's unanimous 1860 secession vote, no other Southern states considered the question until 1861; when they did, none had a unanimous vote. All had residents who cast significant numbers of Unionist votes. Voting to remain in the Union did not necessarily mean individuals were sympathizers with the North. Once fighting began, many who voted to remain in the Union accepted the majority decision, and supported the Confederacy. Many writers have evaluated the Civil War as an American tragedy—a "Brothers' War", pitting "brother against brother, father against son, kin against kin of every degree". Initially, some secessionists hoped for a peaceful departure. Moderates in the Confederate Constitutional Convention included a provision against importation of slaves from Africa to appeal to the Upper South. Non-slave states might join, but the radicals secured a two-thirds requirement in both houses of Congress to accept them. Seven states declared their secession from the United States before Lincoln took office on March 4, 1861. After the Confederate attack on Fort Sumter April 12, 1861, and Lincoln's subsequent call for troops, four more states declared their secession. Kentucky declared neutrality, but after Confederate troops moved in, the state legislature asked for Union troops to drive them out. Delegates from 68 Kentucky counties were sent to the Russellville Convention that signed an Ordinance of Secession. Kentucky was admitted into the Confederacy on December 10, 1861, with Bowling Green as its first capital. Early in the war, the Confederacy controlled more than half of Kentucky but largely lost control in 1862. The splinter Confederate government of Kentucky relocated to accompany western Confederate armies and never controlled the state population after 1862. By the end of the war, 90,000 Kentuckians had fought for the Union, compared to 35,000 for the Confederacy. In Missouri, a constitutional convention was approved and delegates elected. The convention rejected secession 89–1 on March 19, 1861. The governor maneuvered to take control of the St. Louis Arsenal and restrict federal military movements. This led to a confrontation, and in June federal forces drove him and the General Assembly from Jefferson City. The executive committee of the convention called the members together in July, and declared the state offices vacant and appointed a Unionist interim state government. The exiled governor called a rump session of the former General Assembly together in Neosho and, on October 31, 1861, it passed an ordinance of secession. The Confederate government of Missouri effectively controlled only southern Missouri early in the war. It had its capital at Neosho, then Cassville, before being driven out of the state. For the remainder of the war, it operated as a government in exile at Marshall, Texas. Not having seceded, neither Kentucky nor Missouri was declared in rebellion in Lincoln's Emancipation Proclamation. The Confederacy recognized the pro-Confederate claimants in Kentucky (December 10, 1861) and Missouri (November 28, 1861) and laid claim to those states, granting them congressional representation and adding two stars to the Confederate flag. Voting for the representatives was done mostly by Confederate soldiers from Kentucky and Missouri. Some Southern Unionists blamed Lincoln's call for troops as the precipitating event for the second wave of secessions. Historian James McPherson argues such claims have "a self-serving quality" and regards them as misleading: As the telegraph chattered reports of the attack on Sumter April 12 and its surrender next day, huge crowds poured into the streets of Richmond, Raleigh, Nashville, and other upper South cities to celebrate this victory over the Yankees. These crowds waved Confederate flags and cheered the glorious cause of southern independence. They demanded that their own states join the cause. Scores of demonstrations took place from April 12 to 14, before Lincoln issued his call for troops. Many conditional unionists were swept along by this powerful tide of southern nationalism; others were cowed into silence. Historian Daniel W. Crofts disagrees with McPherson: The bombardment of Fort Sumter, by itself, did not destroy Unionist majorities in the upper South. Because only three days elapsed before Lincoln issued the proclamation, the two events viewed retrospectively, appear almost simultaneous. Nevertheless, close examination of contemporary evidence ... shows that the proclamation had a far more decisive impact.... Many concluded ... that Lincoln had deliberately chosen "to drive off all the Slave states, in order to make war on them and annihilate slavery". Richard N. Current concluded: In short, it appears that Lincoln, when he decided to send the Sumter expedition, considered hostilities to be probable. It also appears, however, that he believed an unopposed and peaceable provisioning to be at least barely possible.... He thought hostilities would be the likely result, and he was determined that, if they should be, they must clearly be initiated by the Confederates. "To say that Lincoln meant that the first shot would be fired by the other side if a first shot was fired, ... is not to say that he maneuvered to have the first shot fired." The order of secession resolutions and dates are: In Virginia, the populous counties along the Ohio and Pennsylvania borders rejected the Confederacy. Unionists held a Convention in Wheeling in June 1861, establishing a "restored government" with a rump legislature, but sentiment in the region remained deeply divided. In the 50 counties that would make up the state of West Virginia, voters from 24 counties had voted for disunion in Virginia's May 23 referendum on the ordinance of secession. In the 1860 election "Constitutional Democrat" Breckenridge had outpolled "Constitutional Unionist" Bell in the 50 counties by 1,900 votes, 44% to 42%. The counties simultaneously supplied over 20,000 soldiers to each side of the conflict. Representatives for most counties were seated in both state legislatures at Wheeling and at Richmond for the duration of the war. Attempts to secede from the Confederacy by counties in East Tennessee were checked by martial law. Although slaveholding Delaware and Maryland did not secede, citizens exhibited divided loyalties. Regiments of Marylanders fought in Lee's Army of Northern Virginia. Overall, 24,000 men from Maryland joined Confederate forces, compared to 63,000 who joined Union forces. Delaware never produced a full regiment for the Confederacy, but neither did it emancipate slaves as did Missouri and West Virginia. District of Columbia citizens made no attempts to secede and through the war, referendums sponsored by Lincoln approved compensated emancipation and slave confiscation from "disloyal citizens". Citizens at Mesilla and Tucson in the southern part of New Mexico Territory formed a secession convention, which voted to join the Confederacy on March 16, 1861, and appointed Dr. Lewis S. Owings as the new territorial governor. They won the Battle of Mesilla and established a territorial government with Mesilla serving as its capital. The Confederacy proclaimed the Confederate Arizona Territory on February 14, 1862, north to the 34th parallel. Marcus H. MacWillie served in both Confederate Congresses as Arizona's delegate. In 1862, the Confederate New Mexico campaign to take the northern half of the U.S. territory failed and the Confederate territorial government in exile relocated to San Antonio, Texas. Confederate supporters in the trans-Mississippi west claimed portions of the Indian Territory after the US evacuated the federal forts and installations. Over half of the American Indian troops participating in the War from the Indian Territory supported the Confederacy. On July 12, 1861, the Confederate government signed a treaty with both the Choctaw and Chickasaw Indian nations. After several battles, Union armies took control of the territory. The Indian Territory never formally joined the Confederacy, but did receive representation in the Congress. Many Indians from the Territory were integrated into regular Confederate Army units. After 1863, the tribal governments sent representatives to the Confederate Congress: Elias Cornelius Boudinot representing the Cherokee and Samuel Benton Callahan representing the Seminole and Creek. The Cherokee Nation aligned with the Confederacy. They practiced and supported slavery, opposed abolition, and feared their lands would be seized by the Union. After the war, the Indian Territory was disestablished, their black slaves were freed, and the tribes lost some of their lands. Montgomery, Alabama, served as capital of the Confederate States from February 4 until May 29, 1861, in the Alabama State Capitol. Six states created the Confederacy there on February 8, 1861. The Texas delegation was seated at the time, so it is counted in the "original seven" states of the Confederacy; it had no roll call vote until after its referendum made secession "operative". The Permanent Constitution was adopted there on March 12, 1861. The permanent capital provided for in the Confederate Constitution called for a state cession of a 100 square mile district to the central government. Atlanta, which had not yet supplanted Milledgeville, Georgia, as its state capital, put in a bid noting its central location and rail connections, as did Opelika, Alabama, noting its strategically interior situation, rail connections and deposits of coal and iron. Richmond, Virginia, was chosen for the interim capital at the Virginia State Capitol. The move was used by Vice President Stephens and others to encourage other border states to follow Virginia into the Confederacy. In the political moment it was a show of "defiance and strength". The war for Southern independence was surely to be fought in Virginia, but it also had the largest Southern military-aged white population, with infrastructure, resources, and supplies. The Davis Administration's policy was that "It must be held at all hazards." The naming of Richmond as the new capital took place on May 30, 1861, and the last two sessions of the Provisional Congress were held there. As war dragged on, Richmond became crowded with training and transfers, logistics and hospitals. Prices rose dramatically despite government efforts at price regulation. A movement in Congress argued for moving the capital from Richmond. At the approach of federal armies in mid-1862, the government's archives were readied for removal. As the Wilderness Campaign progressed, Congress authorized Davis to remove the executive department and call Congress to session elsewhere in 1864 and again in 1865. Shortly before the end of the war, the Confederate government evacuated Richmond, planning to relocate further south. Little came of these plans before Lee's surrender. Davis and most of his cabinet fled to Danville, Virginia, which served as their headquarters for eight days. Diplomacy During its four years, the Confederacy asserted its independence and appointed dozens of diplomatic agents abroad. None were recognized by a foreign government. The US government regarded the Southern states as being in rebellion or insurrection and so refused any formal recognition of their status. The US government never declared war on those "kindred and countrymen" in the Confederacy but conducted its military efforts beginning with a presidential proclamation issued April 15, 1861. It called for troops to recapture forts and suppress what Lincoln later called an "insurrection and rebellion". Mid-war parleys between the two sides occurred without formal political recognition, though the laws of war predominantly governed military relationships on both sides of the conflict. Once war with the United States began, the Confederacy pinned its hopes for survival on military intervention by the UK or France. The Confederate government sent James M. Mason to London and John Slidell to Paris. On their way in 1861, the U.S. Navy intercepted their ship, the Trent, and took them to Boston, an international episode known as the Trent Affair. The diplomats were eventually released and continued their voyage. However, their mission was unsuccessful; historians judge their diplomacy as poor.[page needed] Neither secured diplomatic recognition for the Confederacy, much less military assistance. The Confederates who had believed that "cotton is king", that is, that Britain had to support the Confederacy to obtain cotton, proved mistaken. The British had stocks to last over a year and had been developing alternative sources. The United Kingdom took pride in leading the end of the transatlantic slave trade. By 1833, the Royal Navy patrolled Middle Passage waters to prevent slave ships from reaching the Western Hemisphere. It was in London that the first World Anti-Slavery Convention had been held in 1840. Black abolitionist speakers toured England, Scotland, and Ireland, exposing the reality of America's chattel slavery and rebutting the Confederate position that blacks were "unintellectual, timid, and dependent", and "not equal to the white man...the superior race." Frederick Douglass, Henry Highland Garnet, Sarah Parker Remond, her brother Charles Lenox Remond, James W. C. Pennington, Martin Delany, Samuel Ringgold Ward, and William G. Allen all spent years in Britain, where fugitive slaves were safe and, as Allen said, there was an "absence of prejudice against color. Here the colored man feels himself among friends, and not among enemies". Most British public opinion was against the practice, with Liverpool seen as the primary base of Southern support. Throughout the early years of the war, British foreign secretary Lord John Russell, Emperor Napoleon III of France, and, to a lesser extent, British Prime Minister Lord Palmerston, showed interest in recognition of the Confederacy or at least mediation of the war. Chancellor of the Exchequer William Gladstone attempted unsuccessfully to convince Palmerston to intervene. By September 1862 the Union victory at the Battle of Antietam, Lincoln's preliminary Emancipation Proclamation and abolitionist opposition in Britain put an end to these possibilities. The cost to Britain of a war with the U.S. would have been high: the immediate loss of American grain shipments, the end of British exports to the U.S., and seizure of billions of pounds invested in American securities. War would have meant higher taxes in Britain, another invasion of Canada, and attacks on the British merchant fleet. In mid-1862, fears of a race war (like the Haitian Revolution of 1791–1804) led to the British considering intervention for humanitarian reasons. John Slidell, the Confederate States emissary to the French Empire, succeeded in negotiating a loan of $15,000,000 from Erlanger and other French capitalists for ironclad warships and military supplies. The British government allowed the construction of blockade runners in Britain. Most of these were owned and operated by British financiers and shipowners, though a few were owned and operated by the Confederacy. The goal of the British investors was to acquire highly profitable cotton. Several European nations maintained diplomats in place who had been appointed to the U.S., but no country appointed any diplomat to the Confederacy. Those nations recognized the Union and Confederate sides as belligerents. In 1863, the Confederacy expelled European diplomatic missions for advising their resident subjects to refuse to serve in the Confederate army. Both Confederate and Union agents were allowed to work openly in British territories. The Confederacy appointed Ambrose Dudley Mann as special agent to the Holy See in September 1863, but the Holy See never released a statement supporting or recognizing the Confederacy. In November 1863, Mann met Pope Pius IX and received a letter supposedly addressed "to the Illustrious and Honorable Jefferson Davis, President of the Confederate States of America"; Mann had mistranslated the address. In his report to Richmond, Mann claimed a great diplomatic achievement for himself, but Confederate Secretary of State Judah P. Benjamin told Mann it was "a mere inferential recognition, unconnected with political action or the regular establishment of diplomatic relations" and thus did not assign it the weight of formal recognition. Nevertheless, the Confederacy was seen internationally as a serious attempt at nationhood, and European governments sent military observers to assess whether there had been a de facto establishment of independence. These observers included Arthur Lyon Fremantle of the British Coldstream Guards, who entered the Confederacy via Mexico, Fitzgerald Ross of the Austrian Hussars, and Justus Scheibert of the Prussian Army. European travelers visited and wrote accounts for publication. Importantly in 1862, the Frenchman Charles Girard's Seven months in the rebel states during the North American War testified "this government ... is no longer a trial government ... but really a normal government, the expression of popular will". Fremantle went on to write in his book Three Months in the Southern States that he had: not attempted to conceal any of the peculiarities or defects of the Southern people. Many persons will doubtless highly disapprove of some of their customs and habits in the wilder portion of the country; but I think no generous man, whatever may be his political opinions, can do otherwise than admire the courage, energy, and patriotism of the whole population, and the skill of its leaders, in this struggle against great odds. And I am also of opinion that many will agree with me in thinking that a people in which all ranks and both sexes display a unanimity and a heroism which can never have been surpassed in the history of the world, is destined, sooner or later, to become a great and independent nation. French Emperor Napoleon III assured Confederate diplomat John Slidell that he would make a "direct proposition" to Britain for joint recognition. The Emperor made the same assurance to British members of Parliament John A. Roebuck and John A. Lindsay. Roebuck in turn publicly prepared a bill to submit to Parliament supporting joint Anglo-French recognition of the Confederacy. "Southerners had a right to be optimistic, or at least hopeful, that their revolution would prevail, or at least endure." Following the disasters at Vicksburg and Battle of Gettysburg in July 1863, the Confederates "suffered a severe loss of confidence in themselves" and withdrew into an interior defensive position. By December 1864, Davis considered sacrificing slavery in order to enlist recognition and aid from Paris and London; he secretly sent Duncan F. Kenner to Europe with a message that the war was fought solely for "the vindication of our rights to self-government and independence" and that "no sacrifice is too great, save that of honor". The message stated that if the French or British governments made their recognition conditional on anything at all, the Confederacy would consent to such terms. European leaders all saw that the Confederacy was on the verge of defeat. The Confederacy's biggest foreign policy successes were with Brazil and Cuba, but this had little military import. Brazil represented the "peoples most identical to us in Institutions", in which slavery remained legal until the 1880s and the abolitionist movement was small. Confederate ships were welcome in Brazilian ports. After the war, Brazil was the primary destination of those Southerners who wanted to continue living in a slave society, where, as one immigrant remarked, Confederado slaves were cheap. The Captain–General of Cuba declared in writing that Confederate ships were welcome, and would be protected in Cuban ports. Historians speculate that if the Confederacy had achieved independence, it probably would have tried to acquire Cuba as a base of expansion. At war Most soldiers who joined Confederate national or state military units joined voluntarily. Michael Perman (2010) says historians are of two minds on why millions of soldiers seemed so eager to fight, suffer and die over four years: Some historians emphasize that Civil War soldiers were driven by political ideology, holding firm beliefs about the importance of liberty, Union, or state rights, or about the need to protect or to destroy slavery. Others point to less overtly political reasons to fight, such as the defense of one's home and family, or the honor and brotherhood to be preserved when fighting alongside other men. Most historians agree that, no matter what he thought about when he went into the war, the experience of combat affected him profoundly and sometimes affected his reasons for continuing to fight. Civil War historian E. Merton Coulter wrote that for those who would secure its independence, "The Confederacy was unfortunate in its failure to work out a general strategy for the whole war". Aggressive strategy called for offensive force concentration. Defensive strategy sought dispersal to meet demands of locally minded governors. The controlling philosophy evolved into a combination "dispersal with a defensive concentration around Richmond". The Davis administration considered the war purely defensive, a "simple demand that the people of the United States would cease to war upon us". Historian James M. McPherson is a critic of Lee's offensive strategy: "Lee pursued a faulty military strategy that ensured Confederate defeat". As the Confederate government lost control of territory in campaign after campaign, it was said that "the vast size of the Confederacy would make its conquest impossible". The enemy would be struck down by the same elements which so often debilitated or destroyed visitors and transplants in the South: heat exhaustion, sunstroke, and endemic diseases such as malaria and typhoid. Early in the war, both sides believed that one great battle would decide the conflict; the Confederates won a surprise victory at the First Battle of Bull Run, also known as First Manassas (the name used by Confederate forces). It drove the Confederate people "insane with joy"; the public demanded a forward movement to capture Washington, relocate the Confederate capital there, and admit Maryland to the Confederacy. A council of war by the victorious Confederate generals decided not to advance against larger numbers of fresh federal troops in defensive positions. Davis did not countermand it. Following the Confederate incursion into Maryland halted at the Battle of Antietam in October 1862, generals proposed concentrating forces from state commands to re-invade the north. Nothing came of it. Again in mid-1863 at his incursion into Pennsylvania, Lee requested of Davis that Beauregard simultaneously attack Washington with troops taken from the Carolinas. But the troops there remained in place during the Gettysburg campaign. The eleven states of the Confederacy were outnumbered by the North about four-to-one in military manpower. It was overmatched far more in military equipment, industrial facilities, railroads, and wagons supplying the front. Confederates slowed the Yankee invaders, at heavy cost to the Southern infrastructure. The Confederates burned bridges, laid land mines in the roads, and made harbors inlets and inland waterways unusable with sunken mines (called "torpedoes" at the time). Coulter reports: Rangers in twenty to fifty-man units were awarded 50% valuation for property destroyed behind Union lines, regardless of location or loyalty. As Federals occupied the South, objections by loyal Confederate concerning Ranger horse-stealing and indiscriminate scorched earth tactics behind Union lines led to Congress abolishing the Ranger service two years later. The Confederacy relied on external sources for war materials. The first came from trade with the enemy. "Vast amounts of war supplies" came through Kentucky, and thereafter, western armies were "to a very considerable extent" provisioned with illicit trade via federal agents and northern private traders. But that trade was interrupted in the first year of war by Admiral Porter's river gunboats as they gained dominance along navigable rivers north–south and east–west. Overseas blockade running then came to be of "outstanding importance". On April 17, President Davis called on privateer raiders, the "militia of the sea", to wage war on U.S. seaborne commerce. Despite noteworthy effort, over the course of the war the Confederacy was found unable to match the Union in ships and seamanship, materials and marine construction. An inescapable obstacle to success in the warfare of mass armies was the Confederacy's lack of manpower, and sufficient numbers of disciplined, equipped troops in the field at the point of contact with the enemy. During the winter of 1862–63, Lee observed that none of his famous victories had resulted in the destruction of the opposing army. He lacked reserve troops to exploit an advantage on the battlefield as Napoleon had done. Lee explained, "More than once have most promising opportunities been lost for want of men to take advantage of them, and victory itself had been made to put on the appearance of defeat, because our diminished and exhausted troops have been unable to renew a successful struggle against fresh numbers of the enemy." The military armed forces of the Confederacy comprised three branches: Army, Navy and Marine Corps. On February 28, 1861, the Provisional Confederate Congress established a provisional volunteer army and gave control over military operations and authority for mustering state forces and volunteers to the newly chosen Confederate president, Jefferson Davis. On March 1, 1861, on behalf of the Confederate government, Davis assumed control of the military situation at Charleston, South Carolina, where South Carolina state militia besieged Fort Sumter in Charleston harbor, held by a small U.S. Army garrison. By March 1861, the Provisional Confederate Congress expanded the provisional forces and established a more permanent Confederate States Army. The total population of the Confederate Army is unknowable due to incomplete and destroyed Confederate records but estimates are between 750,000 and 1,000,000 troops. This does not include an unknown number of slaves pressed into army tasks, such as the construction of fortifications and defenses or driving wagons. Confederate casualty figures also are incomplete and unreliable, estimated at 94,000 killed or mortally wounded, 164,000 deaths from disease, and between 26,000 and 31,000 deaths in Union prison camps. One incomplete estimate is 194,026.[citation needed] The Confederate military leadership included many veterans from the United States Army and United States Navy who had resigned their federal commissions and were appointed to senior positions. Many had served in the Mexican–American War (including Robert E. Lee and Jefferson Davis), but some such as Leonidas Polk (who graduated from West Point but did not serve in the Army) had little or no experience. The Confederate officer corps consisted of men from both slave-owning and non-slave-owning families. The Confederacy appointed junior and field grade officers by election from the enlisted ranks. Although no Army service academy was established for the Confederacy, some colleges (such as The Citadel and Virginia Military Institute) maintained cadet corps that trained Confederate military leadership. A naval academy was established at Drewry's Bluff, Virginia in 1863, but no midshipmen graduated before the Confederacy's end. Most soldiers were white males aged between 16 and 28; half were 23 or older by 1861. The Confederate Army was permitted to disband for two months in early 1862 after its short-term enlistments expired. The majority of those in uniform would not re-enlist after their one-year commitment, thus on April 16, 1862, the Confederate Congress imposed the first mass conscription on North American territory. (A year later, on March 3, 1863, the United States Congress passed the Enrollment Act.) Rather than a universal draft, the first program was a selective one with physical, religious, professional, and industrial exemptions. These became narrower as the battle progressed. Initially substitutes were permitted, but by December 1863 these were disallowed. In September 1862 the age limit was increased from 35 to 45 and by February 1864, all men under 18 and over 45 were conscripted to form a reserve for state defense inside state borders. By March 1864, the Superintendent of Conscription reported that all across the Confederacy, every officer in constituted authority, man and woman, "engaged in opposing the enrolling officer in the execution of his duties". Although challenged in the state courts, the Confederate State Supreme Courts routinely rejected legal challenges to conscription. Many thousands of slaves served as personal servants to their owner, or were hired as laborers, cooks, and pioneers. Some freed blacks and men of color served in local state militia units of the Confederacy, primarily in Louisiana and South Carolina, but their officers deployed them for "local defense, not combat". Depleted by casualties and desertions, the military suffered chronic manpower shortages. In early 1865, the Confederate Congress, influenced by the public support by General Lee, approved the recruitment of black infantry units. Contrary to Lee's and Davis's recommendations, the Congress refused "to guarantee the freedom of black volunteers". No more than two hundred black combat troops were ever raised. The immediate onset of war meant that it was fought by the "Provisional" or "Volunteer Army". State governors resisted concentrating a national effort. Several wanted a strong state army for self-defense. Others feared large "Provisional" armies answering only to Davis. When filling the Confederate government's call for 100,000 men, another 200,000 were turned away by accepting only those enlisted "for the duration" or twelve-month volunteers who brought their own arms or horses. It was important to raise troops; it was just as important to provide capable officers to command them. With few exceptions the Confederacy secured excellent general officers. Efficiency in the lower officers was "greater than could have been reasonably expected". As with the Union, political appointees could be indifferent. Otherwise, the officer corps was governor-appointed or elected by unit enlisted. Promotion to fill vacancies was made internally regardless of merit, even if better officers were immediately available. Anticipating the need for more "duration" men, in January 1862 Congress provided for company level recruiters to return home for two months, but their efforts met little success on the heels of Confederate battlefield defeats in February. Congress allowed for Davis to require numbers of recruits from each governor to supply the volunteer shortfall. States responded by passing their own draft laws. The veteran Confederate army of early 1862 was mostly twelve-month volunteers with terms about to expire. Enlisted reorganization elections disintegrated the army for two months. Officers pleaded with the ranks to re-enlist, but a majority did not. Those remaining elected majors and colonels whose performance led to officer review boards in October. The boards caused a "rapid and widespread" thinning out of 1,700 incompetent officers. Troops thereafter would elect only second lieutenants. In early 1862, the popular press suggested the Confederacy required a million men under arms. But veteran soldiers were not re-enlisting, and earlier secessionist volunteers did not reappear to serve in war. One Macon, Georgia, newspaper asked how two million brave fighting men of the South were about to be overcome by four million northerners who were said to be cowards. The Confederacy passed the first American law of national conscription on April 16, 1862. The white males of the Confederate States from 18 to 35 were declared members of the Confederate army for three years, and all men then enlisted were extended to a three-year term. They would serve only in units and under officers of their state. Those under 18 and over 35 could substitute for conscripts, in September those from 35 to 45 became conscripts. The cry of "rich man's war and a poor man's fight" led Congress to abolish the substitute system altogether in December 1863. All principals benefiting earlier were made eligible for service. By February 1864, the age bracket was made 17 to 50, those under eighteen and over forty-five to be limited to in-state duty. Confederate conscription was not universal; it was a selective service. The First Conscription Act of April 1862 exempted occupations related to transportation, communication, industry, ministers, teaching and physical fitness. The Second Conscription Act of October 1862 expanded exemptions in industry, agriculture and conscientious objection. Exemption fraud proliferated in medical examinations, army furloughs, churches, schools, apothecaries and newspapers. Rich men's sons were appointed to the socially outcast "overseer" occupation, but the measure was received in the country with "universal odium". The legislative vehicle was the controversial Twenty Negro Law that specifically exempted one white overseer or owner for every plantation with at least 20 slaves. Backpedaling six months later, Congress provided overseers under 45 could be exempted only if they held the occupation before the first Conscription Act. The number of officials under state exemptions appointed by state Governor patronage expanded significantly. The Conscription Act of February 1864 "radically changed the whole system" of selection. It abolished industrial exemptions, placing detail authority in President Davis. As the shame of conscription was greater than a felony conviction, the system brought in "about as many volunteers as it did conscripts." Many men in otherwise "bombproof" positions were enlisted in one way or another, nearly 160,000 additional volunteers and conscripts in uniform. Still there was shirking. To administer the draft, a Bureau of Conscription was set up to use state officers, as state Governors would allow. It had a checkered career of "contention, opposition and futility". Armies appointed alternative military "recruiters" to bring in the out-of-uniform 17–50-year-old conscripts and deserters. Nearly 3,000 officers were tasked with the job. By late 1864, Lee was calling for more troops. "Our ranks are constantly diminishing by battle and disease, and few recruits are received; the consequences are inevitable." By March 1865 conscription was to be administered by generals of the state reserves calling out men over 45 and under 18 years old. All exemptions were abolished. These regiments were assigned to recruit conscripts ages 17–50, recover deserters, and repel enemy cavalry raids. The service retained men who had lost but one arm or a leg in home guards. Ultimately, conscription was a failure, and its main value was in goading men to volunteer. The survival of the Confederacy depended on a strong base of civilians and soldiers devoted to victory. The soldiers performed well, though increasing numbers deserted in the last year of fighting, and the Confederacy never succeeded in replacing casualties as the Union could. The civilians, although enthusiastic in 1861–62, seem to have lost faith in the future of the Confederacy by 1864, and instead looked to protect their homes and communities. As George C. Rable explains, "This contraction of civic vision was more than a crabbed libertarianism; it represented an increasingly widespread disillusionment with the Confederate experiment." The American Civil War broke out in April 1861 with a Confederate victory at the Battle of Fort Sumter in Charleston. In January, President James Buchanan had attempted to resupply the garrison with the steamship, Star of the West, but Confederate artillery drove it away. In March, President Lincoln notified South Carolina Governor Pickens that without Confederate resistance to the resupply there would be no military reinforcement without further notice, but Lincoln prepared to force resupply if it were not allowed. Confederate President Davis, in cabinet, decided to seize Fort Sumter before the relief fleet arrived, and on April 12, 1861, General Beauregard forced its surrender. Following Sumter, Lincoln directed states to provide 75,000 militiamen for three months to recapture the Charleston Harbor forts and all other federal property. This emboldened secessionists in Virginia, Arkansas, Tennessee and North Carolina to secede rather than provide troops to march into neighboring Southern states. In May, Federal troops crossed into Confederate territory along the entire border from the Chesapeake Bay to New Mexico. The first battles were Confederate victories at Big Bethel (Bethel Church, Virginia), First Bull Run (First Manassas) in Virginia July and in August, Wilson's Creek (Oak Hills) in Missouri. At all three, Confederate forces could not follow up their victory due to inadequate supply and shortages of fresh troops to exploit their successes. Following each battle, Federals maintained a military presence and occupied Washington, DC; Fort Monroe, Virginia; and Springfield, Missouri. Both North and South began training up armies for major fighting the next year. Union General George B. McClellan's forces gained possession of much of northwestern Virginia in mid-1861, concentrating on towns and roads; the interior was too large to control and became the center of guerrilla activity. General Robert E. Lee was defeated at Cheat Mountain in September and no serious Confederate advance in western Virginia occurred until the next year. Meanwhile, the Union Navy seized control of much of the Confederate coastline from Virginia to South Carolina. It took over plantations and the abandoned slaves. Federals there began a war-long policy of burning grain supplies up rivers into the interior wherever they could not occupy. The Union Navy began a blockade of the major southern ports and prepared an invasion of Louisiana to capture New Orleans in early 1862. The victories of 1861 were followed by a series of defeats east and west in early 1862. To restore the Union by military force, the Federal strategy was to (1) secure the Mississippi River, (2) seize or close Confederate ports, and (3) march on Richmond. To secure independence, the Confederate intent was to (1) repel the invader on all fronts, costing him blood and treasure, and (2) carry the war into the North by two offensives in time to affect the mid-term elections. Much of northwestern Virginia was under Federal control. In February and March, most of Missouri and Kentucky were Union "occupied, consolidated, and used as staging areas for advances further South". Following the repulse of a Confederate counterattack at the Battle of Shiloh, Tennessee, permanent Federal occupation expanded west, south and east. Confederate forces repositioned south along the Mississippi River to Memphis, Tennessee, where at the naval Battle of Memphis, its River Defense Fleet was sunk. Confederates withdrew from northern Mississippi and northern Alabama. New Orleans was captured on April 29 by a combined Army-Navy force under U.S. Admiral David Farragut, and the Confederacy lost control of the mouth of the Mississippi River. It had to concede extensive agricultural resources that had supported the Union's sea-supplied logistics base. Although Confederates had suffered major reverses everywhere, as of the end of April the Confederacy still controlled territory holding 72% of its population. Federal forces disrupted Missouri and Arkansas; they had broken through in western Virginia, Kentucky, Tennessee and Louisiana. Along the Confederacy's shores, Union forces had closed ports and made garrisoned lodgments on every coastal Confederate state except Alabama and Texas. Although scholars sometimes assess the Union blockade as ineffectual under international law until the last few months of the war, from the first months it disrupted Confederate privateers, making it "almost impossible to bring their prizes into Confederate ports". British firms developed small fleets of blockade running companies, such as John Fraser and Company and S. Isaac, Campbell & Company while the Ordnance Department secured its own blockade runners for dedicated munitions cargoes. During the Civil War fleets of armored warships were deployed for the first time in sustained blockades at sea. After some success against the Union blockade, in March the ironclad CSS Virginia was forced into port and burned by Confederates at their retreat. Despite several attempts mounted from their port cities, CSA naval forces were unable to break the Union blockade. Attempts were made by Commodore Josiah Tattnall III's ironclads from Savannah in 1862 with the CSS Atlanta. Secretary of the Navy Stephen Mallory placed his hopes in a European-built ironclad fleet, but they were never realized. On the other hand, four new English-built commerce raiders served the Confederacy, and several fast blockade runners were sold in Confederate ports. They were converted into commerce-raiding cruisers, and manned by their British crews. In the east, Union forces could not close on Richmond. General McClellan landed his army on the Lower Peninsula of Virginia. Lee subsequently ended that threat from the east, then Union General John Pope attacked overland from the north only to be repulsed at Second Bull Run (Second Manassas). Lee's strike north was turned back at Antietam MD, then Union Major General Ambrose Burnside's offensive was disastrously ended at Fredericksburg VA in December. Both armies then turned to winter quarters to recruit and train for the coming spring. In an attempt to seize the initiative, reprove, protect farms in mid-growing season and influence U.S. Congressional elections, two major Confederate incursions into Union territory had been launched in August and September 1862. Both Braxton Bragg's invasion of Kentucky and Lee's invasion of Maryland were decisively repulsed, leaving Confederates in control of but 63% of its population. Civil War scholar Allan Nevins argues that 1862 was the strategic high-water mark of the Confederacy. The failures of the two invasions were attributed to the same irrecoverable shortcomings: lack of manpower at the front, lack of supplies including serviceable shoes, and exhaustion after long marches without adequate food. Also in September Confederate General William W. Loring pushed Federal forces from Charleston, Virginia, and the Kanawha Valley in western Virginia, but lacking reinforcements Loring abandoned his position and by November the region was back in Federal control. The failed Middle Tennessee campaign was ended January 2, 1863, at the inconclusive Battle of Stones River (Murfreesboro), both sides losing the largest percentage of casualties suffered during the war. It was followed by another strategic withdrawal by Confederate forces. The Confederacy won a significant victory April 1863, repulsing the Federal advance on Richmond at Chancellorsville, but the Union consolidated positions along the Virginia coast and the Chesapeake Bay. Without an effective answer to Federal gunboats, river transport and supply, the Confederacy lost the Mississippi River following the capture of Vicksburg, Mississippi, and Port Hudson in July, ending Southern access to the trans-Mississippi West. July brought short-lived counters, Morgan's Raid into Ohio and the New York City draft riots. Robert E. Lee's strike into Pennsylvania was repulsed at Gettysburg, Pennsylvania despite Pickett's famous charge and other acts of valor. Southern newspapers assessed the campaign as "The Confederates did not gain a victory, neither did the enemy." September and November left Confederates yielding Chattanooga, Tennessee, the gateway to the lower south. For the remainder of the war fighting was restricted inside the South, resulting in a slow but continuous loss of territory. In early 1864, the Confederacy still controlled 53% of its population, but it withdrew further to reestablish defensive positions. Union offensives continued with Sherman's March to the Sea to take Savannah and Grant's Wilderness Campaign to encircle Richmond and besiege Lee's army at Petersburg. In April 1863, the C.S. Congress authorized a uniformed Volunteer Navy, many of whom were British. The Confederacy had altogether eighteen commerce-destroying cruisers, which seriously disrupted Federal commerce at sea and increased shipping insurance rates 900%. Commodore Tattnall again unsuccessfully attempted to break the Union blockade on the Savannah River in Georgia with an ironclad in 1863. Beginning in April 1864 the ironclad CSS Albemarle engaged Union gunboats for six months on the Roanoke River in North Carolina. The Federals closed Mobile Bay by sea-based amphibious assault in August, ending Gulf coast trade east of the Mississippi River. In December, the Battle of Nashville ended Confederate operations in the western theater. Large numbers of families relocated to safer places, usually remote rural areas, bringing along household slaves if they had any. Mary Massey argues these elite exiles introduced an element of defeatism into the southern outlook. The first three months of 1865 saw the Federal Carolinas campaign, devastating a wide swath of the remaining Confederate heartland. The "breadbasket of the Confederacy" in the Great Valley of Virginia was occupied by Philip Sheridan. The Union Blockade captured Fort Fisher in North Carolina, and Sherman finally took Charleston, South Carolina, by land attack. The Confederacy controlled no ports, harbors or navigable rivers. Railroads were captured or had ceased operating. Its major food-producing regions had been war-ravaged or occupied. Its administration survived in only three pockets of territory holding only one-third of its population. Its armies were defeated or disbanding. At the February 1865 Hampton Roads Conference with Lincoln, senior Confederate officials rejected his invitation to restore the Union with compensation for emancipated slaves. The three pockets of unoccupied Confederacy were southern Virginia—North Carolina, central Alabama—Florida, and Texas, the latter two areas less from any notion of resistance than from the disinterest of Federal forces to occupy them. The Davis policy was independence or nothing, while Lee's army was wracked by disease and desertion, barely holding the trenches defending Jefferson Davis' capital. The Confederacy's last remaining blockade-running port, Wilmington, North Carolina, was lost. When the Union broke through Lee's lines at Petersburg, Richmond fell immediately. Lee surrendered at Appomattox Court House, Virginia, on April 9, 1865. "The Surrender" marked the end of the Confederacy. The CSS Stonewall sailed from Europe to break the Union blockade in March; on making Havana, Cuba, it surrendered. Some high officials escaped to Europe, but President Davis was captured May 10; all remaining Confederate land forces surrendered by June 1865. The U.S. Army took control of the Confederate areas, but peace was subsequently marred by a great deal of local violence, feuding and revenge killings. The last Confederate military unit, the commerce raider CSS Shenandoah, surrendered on November 6, 1865, in Liverpool. Historian Gary Gallagher concluded that the Confederacy capitulated in early 1865 because northern armies crushed "organized southern military resistance". The Confederacy's population, soldier and civilian, had suffered material hardship and social disruption. Jefferson Davis' assessment in 1890 determined, "With the capture of the capital, the dispersion of the civil authorities, the surrender of the armies in the field, and the arrest of the President, the Confederate States of America disappeared ... their history henceforth became a part of the history of the United States." Government and politics In February, 1861, Southern leaders met in Montgomery, Alabama to adopt their first constitution, establishing a confederation of "sovereign and independent states", guaranteeing states the right to a republican form of government. Prior to adopting to the first Confederate constitution, the independent states were sovereign republics. A second Confederate constitution was written in March, 1861, which sought to replace the confederation with a federal government; much of this constitution replicated the United States Constitution verbatim, but contained several explicit protections of the institution of slavery including provisions for the recognition and protection of slavery in any territory of the Confederacy. It maintained the ban on international slave-trading, though it made the ban's application explicit to "Negroes of the African race" in contrast to the U.S. Constitution's reference to "such Persons as any of the States now existing shall think proper to admit". It protected the existing internal trade of slaves among slaveholding states. In certain areas, the second Confederate Constitution gave greater powers to the states (or curtailed the powers of the central government more) than the U.S. Constitution of the time did, but in other areas, the states lost rights they had under the U.S. Constitution. Although the Confederate Constitution, like the U.S. Constitution, contained a commerce clause, the Confederate version prohibited the central government from using revenues collected in one state for funding internal improvements in another state. The Confederate Constitution's equivalent to the U.S. Constitution's general welfare clause prohibited protective tariffs (but allowed tariffs for providing domestic revenue). State legislatures had the power to impeach officials of the Confederate government in some cases. On the other hand, the Confederate Constitution contained a Necessary and Proper Clause and a Supremacy Clause that essentially duplicated the respective clauses of the U.S. Constitution. The Confederate Constitution also incorporated each of the 12 amendments to the U.S. Constitution that had been ratified up to that point. The second Confederate Constitution was adopted on February 22, 1862, one year into the American Civil War, and did not specifically include a provision allowing states to secede; the Preamble spoke of each state "acting in its sovereign and independent character" but also of the formation of a "permanent federal government". During the debates on drafting the Confederate Constitution, one proposal would have allowed states to secede from the Confederacy. The proposal was tabled with only the South Carolina delegates voting in favor of considering the motion. The Confederate Constitution also explicitly denied States the power to bar slaveholders from other parts of the Confederacy from bringing their slaves into any state of the Confederacy or to interfere with the property rights of slave owners traveling between different parts of the Confederacy. In contrast with the secular language of the United States Constitution, the Confederate Constitution overtly asked God's blessing ("... invoking the favor and guidance of Almighty God ..."). Some historians have referred to the Confederacy as a form of Herrenvolk democracy. The Montgomery Convention to establish the Confederacy and its executive met on February 4, 1861. Each state as a sovereignty had one vote, with the same delegation size as it held in the U.S. Congress, and generally 41 to 50 members attended. Offices were "provisional", limited to a term not to exceed one year. One name was placed in nomination for president, one for vice president. Both were elected unanimously, 6–0. Jefferson Davis was elected provisional president. His U.S. Senate resignation speech greatly impressed with its clear rationale for secession and his pleading for a peaceful departure from the Union to independence. Although he had made it known that he wanted to be commander-in-chief of the Confederate armies, when elected, he assumed the office of Provisional President. Three candidates for provisional Vice President were under consideration the night before the February 9 election. All were from Georgia, and the various delegations meeting in different places determined two would not do, so Alexander H. Stephens was elected unanimously provisional Vice President, though with some privately held reservations. Stephens was inaugurated February 11, Davis February 18. Davis and Stephens were elected president and vice president, unopposed on November 6, 1861. They were inaugurated on February 22, 1862. Coulter stated, "No president of the U.S. ever had a more difficult task." Washington was inaugurated in peacetime. Lincoln inherited an established government of long standing. The creation of the Confederacy was accomplished by men who saw themselves as fundamentally conservative. Although they referred to their "Revolution", it was in their eyes more a counter-revolution against changes away from their understanding of U.S. founding documents. In Davis' inauguration speech, he explained the Confederacy was not a French-like revolution, but a transfer of rule. The Montgomery Convention had assumed all the laws of the United States until superseded by the Confederate Congress. The Permanent Constitution provided for a President of the Confederate States of America, elected to serve a six-year term but without the possibility of re-election. Unlike the United States Constitution, the Confederate Constitution gave the president the ability to subject a bill to a line item veto, a power also held by some state governors. The Confederate Congress could overturn either the general or the line item vetoes with the same two-thirds votes required in the U.S. Congress. In addition, appropriations not specifically requested by the executive branch required passage by a two-thirds vote in both houses of Congress. The only person to serve as president was Jefferson Davis, as the Confederacy was defeated before the completion of his term. The only two "formal, national, functioning, civilian administrative bodies" in the Civil War South were the Jefferson Davis administration and the Confederate Congresses. The Confederacy was begun by the Provisional Congress in Convention at Montgomery, Alabama on February 28, 1861. The Provisional Confederate Congress was a unicameral assembly; each state received one vote. The Permanent Confederate Congress was elected and began its first session February 18, 1862. The Permanent Congress for the Confederacy followed the United States forms with a bicameral legislature. The Senate had two per state, twenty-six Senators. The House numbered 106 representatives apportioned by free and slave populations within each state. Two Congresses sat in six sessions until March 18, 1865. The political influences of the civilian, soldier vote and appointed representatives reflected divisions of political geography of a diverse South. These in turn changed over time relative to Union occupation and disruption, the war impact on the local economy, and the course of the war. Without political parties, key candidate identification related to adopting secession before or after Lincoln's call for volunteers to retake Federal property. Previous party affiliation played a part in voter selection, predominantly secessionist Democrat or unionist Whig. The absence of political parties made individual roll call voting all the more important, as the Confederate "freedom of roll-call voting [was] unprecedented in American legislative history." Key issues throughout the life of the Confederacy related to (1) suspension of habeas corpus, (2) military concerns such as control of state militia, conscription and exemption, (3) economic and fiscal policy including impressment of slaves, goods and scorched earth, and (4) support of the Jefferson Davis administration in its foreign affairs and negotiating peace. For the first year, the unicameral Provisional Confederate Congress functioned as the Confederacy's legislative branch. The Confederate Constitution outlined a judicial branch of the government, but the ongoing war and resistance from states-rights advocates, particularly on the question of whether it would have appellate jurisdiction over the state courts, prevented the creation or seating of the "Supreme Court of the Confederate States". Thus, the state courts generally continued to operate as they had done, simply recognizing the Confederate States as the national government. Confederate district courts were authorized by Article III, Section 1, of the Confederate Constitution, and President Davis appointed judges within the individual states of the Confederate States of America. In many cases, the same US Federal District Judges were appointed as Confederate States District Judges. Confederate district courts began reopening in early 1861, handling many of the same type cases as had been done before. Prize cases, in which Union ships were captured by the Confederate Navy or raiders and sold through court proceedings, were heard until the blockade of southern ports made this impossible. After a Sequestration Act was passed by the Confederate Congress, the Confederate district courts heard many cases in which enemy aliens (typically Northern absentee landlords owning property in the South) had their property sequestered (seized) by Confederate Receivers. Supreme Court – not established. District Courts – judges The Confederacy established the Confederate Post Office for mail delivery. One of the first undertakings in establishing the office was the appointment of John H. Reagan as Postmaster General, by Jefferson Davis in 1861. Writing in 1906, historian Walter Flavius McCaleb praised Reagan's "energy and intelligence... in a degree scarcely matched by any of his associates". When the war began, the US Post Office briefly delivered mail from the secessionist states. Mail that was postmarked after the date of a state's admission into the Confederacy through May 31, 1861, and bearing US postage was still delivered. After this time, private express companies still managed to carry some of the mail across enemy lines. Later, mail that crossed lines had to be sent by 'Flag of Truce' and was allowed to pass at only two specific points. Mail sent from the Confederacy to the U.S. was received, opened and inspected at Fortress Monroe on the Virginia coast before being passed on into the U.S. mail stream. Mail sent from the North to the South passed at City Point, also in Virginia, where it was also inspected before being sent on. With the chaos of the war, a working postal system was more important than ever for the Confederacy. The Civil War had divided family members and friends and consequently letter writing increased dramatically across the entire divided nation, especially to and from the men who were away serving in an army. Mail delivery was also important for the Confederacy for a myriad of business and military reasons. Because of the Union blockade, basic supplies were always in demand and so getting mailed correspondence out of the country to suppliers was imperative to the successful operation of the Confederacy. Volumes of material have been written about the Blockade runners who evaded Union ships on blockade patrol, usually at night, and who moved cargo and mail in and out of the Confederate States throughout the course of the war. Of particular interest to students and historians of the American Civil War is Prisoner of War mail and Blockade mail as these items were often involved with a variety of military and other war time activities. The postal history of the Confederacy along with surviving Confederate mail has helped historians document the various people, places and events that were involved in the American Civil War as it unfolded. The Confederacy actively used the army to arrest people suspected of loyalty to the United States. Historian Mark Neely found 4,108 names of men arrested and estimated a much larger total. The Confederacy arrested pro-Union civilians in the South at about the same rate as the Union arrested pro-Confederate civilians in the North. Neely argues: The Confederate citizen was not any freer than the Union citizen – and perhaps no less likely to be arrested by military authorities. In fact, the Confederate citizen may have been in some ways less free than his Northern counterpart. For example, freedom to travel within the Confederate states was severely limited by a domestic passport system. Economy Across the South, widespread rumors predicted the slaves were planning insurrection, causing panic. Patrols were stepped up. The slaves did become increasingly independent and resistant to punishment, but historians agree there were no insurrections. Many slaves became spies for the North, and large numbers ran away to federal lines. According to the 1860 United States census, about 31% of free households in the eleven states that would join the Confederacy owned slaves. The 11 states that seceded had the highest percentage of slaves as a proportion of their population, representing 39% of their total population. The proportions ranged from a majority in South Carolina (57.2%) and Mississippi (55.2%) to about a quarter in Tennessee (24.8%). Lincoln's Emancipation Proclamation on January 1, 1863, legally freed three million slaves in designated areas of the Confederacy. The long-term effect was that the Confederacy could not preserve the institution of slavery and lost the use of the core element of its plantation labor force. Over 200,000 freed slaves were hired by the federal army as teamsters, cooks, launderers and laborers, and eventually as soldiers. Plantation owners, realizing that emancipation would destroy their economic system, sometimes moved their slaves as far as possible out of reach of the Union army. Though the concept was promoted within certain circles of the Union hierarchy during and immediately following the war, no program of reparations for freed slaves was ever attempted. Unlike other Western countries, such as Britain and France, the U.S. government never paid compensation to Southern slave owners for their "lost property". The only place compensated emancipation was carried out was the District of Columbia. The plantations of the South, with white ownership and an enslaved labor force, produced substantial wealth from cash crops. It supplied two-thirds of the world's cotton, which was in high demand for textiles, along with tobacco, sugar, and naval stores (such as turpentine). These raw materials were exported to factories in Europe and the Northeast. Planters reinvested their profits in more slaves and fresh land, as cotton and tobacco depleted the soil. There was little manufacturing or mining; shipping was controlled by non-southerners. The plantations that enslaved over three million black people were the principal source of wealth. Most were concentrated in "black belt" plantation areas (because few white families in the poor regions owned slaves). For decades, there had been widespread fear of slave revolts. During the war, extra men were assigned to "home guard" patrol duty and governors sought to keep militia units at home for protection. Historian William Barney reports, "no major slave revolts erupted during the Civil War." Nevertheless, slaves took the opportunity to enlarge their sphere of independence, and when union forces were nearby, many ran off to join them. Slave labor was applied in industry in a limited way in the Upper South and in a few port cities. One reason for the regional lag in industrial development was top-heavy income distribution. Mass production requires mass markets, and slaves living in small cabins, using self-made tools and outfitted with one suit of work clothes each year of inferior fabric, did not generate consumer demand to sustain local manufactures of any description in the same way as did a mechanized family farm of free labor in the North. The Southern economy was "pre-capitalist" in that slaves were put to work in the largest revenue-producing enterprises, not free labor markets. That labor system as practiced in the American South encompassed paternalism, whether abusive or indulgent, and that meant labor management considerations apart from productivity. Approximately 85% of both the North and South white populations lived on family farms, both regions were predominantly agricultural, and mid-century industry in both was mostly domestic. But the Southern economy was pre-capitalist in its overwhelming reliance on the agriculture of cash crops to produce wealth, while the great majority of farmers fed themselves and supplied a small local market. Southern cities and industries grew faster than ever before, but the thrust of the rest of the country's exponential growth elsewhere was toward urban industrial development along transportation systems of canals and railroads. The South was following the dominant currents of the American economic mainstream, but at a "great distance" as it lagged in the all-weather modes of transportation that brought cheaper, speedier freight shipment and forged new, expanding inter-regional markets. A third count of the pre-capitalist Southern economy relates to the cultural setting. White southerners did not adopt a work ethic, nor the habits of thrift that marked the rest of the country. It had access to the tools of capitalism, but it did not adopt its culture. The Southern Cause as a national economy in the Confederacy was grounded in "slavery and race, planters and patricians, plain folk and folk culture, cotton and plantations". The Confederacy started its existence as an agrarian economy with exports, to a world market, of cotton, and, to a lesser extent, tobacco and sugarcane. Local food production included grain, hogs, cattle, and vegetables. The cash came from exports but the Southern people spontaneously stopped exports in early 1861 to hasten the impact of "King Cotton", a failed strategy to coerce international support for the Confederacy through its cotton exports. When the blockade was announced, commercial shipping practically ended (because the ships could not get insurance), and only a trickle of supplies came via blockade runners. The cutoff of exports was an economic disaster for the South, rendering useless its most valuable properties: its plantations and their enslaved workers. Many planters kept growing cotton, which piled up everywhere, but most turned to food production. All across the region, the lack of repair and maintenance wasted away the physical assets. The eleven states had produced $155 million (~$4.4 billion in 2024) in manufactured goods in 1860, chiefly from local gristmills, and lumber, processed tobacco, cotton goods and naval stores such as turpentine. The main industrial areas were border cities such as Baltimore, Wheeling, Louisville and St. Louis, that were never under Confederate control. The government did set up munitions factories in the Deep South. Combined with captured munitions and those coming via blockade runners, the armies were kept minimally supplied with weapons. The soldiers suffered from reduced rations, lack of medicines, and the growing shortages of uniforms, shoes and boots. Shortages were much worse for civilians, and the prices of necessities steadily rose. The Confederacy adopted a tariff or tax on imports of 15 percent and imposed it on all imports from other countries, including the United States. The tariff mattered little; the Union blockade minimized commercial traffic through the Confederacy's ports, and very few people paid taxes on goods smuggled from the North. The Confederate government in its entire history collected only $3.5 million in tariff revenue. The lack of adequate financial resources led the Confederacy to finance the war through printing money, which led to high inflation. The Confederacy underwent an economic revolution by centralization and standardization, but it was too little too late, as its economy was systematically strangled by blockade and raids. In peacetime, the South's extensive and connected systems of navigable rivers and coastal access allowed for cheap and easy transportation of agricultural products. The railroad system in the South had developed as a supplement to the navigable rivers to enhance the all-weather shipment of cash crops to market. Railroads tied plantation areas to the nearest river or seaport and so made supply more dependable, lowered costs and increased profits. In the event of invasion, the vast geography of the Confederacy made logistics difficult for the Union. Wherever Union armies invaded, they assigned many of their soldiers to garrison captured areas and to protect rail lines. At the onset of the Civil War the South had a rail network disjointed and plagued by changes in track gauge as well as lack of interchange. Locomotives and freight cars had fixed axles and could not use tracks of different gauges (widths). Railroads of different gauges leading to the same city required all freight to be off-loaded onto wagons for transport to the connecting railroad station, where it had to await freight cars and a locomotive before proceeding. Centers requiring off-loading included Vicksburg, New Orleans, Montgomery, Wilmington and Richmond. In addition, most rail lines led from coastal or river ports to inland cities, with few lateral railroads. Because of this design limitation, the relatively primitive railroads of the Confederacy were unable to overcome the Union naval blockade of the South's crucial intra-coastal and river routes. The Confederacy had no plan to expand, protect or encourage its railroads. Southerners' refusal to export the cotton crop in 1861 left railroads bereft of their main source of income. Many lines had to lay off employees; many critical skilled technicians and engineers were permanently lost to military service. In the early years of the war the Confederate government had a hands-off approach to the railroads. Only in mid-1863 did the Confederate government initiate a national policy, and it was confined solely to aiding the war effort. Railroads came under the de facto control of the military. In contrast, the U.S. Congress had authorized military administration of Union-controlled railroad and telegraph systems in January 1862, imposed a standard gauge, and built railroads into the South using that gauge. Confederate armies successfully reoccupying territory could not be resupplied directly by rail as they advanced. The C.S. Congress formally authorized military administration of railroads in February 1865. In the last year before the end of the war, the Confederate railroad system stood permanently on the verge of collapse. There was no new equipment and raids on both sides systematically destroyed key bridges, as well as locomotives and freight cars. Spare parts were cannibalized; feeder lines were torn up to get replacement rails for trunk lines, and rolling stock wore out through heavy use. The Confederate army experienced a persistent shortage of horses and mules and requisitioned them with dubious promissory notes given to local farmers and breeders. Union forces paid in real money and found ready sellers in the South. Both armies needed horses for cavalry and for artillery. Mules pulled the wagons. The supply was undermined by an unprecedented epidemic of glanders, a fatal disease that baffled veterinarians. After 1863 the invading Union forces had a policy of shooting all the local horses and mules that they did not need, in order to keep them out of Confederate hands. The Confederate armies and farmers experienced a growing shortage of horses and mules, which hurt the Southern economy and the war effort. The South lost half of its 2.5 million horses and mules; many farmers ended the war with none left. Army horses were used up by hard work, malnourishment, disease and battle wounds; they had a life expectancy of about seven months. Both the individual Confederate states and later the Confederate government printed Confederate States of America dollars as paper currency in various denominations, with a total face value of $1.5 billion. Much of it was signed by Treasurer Edward C. Elmore. Inflation became rampant as the paper money depreciated and eventually became worthless. The state governments and some localities printed their own paper money, adding to the runaway inflation. The Confederate government initially wanted to finance its war mostly through tariffs on imports, export taxes, and voluntary donations of gold. After the spontaneous imposition of an embargo on cotton sales to Europe in 1861, these sources of revenue dried up and the Confederacy increasingly turned to issuing debt and printing money to pay for war expenses. The Confederate States politicians were worried about angering the general population with hard taxes. A tax increase might disillusion many Southerners, so the Confederacy resorted to printing more money. As a result, inflation increased and remained a problem for the southern states throughout the rest of the war. By April 1863, for example, the cost of flour in Richmond had risen to $100 (~$2,615 in 2025) a barrel and housewives were rioting. The Confederate government took over the three national mints in its territory: the Charlotte Mint in North Carolina, the Dahlonega Mint in Georgia, and the New Orleans Mint in Louisiana. During 1861 all of these facilities produced small amounts of gold coinage, and the latter half dollars as well. A lack of silver and gold precluded further coinage. The Confederacy apparently also experimented with issuing one cent coins, although only 12 were produced by a jeweler in Philadelphia, who was afraid to send them to the South. Like the half dollars, copies were later made as souvenirs. US coinage was hoarded and did not have any general circulation. U.S. coinage was admitted as legal tender up to $10, as were British sovereigns, French Napoleons and Spanish and Mexican doubloons at a fixed rate of exchange. Confederate money was paper and postage stamps. By mid-1861, the Union naval blockade virtually shut down the export of cotton and the import of manufactured goods. Food that formerly came overland was cut off. As women were the ones who remained at home, they had to make do with the lack of food and supplies. They cut back on purchases, used old materials, and planted more flax and peas to provide clothing and food. They used ersatz substitutes when possible. The households were severely hurt by inflation in the cost of everyday items like flour, and the shortages of food, fodder for the animals, and medical supplies for the wounded. State governments requested that planters grow less cotton and more food, but most refused. When cotton prices soared in Europe, expectations were that Europe would soon intervene to break the blockade and make them rich, but Europe remained neutral. The Georgia legislature imposed cotton quotas, making it a crime to grow an excess. But food shortages only worsened, especially in the towns. The overall decline in food supplies, made worse by the inadequate transportation system, led to serious shortages and high prices in urban areas. When bacon reached a dollar a pound in 1863, the poor women of Richmond, Atlanta and many other cities began to riot; they broke into shops and warehouses to seize food. As wives and widows of soldiers, they were hurt by the inadequate welfare system. By the end of the war deterioration of the Southern infrastructure was widespread. The number of civilian deaths is unknown. Every Confederate state was affected, but most of the war was fought in Virginia and Tennessee, while Texas and Florida saw the least military action. Much of the damage was caused by direct military action, but most was caused by lack of repairs and upkeep, and by deliberately using up resources. Historians have recently estimated how much of the devastation was caused by military action. Paul Paskoff calculates that Union military operations were conducted in 56% of 645 counties in nine Confederate states (excluding Texas and Florida). These counties contained 63% of the 1860 white population and 64% of the slaves. By the time the fighting took place, undoubtedly some people had fled to safer areas, so the exact population exposed to war is unknown. The eleven Confederate States in the 1860 United States census had 297 towns and cities with 835,000 people; of these 162 with 681,000 people were at one point occupied by Union forces. Eleven were destroyed or severely damaged by war action, including Atlanta (with an 1860 population of 9,600), Charleston, Columbia, and Richmond (with prewar populations of 40,500, 8,100, and 37,900, respectively); the eleven contained 115,900 people in the 1860 census, or 14 percent of the urban South. Historians have not estimated what their actual population was when Union forces arrived. The number of people (as of 1860) who lived in the destroyed towns represented just over 1 percent of the Confederacy's 1860 population. In addition, 45 court houses were burned (out of 830). The South's agriculture was not highly mechanized. The value of farm implements and machinery in the 1860 Census was $81 million; by 1870, it had diminished by 40 percent and was worth just $48 million. Many old tools had broken through heavy use; new tools were rarely available, and even repairs were difficult. The economic losses affected everyone. Most banks and insurance companies had gone bankrupt. Confederate currency and bonds were worthless. The billions of dollars invested in slaves vanished. Most debts were also left behind. Most farms were intact but had lost their horses, mules, and cattle. Paskoff shows the loss of farm infrastructure was about the same whether or not fighting took place nearby. The loss of infrastructure and productive capacity meant that rural widows throughout the region faced not only the absence of able-bodied men, but a depleted stock of material resources. During four years of warfare, disruption, and blockades, the South used up about half its capital stock. The rebuilding took years and was hindered by the low price of cotton after the war. Outside investment was essential, especially in railroads. One historian has summarized the collapse of the transportation infrastructure needed for economic recovery: One of the greatest calamities which confronted Southerners was the havoc wrought on the transportation system. Roads were impassable or nonexistent, and bridges were destroyed or washed away. The important river traffic was at a standstill: levees were broken, channels were blocked, the few steamboats which had not been captured or destroyed were in a state of disrepair, wharves had decayed or were missing, and trained personnel were dead or dispersed. Horses, mules, oxen, carriages, wagons, and carts had nearly all fallen prey at one time or another to the contending armies. The railroads were paralyzed, with most of the companies bankrupt. These lines had been the special target of the enemy. On one stretch of 114 miles in Alabama, every bridge and trestle was destroyed, cross-ties rotten, buildings burned, water-tanks gone, ditches filled up, and tracks grown up in weeds and bushes ... Communication centers like Columbia and Atlanta were in ruins; shops and foundries were wrecked or in disrepair. Even those areas bypassed by battle had been pirated for equipment needed on the battlefront, and the wear and tear of wartime usage without adequate repairs or replacements reduced all to a state of disintegration. More than 250,000 Confederate soldiers died during the war. Some widows abandoned their family farms and merged into the households of relatives, or even became refugees living in camps with high rates of disease and death. In the Old South, being an "old maid" was an embarrassment to the woman and her family, but after the war, it became almost a norm. Some women welcomed the freedom of not having to marry. Divorce, while never fully accepted, became more common. The concept of the "New Woman" emerged – she was self-sufficient and independent, and stood in sharp contrast to the "Southern Belle" of antebellum lore. National flags The first official flag of the Confederate States of America—called the "Stars and Bars"—originally had seven stars, representing the first seven states that initially formed the Confederacy. As more states joined, more stars were added, until the total was 13 (two stars were added for the divided states of Kentucky and Missouri). During the First Battle of Bull Run, (First Manassas) it sometimes proved difficult to distinguish the Stars and Bars from the Union flag. To rectify the situation, a separate "Battle Flag" was designed for use by troops in the field. Also known as the "Southern Cross", many variations sprang from the original square configuration. Although it was never officially adopted by the Confederate government, the popularity of the Southern Cross among both soldiers and the civilian population was a primary reason why it was made the main color feature when a new national flag was adopted in 1863. This new standard—known as the "Stainless Banner"—consisted of a lengthened white field area with a Battle Flag canton. This flag too had its problems when used in military operations as, on a windless day, it could easily be mistaken for a flag of truce or surrender. Thus, in 1865, a modified version of the Stainless Banner was adopted. This final national flag of the Confederacy kept the Battle Flag canton, but shortened the white field and added a vertical red bar to the fly end. The "Confederate Flag" has a color scheme similar to that of the most common Battle Flag design, but is rectangular, not square. The "Confederate Flag" is a highly recognizable symbol of the South in the United States today and continues to be a controversial icon. Southern Unionism Unionism—opposition to the Confederacy—was strong in certain areas within the Confederate States. Southern Unionists were widespread in the mountain regions of Appalachia and the Ozarks. Unionists, led by Parson Brownlow and Senator Andrew Johnson, took control of East Tennessee in 1863. Unionists also attempted control over western Virginia, but never effectively held more than half of the counties that formed the new state of West Virginia. Union forces captured parts of coastal North Carolina, and at first were largely welcomed by local unionists. The occupiers became perceived as oppressive, callous, radical and favorable to Freedmen. Occupiers pillaged, freed slaves, and evicted those who refused to swear loyalty oaths to the Union. In Texas, local officials harassed and murdered Unionists. Draft resistance was widespread especially among Texans of German or Mexican descent, many of the latter leaving for Mexico. Confederate officials attempted to hunt down and kill potential draftees who had gone into hiding. Over 4,000 suspected Unionists were imprisoned in the Confederate States without trial. Up to 100,000 men living in states under Confederate control served in the Union Army or pro-Union guerilla groups. Although Southern Unionists came from all classes, most differed socially, culturally, and economically from the region's dominant pre-war planter class. Geography The Confederate States of America claimed a total of 2,919 miles (4,698 km) of coastline, thus a large part of its territory lay on the seacoast with level and often sandy or marshy ground. Most of the interior portion consisted of arable farmland, though much was also hilly and mountainous, and the far western territories were deserts. The southern reaches of the Mississippi River bisected the country, and the western half was often referred to as the Trans-Mississippi. The highest point (excluding Arizona and New Mexico) was Guadalupe Peak in Texas at 8,750 feet (2,670 m). Much of the area had a humid subtropical climate with mild winters and long, hot, humid summers. The climate and terrain varied from vast swamps to semi-arid steppes and arid deserts. The subtropical climate made winters mild but allowed infectious diseases to flourish; on both sides more soldiers died from disease than were killed in combat. Demographics The 1860 United States census gives a picture of the population for the areas that had joined the Confederacy. The population numbers exclude non-assimilated Indian tribes. In 1860, the areas that later formed the eleven Confederate states (and including the future West Virginia) had 132,760 (2%) free blacks. Males made up 49% of the total population and females 51%. The CSA was overwhelmingly rural. Few towns had populations of more than 1,000—the typical county seat had a population under 500. Of the twenty largest U.S. cities in the 1860 census, only New Orleans lay in Confederate territory. Only 13 Confederate-controlled cities ranked among the top 100 U.S. cities in 1860, most of them ports whose economic activities vanished or suffered severely in the Union blockade. The population of Richmond swelled after it became the Confederate capital, reaching an estimated 128,000 in 1864. The cities of the Confederacy included (by size of population): The CSA was overwhelmingly Protestant. Both free and enslaved populations identified with evangelical Protestantism. Baptists and Methodists together formed majorities of both the white and the slave population, becoming the Black church. Freedom of religion and separation of church and state were fully ensured by Confederate laws.[citation needed] Church attendance was very high and chaplains played a major role in the Army. Most large denominations experienced a North–South split in the prewar era on the issue of slavery. The creation of a new country necessitated independent structures. For example, the Presbyterian Church in the United States split, with much of the new leadership provided by Joseph Ruggles Wilson. Baptists and Methodists both broke off from their Northern coreligionists over the slavery issue, forming the Southern Baptist Convention and the Methodist Episcopal Church, South. Elites in the southeast favored the Protestant Episcopal Church in the Confederate States of America, which had reluctantly split from the Episcopal Church in 1861. Other elites were Presbyterians belonging to the 1861-founded Presbyterian Church in the United States. Catholics included an Irish working-class element in coastal cities and an old French element in southern Louisiana. The southern churches met the shortage of Army chaplains by sending missionaries. One result was wave after wave of revivals in the Army. Legacy and assessment When the war ended over 14,000 Confederates petitioned President Johnson for a pardon; he was generous in giving them out. He issued a general amnesty to all Confederate participants in the "late Civil War" in 1868. Congress passed additional Amnesty Acts in May 1866 with restrictions on office holding, and the Amnesty Act in May 1872 lifting those restrictions. There was a great deal of discussion in 1865 about bringing treason trials, especially against Jefferson Davis. There was no consensus in President Johnson's cabinet, and no one was charged with treason. An acquittal of Davis would have been humiliating for the government. Davis was indicted for treason but never tried; he was released from prison on bail in May 1867. The amnesty of December 25, 1868, eliminated any possibility of Davis standing trial for treason. Henry Wirz, the commandant of a notorious prisoner-of-war camp near Andersonville, Georgia, was convicted by a military court of charges related to cruelty and conspiracy, and executed on November 10, 1865. The U.S. government began a decade-long process known as Reconstruction which attempted to resolve the political and constitutional issues of the Civil War. The priorities were: to guarantee that Confederate nationalism and slavery were ended, to ratify and enforce the Thirteenth Amendment which outlawed slavery; the Fourteenth which guaranteed dual U.S. and state citizenship to all native-born residents, regardless of race; the Fifteenth, which made it illegal to deny the right to vote because of race; and repeal each state's ordinance of secession. By 1877, the Compromise of 1877 ended Reconstruction in the former Confederate states. Federal troops were withdrawn. The war left the entire region economically devastated by military action, ruined infrastructure, and exhausted resources. Still dependent on an agricultural economy and resisting investment in infrastructure, it remained dominated by the planter elite into the next century. Democrat-dominated legislatures passed new constitutions and amendments to exclude most blacks and many poor whites. This exclusion and a weakened Republican Party remained the norm until the Voting Rights Act of 1965. The Solid South of the early 20th century did not achieve national levels of prosperity until long after World War II. In Texas v. White (1869), the Supreme Court ruled by a 5–3 majority that Texas had remained a state ever since it first joined the Union, despite claims that it joined the Confederate States of America. The Court held that the Constitution did not permit a state to unilaterally secede. In declaring that no state could leave the Union, "except through revolution or through consent of the States", it was "explicitly repudiating the position of the Confederate states that the United States was a voluntary compact between sovereign states". In Sprott v. United States (1874), the Supreme Court ruled 8–1 to reaffirm its conclusion in White and held that the Confederacy's "foundation was treason" and its "single purpose, so long as it lasted, was to make that treason successful." Historian Frank Lawrence Owsley argued that the Confederacy "died of states' rights". The central government was denied requisitioned soldiers and money by governors and state legislatures because they feared that Richmond would encroach on the rights of the states. Georgia's governor Joseph Brown warned of a secret conspiracy by Jefferson Davis to destroy states' rights and individual liberty. The first conscription act in North America, authorizing Davis to draft soldiers, was said to be the "essence of military despotism". Roger Lowenstein argued that the Confederacy's failure to raise adequate revenue led to hyperinflation and being unable to win a war of attrition, despite the prowess of its military leadership such as Robert E. Lee. Though political differences were within the Confederacy, no national political parties were formed because they were seen as illegitimate. "Anti-partyism became an article of political faith." Without a system of political parties building alternate sets of national leaders, electoral protests tended to be narrowly state-based, "negative, carping and petty". The 1863 mid-term elections became mere expressions of futile and frustrated dissatisfaction. According to historian David M. Potter, the lack of a functioning two-party system caused "real and direct damage" to the Confederate war effort since it prevented the formulation of any effective alternatives to the conduct of the war by the Davis administration. The enemies of President Davis proposed that the Confederacy "died of Davis". He was unfavorably compared to George Washington by critics such as Edward Alfred Pollard, editor of the most influential newspaper in the Confederacy, the Daily Richmond Examiner. Beyond the early honeymoon period, Davis was never popular. Ellis Merton Coulter, viewed by historians as a Confederate apologist, argues that Davis was unable to mobilize Confederate nationalism in support of his government effectively, and especially failed to appeal to the small farmers who made up the bulk of the population. Davis failed to build a network of supporters who would speak up when he came under criticism, and he repeatedly alienated governors and other state-based leaders by demanding centralized control of the war effort. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://github.com/mcp] | [TOKENS: 622] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. Connect models to the real world Servers and tools from the community that connect models to files, APIs, databases, and more. All MCP servers Convert various file formats (PDF, Word, Excel, images, audio) to Markdown. By microsoft Real-time infrastructure monitoring with metrics, logs, alerts, and ML-based anomaly detection. By netdata Up-to-date code docs for any prompt By upstash Automate web browsers using accessibility trees for testing and data extraction. By microsoft Connect AI assistants to GitHub - manage repos, issues, PRs, and workflows through natural language. By github MCP server for Chrome DevTools By ChromeDevTools Semantic code retrieval & editing tools for coding agents. By oraios Control the Unity Editor from MCP clients via a Unity bridge + local Python server. By CoplayDev Extract web data with Firecrawl By firecrawl MCP server for terminal commands, file operations, and process management By wonderwhy-er Official MCP server for Notion API By makenotion All Azure MCP tools to create a seamless connection between AI agents and Azure services. By microsoft MCP server for interacting with the Supabase platform By supabase-community Minimal Database MCP Server for PostgreSQL, MySQL, SQL Server, SQLite, MariaDB By bytebase Enables clients like GitHub Copilot and other AI agents to bring trusted and up-to-date information directly from Microsoft's official documentation. By MicrosoftDocs Interact with Azure DevOps services like repositories, work items, builds, releases, test plans, and code search. By microsoft MCP server integrating with Stripe - tools for customers, products, payments, and more. By stripe Generate more accurate Terraform and automate workflows for HCP Terraform and Terraform Enterprise By hashicorp MCP server for advanced web search using Tavily By tavily-ai MongoDB Model Context Protocol Server By mongodb-js MCP server helping models understand your Vite/Nuxt app. By antfu Extract data from any website with thousands of scrapers, crawlers, and automations on Apify Store ⚡ By apify Next.js development tools MCP server with stdio transport By vercel MCP server for connecting to Elasticsearch data and indices. Supports search queries, mappings, ES|QL, and shard information through natural language interactions. By elastic MCP server for Sentry - error monitoring, issue tracking, and debugging for AI assistants By getsentry MCP server for interacting with Neon Management API and databases By neondatabase Provides data retrieval capabilities powered by Chroma, enabling AI models to create collections over generated data and user inputs, and retrieve that data using vector search, full text search, metadata filtering, and more. By chroma-core An MCP server that enables integration with SonarQube Server or Cloud for code quality and security. By SonarSource MCP server for monday.com integration. By mondaycom Atlassian Rovo MCP Server By atlassian Footer |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-247] | [TOKENS: 12858] |
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/British_Raj] | [TOKENS: 17944] |
Contents British Raj The British Raj (/rɑːdʒ/ RAHJ; from Hindustani rāj, 'reign', 'rule' or 'government') was the rule of the British Crown on the Indian subcontinent, lasting from 1858 to 1947. It is also called Crown rule in India, or direct rule in India. The region under British control was commonly called India in contemporaneous usage and included areas directly administered by the United Kingdom, which were collectively called British India, and areas ruled by indigenous rulers, but under British paramountcy, called the princely states. The region was sometimes called the Indian Empire, though not officially. As India, it was a founding member of the League of Nations and a founding member of the United Nations in San Francisco in 1945. India was a participating state in the Summer Olympics in 1900, 1920, 1928, 1932, and 1936. This system of governance was instituted on 28 June 1858, when, after the Indian Rebellion of 1857, the rule of the East India Company was transferred to the Crown in the person of Queen Victoria (who, in 1876, was proclaimed Empress of India). It lasted until 1947 when the British Raj was partitioned into two sovereign dominion states: the Union of India (later the Republic of India) and Dominion of Pakistan (later the Islamic Republic of Pakistan and People's Republic of Bangladesh in the 1971 Proclamation of Bangladeshi Independence). At the inception of the Raj in 1858, Lower Burma was already a part of British India; Upper Burma was added in 1886, and the resulting union, Burma, was administered as an autonomous province until 1937, when it became a separate British colony, gaining its independence in 1948. It was renamed Myanmar in 1989. The Chief Commissioner's Province of Aden was also part of British India at the inception of the British Raj and became a separate colony known as Aden Colony in 1937 as well. Geographical extent The British Raj extended over almost all present-day India, Pakistan, Bangladesh and Myanmar, except for small holdings by other European nations such as Goa (Portugal) and Pondicherry (France). This area is very diverse, containing the Himalayan mountains, fertile floodplains, the Indo-Gangetic Plain, a long coastline, tropical dry forests, arid uplands, and the Thar Desert. In addition, at various times, it included Aden (from 1858 to 1937), Lower Burma (from 1858 to 1937), Upper Burma (from 1886 to 1937), British Somaliland (briefly from 1884 to 1898), and the Straits Settlements (briefly from 1858 to 1867). Burma was separated from India and directly administered by the British Crown from 1937 until its independence in 1948. To the west, from the early 20th century, the Trucial States of the Persian Gulf showed signs of more close-knit association with the government of the British Raj, which was analogous to that of the princely states. Among other countries in the region, Ceylon (now Sri Lanka), which referred to the coastal regions and northern part of the island at that time, was ceded to Britain in 1802 under the Treaty of Amiens. These coastal regions were temporarily administered under Madras Presidency between 1793 and 1798, but, for later periods, the British governors reported to London, and it was not part of the Raj. The kingdoms of Nepal and Bhutan, having fought wars with the British, subsequently signed treaties with them and were recognised by the British as independent states. The Kingdom of Sikkim was established as a princely state after the Anglo-Sikkimese Treaty of 1861; however, the issue of sovereignty was left undefined. The Maldive Islands were a British protectorate from 1887 to 1965 but not part of British India. History Although the Indian Rebellion of 1857 had shaken the British enterprise in India, it had not derailed it. Until 1857, the British, especially under Lord Dalhousie, had been hurriedly building an India which they envisaged to be on par with Britain itself in the quality and strength of its economic and social institutions. After the rebellion, they became more circumspect. Much thought was devoted to the causes of the rebellion, and three main lessons were drawn. First, at a practical level, it was felt that there needed to be more communication and camaraderie between the British and Indians—not just between British army officers and their Indian staff but in civilian life as well. The Indian army was completely reorganised: units composed of the Muslims and Brahmins of the United Provinces of Agra and Oudh, who had formed the core of the rebellion, were disbanded. New regiments, like the Sikhs and Baluchis, composed of Indians who, in British estimation, had demonstrated steadfastness, were formed. From then on, the Indian army was to remain unchanged in its organisation until 1947. The 1861 Census had revealed that the English population in India was 125,945. Of these, only about 41,862 were civilians as compared with about 84,083 European officers and men of the Army. In 1880, the standing Indian Army consisted of 66,000 British soldiers, 130,000 Natives, and 350,000 soldiers in the princely armies. Second, it was also felt that both the princes and the large land-holders, by not joining the rebellion, had proved to be, in Lord Canning's words, "breakwaters in a storm". They too were rewarded in the new British Raj by being integrated into the British-Indian political system and having their territories guaranteed. At the same time, it was felt that the peasants, for whose benefit the large land reforms of the United Provinces had been undertaken, had shown disloyalty by, in many cases, fighting for their former landlords against the British. Consequently, no more land reforms were implemented for the next 90 years: Bengal and Bihar were to remain the realms of large land holdings (unlike the Punjab and Uttar Pradesh). Third, the British felt disenchanted with Indian reaction to social change. Until the rebellion, they had enthusiastically pushed through social reform, like the ban on sati by Lord William Bentinck. It was now felt that traditions and customs in India were too strong and too rigid to be changed easily; consequently, no more British social interventions were made, especially in matters dealing with religion, even when the British felt very strongly about the issue (as in the instance of the remarriage of Hindu child widows). This sentiment was exemplified further in Queen Victoria's Proclamation released immediately after the rebellion. The proclamation stated that "We disclaim alike our Right and Desire to impose Our Convictions on any of Our Subjects", demonstrating official British commitment to abstaining from social intervention in India. In the second half of the 19th century, both the direct administration of India by the British crown and the technological change ushered in by the industrial revolution had the effect of closely intertwining the economies of India and Great Britain. In fact, many of the major changes in transport and communications (that are typically associated with Crown Rule of India) had already begun before the Mutiny. Since Dalhousie had embraced the technological change then rampant in Great Britain, India too saw the rapid development of all those technologies. Railways, roads, canals, and bridges were rapidly built in India, and telegraph links were equally rapidly established so that raw materials, such as cotton, from India's hinterland, could be transported more efficiently to ports, such as Bombay, for subsequent export to England. Likewise, finished goods from England were transported back for sale in the burgeoning Indian markets. Unlike Britain, where the market risks for the infrastructure development were borne by private investors, in India, it was the taxpayers—primarily farmers and farm-labourers—who endured the risks, which, in the end, amounted to £50 million. Despite these costs, very little skilled employment was created for Indians. By 1920, with the fourth largest railway network in the world and a history of 60 years of its construction, only ten per cent of the "superior posts" in the Indian Railways were held by Indians. The rush of technology was also changing the agricultural economy in India: by the last decade of the 19th century, a large fraction of some raw materials—not only cotton but also some food-grains—were being exported to faraway markets. Many small farmers, dependent on the whims of those markets, lost land, animals, and equipment to money-lenders. The latter half of the 19th century also saw an increase in the number of large-scale famines in India. Although famines were not new to the subcontinent, these were particularly severe, with tens of millions dying[citation needed] and with many critics, both British and Indian, laying the blame at the doorsteps of the lumbering colonial administrations. There were also salutary effects: commercial cropping, especially in the newly canalled Punjab, led to increased food production for internal consumption. The railway network provided critical famine relief, notably reduced the cost of moving goods, and helped nascent Indian-owned industry. After the Great Famine of 1876–1878, the Indian Famine Commission report was issued in 1880, and the Indian Famine Codes, the earliest famine scales and programmes for famine prevention, were instituted. In one form or other, they would be implemented worldwide by the United Nations and the Food and Agricultural Organisation well into the 1970s.[citation needed] By 1880, a new middle class had arisen in India and spread thinly across the country. Moreover, there was a growing solidarity among its members, created by the "joint stimuli of encouragement and irritation". The encouragement felt by this class came from its success in education and its ability to avail itself of the benefits of that education such as employment in the Indian Civil Service. It came too from Queen Victoria's proclamation of 1858 in which she had declared, "We hold ourselves bound to the natives of our Indian territories by the same obligation of duty which bind us to all our other subjects." Indians were especially encouraged when Canada was granted dominion status in 1867 and established an autonomous democratic constitution. Lastly, the encouragement came from the work of contemporaneous Oriental scholars like Monier Monier-Williams and Max Müller, who in their works had been presenting ancient India as a great civilisation. Irritation, on the other hand, came not just from incidents of racial discrimination at the hands of the British in India but also from governmental actions like the use of Indian troops in imperial campaigns (e.g., in the Second Anglo-Afghan War) and the attempts to control the vernacular press (e.g., in the Vernacular Press Act of 1878). It was, however, Viceroy Lord Ripon's partial reversal of the Ilbert Bill (1883), a legislative measure that had proposed putting Indian judges in the Bengal Presidency on equal footing with British ones, that transformed the discontent into political action. On 28 December 1885, professionals and intellectuals from this middle-class—many educated at the new British-founded universities in Bombay, Calcutta, and Madras, and familiar with the ideas of British political philosophers, especially the utilitarians assembled in Bombay—founded the Indian National Congress. The 70 men elected Womesh Chunder Bonerjee as the first president. The membership consisted of a westernised elite, and no effort was made at this time to broaden the base.[citation needed] During its first 20 years, the Congress primarily debated British policy toward India. Its debates created a new Indian outlook that held Great Britain responsible for draining India of its wealth. Britain did this, the nationalists claimed, by unfair trade, by the restraint on indigenous Indian industry, and by the use of Indian taxes to pay the high salaries of the British civil servants in India. Thomas Baring served as Viceroy of India 1872–1876. Baring's major accomplishments came as an energetic reformer who was dedicated to upgrading the quality of government in the British Raj. He began large scale famine relief, reduced taxes, and overcame bureaucratic obstacles in an effort to reduce both starvation and widespread social unrest. Although appointed by a Liberal government, his policies were much the same as viceroys appointed by Conservative governments. Social reform was in the air by the 1880s. For example, Pandita Ramabai, poet, Sanskrit scholar, and a champion of the emancipation of Indian women, took up the cause of widow remarriage, especially of Brahmin widows, later converted to Christianity. By 1900, reform movements had taken root within the Indian National Congress. Congress member Gopal Krishna Gokhale founded the Servants of India Society, which lobbied for legislative reform (e.g., for a law to permit the remarriage of Hindu child widows) and whose members took vows of poverty, and worked among the untouchable community. By 1905, a deep gulf opened between the moderates, led by Gokhale, who downplayed public agitation, and the new "extremists" who not only advocated agitation but also regarded the pursuit of social reform as a distraction from nationalism. Prominent among the extremists was Bal Gangadhar Tilak, who attempted to mobilise Indians by appealing to an explicitly Hindu political identity, displayed, for example, in the annual public Ganapati festivals that he inaugurated in western India. The overwhelming, but predominantly Hindu, protest against the partition of Bengal and the fear in its wake of reforms favouring the Hindu majority, led the Muslim elite in India to meet with the new viceroy, Lord Minto in 1906 and to ask for separate electorates for Muslims. In conjunction, they demanded proportional legislative representation reflecting both their status as former rulers and their record of cooperating with the British. This led,[citation needed] in December 1906, to the founding of the All-India Muslim League in Dacca. Although Curzon, by now, had resigned his position over a dispute with his military chief Lord Kitchener and returned to England, the League was in favour of his partition plan. The Muslim elite's position, which was reflected in the League's position, had crystallized gradually over the previous three decades, beginning with the revelations of the Census of British India in 1871,[citation needed] which had for the first time estimated the populations in regions of the Muslim majority (for his part, Curzon's desire to court the Muslims of East Bengal had arisen from British anxieties ever since the 1871 census—and in light of the history of Muslims fighting them in the 1857 Mutiny and the Second Anglo-Afghan War—about Indian Muslims rebelling against the Crown).[citation needed] In the three decades since, Muslim leaders across northern India had intermittently experienced public animosity from some of the new Hindu political and social groups. The Arya Samaj, for example, had not only supported Cow Protection Societies in their agitation, but also—distraught at the 1871 Census's Muslim numbers—organized "reconversion" events for the purpose of welcoming Muslims back to the Hindu fold. In 1905, when Tilak and Lajpat Rai attempted to rise to leadership positions in the Congress, and the Congress itself rallied around the symbolism of Kali, Muslim fears increased. It was not lost on many Muslims, for example, that the rallying cry, "Bande Mataram", had first appeared in the novel Anand Math in which Hindus had battled their Muslim oppressors. Lastly, the Muslim elite, and among it Dacca Nawab, Khwaja Salimullah, who hosted the League's first meeting in his mansion in Shahbag, was aware that a new province with a Muslim majority would directly benefit Muslims aspiring to political power. The first steps were taken toward self-government in British India in the late 19th century with the appointment of Indian counsellors to advise the British viceroy and the establishment of provincial councils with Indian members; the British subsequently widened participation in legislative councils with the Indian Councils Act 1892. Municipal Corporations and District Boards were created for local administration; they included elected Indian members. The Indian Councils Act 1909, known as the Morley-Minto Reforms (John Morley was the secretary of state for India, and Minto was viceroy)—gave Indians limited roles in the central and provincial legislatures. Upper-class Indians, rich landowners and businessmen were favoured. The Muslim community was made a separate electorate and granted double representation. The goals were quite conservative but they did advance the elective principle. The partition of Bengal was rescinded in 1911 and announced at the Delhi Durbar at which King George V came in person and was crowned Emperor of India. He announced the capital would be moved from Calcutta to Delhi. This period saw an increase in the activities of revolutionary groups, which included Bengal's Anushilan Samiti and the Punjab's Ghadar Party. However, the British authorities were able to crush violent rebels swiftly, partly because the mainstream of educated Indian politicians opposed violent revolution. The viceroy, Lord Curzon (1899–1905), was unusually energetic in pursuit of efficiency and reform. His agenda included the creation of the North-West Frontier Province; small changes in the civil services; speeding up the operations of the secretariat; setting up a gold standard to ensure a stable currency; creation of a Railway Board; irrigation reform; reduction of peasant debts; lowering the cost of telegrams; archaeological research and the preservation of antiquities; improvements in the universities; police reforms; upgrading the roles of the Native States; a new Commerce and Industry Department; promotion of industry; revised land revenue policies; lowering taxes; setting up agricultural banks; creating an Agricultural Department; sponsoring agricultural research; establishing an Imperial Library; creating an Imperial Cadet Corps; new famine codes; and, indeed, reducing the smoke nuisance in Calcutta. Trouble emerged for Curzon when he divided the largest administrative subdivision in British India, the Bengal Province, into the Muslim-majority province of Eastern Bengal and Assam and the Hindu-majority province of West Bengal (present-day Indian states of West Bengal, Bihar, and Odisha). Curzon's act, the Partition of Bengal, had been contemplated by various colonial administrations since the time of Lord William Bentinck but was never acted upon. Though some considered it administratively felicitous, it was communally charged. It sowed the seeds of division among Indians in Bengal, transforming nationalist politics as nothing else before it. The Hindu elite of Bengal, among them many who owned land in East Bengal that was leased out to Muslim peasants, protested fervidly. Following the Partition of Bengal, which was a strategy set out by Lord Curzon to weaken the nationalist movement, Tilak encouraged the Swadeshi movement and the Boycott movement. The movement consisted of the boycott of foreign goods and also the social boycott of any Indian who used foreign goods. The Swadeshi movement consisted of the usage of natively produced goods. Once foreign goods were boycotted, there was a gap which had to be filled by the production of those goods in India itself. Bal Gangadhar Tilak said that the Swadeshi and Boycott movements are two sides of the same coin. The large Bengali Hindu middle-class (the Bhadralok), upset at the prospect of Bengalis being outnumbered in the new Bengal province by Biharis and Oriyas, felt that Curzon's act was punishment for their political assertiveness. The pervasive protests against Curzon's decision took the form predominantly of the Swadeshi ("buy Indian") campaign led by two-time Congress president, Surendranath Banerjee, and involved boycott of British goods. The rallying cry for both types of protest was the slogan Bande Mataram ("Hail to the Mother"), which invoked a mother goddess, who stood variously for Bengal, India, and the Hindu goddess Kali. Sri Aurobindo never went beyond the law when he edited the Bande Mataram magazine; it preached independence but within the bounds of peace as far as possible. Its goal was Passive Resistance. The unrest spread from Calcutta to the surrounding regions of Bengal when students returned home to their villages and towns. Some joined local political youth clubs emerging in Bengal at the time, some engaged in robberies to fund arms, and even attempted to take the lives of Raj officials. However, the conspiracies generally failed in the face of intense police work. The Swadeshi boycott movement cut imports of British textiles by 25%. The swadeshi cloth, although more expensive and somewhat less comfortable than its Lancashire competitor, was worn as a mark of national pride by people all over India. The First World War would prove to be a watershed in the imperial relationship between Britain and India. Shortly before the outbreak of war, the Government of India had indicated that they could furnish two divisions plus a cavalry brigade, with a further division in case of emergency. Some 1.4 million Indian and British soldiers of the British Indian Army took part in the war, primarily in Iraq and the Middle East. Their participation had a wider cultural fallout as news spread of how bravely soldiers fought and died alongside British soldiers, as well as soldiers from dominions like Canada and Australia. India's international profile rose during the 1920s, as it became a founding member of the League of Nations in 1920 and participated, under the name "Les Indes Anglaises" (British India), in the 1920 Summer Olympics in Antwerp. Back in India, especially among the leaders of the Indian National Congress, the war led to calls for greater self-government for Indians. At the onset of World War I, the reassignment of most of the British army in India to Europe and Mesopotamia, had led the previous viceroy, Lord Harding, to worry about the "risks involved in denuding India of troops". Revolutionary violence had already been a concern in British India; consequently, in 1915, to strengthen its powers during what it saw was a time of increased vulnerability, the Government of India passed the Defence of India Act 1915, which allowed it to intern politically dangerous dissidents without due process, and added to the power it already had under the Indian Press Act of 1910 to imprison journalists without trial and to censor the press. It was under the Defence of India act that the Ali brothers were imprisoned in 1916, and Annie Besant, a European woman, and ordinarily more problematic to imprison, was arrested in 1917. Now, as constitutional reform began to be discussed in earnest, the British began to consider how new moderate Indians could be brought into the fold of constitutional politics and, simultaneously, how the hand of established constitutionalists could be strengthened. However, since the Government of India wanted to ensure against any sabotage of the reform process by extremists, and since its reform plan was devised during a time when extremist violence had ebbed as a result of increased governmental control, it also began to consider how some of its wartime powers could be extended into peacetime. After the 1906 split between the moderates and the extremists in the Indian National Congress, organised political activity by the Congress had remained fragmented until 1914, when Bal Gangadhar Tilak was released from prison and began to sound out other Congress leaders about possible reunification. That, however, had to wait until the demise of Tilak's principal moderate opponents, Gopal Krishna Gokhale and Pherozeshah Mehta, in 1915, whereupon an agreement was reached for Tilak's ousted group to re-enter the Congress. In the 1916 Lucknow session of the Congress, Tilak's supporters were able to push through a more radical resolution which asked for the British to declare that it was their "aim and intention ... to confer self-government on India at an early date". Soon, other such rumblings began to appear in public pronouncements: in 1917, in the Imperial Legislative Council, Madan Mohan Malaviya spoke of the expectations the war had generated in India, "I venture to say that the war has put the clock ... fifty years forward ... (The) reforms after the war will have to be such, ... as will satisfy the aspirations of her (India's) people to take their legitimate part in the administration of their own country." The 1916 Lucknow Session of the Congress was also the venue of an unanticipated mutual effort by the Congress and the Muslim League, the occasion for which was provided by the wartime partnership between Germany and Turkey. Since the Turkish Sultan, or Khalifah, had also sporadically claimed guardianship of the Islamic holy sites of Mecca, Medina, and Jerusalem, and since the British and their allies were now in conflict with Turkey, doubts began to increase among some Indian Muslims about the "religious neutrality" of the British, doubts that had already surfaced as a result of the reunification of Bengal in 1911, a decision that was seen as ill-disposed to Muslims. In the Lucknow Pact, the League joined the Congress in the proposal for greater self-government that was campaigned for by Tilak and his supporters; in return, the Congress accepted separate electorates for Muslims in the provincial legislatures as well as the Imperial Legislative Council. In 1916, the Muslim League had anywhere between 500 and 800 members and did not yet have the wider following among Indian Muslims that it enjoyed in later years; in the League itself, the pact did not have unanimous backing, having largely been negotiated by a group of "Young Party" Muslims from the United Provinces (UP), most prominently, two brothers Mohammad and Shaukat Ali, who had embraced the Pan-Islamic cause; however, it did have the support of a young lawyer from Bombay, Muhammad Ali Jinnah, who was later to rise to leadership roles in both the League and the Indian independence movement. In later years, as the full ramifications of the pact unfolded, it was seen as benefiting the Muslim minority élites of provinces like UP and Bihar more than the Muslim majorities of Punjab and Bengal; nonetheless, at the time, the "Lucknow Pact" was an important milestone in nationalistic agitation and was seen as such by the British. During 1916, two Home Rule Leagues were founded within the Indian National Congress by Tilak and Annie Besant, respectively, to promote Home Rule among Indians, and also to elevate the stature of the founders within the Congress itself. Besant, for her part, was also keen to demonstrate the superiority of this new form of organised agitation, which had achieved some success in the Irish home rule movement, over the political violence that had intermittently plagued the subcontinent during the years 1907–1914. The two Leagues focused their attention on complementary geographical regions: Tilak's in western India, in the southern Bombay presidency, and Besant's in the rest of the country, but especially in the Madras Presidency and in regions like Sind and Gujarat that had hitherto been considered politically dormant by the Congress. Both leagues rapidly acquired new members—approximately thirty thousand each in a little over a year—and began to publish inexpensive newspapers. Their propaganda also turned to posters, pamphlets, and political-religious songs, and later to mass meetings, which not only attracted greater numbers than in earlier Congress sessions, but also entirely new social groups such as non-Brahmins, traders, farmers, students, and lower-level government workers. Although they did not achieve the magnitude or character of a nationwide mass movement, the Home Rule leagues both deepened and widened organised political agitation for self-rule in India. The British authorities reacted by imposing restrictions on the Leagues, including shutting out students from meetings and banning the two leaders from travelling to certain provinces. The year 1915 also saw the return of Mohandas Karamchand Gandhi to India. Already known in India as a result of his civil liberties protests on behalf of the Indians in South Africa, Gandhi followed the advice of his mentor Gopal Krishna Gokhale and chose not to make any public pronouncements during the first year of his return, but instead spent the year travelling, observing the country at first hand, and writing. Earlier, during his South Africa sojourn, Gandhi, a lawyer by profession, had represented an Indian community, which, although small, was sufficiently diverse to be a microcosm of India itself. In tackling the challenge of holding this community together and simultaneously confronting the colonial authority, he had created a technique of non-violent resistance, which he labelled Satyagraha (or Striving for Truth). For Gandhi, Satyagraha was different from "passive resistance", by then a familiar technique of social protest, which he regarded as a practical strategy adopted by the weak in the face of superior force; Satyagraha, on the other hand, was for him the "last resort of those strong enough in their commitment to truth to undergo suffering in its cause". Ahimsa or "non-violence", which formed the underpinning of Satyagraha, came to represent the twin pillar, with Truth, of Gandhi's unorthodox religious outlook on life. During the years 1907–1914, Gandhi tested the technique of Satyagraha in a number of protests on behalf of the Indian community in South Africa against the unjust racial laws. Also, during his time in South Africa, in his essay, Hind Swaraj, (1909), Gandhi formulated his vision of Swaraj, or "self-rule" for India based on three vital ingredients: solidarity between Indians of different faiths, but most of all between Hindus and Muslims; the removal of untouchability from Indian society; and the exercise of swadeshi—the boycott of manufactured foreign goods and the revival of Indian cottage industry. The first two, he felt, were essential for India to be an egalitarian and tolerant society, one befitting the principles of Truth and Ahimsa, while the last, by making Indians more self-reliant, would break the cycle of dependence that was perpetuating not only the direction and tenor of the British rule in India, but also the British commitment to it. At least until 1920, the British presence itself was not a stumbling block in Gandhi's conception of swaraj; rather, it was the inability of Indians to create a modern society. Gandhi made his political debut in India in 1917 in Champaran district in Bihar, near the Nepal border, where he was invited by a group of disgruntled tenant farmers who, for many years, had been forced into planting indigo (for dyes) on a portion of their land and then selling it at below-market prices to the British planters who had leased them the land. Upon his arrival in the district, Gandhi was joined by other agitators, including a young Congress leader, Rajendra Prasad, from Bihar, who would become a loyal supporter of Gandhi and go on to play a prominent role in the Indian independence movement. When Gandhi was ordered to leave by the local British authorities, he refused on moral grounds, setting up his refusal as a form of individual Satyagraha. Soon, under pressure from the Viceroy in Delhi who was anxious to maintain domestic peace during wartime, the provincial government rescinded Gandhi's expulsion order, and later agreed to an official enquiry into the case. Although the British planters eventually gave in, they were not won over to the farmers' cause, and thereby did not produce the optimal outcome of a Satyagraha that Gandhi had hoped for; similarly, the farmers themselves, although pleased at the resolution, responded less than enthusiastically to the concurrent projects of rural empowerment and education that Gandhi had inaugurated in keeping with his ideal of swaraj. The following year Gandhi launched two more Satyagrahas—both in his native Gujarat—one in the rural Kaira district where land-owning farmers were protesting increased land-revenue and the other in the city of Ahmedabad, where workers in an Indian-owned textile mill were distressed about their low wages. The satyagraha in Ahmedabad took the form of Gandhi fasting and supporting the workers in a strike, which eventually led to a settlement. In Kaira, in contrast, although the farmers' cause received publicity from Gandhi's presence, the satyagraha itself, which consisted of the farmers' collective decision to withhold payment, was not immediately successful, as the British authorities refused to back down. The agitation in Kaira gained for Gandhi another lifelong lieutenant in Sardar Vallabhbhai Patel, who had organised the farmers, and who too would go on to play a leadership role in the Indian independence movement. In 1916, in the face of new strength demonstrated by the nationalists with the signing of the Lucknow Pact and the founding of the Home Rule leagues, and the realisation, after the disaster in the Mesopotamian campaign, that the war would likely last longer, the new viceroy, Lord Chelmsford, cautioned that the Government of India needed to be more responsive to Indian opinion. Towards the end of the year, after discussions with the government in London, he suggested that the British demonstrate their good faith—in light of the Indian war role—through a number of public actions, including awards of titles and honours to princes, granting of commissions in the army to Indians, and removal of the much-reviled cotton excise duty, but, most importantly, an announcement of Britain's future plans for India and an indication of some concrete steps. After more discussion, in August 1917, the new Liberal secretary of state for India, Edwin Montagu, announced the British aim of "increasing association of Indians in every branch of the administration, and the gradual development of self-governing institutions, with a view to the progressive realisation of responsible government in India as an integral part of the British Empire". Although the plan envisioned limited self-government at first only in the provinces—with India emphatically within the British Empire—it represented the first British proposal for any form of representative government in a non-white colony. Montagu and Chelmsford presented their report in July 1918 after a long fact-finding trip through India the previous winter. After more discussion by the government and parliament in Britain, and another tour by the Franchise and Functions Committee for the purpose of identifying who among the Indian population could vote in future elections, the Government of India Act 1919 (also known as the Montagu–Chelmsford Reforms) was passed in December 1919. The new Act enlarged both the provincial and Imperial legislative councils and repealed the Government of India's recourse to the "official majority" in unfavourable votes. Although departments like defence, foreign affairs, criminal law, communications, and income-tax were retained by the Viceroy and the central government in New Delhi, other departments like public health, education, land-revenue, local self-government were transferred to the provinces. The provinces themselves were now to be administered under a new diarchical system, whereby some areas like education, agriculture, infrastructure development, and local self-government became the preserve of Indian ministers and legislatures, and ultimately the Indian electorates, while others like irrigation, land-revenue, police, prisons, and control of media remained within the purview of the British governor and his executive council. The new Act also made it easier for Indians to be admitted into the civil services and the army officer corps. A greater number of Indians were now enfranchised, although, for voting at the national level, they constituted only 10% of the total adult male population, many of whom were still illiterate. In the provincial legislatures, the British continued to exercise some control by setting aside seats for special interests they considered cooperative or useful. In particular, rural candidates, generally sympathetic to British rule and less confrontational, were assigned more seats than their urban counterparts. Seats were also reserved for non-Brahmins, landowners, businessmen, and college graduates. The principal of "communal representation", an integral part of the Minto–Morley Reforms, and more recently of the Congress-Muslim League Lucknow Pact, was reaffirmed, with seats being reserved for Muslims, Sikhs, Indian Christians, Anglo-Indians, and domiciled Europeans, in both provincial and Imperial legislative councils. The Montagu–Chelmsford reforms offered Indians the most significant opportunity yet for exercising legislative power, especially at the provincial level; however, that opportunity was also restricted by the still limited number of eligible voters, by the small budgets available to provincial legislatures, and by the presence of rural and special interest seats that were seen as instruments of British control. Its scope was unsatisfactory to the Indian political leadership, famously expressed by Annie Besant as something "unworthy of England to offer and India to accept". In 1917, as Montagu and Chelmsford were compiling their report, a committee chaired by a British judge, Sidney Rowlatt, and was tasked with investigating "revolutionary conspiracies", with the unstated goal of extending the government's wartime powers. The Rowlatt Committee comprised four British and two Indian members, including Sir Basil Scott and Diwan Bahadur Sir C. V. Kumaraswami Sastri, the present and future Chief Justices of the High Court of Bombay and the High Court of Madras. It presented its report in July 1918 and identified three regions of conspiratorial insurgency: Bengal, the Bombay presidency, and the Punjab. To combat subversive acts in these regions, the committee unanimously recommended that the government use emergency powers akin to its wartime authority, which included the ability to try cases of sedition by a panel of three judges and without juries, exaction of securities from suspects, governmental overseeing of residences of suspects, and the power for provincial governments to arrest and detain suspects in short-term detention facilities and without trial. With the end of World War I, there was also a change in the economic climate. By the end of 1919, 1.5 million Indians had served in the armed services in either combatant or non-combatant roles, and India had provided £146 million in revenue for the war. The increased taxes coupled with disruptions in both domestic and international trade had the effect of approximately doubling the index of overall prices in India between 1914 and 1920. Returning war veterans, especially in the Punjab, created a growing unemployment crisis, and post-war inflation led to food riots in Bombay, Madras, and Bengal provinces, a situation that was made only worse by the failure of the 1918–19 monsoon and by profiteering and speculation. The global influenza epidemic and the Bolshevik Revolution of 1917 added to the general jitters; the former among the population already experiencing economic woes, and the latter among government officials, fearing a similar revolution in India. To combat what it saw as a coming crisis, the government now drafted the Rowlatt committee's recommendations into two Rowlatt Bills. Although the bills were authorised for legislative consideration by Edwin Montagu, they were done so unwillingly, with the accompanying declaration, "I loathe the suggestion at first sight of preserving the Defence of India Act in peacetime to such an extent as Rowlatt and his friends think necessary." In the ensuing discussion and vote in the Imperial Legislative Council, all Indian members voiced opposition to the bills. The Government of India was, nevertheless, able to use of its "official majority" to ensure passage of the bills early in 1919. However, what it passed, in deference to the Indian opposition, was a lesser version of the first bill, which now allowed extrajudicial powers, but for a period of exactly three years and for the prosecution solely of "anarchical and revolutionary movements", dropping entirely the second bill involving modification the Indian Penal Code. Even so, when it was passed, the new Rowlatt Act aroused widespread indignation throughout India, and brought Gandhi to the forefront of the nationalist movement. The Jallianwala Bagh massacre or "Amritsar massacre", took place in the Jallianwala Bagh public garden in the predominantly Sikh northern city of Amritsar. After days of unrest Brigadier-General Reginald E.H. Dyer forbade public meetings and on Sunday 13 April 1919 fifty British Indian Army soldiers commanded by Dyer began shooting at an unarmed gathering of thousands of men, women, and children without warning. Casualty estimates vary widely, with the Government of India reporting 379 dead, with 1,100 wounded. The Indian National Congress estimated three times the number of dead. Dyer was removed from duty but he became a celebrated hero in Britain among people with connections to the Raj. Historians consider the episode was a decisive step towards the end of British rule in India. In 1920, after the British government refused to back down, Gandhi began his campaign of non-cooperation, prompting many Indians to return British awards and honours, to resign from the civil services, and to again boycott British goods. In addition, Gandhi reorganised the Congress, transforming it into a mass movement and opening its membership to even the poorest Indians. Although Gandhi halted the non-cooperation movement in 1922 after the violent incident at Chauri Chaura, the movement revived again, in the mid-1920s. The visit, in 1928, of the British Simon Commission, charged with instituting constitutional reform in India, resulted in widespread protests throughout the country. Earlier, in 1925, non-violent protests of the Congress had resumed too, this time in Gujarat, and led by Patel, who organised farmers to refuse payment of increased land taxes; the success of this protest, the Bardoli Satyagraha, brought Gandhi back into the fold of active politics. At its annual session in Lahore, the Indian National Congress, under the presidency of Jawaharlal Nehru, issued a demand for Purna Swaraj (Hindustani language: "complete independence"), or Purna Swarajya. The declaration was drafted by the Congress Working Committee, which included Gandhi, Nehru, Patel, and Chakravarthi Rajagopalachari. Gandhi subsequently led an expanded movement of civil disobedience, culminating in 1930 with the Salt Satyagraha, in which thousands of Indians defied the tax on salt, by marching to the sea and making their own salt by evaporating seawater. Although, many, including Gandhi, were arrested, the British government eventually gave in, and in 1931 Gandhi travelled to London to negotiate new reform at the Round Table Conferences.[citation needed] In local terms, British control rested on the Indian Civil Service (ICS), but it faced growing difficulties. Fewer and fewer young men in Britain were interested in joining, and the continuing distrust of Indians resulted in a declining base in terms of quality and quantity. By 1945 Indians were numerically dominant in the ICS and at issue was divided loyalty between the Empire and independence. The finances of the Raj depended on land taxes, and these became problematic in the 1930s. Epstein argues that after 1919 it became harder and harder to collect the land revenue. The Raj's suppression of civil disobedience after 1934 temporarily increased the power of the revenue agents but after 1937 they were forced by the new Congress-controlled provincial governments to hand back confiscated land. Again the outbreak of war strengthened them, in the face of the Quit India movement the revenue collectors had to rely on military force and by 1946–47 direct British control was rapidly disappearing in much of the countryside. In 1935, after the Round Table Conferences, Parliament passed the Government of India Act 1935, which authorised the establishment of independent legislative assemblies in all provinces of British India, the creation of a central government incorporating both the British provinces and the princely states, and the protection of Muslim minorities. The future Constitution of independent India was based on this act. However, it divided the electorate into 19 religious and social categories, e.g., Muslims, Sikhs, Indian Christians, Depressed Classes, Landholders, Commerce and Industry, Europeans, Anglo-Indians, etc., each of which was given separate representation in the Provincial Legislative Assemblies. A voter could cast a vote only for candidates in his own category.[citation needed] The 1935 Act provided for more autonomy for Indian provinces, with the goal of cooling off nationalist sentiment. The act provided for a national parliament and an executive branch under the purview of the British government, but the rulers of the princely states managed to block its implementation. These states remained under the full control of their hereditary rulers, with no popular government. To prepare for elections Congress built up its grass roots membership from 473,000 in 1935 to 4.5 million in 1939. In the 1937 elections Congress won victories in seven of the eleven provinces of British India. Congress governments, with wide powers, were formed in these provinces. The widespread voter support for the Indian National Congress surprised Raj officials, who previously had seen the Congress as a small elitist body. The British separated Burma Province from British India in 1937 and granted the colony a new constitution calling for a fully elected assembly, with many powers given to the Burmese, but this proved to be a divisive issue as a ploy to exclude Burmese from any further Indian reforms. With the outbreak of World War II in 1939, the viceroy, Lord Linlithgow, declared war on India's behalf without consulting Indian leaders, leading the Congress provincial ministries to resign in protest. The Muslim League, in contrast, supported Britain in the war effort and maintained its control of the government in three major provinces, Bengal, Sind and the Punjab. While the Muslim League had been a small elite group in 1927 with only 1300 members, it grew rapidly once it became an organisation that reached out to the masses, reaching 500,000 members in Bengal in 1944, 200,000 in Punjab, and hundreds of thousands elsewhere. Jinnah now was well positioned to negotiate with the British from a position of power. Jinnah repeatedly warned that Muslims would be unfairly treated in an independent India dominated by the Congress. On 24 March 1940 in Lahore, the League passed the "Lahore Resolution", demanding that, "the areas in which the Muslims are numerically in majority as in the North-Western and Eastern zones of India should be grouped to constitute independent states in which the constituent units shall be autonomous and sovereign." Although there were other important national Muslim politicians such as Congress leader Ab'ul Kalam Azad, and influential regional Muslim politicians such as A. K. Fazlul Huq of the leftist Krishak Praja Party in Bengal, Fazl-i-Hussain of the landlord-dominated Punjab Unionist Party, and Abd al-Ghaffar Khan of the pro-Congress Khudai Khidmatgar (popularly, "red shirts") in the North West Frontier Province, the British, over the next six years, were to increasingly see the League as the main representative of Muslim India. The Congress was secular and strongly opposed to having any religious state. It insisted there was a natural unity to India, and repeatedly blamed the British for "divide and rule" tactics based on prompting Muslims to think of themselves as alien from Hindus.[citation needed] Jinnah rejected the notion of a united India, and emphasised that religious communities were more basic than an artificial nationalism. He proclaimed the Two-Nation Theory, stating at Lahore on 23 March 1940: [Islam and Hinduism] are not religions in the strict sense of the word, but are, in fact, different and distinct social orders and it is a dream that the Hindus and Muslims can ever evolve a common nationality ... The Hindu and Muslim belong to two different religions, philosophies, social customs and literature [sic]. They neither intermarry nor interdine together and indeed they belong to two different civilizations which are based mainly on conflicting ideas and conceptions. Their aspects on life and of life are different ... To yoke together two such nations under a single state, one as a numerical minority and the other as a majority must lead to growing discontent and final destruction of any fabric that may be so built up for the government of such a state. While the regular Indian army in 1939 included about 220,000 native troops, it expanded tenfold during the war, and small naval and air force units were created. Over two million Indians volunteered for military service in the British Army. They played a major role in numerous campaigns, especially in the Middle East and North Africa. Casualties were moderate (in terms of the world war), with 24,000 killed; 64,000 wounded; 12,000 missing (probably dead), and 60,000 captured at Singapore in 1942. London paid most of the cost of the Indian Army, which had the effect of erasing India's national debt; it ended the war with a surplus of £1,300 million. In addition, heavy British spending on munitions produced in India (such as uniforms, rifles, machine-guns, field artillery, and ammunition) led to a rapid expansion of industrial output, such as textiles (up 16%), steel (up 18%), and chemicals (up 30%). Small warships were built, and an aircraft factory opened in Bangalore. The railway system, with 700,000 employees, was taxed to the limit as demand for transportation soared. The British government sent the Cripps mission in 1942 to secure Indian nationalists' co-operation in the war effort in exchange for a promise of independence as soon as the war ended. Top officials in Britain, most notably Prime Minister Winston Churchill, did not support the Cripps Mission and negotiations with the Congress soon broke down. Congress launched the Quit India Movement in July 1942 demanding the immediate withdrawal of the British from India or face nationwide civil disobedience. On 8 August the Raj arrested all national, provincial and local Congress leaders, holding tens of thousands of them until 1945. The country erupted in violent demonstrations led by students and later by peasant political groups, especially in Eastern United Provinces, Bihar, and western Bengal. The large wartime British Army presence crushed the movement in a little more than six weeks; nonetheless, a portion of the movement formed for a time an underground provisional government on the border with Nepal. In other parts of India, the movement was less spontaneous and the protest less intensive; however, it lasted sporadically into the summer of 1943. Earlier, Subhas Chandra Bose, who had been a leader of the younger, radical, wing of the Indian National Congress in the late 1920s and 1930s, had risen to become Congress President from 1938 to 1939. However, he was ousted from the Congress in 1939 following differences with the high command, and subsequently placed under house arrest by the British before escaping from India in early 1941. He turned to Nazi Germany and Imperial Japan for help in gaining India's independence by force. With Japanese support, he organised the Indian National Army, composed largely of Indian soldiers of the British Indian Army who had been captured by the Japanese in the Battle of Singapore. As the war turned against them, the Japanese came to support a number of puppet and provisional governments in the captured regions, including those in Burma, the Philippines and Vietnam, and in addition, the Provisional Government of Azad Hind, presided by Bose. Bose's effort, however, was short-lived. In mid-1944 the British Army first halted and then reversed the Japanese U-Go offensive, beginning the successful part of the Burma Campaign. Bose's Indian National Army largely disintegrated during the subsequent fighting in Burma, with its remaining elements surrendering with the recapture of Singapore in September 1945. Bose died in August from third degree burns received after attempting to escape in an overloaded Japanese plane which crashed in Taiwan, which many Indians believe did not happen. Although Bose was unsuccessful, he roused patriotic feelings in India. In January 1946, a number of mutinies broke out in the armed services, starting with that of RAF servicemen frustrated with their slow repatriation to Britain. The mutinies came to a head with mutiny of the Royal Indian Navy in Bombay in February 1946, followed by others in Calcutta, Madras, and Karachi. Although the mutinies were rapidly suppressed, they had the effect of spurring the new Labour government in Britain to action, and leading to the Cabinet Mission to India led by the secretary of state for India, Lord Pethick Lawrence, and including Sir Stafford Cripps, who had visited four years before. Also in early 1946, new elections were called in India. Earlier, at the end of the war in 1945, the colonial government had announced the public trial of three senior officers of Bose's defeated Indian National Army who stood accused of treason. Now as the trials began, the Congress leadership, although ambivalent towards the INA, chose to defend the accused officers. The subsequent convictions of the officers, the public outcry against the convictions, and the eventual remission of the sentences, created positive propaganda for the Congress, which only helped in the party's subsequent electoral victories in eight of the eleven provinces. The negotiations between the Congress and the Muslim League, however, stumbled over the issue of the partition. Jinnah proclaimed 16 August 1946, Direct Action Day, with the stated goal of highlighting, peacefully, the demand for a Muslim homeland in British India. The following day Hindu-Muslim riots broke out in Calcutta and quickly spread throughout British India. Although the Government of India and the Congress were both shaken by the course of events, in September, a Congress-led interim government was installed, with Jawaharlal Nehru as united India's prime minister. Later that year, the British Exchequer exhausted by the recently concluded World War II, and the Labour government conscious that it had neither the mandate at home, the international support, nor the reliability of native forces for continuing to control an increasingly restless British India, decided to end British rule of India, and in early 1947 Britain announced its intention of transferring power no later than June 1948. As independence approached, the violence between Hindus and Muslims in the provinces of Punjab and Bengal continued unabated. With the British army unprepared for the potential for increased violence, the new viceroy, Louis Mountbatten, advanced the date for the transfer of power, allowing less than six months for a mutually agreed plan for independence. With the partition of India, the end of the British rule in India in August 1947 saw the creation of two separate states of India and Pakistan. On 15 August 1947, the new Dominion of Pakistan (later Islamic Republic of Pakistan), with Muhammad Ali Jinnah as the governor-general; and the Dominion of India, (later Republic of India) with Jawaharlal Nehru as the prime minister, and the viceroy, Louis Mountbatten, staying on as its first governor-general came into being; with official ceremonies taking place in Karachi on 14 August and New Delhi on 15 August. This was done so that Mountbatten could attend both ceremonies. The great majority of Indians remained in place with independence, but in border areas millions of people (Muslim, Sikh, and Hindu) relocated across the newly drawn borders. In Punjab, where the new border lines divided the Sikh regions in half, there was much bloodshed; in Bengal and Bihar, where Gandhi's presence assuaged communal tempers, the violence was more limited. In all, somewhere between 250,000 and 500,000 people on both sides of the new borders, among both the refugee and resident populations of the three faiths, died in the violence. British India and the princely states India during the British Raj was made up of two types of territory: British India and the Native States (or Princely States). In its Interpretation Act 1889, the British Parliament adopted the following definitions in Section 18: (4.) The expression "British India" shall mean all territories and places within Her Majesty's dominions which are for the time being governed by Her Majesty through the Governor-General of India or through any governor or other officer subordinates to the Governor-General of India. (5.) The expression "India" shall mean British India together with any territories of any native prince or chief under the suzerainty of Her Majesty exercised through the Governor-General of India, or through any governor or other officer subordinates to the Governor-General of India. In general, the term "British India" had been used (and is still used) to refer also to the regions under the rule of the British East India Company in India from 1600 to 1858. The term has also been used to refer to the "British in India". The terms "Indian Empire" and "Empire of India" (like the term "British Empire") were not used in legislation. The monarch was officially known as Empress or Emperor of India and the term was often used in Queen Victoria's Queen's Speeches and Prorogation Speeches. In addition, an order of knighthood, the Most Eminent Order of the Indian Empire, was set up in 1878. Suzerainty over 175 princely states, some of the largest and most important, was exercised (in the name of the British Crown) by the central government of British India under the viceroy; the remaining approximately 500 states were dependents of the provincial governments of British India under a governor, lieutenant-governor, or chief commissioner (as the case might have been). A clear distinction between "dominion" and "suzerainty" was supplied by the jurisdiction of the courts of law: the law of British India rested upon the laws passed by the British Parliament and the legislative powers those laws vested in the various governments of British India, both central and local; in contrast, the courts of the Princely States existed under the authority of the respective rulers of those states. At the turn of the 20th century, British India consisted of eight provinces that were administered either by a governor or a lieutenant-governor. During the partition of Bengal (1905–1913), the new province of Eastern Bengal and Assam was created as a Lieutenant-Governorship. In 1911, East Bengal was reunited with Bengal, and the new provinces in the east became: Assam, Bengal, Bihar and Orissa. In addition, there were a few minor provinces that were administered by a chief commissioner: A Princely State, also called a Native State or an Indian State, was a British vassal state in India with an indigenous nominal Indian ruler, subject to a subsidiary alliance. There were 565 princely states when India and Pakistan became independent from Britain in August 1947. The princely states did not form a part of British India (i.e. the presidencies and provinces), as they were not directly under British rule. The larger ones had treaties with Britain that specified which rights the princes had; in the smaller ones the princes had few rights. Within the princely states external affairs, defence and most communications were under British control.[citation needed] The British also exercised a general influence over the states' internal politics, in part through the granting or withholding of recognition of individual rulers. Although there were nearly 600 princely states, the great majority were very small and contracted out the business of government to the British. Some two hundred of the states had an area of less than 25 square kilometres (10 square miles). The last vestige of the Mughal Empire in Delhi which was under Company authority prior to the advent of British Raj was finally abolished and seized by the Crown in the aftermath of the Sepoy Mutiny of 1857 for its support to the rebellion. The princely states were grouped into agencies and residencies. Following the Indian Rebellion of 1857 (usually called the Indian Mutiny by the British), the Government of India Act 1858 made changes in the governance of India at three levels: In London, it provided for a cabinet-level Secretary of State for India and a fifteen-member Council of India, whose members were required, as one prerequisite of membership, to have spent at least ten years in India and to have done so no more than ten years before. Although the secretary of state formulated the policy instructions to be communicated to India, he was required in most instances to consult the Council, but especially so in matters relating to spending of Indian revenues. The Act envisaged a system of "double government" in which the Council ideally served both as a check on excesses in imperial policy-making and as a body of up-to-date expertise on India. However, the secretary of state also had special emergency powers that allowed him to make unilateral decisions, and, in reality, the Council's expertise was sometimes outdated. From 1858 until 1947, twenty-seven individuals served as Secretary of State for India and directed the India Office; these included: Sir Charles Wood (1859–1866), the Marquess of Salisbury (1874–1878; later British prime minister), John Morley (1905–1910; initiator of the Minto–Morley Reforms), E. S. Montagu (1917–1922; an architect of the Montagu–Chelmsford Reforms), and Frederick Pethick-Lawrence (1945–1947; head of the 1946 Cabinet Mission to India). The size of the Advisory Council was reduced over the next half-century, but its powers remained unchanged. In 1907, for the first time, two Indians were appointed to the Council. They were K.G. Gupta and Syed Hussain Bilgrami. In Calcutta, the governor-general remained head of the Government of India and now was more commonly called the viceroy on account of his secondary role as the Crown's representative to the nominally sovereign princely states; he was, however, now responsible to the secretary of state in London and through him to Parliament. A system of "double government" had already been in place during the Company's rule in India from the time of Pitt's India Act of 1784. The governor-general in the capital, Calcutta, and the governor in a subordinate presidency (Madras or Bombay) was each required to consult his advisory council; executive orders in Calcutta, for example, were issued in the name of "Governor-General-in-Council" (i.e. the Governor-General with the advice of the Council). The Company's system of "double government" had its critics, since, from the time of the system's inception, there had been intermittent feuding between the governor-general and his Council; still, the Act of 1858 made no major changes in governance. However, in the years immediately thereafter, which were also the years of post-rebellion reconstruction, Viceroy Lord Canning found the collective decision making of the Council to be too time-consuming for the pressing tasks ahead, so he requested the "portfolio system" of an Executive Council in which the business of each government department (the "portfolio") was assigned to and became the responsibility of a single council member. Routine departmental decisions were made exclusively by the member, but important decisions required the consent of the governor-general and, in the absence of such consent, required discussion by the entire Executive Council. This innovation in Indian governance was promulgated in the Indian Councils Act 1861. If the Government of India needed to enact new laws, the Councils Act allowed for a Legislative Council—an expansion of the Executive Council by up to twelve additional members, each appointed to a two-year term—with half the members consisting of British officials of the government (termed official) and allowed to vote, and the other half, comprising Indians and domiciled Britons in India (termed non-official) and serving only in an advisory capacity. All laws enacted by Legislative Councils in India, whether by the Imperial Legislative Council in Calcutta or by the provincial ones in Madras and Bombay, required the final assent of the secretary of state in London; this prompted Sir Charles Wood, the second secretary of state, to describe the Government of India as "a despotism controlled from home". Moreover, although the appointment of Indians to the Legislative Council was a response to calls after the 1857 rebellion, most notably by Sayyid Ahmad Khan, for more consultation with Indians, the Indians so appointed were from the landed aristocracy, often chosen for their loyalty, and far from representative. Even so, the "... tiny advances in the practice of representative government were intended to provide safety valves for the expression of public opinion, which had been so badly misjudged before the rebellion". Indian affairs now also came to be more closely examined in the British Parliament and more widely discussed in the British press. With the promulgation of the Government of India Act 1935, the Council of India was abolished with effect from 1 April 1937 and a modified system of government enacted. The secretary of state for India represented the Government of India in the UK. He was assisted by a body of advisers numbering from 8–12 individuals, at least half of whom were required to have held office in India for a minimum of 10 years, and had not relinquished office earlier than two years prior to their appointment as advisers to the secretary of state. The viceroy and governor-general of India, a Crown appointee, typically held office for five years though there was no fixed tenure, and received an annual salary of Rs. 250,800 p.a. (£18,810 p.a.). He headed the Viceroy's Executive Council, each member of which had responsibility for a department of the central administration. From 1 April 1937, the position of Governor-General in Council, which the viceroy and governor-general concurrently held in the capacity of representing the Crown in relations with the Indian princely states, was replaced by the designation of "HM Representative for the Exercise of the Functions of the Crown in its Relations with the Indian States", or the "Crown Representative". The Executive Council was greatly expanded during the Second World War, and in 1947 comprised 14 members (secretaries), each of whom earned a salary of Rs. 66,000 p.a. (£4,950 p.a.). Until 1946, the viceroy held the portfolio for External Affairs and Commonwealth Relations, as well as heading the Political Department in his capacity as the Crown representative. Each department was headed by a secretary excepting the Railway Department, which was headed by a Chief Commissioner of Railways under a secretary. The viceroy and governor-general was also the head of the bicameral Indian Legislature, consisting of an upper house (the Council of State) and a lower house (the Legislative Assembly). The viceroy was the head of the Council of State, while the Legislative Assembly, which was first opened in 1921, was headed by an elected president (appointed by the Viceroy from 1921 to 1925). The Council of State consisted of 58 members (32 elected, 26 nominated), while the Legislative Assembly comprised 141 members (26 nominated officials, 13 others nominated and 102 elected). The Council of State existed in five-year periods and the Legislative Assembly for three-year periods, though either could be dissolved earlier or later by the Viceroy. The Indian Legislature was empowered to make laws for all persons resident in British India including all British subjects resident in India, and for all British Indian subjects residing outside India. With the assent of the King-Emperor and after copies of a proposed enactment had been submitted to both houses of the British Parliament, the Viceroy could overrule the legislature and directly enact any measures in the perceived interests of British India or its residents if the need arose. Effective from 1 April 1936, the Government of India Act created the new provinces of Sind (separated from the Bombay Presidency) and Orissa (separated from the Province of Bihar and Orissa). Burma and Aden became separate Crown Colonies under the Act from 1 April 1937, thereby ceasing to be part of the Indian Empire. From 1937 onwards, British India was divided into 17 administrations: the three Presidencies of Madras, Bombay and Bengal, and the 14 provinces of the United Provinces, Punjab, Bihar, the Central Provinces and Berar, Assam, the North-West Frontier Province (NWFP), Orissa, Sind, British Baluchistan, Delhi, Ajmer-Merwara, Coorg, the Andaman and Nicobar Islands and Panth Piploda. The Presidencies and the first eight provinces were each under a governor, while the latter six provinces were each under a chief commissioner. The viceroy directly governed the chief commissioner provinces through each respective chief commissioner, while the Presidencies and the provinces under governors were allowed greater autonomy under the Government of India Act. Each Presidency or province headed by a governor had either a provincial bicameral legislature (in the Presidencies, the United Provinces, Bihar and Assam) or a unicameral legislature (in the Punjab, Central Provinces and Berar, NWFP, Orissa and Sind). The governor of each presidency or province represented the Crown in his capacity, and was assisted by a ministers appointed from the members of each provincial legislature. Each provincial legislature had a life of five years, barring any special circumstances such as wartime conditions. All bills passed by the provincial legislature were either signed or rejected by the governor, who could also issue proclamations or promulgate ordinances while the legislature was in recess, as the need arose. Each province or presidency comprised a number of divisions, each headed by a commissioner and subdivided into districts, which were the basic administrative units and each headed by a district magistrate, collector or deputy commissioner; in 1947, British India comprised 230 districts. Economy All three sectors of the economy—agriculture, manufacturing, and services—accelerated in the postcolonial India. In agriculture a huge increase in production took place in the 1870s. The most important difference between colonial and postcolonial India was the use of land surplus with productivity-led growth by using high-yielding variety seeds, chemical fertilizers and more intensive application of water. All these three inputs were subsidised by the state. The result was, on average, no long-term change in per capita income levels, though cost of living had grown higher. Agriculture was still dominant, with most peasants at the subsistence level. Extensive irrigation systems were built, providing an impetus for switching to cash crops for export and for raw materials for Indian industry, especially jute, cotton, sugarcane, coffee and tea. India's global share of GDP fell drastically from above 20% to less than 5% in the colonial period. Historians have been bitterly divided on issues of economic history, with the Nationalist school (following Nehru) arguing that India was poorer at the end of British rule than at the beginning and that impoverishment occurred because of the British. Mike Davis writes that much of the economic activity in British India was for the benefit of the British economy and was carried out relentlessly through repressive British imperial policies and with negative repercussions for the Indian population. This is reified in India's large exports of wheat to Britain: despite a major famine that claimed between 6 and 10 million lives in the late 1870s, these exports remained unchecked. A colonial government committed to laissez-faire economics refused to interfere with these exports or provide any relief. With the end of the state-granted monopoly of the East India Trading Company in 1813, the importation into India of British manufactured goods, including finished textiles, increased dramatically, from approximately 1 million yards of cotton cloth in 1814 to 13 million in 1820, 995 million in 1870, to 2050 million by 1890. The British imposed "free trade" on India, while continental Europe and the United States erected stiff tariff barriers ranging from 30% to 70% on the importation of cotton yarn or prohibited it entirely. As a result of the less expensive imports from more industrialized Britain, India's most significant industrial sector, textile production, shrank, such that by 1870–1880 Indian producers were manufacturing only 25%–45% of local consumption. Deindustrialization of India's iron industry was even more extensive during this period. Jamsetji Tata (1839–1904) began his industrial career in 1877 with the Central India Spinning, Weaving, and Manufacturing Company in Bombay. While other Indian mills produced cheap coarse yarn (and later cloth) using local short-staple cotton and cheap machinery imported from Britain, Tata did much better by importing expensive longer-stapled cotton from Egypt and buying more complex ring-spindle machinery from the United States to spin finer yarn that could compete with imports from Britain. In the 1890s, he launched plans to move into heavy industry using Indian funding. The Raj did not provide capital, but, aware of Britain's declining position against the US and Germany in the steel industry, it wanted steel mills in India. It promised to purchase any surplus steel Tata could not otherwise sell. The Tata Iron and Steel Company (TISCO), now headed by his son Dorabji Tata (1859–1932), began constructing its plant at Jamshedpur in Bihar in 1908, using American technology, not British. According to The Oxford Dictionary of National Biography, TISCO became the leading iron and steel producer in India, and "a symbol of Indian technical skill, managerial competence, and entrepreneurial flair". The Tata family, like most of India's big businessmen, were Indian nationalists but did not trust the Congress because it seemed too aggressively hostile to the Raj, too socialist, and too supportive of trade unions. British India built a modern railway system in the late 19th century, which was the fourth largest in the world. At first the railways were privately owned and operated. They were run by British administrators, engineers and craftsmen. At first, only the unskilled workers were Indians. The East India Company (and later the colonial government) encouraged new railway companies backed by private investors under a scheme that would provide land and guarantee an annual return of up to 5% during the initial years of operation. The companies were to build and operate the lines under a 99-year lease, with the government having the option to buy them earlier. Two new railway companies, the Great Indian Peninsular Railway (GIPR) and the East Indian Railway Company (EIR) began to construct and operate lines near Bombay and Calcutta in 1853–54. The first passenger railway line in North India, between Allahabad and Kanpur, opened in 1859. Eventually, five British companies came to own all railway business in India, and operated under a profit maximization scheme. Further, there was no government regulation of these companies. In 1854, Governor-General Lord Dalhousie formulated a plan to construct a network of trunk lines connecting the principal regions of India. Encouraged by the government guarantees, investment flowed in and a series of new rail companies was established, leading to rapid expansion of the rail system in India. Soon several large princely states built their own rail systems and the network spread to the regions that became the modern-day states of Assam, Rajasthan and Andhra Pradesh. The route mileage of this network increased from 1,349 to 25,495 kilometres (838 to 15,842 mi) between 1860 and 1890, mostly radiating inland from the three major port cities of Bombay, Madras, and Calcutta. After the Sepoy Rebellion in 1857, and subsequent Crown rule over India, the railways were seen as a strategic defense of the European population, allowing the military to move quickly to subdue native unrest and protect Britons. The railway thus served as a tool of the colonial government to control India as they were "an essential strategic, defensive, subjugators and administrative 'tool'" for the Imperial Project. Most of the railway construction was done by Indian companies supervised by British engineers. The system was heavily built, using a broad gauge, sturdy tracks and strong bridges. By 1900 India had a full range of rail services with diverse ownership and management, operating on broad, metre and narrow gauge networks. In 1900, the government took over the GIPR network, while the company continued to manage it. During the First World War, the railways were used to transport troops and grain to the ports of Bombay and Karachi en route to Britain, Mesopotamia, and East Africa.[citation needed] With shipments of equipment and parts from Britain curtailed, maintenance became much more difficult; critical workers entered the army; workshops were converted to making munitions; the locomotives, rolling stock, and track of some entire lines were shipped to the Middle East. The railways could barely keep up with the increased demand. By the end of the war, the railways had deteriorated for lack of maintenance and were not profitable. In 1923, both GIPR and EIR were nationalised. Headrick shows that until the 1930s, both the Raj lines and the private companies hired only European supervisors, civil engineers, and even operating personnel, such as locomotive engineers. The hard physical labor was left to the Indians. The colonial government was chiefly concerned with the welfare of European workers, and any Indian deaths were "either ignored or merely mentioned as a cold statistical figure." The government's Stores Policy required that bids on railway contracts be made to the India Office in London, shutting out most Indian firms. The railway companies purchased most of their hardware and parts in Britain. There were railway maintenance workshops in India, but they were rarely allowed to manufacture or repair locomotives. After independence in 1947, forty-two separate railway systems, including thirty-two lines owned by the former Indian princely states, were amalgamated to form a single nationalised unit named the Indian Railways. India provides an example of the British Empire pouring its money and expertise into a very well-built system designed for military purposes (after the Rebellion of 1857), in the hope that it would stimulate industry. The system was overbuilt and too expensive for the small amount of freight traffic it carried. Christensen (1996), who looked at colonial purpose, local needs, capital, service, and private-versus-public interests, concluded that making the railways a creature of the state hindered success because railway expenses had to go through the same time-consuming and political budgeting process as did all other state expenses. Railway costs could therefore not be tailored to the current needs of the railways or of their passengers. The British Raj invested heavily in infrastructure, including canals and irrigation systems. The Ganges Canal reached 560 kilometres (350 miles) from Haridwar to Cawnpore (now Kanpur), and supplied thousands of kilometres of distribution canals. By 1900 the Raj had the largest irrigation system in the world. One success story was Assam, a jungle in 1840 that by 1900 had 1,600,000 hectares (4,000,000 acres) under cultivation, especially in tea plantations. In all, the amount of irrigated land rose eightfold. Historian David Gilmour says: By the 1870s the peasantry in the districts irrigated by the Ganges Canal were visibly better fed, housed and dressed than before; by the end of the century the new network of canals in the Punjab had produced an even more prosperous peasantry there. Historians continue to debate whether the long-term intention of British rule was to accelerate the economic development of India, or to distort and delay it. In 1780, the conservative British politician Edmund Burke raised the issue of India's position: he vehemently attacked the East India Company, claiming that Warren Hastings and other top officials had ruined the Indian economy and society. Indian historian Rajat Kanta Ray (1998) continues this line of attack, saying the new economy brought by the British in the 18th century was a form of "plunder" and a catastrophe for the traditional economy of the Mughal Empire. Ray accuses the British of depleting the food and money stocks and of imposing high taxes that helped cause the terrible Bengal famine of 1770, which killed a third of the people of Bengal. P. J. Marshall shows that recent scholarship has reinterpreted the view that the prosperity of the formerly benign Mughal rule gave way to poverty and anarchy. He argues the British takeover did not make any sharp break with the past, which largely delegated control to regional Mughal rulers and sustained a generally prosperous economy for the rest of the 18th century. Marshall notes the British went into partnership with Indian bankers and raised revenue through local tax administrators and kept the old Mughal rates of taxation. The East India Company inherited an onerous taxation system that took one-third of the produce of Indian cultivators. Instead of the Indian nationalist account of the British as alien aggressors, seizing power by brute force and impoverishing all of India, Marshall presents the interpretation (supported by many scholars in India and the West) that the British were not in full control but instead were players in what was primarily an Indian play and in which their rise to power depended upon excellent co-operation with Indian elites. Marshall admits that much of his interpretation is still highly controversial among many historians. Demography The population of the territory that became the British Raj was 100 million by 1600 and remained nearly stationary until the 19th century. The population of the Raj reached 255 million according to the first census taken in 1881 of India. Studies of India's population since 1881 have focused on such topics as total population, birth and death rates, growth rates, geographic distribution, literacy, the rural and urban divide, cities of a million, and the three cities with populations over eight million: Delhi, Greater Bombay, and Calcutta. Mortality rates fell in the 1920–1945 era, primarily due to biological immunisation. Other factors included rising incomes and better living conditions, improved nutrition, a safer and cleaner environment, and better official health policies and medical care. Severe overcrowding in the cities caused major public health problems, as noted in an official report from 1938: Famines, epidemics, and public health During the British Raj, India experienced a large number of major famines, including the Great Famine of 1876–1878, in which 6.1 million to 10.39 million Indians perished and the Indian famine of 1899–1900, in which 1.25 to 10 million Indians perished. The first cholera pandemic began in Bengal, then spread across India by 1820. Ten thousand British troops and countless Indians died during this pandemic.[citation needed] Estimated deaths in India between 1817 and 1860 exceeded 15 million. Another 23 million died between 1865 and 1917. The Third plague pandemic which started in China in the middle of the 19th century, eventually spread to all inhabited continents and killed 10 million Indians in India alone. Waldemar Haffkine, who mainly worked in India, became the first microbiologist to develop and deploy vaccines against cholera and bubonic plague. In 1925 the Plague Laboratory in Bombay was renamed the Haffkine Institute. Fevers ranked as one of the leading causes of death in India in the 19th century. Britain's Sir Ronald Ross, working in the Presidency General Hospital in Calcutta, finally proved in 1898 that mosquitoes transmit malaria, while on assignment in the Deccan at Secunderabad, where the Centre for Tropical and Communicable Diseases is now named in his honour. In 1881 there were around 120,000 leprosy patients. The central government passed the Lepers Act of 1898, which provided legal provision for forcible confinement of people with leprosy in India. Under the direction of Mountstuart Elphinstone a program was launched to propagate smallpox vaccination. Mass vaccination in India resulted in a major decline in smallpox mortality by the end of the 19th century. In 1849 nearly 13% of all Calcutta deaths were due to smallpox. Between 1868 and 1907, there were approximately 4.7 million deaths from smallpox. Sir Robert Grant directed his attention to establishing a systematic institution in Bombay for imparting medical knowledge to the natives. In 1860, Grant Medical College became one of the four recognised colleges for teaching courses leading to degrees (alongside Elphinstone College, Deccan College and Government Law College, Mumbai). Education Universities in Calcutta, Bombay, and Madras were established in 1857, just before the Rebellion. By 1890 some 60,000 Indians had matriculated, chiefly in the liberal arts or law. About a third entered public administration, and another third became lawyers. The result was a very well educated professional state bureaucracy. By 1887 of 21,000 mid-level civil services appointments, 45% were held by Hindus, 7% by Muslims, 19% by Eurasians (European father and Indian mother), and 29% by Europeans. Of the 1000 top-level civil services positions, almost all were held by Britons, typically with an Oxbridge degree. The government, often working with local philanthropists, opened 186 universities and colleges of higher education by 1911; they enrolled 36,000 students (over 90% men). By 1939 the number of institutions had doubled and enrolment reached 145,000. The curriculum followed classical British standards of the sort set by Oxford and Cambridge and stressed English literature and European history. Nevertheless, by the 1920s the student bodies had become hotbeds of Indian nationalism. See also Notes References Bibliography Further reading Media related to British Raj at Wikimedia Commons Quotations related to British Raj at Wikiquote British Raj travel guide from Wikivoyage The dictionary definition of british raj at Wiktionary |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Redneck_joke] | [TOKENS: 81] |
Contents Redneck joke A redneck joke is a joke about rednecks—working-class, rural, southern white Americans. These jokes can be a form of classism, depending on the teller. Jeff Foxworthy is a comedian that specializes in telling redneck jokes. For example: See also References This comedy- or humor-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://github.com/features/actions] | [TOKENS: 501] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. Automate your workflow from idea to production GitHub Actions makes it easy to automate all your software workflows, now with world-class CI/CD. Build, test, and deploy your code right from GitHub. Make code reviews, branch management, and issue triaging work the way you want. Kick off workflows on any GitHub event to automate tasks Linux, macOS, Windows, ARM, GPU, and containers make it easy to build and test all your projects. Run directly on a VM or inside a container. Use your own VMs, in the cloud or on-prem, with self-hosted runners. Save time with matrix workflows that simultaneously test across multiple operating systems and versions of your runtime. GitHub Actions supports Node.js, Python, Java, Ruby, PHP, Go, Rust, .NET, and more. Build, test, and deploy applications in your language of choice. See your workflow run in realtime with color and emoji. It’s one click to copy a link that highlights a specific line number to share a CI/CD failure. Automate your software development practices with workflow files embracing the Git flow by codifying it in your repository. Test your web service and its DB in your workflow by simply adding some docker-compose to your workflow file. Whether you want to build a container, deploy a web service, or automate welcoming new users to your open source projects—there's an action for that. Pair GitHub Packages with Actions to simplify package management, including version updates, fast distribution with our global CDN, and dependency resolution, using your existing GITHUB_TOKEN. GitHub Actions connects all of your tools to automate every step of your development workflow. Securely store and manage your code and packages with GitHub credentials, integrated into your workflows via APIs and webhooks. Enjoy fast, reliable downloads through a global CDN for optimized performance. We take pride in our Open Source legacy, and are happy to provide free CI/CD for public repositories. Check out the doc to see which runners are included. Check out plan details to see how many minutes are included and the pricing table below to see which runners you can use your free minutes on. The future of workflow automation is now Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Baybars] | [TOKENS: 5575] |
Contents Baybars Al-Malik al-Zahir Rukn al-Din Baybars al-Bunduqdari (Arabic: الْمَلِك الظَّاهِر رُكْن الدِّين بَيْبَرْس الْبُنْدُقْدَارِيّ;[a] 1223/1228 – 30 June 1277), commonly known as Baibars or Baybars (بَيْبَرْس) and nicknamed Abu al-Futuh (أَبُو الْفُتُوح, lit. 'Father of Conquests'), was the fourth Mamluk sultan of Egypt and Syria, of Turkic Kipchak origin, in the Bahri dynasty, succeeding Qutuz. He was one of the commanders of the Muslim forces that inflicted a defeat on the Seventh Crusade of King Louis IX of France. He also led the vanguard of the Mamluk army at the Battle of Ain Jalut in 1260, which marked the first substantial defeat of the Mongol army that is considered a turning point in history. The reign of Baybars marked the start of an age of Mamluk dominance in the Eastern Mediterranean and solidified the durability of their military system. He managed to pave the way for the end of the Crusader presence in the Levant and reinforced the union of Egypt and Syria as the region's pre-eminent Muslim state, able to fend off threats from both Crusaders and Mongols, and even managed to subdue the kingdom of Makuria, which was famous for being unconquerable by previous Muslim empire invasion attempts. As sultan, Baybars also engaged in a combination of diplomacy and military action, allowing the Mamluks of Egypt to greatly expand their empire. Name and appearance In his native Turkic language, Baybars' name means "great panther" or "lord panther" (see also Wiktionary: bay "rich person, noble" + pars "leopard, panther"). Possibly based on the Turkic meaning of his name, Baybars used the panther as his heraldic blazon, and placed it on both coins and buildings. The lion/panther used on the bridge built by Baybars near al-Ludd (today's Lod) plays with a rat, which may be interpreted to represent Baybars' Crusader enemies. Baybars was described as a tall man with olive skin and blue eyes. He had broad shoulders, slim legs, and a powerful voice. It was observed that he had cataract in one eye. Biography Baybars was a Kipchak thought to be born in the steppe region north of the Black Sea, or Dasht-i Kipchak at the time. There is a discrepancy in Ibn Taghrībirdī's dating of his birth, since he says it took place in 625 AH (12 December 1227 – 29 November 1228) and also that Baybars was about 24 years old in 1247, which would put his birth closer to 1223. He belonged to the Barli tribe. According to a fellow Cuman and eyewitness, Badr al-Din Baysari, the Barli fled the armies of the Mongols, intending to settle in the Second Bulgarian Empire (named in the sources Wallachia). They crossed the Black Sea from either Crimea or Alania, where they had arrived in Bulgaria in about 1242. In the meantime, the Mongols invaded Bulgaria, including the regions where the Cuman refugees had recently settled. Both Baybars, who witnessed his parents being massacred, and Baysari were among the captives during the invasion and were sold into slavery in the Sultanate of Rum at the slave market in Sivas. Afterwards, he was sold in Hama to 'Alā' al-Dīn Īdīkīn al-Bunduqārī [de], an Egyptian of high rank, who brought him to Cairo. In 1247, al-Bunduqārī was arrested and the sultan of Egypt, As-Salih Ayyub, confiscated his slaves, including Baybars. Al-Sha'rani (d. 973/1565) counted him among Ibn 'Arabi's students. In 1250, he supported the defeat of the Seventh Crusade of Louis IX of France in two major battles. The first was the Battle of Al Mansurah, where he employed an ingenious strategy in ordering the opening of a gate to let the crusader knights enter the town; the crusaders rushed into the town that they thought was deserted to find themselves trapped inside. They were besieged from all directions by the Egyptian forces and the town population, and suffered heavy losses. Robert of Artois, who took refuge in a house, and William Longespée the Younger were both killed, along with most of the Knights Templar. Only five Templar Knights escaped alive. The second was the Battle of Fariskur which essentially ended the Seventh Crusade and led to the capture of Louis IX. Egyptian forces in that battle were led by Sultan Turanshah, the young son of recently deceased as-Salih Ayyub. Shortly after the victory over the Crusaders, Baybars and a group of Mamluk soldiers assassinated Turanshah, leading to as-Salih Ayyub's widow Shajar al-Durr being named sultana. In 1254, a power shift occurred in Egypt, as Aybak killed Faris ad-Din Aktai, the leader of the Bahri Mamluks. Some of his Mamluks, among them Baybars and Qalawun al-Alfi, fled to an-Nasir Yusuf in Syria, persuading him to break the accord[clarification needed] and invade Egypt. Aybak wrote to an-Nassir Yusuf warning him of the danger of these Mamluks who took refuge in Syria, and agreed to grant him their territorial domains on the coast, but an-Nasir Yusuf refused to expel them and instead returned to them the domains which Aybak had granted. In 1255, an-Nasir Yusuf sent new forces to the Egyptian border, this time with many of Aktai's Mamluks, among them Baybars, and Qalawun al-Alfi, but he was defeated again. In 1257, Baybars and other Bahri Mamluks left Damascus to Jerusalem, where they deposed its governor Kütük and plundered its markets, then they did the same in Gaza. Later on, they fought against the forces of an-Nasir Yusuf at Nablus, then fled to join the forces of al-Mughith Umar [de] in Kerak. The combined forces tried in vain to invade Egypt during the reign of Aybak. Baybars then sent 'Ala al-Din Taybars al-Waziri to discuss with Qutuz his return to Egypt, which was eagerly accepted. He was still a commander under sultan Qutuz at the Battle of Ain Jalut in 1260, when he decisively defeated the Mongols. After the battle, Sultan Qutuz (aka Koetoez) was assassinated while on a hunting expedition. It was said that Baybars was involved in the assassination because he expected to be rewarded with the governorship of Aleppo for his military success, but Qutuz, fearing his ambition, refused to give him the post. Baybars succeeded Qutuz as Sultan of Egypt. Soon after Baybars had ascended to the Sultanate, his authority was confirmed without any serious resistance, except from Alam al-Din Sinjar al-Halabi, another Mamluk amir who was popular and powerful enough to claim Damascus. Also, the threat from the Mongols was still serious enough to be considered as a threat to Baybars' authority. However, Baybars first chose to deal with Sinjar, and marched on Damascus. At the same time the princes of Hama and Homs proved able to defeat the Mongols in the First Battle of Homs, which lifted the Mongol threat for a while. On 17 January 1261, Baybars's forces were able to rout the troops of Sinjar outside Damascus, and pursued the attack to the city, where the citizens were loyal to Sinjar and resisted Baybars, although their resistance was soon crushed. There was also a brief rebellion in Cairo led by a leading figure of the Shiite named al-Kurani. Al-Kurani is said originated from Nishapur. Al-Kurani and his follower are recorded to have attacked the weapon stores and stables of Cairo during a night raid. Baybars, however, manage to suppress the rebellion quickly as he surrounded and arrested them all. Al- Kurani and another rebel leaders were executed (crucified) in Bab Zuweila After suppressing the revolt of Sinjar, Baybars then managed to deal with the Ayyubids, while quietly eliminating the prince of Kerak. Ayyubids such as Al-Ashraf Musa, Emir of Homs and the Ayyubid Emir Dynasty of Hama Al-Mansur Muhammad II, who had earlier staved off the Mongol threat, were permitted to continue their rule in exchange for their recognizing Baybars' authority as Sultan. After the Abbasid caliphate in Iraq was overthrown by the Mongols in 1258 when they conquered and sacked Baghdad, the Muslim world lacked a caliph, a theoretically supreme leader who had sometimes used his office to endow distant Muslim rulers with legitimacy by sending them writs of investiture. Thus, when the Abbasid refugee Abu al-Qasim Ahmad, the uncle of the last Abbasid caliph al-Musta'sim, arrived in Cairo in 1261, Baybars had him proclaimed caliph as al-Mustansir II and duly received investiture as sultan from him. Unfortunately, al-Mustansir II was killed by the Mongols during an ill-advised expedition to recapture Baghdad from the Mongols later in the same year. In 1262, another Abbasid, allegedly the great-great-great-grandson of the Caliph al-Mustarshid, Abu al-'Abbas Ahmad, who had survived from the defeated expedition, was proclaimed caliph as al-Hakim I, inaugurating the line of Abbasid caliphs of Cairo that continued as long as the Mamluk sultanate, until 1517. Like his unfortunate predecessor, al-Hakim I also received the formal oath of allegiance of Baybars and provided him with legitimation. While most of the Muslim world did not take these caliphs seriously, as they were mere instruments of the sultans, they still lent a certain legitimation as well as a decorative element to their rule. According to ʿIzz al-Dīn ibn Shaddād the Mongols tried to conquer al bira in 1264, after a brutal fight with the Mamluks and Sultan Baybars the mongols retreated beyond the euphrates. As sultan, Baybars engaged in a lifelong struggle against the Crusader kingdoms in Syria, in part because the Christians had aided the Mongols. He started with the Principality of Antioch, which had become a vassal state of the Mongols and had participated in attacks against Islamic targets in Damascus and Syria. In 1263, Baybars laid siege to Acre, the capital of the remnant of the Kingdom of Jerusalem, although the siege was abandoned when he sacked Nazareth instead. He used siege engines to defeat the Crusaders in battles such as the Fall of Arsuf from 21 March to 30 April. After breaking into the town he offered free passage to the defending Knights Hospitallers if they surrendered their formidable citadel. The Knights accepted Baybars' offer but were enslaved anyway. Baybars razed the castle to the ground. He next attacked Atlit and Haifa, where he captured both towns after destroying the crusaders' resistance, and razed the citadels. In the same year, Baybars laid siege to the fortress of Safed, held by the Templar knights, which had been conquered by Saladin in 1188 but returned to the Kingdom of Jerusalem in 1240. Baybars promised the knights safe passage to the Christian town of Acre if they surrendered their fortress. Badly outnumbered, the knights agreed. On capturing Safed, Baybars did not raze the fortress to the ground but fortified and repaired it instead, as it was strategically situated and well constructed. He installed a new governor in Safed, with the rank of Wali. Later, in 1266, Baybars invaded the Christian country of Cilician Armenia which, under King Hethum I, had submitted to the Mongol Empire. After defeating the forces of Hethum I in the Battle of Mari, Baybars managed to ravage the three great cities of Mamistra, Adana and Tarsus, so that when Hetoum arrived with Mongol troops, the country was already devastated. Hetoum had to negotiate the return of his son Leo by giving control of Armenia's border fortresses to the Mamluks. In 1269, Hetoum abdicated in favour of his son and became a monk, but he died a year later. Leo was left in the awkward situation of keeping Cilicia as a subject of the Mongol Empire, while at the same time paying tribute to the Mamluks. This isolated Antioch and Tripoli, led by Hethum's son-in-law, Prince Bohemond VI. After successfully conquering Cilicila, Baybars in 1267 settled his unfinished business with Acre, and continued the extermination of remaining crusader garrisons in the following years. In 1268, he besieged Antioch, capturing the city on 18 May. Baybars had promised to spare the lives of the inhabitants, but he broke his promise and had the city razed, killing or enslaving much of the population after the surrender. prompting the fall of the Principality of Antioch. The massacre of men, women, and children at Antioch "was the single greatest massacre of the entire crusading era." Priests had their throats slit inside their churches, and women were sold into slavery. Then he continued to Jaffa, which belonged to Guy, the son of John of Ibelin. Jaffa fell to Baybars on 7 March after twelve hours of fighting; most of Jaffa's citizens were slain, but Baybars allowed the garrison to go unharmed. After this he conquered Ashkalon and Caesarea. In 1272 the Mongol Ilkhanates attempted to besiege Al Bira but failed when Sultan baybars Atacked the mongols with a relief force of about 5000 men, after that the mongols retreated beyond the euphrates. Baybars actively pursued a close relationship with Berke, the Khan of Golden Horde. He particularly was recorded to receive the first two hundred soldiers from Golden Horde to visit warmly, where Baybars persuade them to convert to Islam while also observing the growing enmity between the Golden Horde Khan with Hulagu. Baybars, who at that time has just defeated Hulagu, immediately sent envoy to Berke to inform the latter about this. Then, As soon as Berke converted to Islam, he sent envoy to Egypt to give news about this matter, and later, Baybars brought more peoples from Golden Horde to be sent into Egypt, where they also converted to Islam. In some time around October to November 1267, or about 666 Safar of Hijra year, Baybars wrote condolences and congratulations to the new Khan of the Golden Horde, Mengu-Timur, to urge him to fight Abaqa. Baybars continued to conduct warm correspondence with the Golden Horde, particularly with Mengu Timur's general Noqai, who unlike Mengu Timur was very cooperative with Baybars. It is theorized that this intimacy was not only due to the religious connection (as Noqai was a Muslim, unlike his Khan), but also because Noqai was not really fond of Mengu-Timur. However, Baybars was pragmatic in his approach and did not want to become involved in complicated intrigue inside the Golden Horde, so instead he stayed close to both Mengu Timur and Noqai. On 30 March 1271, after Baybars captured the smaller castles in the area, including Chastel Blanc, he besieged the Krak des Chevaliers, held by the Hospitallers. Peasants who lived in the area had fled to the castle for safety and were kept in the outer ward. As soon as Baybars arrived, he began erecting mangonels, powerful siege weapons which he would turn on the castle. According to Ibn Shaddad, two days later the first line of defences was captured by the besiegers; he was probably referring to a walled suburb outside the castle's entrance. After a lull of ten days, the besiegers conveyed a letter to the garrison, supposedly from the Grand Master of the Knights Hospitaller in Tripoli, Hugues de Revel, which granted permission for them to surrender. The garrison capitulated and the Sultan spared their lives. The new owners of the castle undertook repairs, focused mainly on the outer ward. The Hospitaller chapel was converted to a mosque and two mihrabs were added to the interior. Baibars led a campaign against the Assasins In 1271, Baibars' forces seized al-'Ullaiqah and ar-Rusafa, after taking Masyaf the year before. Later in the year, Shams ad-Din surrendered and was deported to Egypt. Qala'at al-Khawabi fell that year and within two years Gerdkuh and all of the Assassin fortresses were held by the sultan. With the Assassins under his control, Baibars was able to use them to counter the forces arriving in the Ninth Crusade.[citation needed] Baybars then turned his attention to Tripoli, but he interrupted his siege there to call a truce in May 1271. The fall of Antioch had led to the brief Ninth Crusade, led by Prince Edward of England, who arrived in Acre in May 1271 and attempted to ally himself with the Mongols against Baybars. So Baybars declared a truce with Tripoli, as well as with Edward, who was never able to capture any territory from Baybars anyway. According to some reports, Baybars tried to have Edward assassinated with poison, but Edward survived the attempt and returned home in 1272 following the failure of the crusade.[citation needed] In 1265 a Mamluk army allegedly raided Makuria as far south as Dongola while also expanding southwards along the African Red Sea coast, thus threatening the Nubians. In 1272 king David marched east and attacked the port town of Aidhab, located on an important pilgrimage route to Mecca. The Nubian army destroyed the town, causing “a blow to the very heart of Islam”. This initiated several decades of intervention by the Mamluks in Nubian affairs. A punitive Mamluk expedition was sent in response, but did not pass beyond the second cataract. Three years later the Makurians attacked and destroyed Aswan, but this time, Baybars responded with a well-equipped army setting off from Cairo in early 1276, accompanied by a cousin of king David named Mashkouda or Shekanda. The Mamluks defeated the Nubians in three battles at Gebel Adda, Meinarti and finally at the Battle of Dongola. David fled upstream the Nile, eventually entering al-Abwab in the south, which, previously being Alodia's northernmost province, had by this period become a kingdom of its own. The king of al-Abwab, however, handed David over to Baybars, who had him executed. Baybars then completed his conquest of Nubia, including the Medieval lower Nubia which was ruled by Banu Kanz. Under the terms of the settlement, the Nubians were now subjected to paying jizya tribute, and in return they were allowed to keep their religion, being protected under Islamic law as 'People of the Book'; they were also allowed to continue being governed by a king from the native royal family, although this king was chosen personally by Baybars, namely a Makurian noble named Shakanda. In practice this was reducing Makuria to a vassal kingdom, effectively ending Makuria's status as an independent kingdom. In 1277, Baybars invaded the Seljuq Sultanate of Rûm, then controlled by the Ilkhanate Mongols. He defeated a Ilkhanate army at the Battle of Elbistan and captured the city of Kayseri. Baybars himself went with a few troops to deal with the Mongol right flank that was pounding his left wing. Baybars ordered a force from the army from Hama to reinforce his left. The large Mamluk numbers were able to overwhelm the Mongol force, who instead of retreating dismounted from their horses. Some Mongols were able to escape and took up positions on the hills. Once they became surrounded they once again dismounted, and fought to the death. During the celebration of victory, Baybars said that "How can I be happy? Before I had thought that I and my servants would defeat the Mongols, but my left wing was beaten by them. Only Allah helped us". The possibility of a new Mongol army convinced Baybars to return to Syria, since he was far away from his bases and supply line. As the Mamluk army returned to Syria the commander of the Mamluk vanguard, Izz al-Din Aybeg al-Shaykhi, deserted to the Mongols. Pervâne sent a letter to Baybars asking him to delay his departure. Baybars chastised him for not aiding him during the Battle of Elbistan. Baybars told him he was leaving for Sivas to mislead Pervâne and the Mongols as to his true destination. Baybars also sent Taybars al-Waziri with a force to raid the Armenian town of al-Rummana, whose inhabitants had hidden the Mongols earlier. Baybars died in Damascus on 30 June 1277, when he was 53 years old. His demise has been the subject of some academic speculation. Many sources agree that he died from drinking poisoned kumis that was intended for someone else. Other accounts suggest that he may have died from a wound while campaigning, or from illness. He was buried in the Az-Zahiriyah Library in Damascus. Family Sultan Baybars married a noble lady from Tripoli (modern-day Lebanon) named Aisha al Bushnatiya, a prominent Arab family. Aisha was a warrior who fought the Crusaders along with her brother lieutenant Hassan. She met Sultan Baybars after he camped in Tripoli during his siege.[citation needed] They had a short relationship and after that they got married. There are conflicting stories of whether Aisha returned with Baybars to Egypt or was martyred in Tripoli.[citation needed] One of Baibar's wives was the daughter of Amir Sayf ad-Din Nogay at-Tatari. Another wife was the daughter of Amir Sayf ad-Din Giray at-Tatari. Another wife was the daughter of Amir Sayf ad-Din Tammaji. Another wife was Iltutmish Khatun. She was the daughter of Barka Khan a former Khwarazmian amir. She was the mother of his son Al-Said Barakah. She died in 1284–85. Another wife was the daughter Karmun Agha, a Mongol Amir. He had three sons al-Said Barakah, Solamish and Khizir. He had seven daughters; one of them was named Tidhkarbay Khatun. Legacy As the first Sultan of the Bahri Mamluk dynasty, Baybars made the meritocratic ascent up the ranks of Mamluk society, where he commanded Mamluk forces in the decisive Battle of Ain Jalut in 1260, repelling Mongol forces from Syria. Although in the Muslim world he has been considered a national hero for centuries, and in the Near East and Kazakhstan is still regarded as such, Baybars was reviled in the Christian world of the time for his successful campaigns against the Crusader States. Baybars also played an important role in bringing the Mongols to Islam. He developed strong ties with the Mongols of the Golden Horde and took steps for the Golden Horde Mongols to travel to Egypt. The arrival of the Mongol's Golden Horde to Egypt resulted in a significant number of Mongols accepting Islam. Baybars was a popular ruler in the Muslim world who had defeated the crusaders in three campaigns, and the Mongols in the Battle of Ain Jalut which many scholars deem of great macro-historical importance. In order to support his military campaigns, Baybars commissioned arsenals, warships and cargo vessels. He was also arguably the first to employ explosive hand cannons in war, at the Battle of Ain Jalut. However this claim of hand cannons usage is disputed by other historians who claim hand cannons did not appear in the Middle East until the 14th century. His military campaign also extended into Libya and Nubia. He was also an efficient administrator who took interest in building various infrastructure projects, such as a mounted message relay system capable of delivery from Cairo to Damascus in four days. He built bridges, irrigation and shipping canals, improved the harbours, and built mosques. He was a patron of Islamic science, such as his support for the medical research of his Arab physician, Ibn al-Nafis. As a testament of a special relationship between Islam and cats, Baybars left a cat garden in Cairo as a waqf, providing the cats of Cairo with food and shelter. His memoirs were recorded in Sirat al-Zahir Baibars ("Life of al-Zahir Baibars"), a popular Arabic romance recording his battles and achievements. He has a heroic status in Kazakhstan, as well as in Egypt, Palestine, Lebanon and Syria. Al-Madrassa al-Zahiriyya is the school built adjacent to his Mausoleum in Damascus.[citation needed] The Az-Zahiriyah Library has a wealth of manuscripts in various branches of knowledge to this day. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/XAI_(company)#cite_note-70] | [TOKENS: 1856] |
Contents xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links |
======================================== |
[SOURCE: https://github.com/features/codespaces] | [TOKENS: 1016] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. Secure development made simple GitHub Codespaces gets you up and coding faster with fully configured, secure cloud development environments native to GitHub. Secure by design Created with security in mind, Codespaces provides a secure development environment through its built-in capabilities and native integration with GitHub. Collaborate where you code Codespaces provides a shared development environment and removes the need for complex, time consuming setups. Your space, your way. Codespaces is a home away from home for your code that feels just like your usual machine. Your space, your way. Codespaces is a home away from home for your code that feels just like your usual machine. Browser preview and port forwarding Preview your changes and get feedback from teammates by sharing ports within the scope allowed by policy. Onboard faster Quickly spin up a codespace with only an IDE or browser and a GitHub account. With a few configuration files, you can give your developers an instant, fully configured, and secure development environment so they can start coding immediately. What you can do with Codespaces Code from any device. Want to code on an iPad? Go for it. Spin up Codespaces from any device with internet access. Don’t worry if your device is powerful enough—Codespaces lives in the cloud. Onboard at the speed of thought. No more building your dev environment while you onboard. Codespaces launches instantly from any repository on GitHub with pre-configured, secure environments. Fix bugs right from a pull request. Got a pull request detailing a bug or security issue? Open Codespaces right from the pull request without waiting for your dev environment to load. Start coding in seconds with Codespaces A codespace is a development environment that's hosted in the cloud. Customize your project for GitHub Codespaces by configuring dev container files to your repository (often known as configuration-as-code), which creates a repeatable codespace configuration for all users of your project. GitHub Codespaces run on a various VM-based compute options hosted by GitHub.com, which you can configure from 2 core machines up to 32 core machines. Connect to your codespaces from the browser or locally using an IDE like Visual Studio Code or IntelliJ. There are a number of entry points to spin up a Codespaces environment, including: A template. Your repository for new feature work An open pull request to explore work-in-progress A commit in the repository's history to investigate a bug at a specific point in time Visual Studio Code In beta, can you also use your JetBrains IDE or JupyterLab Learn more about how to use Codespaces in our documentation. Codespaces is available for developers in every organization, and under the control of the organization who pays for the user's codespace. All personal (individual) GitHub.com accounts include a quota of free usage each month, which organizations can enable (see the next question) for their private and internal repositories. GitHub will provide users in the free plan 120 core hours or 60 hours of run time on a 2 core codespace, plus 15 GB of storage each month. See how it's balanced on the billing page. Codespaces is available for teams and companies, but needs to be enabled first in an organization’s settings. Teams and companies can select which repositories and users have access to Codespaces for added security and permissioning control. Learn how to enable Codespaces in an organization in our docs. Codespaces is free for individual use up to 60 hours a month and comes with simple, pay-as-you-go pricing after that. It’s also available for organizations with pay-as-you-go pricing and has pricing controls so any company or team can determine how much they want to spend a month. Learn more about Codespaces pricing for organizations here. Codespaces cannot be self-hosted. You can use Codespaces directly through LinkedIn Learning. LinkedIn Learning offers 50+ courses across six of the most popular coding languages, as well as data science and machine learning. These courses are integrated with Codespaces, so you can get hands-on practice anytime, from any machine via LinkedIn. These courses will be unlocked on LinkedIn Learning for free through Feb. 2023. Learn more about LinkedIn Learning and GitHub Codespaces here. Codespaces is on by default for developers with a GitHub free account. If you belong to an organization, there may be a policy that prevents cloning—but if you can clone a repository, you will be able to start using Codespaces. Organizations will also need to pay for, enable, and manage their Codespaces instances. Codespaces is available for free to students as part of the GitHub Student Developer Pack. Learn more about how to sign up and start using Codespaces and other GitHub products here. Codespaces provides both maintainers and contributors with generous free monthly usage. Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Expulsion_of_Jews_from_Spain] | [TOKENS: 11609] |
Contents Expulsion of Jews from Spain On 31 March 1492, the Catholic Monarchs of Spain, King Ferdinand II of Aragon and Queen Isabella I of Castile, issued the Alhambra Decree, ordering all unconverted Jews to leave their kingdoms and territories by the end of July that year, unless they converted to Christianity. Motivated by a desire for religious unity following the completion of the Reconquista and amid fears that unconverted Jews were influencing conversos (Jewish converts to Christianity) to revert to Judaism, the decree brought to an end more than a millennium of Jewish presence in the Iberian Peninsula. It also ranks among the most consequential events in Spanish and Jewish history. In the decades before 1492, successive crises had already thinned Spain's Jewish population through violence, forced conversion, and legal discrimination. In the aftermath of the 1391 massacres, large numbers of Jews converted to Catholicism. Continued attacks produced about 50,000 additional conversions by 1415. Authorities suspected that some conversos continued to practice Judaism in secret; concerns over such "Judaizing" helped motivate the creation of the Spanish Inquisition in 1478, which investigated cases of heresy and, in some instances, used torture and imposed penalties up to execution for the unrepentant. Growing limpieza de sangre ("purity-of-blood") statutes in the 15th century further stigmatized "New Christians" of Jewish descent. After the fall of Granada in January 1492, the Catholic Monarchs, urged on by Grand Inquisitor Tomás de Torquemada, moved from consolidating territorial unity to enforcing religious uniformity, culminating in the Alhambra Decree of March 31. Many of those who remained decided to convert to avoid expulsion. Modern estimates generally place the number expelled between 40,000 and 200,000; figures remain debated. An unknown number returned to Spain in the following years. Following the expulsions many first crossed into Portugal (1492–96) before facing forced conversion in 1497; others moved to Navarre, which expelled its Jews in 1498. The Inquisition continued to prosecute suspected crypto-Jews for centuries in Spain and across its overseas tribunals, conducting autos-da-fé well into the 18th century. The mass expulsion created a large Sephardic Jewish diaspora across the Mediterranean, in North Africa, the Italian states, and especially the Ottoman Empire (notably Istanbul, Salonika, İzmir, Sarajevo, Jerusalem and Safed). Smaller streams reached southern France, and in the 16th–17th centuries Antwerp, Amsterdam, Hamburg, and London attracted merchant families. Many communities preserved Judaeo-Spanish (Ladino) for centuries. In 1924, the regime of Miguel Primo de Rivera granted Spanish citizenship to a part of the Sephardic Jewish diaspora. The edict was formally and symbolically revoked on December 16, 1968, following the Second Vatican Council, by the regime of Francisco Franco. This occurred a full century after Jews had openly begun to practice their religion in Spain and synagogues were once more legal places of worship under Spain's Laws of Religious Freedom. In 2015, the Cortes Generales of Spain passed a law whereby the descendants of Sephardic Jews could obtain Spanish nationality by naturalisation to "compensate for shameful events in the country's past." Jews who could prove that they are the descendants of those expelled from Spain due to the Alhambra Decree could "become Spaniards without leaving home or giving up their present nationality." The deadline to apply was October 1, 2019. Background Jews have lived in the Iberian Peninsula since at least the Roman period. Under the Visigothic kingdom (6th–7th centuries), church councils at Toledo promulgated increasingly harsh anti-Jewish laws, including episodes of forced baptism. Later, upon the Islamic conquest, the Jewish population came under Islamic rule of Al-Andalus. María Rosa Menocal wrote that Jews under Muslim rule were dhimmis with reduced rights relative to Muslims, but were still generally in a better position than other European Jews living under Christian rule. Mark R. Cohen, whose book Under Cross and Crescent reviewed the treatment of medieval Jews, writes that while there were significant restrictions on Jews in Islamic states, they did not suffer serfdom, supersessionist oppression, and demonisation like the Jews of Christendom. Darío Fernández-Morera states that the supposed harmony between Jews and Muslims in Spain was an exaggeration that proliferated in the 19th century. However, this position has been strongly criticized as far-right Islamophobia. Fernández-Morera uses the case of medieval Spain to further an explicitly extreme right-wing and conservative Christian political and cultural agenda as it bears upon debates about politics, the establishment of religion and the very place of the academy in civic life. On January 2, 1492, the Catholic Monarchs of Spain conquered the Emirate of Granada. The last Muslim king, Muhammad XII (Spanish: Boabdil), withdrew to Alpujarras in an event that ended the Reconquista, although the idea of a continuous or tightly orchestrated Reconquista has been challenged by modern scholars. A letter sent by the Catholic Monarchs to the Bilbao City Council in 1490 stated that under canon law and the laws of the kingdoms, Jews were tolerated and allowed to live in the kingdoms as subjects and vassals. Joseph Pérez considers that "the myth of the 'Spain of the three cultures,' widely used as an element of propaganda, is so far detached from historical reality that it can only generate new elements of confusion." In Christian kingdoms, according to Henry Kamen, both Jews and Muslims were treated "with contempt" and the three communities "lived separate existences." In Muslim kingdoms, on the other hand, Christians and Jews were required to pay a tax to profess their religion. In the twelfth and thirteenth centuries, Christian anti-Judaism in the West had intensified, which was reflected in the harsh anti-Jewish measures agreed at the Fourth Council of the Lateran called in 1215 by Pope Innocent III. The peninsular Christian kingdoms were not at all oblivious to the growth of increasingly belligerent anti-Judaism – the Castilian statutory code of Siete Partidas stated that the Jews lived among Christians "so that their presence reminds them that they descend from those who crucified Our Lord Jesus Christ", but the kings continued to "protect" the Jews for the important role they played in their kingdoms. In the fourteenth century, the period of relative tolerance towards the Jews ended, passing into a phase of increasing conflict. What changes are not the mentalities, it is the circumstances. The good times of the Spain of the three religions had coincided with a phase of territorial, demographic, and economic expansion, in which Jews and Christians did not compete in the labor market: both the former and the latter had contributed to the general prosperity and shared their benefits. The militant anti-Judaism of the Church and the mendicant orders barely found an echo. The social, economic and political changes of the 14th century, however, including the wars and natural disasters that preceded and followed the Black Plague, created a new situation. [...] [The people] believed they were victims of a curse, punished for sins they must have committed. The clergy invited the faithful to repent, change their behavior, and return to God. It was then that the presence of the "deicidal people" among Christians was considered scandalous. The first wave of violence against the Jews on the Iberian Peninsula occurred in the Kingdom of Navarre as a consequence of the arrival of the Shepherds' Crusade across the Pyrenees in 1321. The Jewish communities of Pamplona and Estella-Lizarra were massacred. Two decades later, the impact of the Black Death of 1348 provoked assaults on the Jewish quarters (juderías) of several places, especially Barcelona and other places in the Principality of Catalonia. In the Crown of Castile, anti-Jewish violence was closely related to the civil war during the reign of Peter of Castile. In this conflict, the side supporting Enrique de Trastámara (later King Henry II of Castile) used anti-Judaism as a propaganda weapon, and the pretender to the throne accused his stepbrother, Peter of Castile, of favoring the Jews. The first slaughter of Jews, in Toledo in 1355, was carried out by the supporters of Enrique de Trastámara when they entered the city. The same happened eleven years later when they occupied Briviesca. In Burgos, Jews who could not pay the large tribute imposed on them in 1366 were enslaved and sold. In 1367 in Valladolid, Jews were assaulted to shouts of "Long live King Henry!" There were no deaths, but the synagogues were burned down. The great catastrophe for the Jews took place in 1391 when the communities of Castile and the Crown of Aragon were massacred. The assaults, fires, looting, and slaughter began in June, in Seville, where Ferrand Martínez , archdeacon of Écija, took advantage of the power vacuum created by the death of the archbishop of Seville. Hardening his[clarification needed] preaching against the Jews that had begun in 1378, he ordered the overthrow of synagogues and the seizure of prayer books. In January 1391, an assault on the Jewish quarter was avoided by the municipal authorities, but in June, hundreds of Jews were murdered, their houses were ransacked, and their synagogues were converted into churches. Some Jews managed to escape; others, terrified, asked to be baptized. From Seville, anti-Jewish violence extended throughout Andalusia, and then towards other parts of Castile. In August, it reached the crown of Aragon. Murders, looting, and fires occurred everywhere. The Jews who managed to survive either fled, many seeking refuge in the kingdoms of Navarre, Portugal and France, and in North Africa, or chose baptism to avoid death. It is difficult to be certain of the number of victims. In Barcelona some 400 Jews were murdered; in Valencia, 250; and in Lérida, 68. After the Massacre of 1391, anti-Jewish measures were intensified. In Castile in 1412, Jewish men had to let their beards grow, and Jews were required to wear a distinctive red badge sewn to their clothes, so they could be recognized. In the Crown of Aragon, possession of the Talmud was declared unlawful, and the number of synagogues was limited to one per Jewish community (aljama). In addition, the mendicant orders intensified their campaign of proselytism to make Jews convert to Christianity. The Dominican Vincent Ferrer of Valencia played a prominent role in this campaign, which had the support of the monarchs. In the Crown of Aragon, it was decreed that Jews were obligated to attend three sermons a year. As a result of the massacres of 1391 and the measures that followed, by 1415 more than half of the Jews of the crowns of Castile and Aragon had renounced Mosaic law and had been baptized, including many rabbis and important members of the community. After the massacres of 1391 and the preaching that followed them, by 1415 scarcely 100,000 Jews continued to practice their religion in the crowns of Castile and Aragon. The historian Joseph Perez explains that "Spanish Judaism [would] never recover from this catastrophe." The Jewish community "came out of the crisis not only physically diminished but morally and intellectually shattered". In the Crown of Aragon, Judaism virtually disappeared in important places such as Barcelona, Valencia, and Palma – in 1424 the Barcelona Jewry was abolished because it was considered unnecessary – and only the one in Zaragoza remained. In Castile, once-flourishing aljamas such as those of Seville, Toledo, and Burgos lost many of their members; in 1492, the year of the expulsion, in the Crown of Aragon only a quarter of the former number of Jews remained. The famous Jewish community of Gerona, for example, was left with only 24 families. In the Crown of Castile, there were less than 80,000. In Seville before the revolts of 1391, there were about 500 Jewish families. According to Joseph Perez, at the time of the expulsion, there were fewer than 150,000 Jews, distributed in 35 aljamas of the Crown of Aragon and 216 in the Crown of Castile. In both Crowns, it was observed that the Jews had left the great cities and lived in the small and rural areas, less exposed "to the excesses of the Christians." After the critical period of 1391–1415, the pressure on Jews to recuperate their confiscated synagogues and books had decreased, and they were then able to avoid certain obligations such as carrying the red ribbon or attending friars' sermons. They were also able to reconstruct the internal organization of the aljamas and their religious activities, thanks to the agreements reached by the procurators of the aljamas gathered in Valladolid in 1432 and sanctioned by the king, which meant that "the Crown of Castile accepts again officially that a minority of its subjects has another religion than the Christian one and recognizes the right of this minority to exist legally, with a legal status." "In this way, the Jewish community is rebuilt with the approval of the crown." Abraham Benveniste, who presided over the meeting of Valladolid, was appointed court rabbi with authority over all the Jews of the kingdom, and at the same time as delegate of the king over them. During the reign of the Catholic Monarchs, in the last quarter of the 15th century, many Jews lived in rural villages and engaged in agricultural activities. Crafts and trade were not monopolized – international trade had passed into the hands of converts. While Jews continued to trade as money-lenders, the number of Christian lenders had increased by a large percentage. Jews also continued to collect royal, ecclesiastical, and seigniorial rents, but their importance there had also diminished – in Castile they were only in charge of a quarter of the revenues. However, in the court of Castile – but not in the crown of Aragon – Jews held important administrative and financial positions. Abraham Senior was from 1488 treasurer-major of the Holy Brotherhood, a key organ in the financing of the Granada War, and also chief rabbi of Castile. Yucé Abravanel was "A greater collector of the service and mountaineering of the herds, one of the more healthy income and greater yield of the Crown of Castile." However, according to Joseph Perez, the role of the Jews in the court must not be exaggerated. "The truth was that the state could do without the Jews, both in the bureaucratic apparatus and in the management of the estate." The Hebrew community at the end of the 15th century was therefore far from rich and influential. "In fact, the Spanish Jews at the time of their expulsion did not form a homogeneous social group. There were classes among them as in Christian society, a small minority of very rich and well-placed men, together with a mass of small people: farmers, artisans, shopkeepers." What united them was that they practiced the same faith, different from the one recognized, which made them a separate community within the monarchy and which was "property" of the crown which thereby protected them. In a letter dated July 7, 1477, addressed to the authorities of Trujillo, where incidents had occurred against the Jews, Queen Isabella I of Castile, after putting the aljama under her protection and prohibiting all type of oppression or humiliation against its members, states: All the Jews of my kingdoms are mine and are under my protection, and it is for me to defend and protect them and to keep them in justice. Thus, the Jews "formed not a State in the State, but rather a micro-society next to the majority Christian society, with an authority, the crown rabbi, that the crown delegated to it over its members." The aljamas were organized internally with a wide margin of autonomy. They designated by lottery the council of elders that governed the life of the community; collecting their own taxes for the maintenance of worship, synagogues, and rabbinical teaching; lived under the norms of Jewish law; and had their own courts that heard all cases in civil matters – since the Cortes de Madrigal [es] of 1476, criminal cases had passed to the royal courts. But Jews did not enjoy full civil rights: they had a specific tax system far more burdensome than that of Christians and were excluded from positions that could confer authority over Christians. The situation in which the Jews lived, according to Joseph Perez, posed two problems: "As subjects and vassals of the king, the Jews had no guarantee for the future – the monarch could at any time close the autonomy of the aljamas or require new Most important taxes"; and, above all, "in these late years of the Middle Ages, when a state of modern character was being developed, there could be no question of a problem of immense importance: was the existence of separate and autonomous communities compatible with the demands of a modern state? This was the real question." In the 15th century, the main problem stopped being the Jews becoming conversos, who, according to Henry Kamen, probably numbered around three hundred thousand people.[citation needed] “Christian convert” was the term applied to Jews who had been baptized and their descendants. As many of them had been forcibly converted, they were often looked upon with distrust by those who considered themselves Old Christians. The positions abandoned by Jews were mostly filled by converts, who congregated where Jewish communities had flourished before 1391, doing work formerly performed by Jews – trade and crafts – with the added advantage that as Christians they could now access trades and professions previously forbidden to Jews. Some even entered the clergy, becoming canons, priors and even bishops. The socio-economic position of converts was viewed with suspicion by the "old" Christians, a resentment that was accentuated by the conscience on the part of those who had a differentiated identity, proud of being Christians and having Jewish ancestry, which was the lineage of Christ. Popular revolts broke out against the converts between 1449 and 1474, a period in Castile of economic difficulties and political crisis (especially during the civil war of the reign of Henry IV). The first and largest of these revolts took place in 1449 in Toledo, during which a "Judgment–Statute" was approved that prohibited access to municipal positions by "any confessor of Jewish lineage" – an antecedent of the blood-purity statutes of the following century. The origin of the revolts was economic in Andalusia especially because there was a situation of hunger, aggravated by an epidemic of plague – and in principle "not directed especially against the converts. ... It was the parties and the demagogues that took advantage of the exasperation of the people and directed it against the converts." To justify the attacks on converts, they affirmed that conversos were false Christians and that they still practiced the Jewish religion in secret. According to Joseph Perez, it is a proven fact that, among those who converted to escape the blind furor of the masses in 1391, or by the pressure of the proselytizing campaigns of the early fifteenth century, some clandestinely returned to their old faith when it seemed that the danger had passed, of which it is said that they "Judaized". The accusation of Crypto-Judaism became more plausible when some cases arose of prominent converts who continued to observe Jewish rites after their conversion. But Judaizers, according to Joseph Perez, were a minority, although relatively important. Henry Kamen says that "it can be affirmed that at the end of the 1470s, there was no Judaizing movement highlighted or proven among the converts." He also points out that when a convert was accused of Judaizing, in many cases the "proofs" that were brought were, in fact, cultural elements of his Jewish ancestry – such as treating Saturday, not Sunday, as the day of rest – or the lack of knowledge of the new faith, such as not knowing the creed or eating meat during Lent. This is how the "converso problem" was born. The baptized cannot renounce their faith according to the canonical doctrine of the Church, which considers Crypto-Judaism to be heresy that must be punished. This is how various voices began to claim, including those of some converts who do not want to question the sincerity of their baptism because of those "false" Christians who are beginning to be called Marranos. And it also bolstered the idea that the presence of the Jews among the Christians is what invites the converts to continue practicing the Law of Moses. When Isabel I of Castile ascended to the throne in 1474, she was already married to the heir to the Crown of Aragon, the future Ferdinand II of Aragon. At this time, there was no punishment for practicing crypto-Judaism, not out of tolerance for Jews, but for legalistic reasons.[a] They decided to confront the "converso problem," especially after having received some alarming reports in 1475 by the Prior of the Dominicans of Seville, Friar Alonso de Ojeda,[b] who reported that there were a large number of conversos in that city secretly practicing their religion in private, some even doing so openly. After receiving these reports, the monarchs applied to Pope Sixtus IV for authorization to name a number of inquisitors in their kingdom, which the pontiff agreed to in his bull Exigit sincerae devotionis of 1 November 1478. "With the creation of the Tribunal of the Inquisition,[c] the authorities will have sufficient instruments and methods of investigation at their disposal." According to Joseph Pérez, Ferdinand and Isabella "were convinced that the Inquisition would force the conversos to assimilate into society once and for all: the day when all of the new Christians would renounce Judaism, and nothing would distinguish them anymore from any other member of society." Expulsion From the beginning of their reign, Isabel and Ferdinand were concerned with protecting Jews – since they were "property" of the crown. For example, on September 6, 1477, in a letter addressed to the Jewish community of Seville, Queen Isabel I gave assurances about their safety: I take under my protection the Jews of the aljamas in general and each one in particular, as well as their persons and their property; I protect them against any attack, whatever their nature ...; I forbid that they be attacked, killed or injured; I also forbid that they adopt a passive attitude if they are attacked, killed or injured. Hence, even the Catholic Monarchs were reputed to be favorable to the Jews until 1492. This is what the German traveler, Nicolas de Popielovo said, for example, after his visit in 1484–1485: Her subjects from Catalonia and Aragon speak publicly, and I have heard the same thing from many in Spain that the Queen is the protector of the Jews and the daughter of a Jewess. But the monarchs could not do away with all the vexations and discrimination suffered by the Jews, encouraged on many occasions by the preaching of the friars from the mendicant orders. They decided to segregate the Jews to end the conflict. Already in the Cortes of Madrigal of 1476, the monarchs had protested the breach of the provisions in the Order of 1412 on the Jews – prohibition to wear luxury dresses; obligation to wear a red slice on the right shoulder; prohibition to hold positions with authority over Christians, to have Christian servants, to lend money at usurious interest, etc. But in the Cortes de Toledo of 1480, they decided to go much further to fulfill these norms: to force the Jews to live in separate quarters, where they could not leave except during daytime to carry out their professional occupations. Until then, the Jewish quarters – where the Jews used to live and where they had their synagogues, butchers, etc. – had not formed a separate world in the cities. There were also Christians living in them and Jews living outside them. From 1480 onwards, the Jewish quarters were converted into ghettos surrounded by walls, and the Jews were confined in them to avoid confusion and damage to Christianity. A term of two years was established for the process, but it lasted for more than ten years and was not exempt from problems and abuses by Christians. The text approved by the Cortes, which also applied to the Muslims of the region, read as follows: We send to the aljamas of the said Jews and Moors: that each of them be put in said separation [by] such procedure and such order that within the said term of the said two years they [shall] have the said houses of their separation, and live and die in them, and henceforth not have their dwellings among the Christians or elsewhere outside the designated areas and places that have been assigned to the said Jewish and Moorish quarters. The decision of the kings approved by the Courts of Toledo had antecedents, since Jews already had been confined in some Castilian localities like Cáceres or Soria. In this last locality it had been carried out with the monarchs' approval "to avoid the harms that followed from the Jews living, dwelling, and being present among the Christians." Fray Hernando de Talavera, the queen's confessor and who had opposed the use of force to solve the "converso problem," also justified the segregation "by avoiding many sins that follow from the mixture and a great deal of familiarity [between Christians and Jews] and from not keeping everything that, encompassing their conversation with Christians, by holy canons and civil laws is ordered and commanded." With the decision to detain Jews in ghettos, it was not only a question of separating them from Christians and of protecting them, but also of imposing a series of obstacles to their activities, so that they would have no choice but "to give up their status as Jews if they want to lead a normal existence. Their conversion is not demanded – not yet – nor is their autonomous statute touched, but it continues with them in such a way that they end up convincing themselves that the only solution is conversion." The first inquisitors appointed by the kings arrived in Seville in November 1480, "immediately sowing terror." During the first years, in this city alone, they pronounced 700 death sentences and more than 5,000 "reconciliations" – that is, prison sentences, exile or simple penances – accompanied by confiscation of their property and disqualification for public office and ecclesiastical benefits. Over the course of their inquiries, the inquisitors discovered that for a long time many converts had been meeting with their Jewish relatives to celebrate Jewish holidays and even attend synagogues. This convinced them that they would not be able to put an end to crypto-Judaism if converts continued to maintain contact with the Jews, so they asked the monarchs for the Jews to be expelled from Andalusia. This request was approved and in 1483, the monarchs gave six months for the Jews of the dioceses of Seville, Cordoba, and Cadiz to go to Extremadura. There are doubts as to whether the order was strictly enforced, since at the time of the final expulsion in 1492 some chroniclers speak of the fact that 8,000 families of Andalusia embarked in Cadiz and others in Cartagena and the ports of the Crown of Aragon. On the other hand, the expulsion of the Jews of Saragossa and Teruel was also proposed, but in the end, it was not carried out. According to Julio Valdeón, the decision to expel the Jews from Andalusia also obeyed "the desire to move them away from the border between the crown of Castile and the Nasrid Kingdom of Granada, the scene, during the 1480s and the first years of the 1490s, of the war that ended with the disappearance of the last stronghold of peninsular Islam." On March 31, 1492, shortly after the end of the Granada War, the Catholic Monarchs signed the decree of expulsion of the Jews in Granada, which was sent to all the cities, towns and lordships of their kingdoms with strict orders to not read it or make it public until May 1. It is possible that some prominent Jews tried to nullify or soften it but did not have any success. Among these Jews, Isaac Abravanel stands out, who offered King Ferdinand a considerable sum of money. According to a well-known legend, when Inquisitor General Tomás de Torquemada discovered this, he presented himself before the king and threw a crucifix at his feet, saying: "Judas sold our Lord for thirty pieces of silver; His Majesty is about to sell it again for thirty thousand." According to the Israeli historian Benzion Netanyahu, quoted by Julio Valdeón, when Abravanel met with Queen Isabella, she said to him: "'Do you think this comes from me? The Lord has put that thought into the heart of the King?" A few months before, an auto da fe[clarification needed] was held in Avila in which three converts and two Jews condemned by the Inquisition were burnt alive for an alleged ritual crime against a Christian child (who will be known as the [Child of the Guard]) contributed to create the propitious environment for the expulsion. The Catholic Monarchs had precisely entrusted to the inquisitor general Tomás de Torquemada and its collaborators the writing of the decree fixing to them, according to the historian Luis Suarez, three previous conditions which would be reflected in the document: to justify the expulsion by charging Jews with two sufficiently serious offenses – usury and "heretical practice"; That there should be sufficient time for Jews to choose between baptism or exile; And that those who remained faithful to the Mosaic law could dispose of their movable and immovable property, although with the provisos established by the laws: they could not take either gold, silver, or horses. Torquemada presented the draft decree to the monarchs on March 20, 1492, and the monarchs signed and published it in Granada on March 31. According to Joseph Pérez, that the monarchs commissioned the drafting of the decree to Torquemada "demonstrates the leading role of the Inquisition in that matter." Of the decree promulgated in Granada on March 31, which was based on the draft decree of Torquemada – drawn up "with the will and consent of their highnesses" and which is dated March 20 in Santa Fe – there are two versions: One signed by the two monarchs and valid for the Crown of Castile and another signed only by King Ferdinand and valid for the Crown of Aragon. Between the draft decree of Torquemada and the two final versions, there exist, according to Joseph Pérez, "significant variants." In contrast to the Torquemada project and the Castilian decree, in the version addressed to the Crown of Aragon: Regarding the essentials, the two versions have the same structure and expose the same ideas. The first part describes the reasons why the monarchs – or the king in the case of the Aragonese version – decided to expel the Jews. The second part details how the expulsion would take place. The second part of the decree detailed the conditions for expulsion: Although the edict did not refer to a possible conversion, this alternative was implicit. As the historian Luis Suárez pointed out, the Jews had "four months to take the most terrible decision of their lives: to abandon their faith to be integrated in it [in the kingdom, in the political and civil community], or leave the territory in order to preserve it." The drama that the Jews lived is documented by a contemporary source: Some Jews, when the term was running out, went about by night and day in despair. Many turned from the road ... and received the faith of Christ. Many others, in order not to deprive themselves of the country where they were born and not to sell their goods at that time at lower prices, were baptized. The most outstanding Jews, with few exceptions such as that of Isaac Abravanel, decided to convert to Christianity. The most relevant case was that of Abraham Senior, the chief rabbi of Castile and one of the closest collaborators of the monarchs. He and all his relatives were baptized on June 15, 1492, in the Guadalupe monastery, with the monarchs Isabel and Ferdinand as their godparents. He took the name of Fernán Núñez Coronel, while his son-in-law Mayr Melamed took the name Fernán Pérez Coronel – in both cases, the same Christian name as the king. This case, like that of Abraham de Córdoba, was given much publicity, to serve as an example for the rest of their community. In fact, during the four-month tacit term that was given for the conversion, many Jews were baptized, especially the rich and the most educated, and among them the vast majority of the rabbis. A chronicler of the time relates the intense propaganda campaign that unfolded: To all their aljamas and communities much preaching was done, in all the synagogues and in the squares and in the churches and in the fields, by the wise men of Spain; and the holy gospel and the doctrine of the Holy Mother Church was preached to them, and it was preached and proven by their own Scriptures, how the Messiah they awaited was Our Redeemer and Savior Jesus Christ, who came at the suitable time, who their ancestors ignored with malice, and all the others who came after them never wanted to hear the truth; before, deceived by the false book of the Talmud, having the truth before their eyes and reading it in their law every day, they ignored and disregarded it. The Jews who decided not to convert "had to prepare themselves for the departure in tremendous conditions." They had to sell their goods because they had very little time and had to accept the sometimes ridiculous amounts offered to them in the form of goods that could be carried away, since the export of gold and silver from the kingdom was prohibited. The possibility of taking bills of exchange was not much help because the bankers, Italians for the most part, demanded enormous interest. A chronicler of the time attests: They sold and bargained away everything they could of their estates ... and in everything there were sinister ventures, and the Christians got their estates, very many and very rich houses and inheritances, for few monies; and they went about begging with them, and found not one to buy them, and gave a house for an ass and a vine for a little cloth or linen because they could not bring forth gold or silver. They also had serious difficulties in recovering money lent to Christians because either the repayment term was after August 10, the deadline for their departure, or many of the debtors claimed "usury fraud," knowing that the Jews would not have time for the courts to rule in their favor. In a letter to the monarchs, the Ampudia Jews complained that, "The mayors of the said village were committing and have committed many wrongdoings and affronts that were specifically not consented to, no less do they want to pay their personal property and real estate that they have, nor pay the debts owed to them and that which they owe urge them to do and then pay them even if the deadlines are not reached." In addition, they had to pay all the expenses of the trip – transport, maintenance, freight of the ships, tolls, etc. This was organized by Isaac Abravanel, who contracted the ships (having to pay very high prices), and whose owners in some cases did not fulfill the contract or killed the travelers to steal what little they had. Abravanel counted on the collaboration of the royal official and convert Luis de Santángel and of the Genovese banker Francisco Pinelo. The monarchs had to give orders to protect Jews during the trip because they suffered vexations and abuse. This is how Andrés Bernaldez, pastor of Los Palacios, describes the time when the Jews had to "abandon the lands of their birth": All the young men and daughters who were twelve years old were married to each other, for all the females of this age above were in the shadow and company of husbands... They came out of the lands of their birth, big and small children, old and young, on foot and men on asses and other beasts, and on wagons, and continued their journeys each to the ports where they were to go; and went by the roads and fields where they went with many works and fortunes; some falling, others rising, others dying, others being born, others becoming sick, that there was no Christian that did not feel their pain, and always invited them to baptism, and some, with grief, converted and remained, but very few, and the rabbis worked them up and made the women and young men sing and play tambourines. In the Castillian version of the Alhambra Decree, reference is made exclusively to religious motives. The Aragonese version also alludes to usury. The Jews are accused of heretical depravity, that is, of serving as an example and inciting the convert to return to the practices of his ancient religion. At the beginning of the decree, It is said that: It is well known that in our dominions, there are some bad Christians who have Judaized and committed apostasy against the holy Catholic faith, the majority being caused by relations between Jews and Christians. The measures taken up to that point by the monarchs to put an end to communication between the Jewish community and the converts, a fundamental cause of the new Christians "Judaizing," according to the monarchs and the Inquisition, are as follows. The first was the agreement of the Cortes of Toledo of 1480, by which the Jews were forced to live in separate neighborhoods from the Christians, to prevent the Jews from being able to "subvert and subtract the Christian faithful from our holy Catholic faith." The second was the decision to expel the Jews from Andalusia, "believing that this would be enough for those of the other cities and towns and places of our kingdoms and manors to stop doing and committing the aforementioned." But this measure failed "because every day it is found and it seems that the said Jews continue to grow their evil and damaged purpose where they live and converse." Finally, the reason for deciding to expel the entire Jewish community, and not just those of its members who allegedly wanted to "pervert" the Christians, is explained: Because when some serious and detestable crime is committed by some college or university [i.e. some corporation and community], it is reason that such college or university be dissolved, and annihilated and the younger ones by the elders and for each other to be punished and that those who pervert the good and honest living of cities and towns by a contagion, that can harm others, be expelled. As highlighted Julio Valdeón, "undoubtedly the expulsion of the Jews from the Iberian site is one of the most controversial issues of all that have happened throughout the history of Spain." It is not surprising, therefore, that historians have debated whether, in addition to the motives laid out by the Catholic Monarchs in the decree, there were others. Nowadays, some of the arguments made over time, such as that the Jews were expelled to keep their wealth, seem to have been discarded, since the majority of the Jews who left were the most modest, while the richest converted and they stayed. And on the other hand, the crown did not benefit at all from the operation; rather, it was damaged, because it stopped receiving the taxes paid by the Jews. Nor does the argument seem to hold that the expulsion was an episode of class conflict – for instance, that the nobility wanted to get rid of an incipient bourgeoisie, represented by the Jews, that supposedly threatened their interests – because many Jews were defended by some of the most important noble families of Castile, and because in addition, it was among the ranks of the "bourgeoisie" of "old Christians" where anti-Judaism grew the most. A personal motive on the part of the monarchs can also be ruled out, as there is no indication that they felt any repugnance towards Jews and converts. Among the monarchs' trusted men were several who belonged to this group, such as the confessor of the queen friar Hernando de Talavera, the steward Andrés Cabrera, the treasurer of the Santa Hermandad Abraham Senior, or Mayr Melamed and Isaac Abarbanel, without counting the Jewish doctors that attended them. Current historians prefer to place expulsion in the European context, and those such as Luis Suárez Fernández or Julio Valdeón highlight that the Catholic Monarchs were, in fact, the last of the sovereigns of the great western European states to decree expulsion – the Kingdom of England did it in 1290, the Kingdom of France in 1394; in 1421 the Jews were expelled from Vienna; in 1424 from Linz and of Colonia; in 1439 from Augsburg; in 1442 from Bavaria; in 1485 from Perugia; in 1486 from Vicenza; in 1488 from Parma; in 1489 from Milan and Luca; in 1493 from Sicily; in 1494 from Florence; in 1498 from Provence...-. The objective of all of them was to achieve unity of faith in their states, a principle that would be defined in the 16th century with the maxim "cuius regio, eius religio," i.e., that the subjects should profess the same religion as their prince. As Joseph Pérez has pointed out, the expulsion "puts an end to an original situation in Christian Europe: that of a nation that consents to the presence of different religious communities" with which it "becomes a nation like the rest in European Christendom." Pérez adds, "The University of Paris congratulated Spain for having carried out an act of good governance, an opinion shared by the best minds of the time (Machiavelli, Guicciardini, Pico della Mirandola)... [...] it was the so-called medieval coexistence that was strange to Christian Europe." Julio Valdeón affirms that the decision of the Catholic Monarchs, who "showed themselves, in their first years of rule, clearly protective of the Hebrews," was due to "pressure from the rest of Christianity" and to "the constant pressure of The Church, who often preached against those it called "deicides," as well as the "tremendous animosity that existed in the Christian people against the Jewish community." In this sense, he quotes the thesis of the Israeli historian Benzion Netanyahu that the expulsion was the consequence of the climate of racism that lived in the Christian society of the time. A thesis of the latter – that the monarchs decided on the expulsion to ingratiate themselves with the masses in which anti-Jewish sentiments predominated – Joseph Pérez considers to be without foundation: "Why should the monarchs have had to worry about what the masses felt about Jews and converts when they did not [even] attend to the more concrete interests of those masses? Of the three surviving versions of the expulsion edict, only the third [the Aragonese], which was signed only by King Ferdinand, refers to the subject of usury, and certainly in very harsh terms. In the other two versions, we do not read a single mention or even the slightest allusion to this matter. Accusations that had been repeated for centuries against the Jews: a deicide people, desecration of hosts, ritual crimes ... do not appear in any of the three versions." For Joseph Pérez, the decision of the Catholic Monarchs, as evidenced by the content of the Granada Edict, is directly related to the "converso problem." The first step was the creation of the Inquisition, the second the expulsion of the Jews to eliminate those who allegedly incited the converts to Judaize. "What concerned them [the monarchs] was the total and definitive assimilation of the converts, for which the previous measures failed; they resort to a drastic solution: the expulsion of the Jews to root out evil." "The idea of expelling the Jews comes from the Inquisition; there is no doubt about this. [...] The expulsion of the Jews seemed to the Inquisition the best way to end the Judaizing of converts: by removing the cause – communication with Jews – the effect would fade away. […] The Catholic Monarchs take the idea on their own, but this does not mean that they are under pressure from the inquisitors. The concerns, for them, are also religious: heresy is not to their liking; they want to cleanse the kingdom of it, as the queen wrote, but these concerns are also political: they hope that the elimination of Judaism will facilitate the definitive assimilation and integration of the converts into Spanish society. ". On the other hand, Joseph Pérez, following Luis Suárez, places the expulsion within the context of the construction of the "modern State," which requires greater social cohesion based on the unity of faith to impose its authority on all groups and individuals in the kingdom. Unlike in medieval times, in this type of state there are no groups that are governed by particular rules, as was the case for the Jewish community. For this reason, it is not by chance, Pérez warns, that only three months after having eliminated the last Muslim stronghold on the peninsula with the conquest of the Nasrid kingdom of Granada, the monarchs decreed the expulsion of the Jews. "What was intended then was to fully assimilate Judaizers and Jews so that there were only Christians. The monarchs must have thought that the prospect of expulsion would encourage Jews to convert en masse and that thus a gradual assimilation would destroy the remnants of Judaism. They were wrong about this. The vast majority preferred to leave, with all that this entailed in tears, sacrifices and humiliations, and remain faithful to their faith. They flatly refused the assimilation that was offered them as an alternative." However, "assimilation" is in this quote a euphemism: what was offered to the Sephardic Jew was, in fact, conversion to a faith that was not his own, hence his mass emigration (towards the different directions indicated in the map above). Consequences As Joseph Pérez has pointed out, "In 1492, the story of Spanish Judaism ends, thenceforth leading only an underground existence, always threatened by the Spanish Inquisition and the suspicion of a public opinion that saw in Jews, Judaizers and even sincere converts natural enemies of Catholicism and Spanish idiosyncrasy, as understood and imposed by some ecclesiastical and intellectual leaders, in an attitude that bordered on racism." Historic accounts of the numbers of Jews who left Spain are based on speculation, and some aspects were exaggerated by early accounts and historians: Juan de Mariana speaks of 800,000 people, and Don Isaac Abravanel of 300,000. While few reliable statistics exist for the expulsion, modern estimates by scholars from the University of Barcelona estimated the number of Sephardic Jews during the 15th century at 400,000 out of a total population of approximately 7.5 million people in all of Spain, out of whom about half (at least 200,000) or slightly more (300,000) remained in Iberia as conversos; Others who tried to estimate the Jews' demographics based on tax returns and population estimates of communities are much lower, with Kamen stating that, of a population of approximately 80,000 Jews and 200,000 conversos, about 40,000 emigrated. Another approximately 50,000 Jews received a Christian baptism so as to remain in Spain; many secretly kept some of their Jewish traditions and thus became the target of the Inquisition. The Jews of the kingdom of Castile emigrated mainly to Portugal (where the entire community was forcibly converted in 1497) and to North Africa. The Jews of the kingdom of Aragon fled to other Christian areas including Italy, rather than to Muslim lands as is often assumed. Although the vast majority of conversos simply assimilated into the Catholic dominant culture, a minority continued to practice Judaism in secret, gradually migrating throughout Europe, North Africa, and the Ottoman Empire, mainly to areas where Sephardic communities were already present as a result of the Alhambra Decree. The situation of those who returned was regularized with an order of November 10, 1492, in which it was established that civil and ecclesiastical authorities had to be witnesses to baptism, and in the event that they had been baptized before returning, evidence and testimonials that confirm it. They were also able to recover all their goods for the same price at which they had sold them. Returns are documented at least until 1499. On the other hand, the Provision of the Royal Council of 24 October 1493 set harsh sanctions for those who slandered these New Christians with insulting terms such as tornadizos ("transgressors"). As for the economic impact of the expulsion, it seems to be ruled out that it was a hard setback that stopped the birth of capitalism, which would be one of the causes of the decline of Spain. As Joseph Pérez has pointed out, "in view of the published literature on taxation and economic activities, there is no doubt that the Jews were no longer a source of relevant wealth, neither as bankers nor as renters nor as merchants who conducted business at an international level. [...] The expulsion of the Jews produced problems at the local level but not a national catastrophe. It is unreasonable to attribute to that event the decline of Spain and its supposed inability to adapt to the transformations of the modern world. What we know now shows that 16th century Spain was not exactly an economically backward nation. [....] In strictly demographic and economic terms, and apart from human aspects, the expulsion did not imply for Spain any substantial deterioration, but only a temporary crisis quickly overcome." An issue of the Amsterdam Gazette published in the Netherlands on September 12, 1672, and preserved at Beth Hatefutsoth evidences interest by the Jewish community in what was happening at that time in Madrid, and it presents the news in Spanish, 180 years after the expulsion. After the expulsion, the Spanish Inquisition remained active for centuries and continued into the 18th century, targeting conversos suspected of 'Judaizing,' that is, secretly observing Jewish beliefs and practices. As the Church could not legally execute, it transferred convicted individuals to secular authorities, who implemented sentences such as death by burning—a punishment developed within ecclesiastical circles and justified as saving the soul through earthly suffering. Public executions, called autos-de-fé, became elaborate ceremonies featuring processions, sermons, and mass spectacle, sometimes even attended by royalty. Those who confessed under pressure were paraded in humiliating garments called sanbenitos, with their offenses read aloud and their names displayed in churches for generations. Others were burned in effigy if they had died or fled, and even the remains of deceased suspects were exhumed and destroyed as part of the symbolic enforcement of orthodoxy. The expulsion initiated a prolonged period of exile and hardship, triggering a refugee crisis as Jews sought new places of refuge and resettlement. Most of the expelled Jews settled in North Africa, sometimes via Portugal, or in nearby states, such as the Kingdom of Portugal, the Kingdom of Navarre, or in the Italian states. As they were also expelled from these first two kingdoms in 1497 and 1498 respectively, they were forced to emigrate again. The majority of those from Navarre settled in Bayonne. And those from Portugal ended up in Northern Europe (England or Flanders). In North Africa, those who went to the Fez kingdom suffered all kinds of ill-treatment and were plundered, even by the Jews who had lived there for a long time. Those who fared the best were those who settled in the territories of the Ottoman Empire, both in North Africa and in the Middle East, such as in the Balkans and the Republic of Ragusa, after having passed by Italy. Sultan Bayezid II gave orders to welcome them, and exclaimed on one occasion, referring to King Ferdinand: "You call him king who impoverishes his states to enrich mine?" This same sultan commented to the ambassador sent by Carlos V who marveled that "the Jews had been thrown out of Castile, which was to throw away wealth." Over the course of a few generations, the Ottoman Empire's cities emerged as the heart of the Sephardic world. One result of the migration was new Jewish surnames appearing in Italy and Greece. The surnames Faraggi, Farag and Farachi, for example, originated from the Spanish city of Fraga. As some Jews identified Spain and the Iberian Peninsula with the biblical Sepharad, the Jews expelled by the Catholic monarchs took or received the name of Sephardi. Contemporary Jewish accounts frequently compared their suffering to that of the ancient Israelites, expressing both their trust in God and their hope for messianic deliverance. In addition to their religion, they also "kept many of their ancestral customs, and in particular preserved the use of the Spanish language, a language which, of course, is not exactly what was spoken in fifteenth-century Spain: like any living language, it evolved and suffered notable alterations over the passage of time, although the structures and essential characteristics remained those of late medieval Castilian. [...] The Sephardis never forgot the land of their parents, harboring mixed feelings for her: on the one hand, the resentment for the tragic events of 1492, and on the other hand, as time goes by, the nostalgia for the lost homeland." Regarding Judeo-Spanish (also known as Ladino) as a socio-cultural and identity phenomenon, Garcia-Pelayo and Gross wrote in 1977: It is said of the Jews expelled from Spain in the 15th century that they preserve the language and the Spanish traditions in the East. The expulsion of the Jews [...] sent a large number of families out of the Iberian Peninsula, mainly from Andalusia and Castile, to settle in the eastern Mediterranean countries dominated by the Turks, where they formed colonies which have survived to this day, especially in Egypt, Algeria, Morocco, Turkey, Greece, Bulgaria [...]. These families, generally composed of Sephardic elements of good social standing, have maintained their religion, traditions, language, and even their own literature for four-and-a-half centuries. The Spanish that they transported, that of Castile and Andalusia from the end of the 15th century, removed from all contact with that of the Peninsula, has not participated in the evolution undergone by that of Spain and Spanish colonial America. Its phonetics present some archaic but not degenerate forms; Its vocabulary offers countless loan words from Hebrew, Greek, Italian, Arabic, Turkish, according to the countries of residence. Notes References Sources |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-46] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Hugging_Face#Transformers_Library] | [TOKENS: 535] |
Contents Hugging Face Hugging Face, Inc., is an American company based in New York City that develops computation tools for building applications using machine learning. Its transformers library built for natural language processing applications and its platform allow users to share machine learning models and datasets and showcase their work. History The company was founded in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond, and Thomas Wolf in New York City, originally as a company that developed a chatbot app targeted at teenagers. The company was named after the U+1F917 🤗 HUGGING FACE emoji. After open sourcing the model behind the chatbot, the company pivoted to focus on being a platform for machine learning. On April 28, 2021, the company launched the BigScience Research Workshop in collaboration with several other research groups to release an open large language model. In 2022, the workshop concluded with the announcement of BLOOM, a multilingual large language model with 176 billion parameters. In February 2023, the company announced partnership with Amazon Web Services (AWS) which would allow Hugging Face's products to be available to AWS customers to use them as the building blocks for their custom applications. The company also said the next generation of BLOOM will be run on Trainium, a proprietary machine learning chip created by AWS. In June 2024, the company announced, along with Meta and Scaleway, their launch of a new AI accelerator program for European startups. The initiative aimed to help startups integrate open foundation models into their products, accelerating the EU AI ecosystem. The program, based at STATION F in Paris, ran from September 2024 to February 2025. Selected startups received mentoring, and access to AI models and tools and Scaleway’s computing power. On September 23, 2024, to further the International Decade of Indigenous Languages, Hugging Face teamed up with Meta and UNESCO to launch a new online language translator. It was built on Meta's No Language Left Behind open-source AI model, enabling free text translation across 200 languages, including many low-resource languages. On April 2025, Hugging Face announced that they acquired a humanoid robotics startup, Pollen Robotics, based in France and founded by Matthieu Lapeyre and Pierre Rouanet in 2016. In an X tweet, Delangue shared his vision to "make Artificial Intelligence robotics Open Source". In early 2026, hackers hijacked the Hugging Face platform to launch Android-targeted attacks involving "powerful malware" which could completely take over a compromised target. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_note-16] | [TOKENS: 9291] |
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Biological_network] | [TOKENS: 3890] |
Contents Biological network A biological network is a method of representing systems as complex sets of binary interactions or relations between various biological entities. In general, networks or graphs are used to capture relationships between entities or objects. A typical graphing representation consists of a set of nodes connected by edges. History of networks As early as 1736 Leonhard Euler analyzed a real-world issue known as the Seven Bridges of Königsberg, which established the foundation of graph theory. From the 1930s-1950s the study of random graphs were developed. During the mid 1990s, it was discovered that many different types of "real" networks have structural properties quite different from random networks. In the late 2000s, scale-free and small-world networks began shaping the emergence of systems biology, network biology, and network medicine. In 2014, graph theoretical methods were used by Frank Emmert-Streib to analyze biological networks. In the 1980s, researchers started viewing DNA or genomes as the dynamic storage of a language system with precise computable finite states represented as a finite-state machine. Recent complex systems research has also suggested some far-reaching commonality in the organization of information in problems from biology, computer science, and physics. Networks in biology Protein-protein interaction networks (PINs) represent the physical relationship among proteins present in a cell, where proteins are nodes, and their interactions are undirected edges. Due to their undirected nature, it is difficult to identify all the proteins involved in an interaction. Protein–protein interactions (PPIs) are essential to the cellular processes and also the most intensely analyzed networks in biology. PPIs could be discovered by various experimental techniques, among which the yeast two-hybrid system is a commonly used technique for the study of binary interactions. Recently, high-throughput studies using mass spectrometry have identified large sets of protein interactions. Many international efforts have resulted in databases that catalog experimentally determined protein-protein interactions. Some of them are the Human Protein Reference Database, Database of Interacting Proteins, the Molecular Interaction Database (MINT), IntAct, and BioGRID. At the same time, multiple computational approaches have been proposed to predict interactions. FunCoup and STRING are examples of such databases, where protein-protein interactions inferred from multiple evidences are gathered and made available for public usage.[citation needed] Recent studies have indicated the conservation of molecular networks through deep evolutionary time. Moreover, it has been discovered that proteins with high degrees of connectedness are more likely to be essential for survival than proteins with lesser degrees. This observation suggests that the overall composition of the network (not simply interactions between protein pairs) is vital for an organism's overall functioning. The genome encodes thousands of genes whose products (mRNAs, proteins) are crucial to the various processes of life, such as cell differentiation, cell survival, and metabolism. Genes produce such products through a process called transcription, which is regulated by a class of proteins called transcription factors. For instance, the human genome encodes almost 1,500 DNA-binding transcription factors that regulate the expression of more than 20,000 human genes. The complete set of gene products and the interactions among them constitutes gene regulatory networks (GRN). GRNs regulate the levels of gene products within the cell and in-turn the cellular processes. GRNs are represented with genes and transcriptional factors as nodes and the relationship between them as edges. These edges are directional, representing the regulatory relationship between the two ends of the edge. For example, the directed edge from gene A to gene B indicates that A regulates the expression of B. Thus, these directional edges can not only represent the promotion of gene regulation but also its inhibition. GRNs are usually constructed by utilizing the gene regulation knowledge available from databases such as., Reactome and KEGG. High-throughput measurement technologies, such as microarray, RNA-Seq, ChIP-chip, and ChIP-seq, enabled the accumulation of large-scale transcriptomics data, which could help in understanding the complex gene regulation patterns. Gene co-expression networks can be perceived as association networks between variables that measure transcript abundances. These networks have been used to provide a system biologic analysis of DNA microarray data, RNA-seq data, miRNA data, etc. weighted gene co-expression network analysis is extensively used to identify co-expression modules and intramodular hub genes. Co-expression modules may correspond to cell types or pathways, while highly connected intramodular hubs can be interpreted as representatives of their respective modules. Cells break down the food and nutrients into small molecules necessary for cellular processing through a series of biochemical reactions. These biochemical reactions are catalyzed by enzymes. The complete set of all these biochemical reactions in all the pathways represents the metabolic network. Within the metabolic network, the small molecules take the roles of nodes, and they could be either carbohydrates, lipids, or amino acids. The reactions which convert these small molecules from one form to another are represented as edges. It is possible to use network analyses to infer how selection acts on metabolic pathways. Signals are transduced within cells or in between cells and thus form complex signaling networks which plays a key role in the tissue structure. For instance, the MAPK/ERK pathway is transduced from the cell surface to the cell nucleus by a series of protein-protein interactions, phosphorylation reactions, and other events. Signaling networks typically integrate protein–protein interaction networks, gene regulatory networks, and metabolic networks. Single cell sequencing technologies allows the extraction of inter-cellular signaling, an example is NicheNet, which allows to modeling intercellular communication by linking ligands to target genes. The complex interactions in the brain make it a perfect candidate to apply network theory. Neurons in the brain are deeply connected with one another, and this results in complex networks being present in the structural and functional aspects of the brain. For instance, small-world network properties have been demonstrated in connections between cortical regions of the primate brain or during swallowing in humans. This suggests that cortical areas of the brain are not directly interacting with each other, but most areas can be reached from all others through only a few interactions. All organisms are connected through feeding interactions. If a species eats or is eaten by another species, they are connected in an intricate food web of predator and prey interactions. The stability of these interactions has been a long-standing question in ecology. That is to say if certain individuals are removed, what happens to the network (i.e., does it collapse or adapt)? Network analysis can be used to explore food web stability and determine if certain network properties result in more stable networks. Moreover, network analysis can be used to determine how selective removals of species will influence the food web as a whole. This is especially important considering the potential species loss due to global climate change. In biology, pairwise interactions have historically been the focus of intense study. With the recent advances in network science, it has become possible to scale up pairwise interactions to include individuals of many species involved in many sets of interactions to understand the structure and function of larger ecological networks. The use of network analysis can allow for both the discovery and understanding of how these complex interactions link together within the system's network, a property that has previously been overlooked. This powerful tool allows for the study of various types of interactions (from competitive to cooperative) using the same general framework. For example, plant-pollinator interactions are mutually beneficial and often involve many different species of pollinators as well as many different species of plants. These interactions are critical to plant reproduction and thus the accumulation of resources at the base of the food chain for primary consumers, yet these interaction networks are threatened by anthropogenic change. The use of network analysis can illuminate how pollination networks work and may, in turn, inform conservation efforts. Within pollination networks, nestedness (i.e., specialists interact with a subset of species that generalists interact with), redundancy (i.e., most plants are pollinated by many pollinators), and modularity play a large role in network stability. These network properties may actually work to slow the spread of disturbance effects through the system and potentially buffer the pollination network from anthropogenic changes somewhat. More generally, the structure of species interactions within an ecological network can tell us something about the diversity, richness, and robustness of the network. Researchers can even compare current constructions of species interactions networks with historical reconstructions of ancient networks to determine how networks have changed over time. Much research into these complex species interactions networks is highly concerned with understanding what factors (e.g., species richness, connectance, nature of the physical environment) lead to network stability. Network analysis provides the ability to quantify associations between individuals, which makes it possible to infer details about the network as a whole at the species and/or population level. One of the most attractive features of the network paradigm would be that it provides a single conceptual framework in which the social organization of animals at all levels (individual, dyad, group, population) and for all types of interaction (aggressive, cooperative, sexual, etc.) can be studied. Researchers interested in ethology across many taxa, from insects to primates, are starting to incorporate network analysis into their research. Researchers interested in social insects (e.g., ants and bees) have used network analyses better to understand the division of labor, task allocation, and foraging optimization within colonies. Other researchers are interested in how specific network properties at the group and/or population level can explain individual-level behaviors. Studies have demonstrated how animal social network structure can be influenced by factors ranging from characteristics of the environment to characteristics of the individual, such as developmental experience and personality. At the level of the individual, the patterning of social connections can be an important determinant of fitness, predicting both survival and reproductive success. At the population level, network structure can influence the patterning of ecological and evolutionary processes, such as frequency-dependent selection and disease and information transmission. For instance, a study on wire-tailed manakins (a small passerine bird) found that a male's degree in the network largely predicted the ability of the male to rise in the social hierarchy (i.e., eventually obtain a territory and matings). In bottlenose dolphin groups, an individual's degree and betweenness centrality values may predict whether or not that individual will exhibit certain behaviors, like the use of side flopping and upside-down lobtailing to lead group traveling efforts; individuals with high betweenness values are more connected and can obtain more information, and thus are better suited to lead group travel and therefore tend to exhibit these signaling behaviors more than other group members. Social network analysis can also be used to describe the social organization within a species more generally, which frequently reveals important proximate mechanisms promoting the use of certain behavioral strategies. These descriptions are frequently linked to ecological properties (e.g., resource distribution). For example, network analyses revealed subtle differences in the group dynamics of two related equid fission-fusion species, Grevy's zebra and onagers, living in variable environments; Grevy's zebras show distinct preferences in their association choices when they fission into smaller groups, whereas onagers do not. Similarly, researchers interested in primates have also utilized network analyses to compare social organizations across the diverse primate order, suggesting that using network measures (such as centrality, assortativity, modularity, and betweenness) may be useful in terms of explaining the types of social behaviors we see within certain groups and not others. Finally, social network analysis can also reveal important fluctuations in animal behaviors across changing environments. For example, network analyses in female chacma baboons (Papio hamadryas ursinus) revealed important dynamic changes across seasons that were previously unknown; instead of creating stable, long-lasting social bonds with friends, baboons were found to exhibit more variable relationships which were dependent on short-term contingencies related to group-level dynamics as well as environmental variability. Changes in an individual's social network environment can also influence characteristics such as 'personality': for example, social spiders that huddle with bolder neighbors tend to increase also in boldness. This is a very small set of broad examples of how researchers can use network analysis to study animal behavior. Research in this area is currently expanding very rapidly, especially since the broader development of animal-borne tags and computer vision can be used to automate the collection of social associations. Social network analysis is a valuable tool for studying animal behavior across all animal species and has the potential to uncover new information about animal behavior and social ecology that was previously poorly understood. Within a nucleus, DNA is constantly in motion. Perpetual actions such as genome folding and Cohesin extrusion morph the shape of a genome in real time. The spatial location of strands of chromatin relative to each other plays an important role in the activation or suppression of certain genes. DNA-DNA Chromatin Networks help biologists to understand these interactions by analyzing commonalities amongst different loci. The size of a network can vary significantly, from a few genes to several thousand and thus network analysis can provide vital support in understanding relationships among different areas of the genome. As an example, analysis of spatially similar loci within the organization in a nucleus with Genome Architecture Mapping (GAM) can be used to construct a network of loci with edges representing highly linked genomic regions. The first graphic showcases the Hist1 region of the mm9 mouse genome with each node representing genomic loci. Two nodes are connected by an edge if their linkage disequilibrium is greater than the average across all 81 genomic windows. The locations of the nodes within the graphic are randomly selected and the methodology of choosing edges yields a, simple to show, but rudimentary graphical representation of the relationships in the dataset. The second visual exemplifies the same information as the previous; However, the network starts with every loci placed sequentially in a ring configuration. It then pulls nodes together using linear interpolation by their linkage as a percentage. The figure illustrates strong connections between the center genomic windows as well as the edge loci at the beginning and end of the Hist1 region. Modelling biological networks To draw useful information from a biological network, an understanding of the statistical and mathematical techniques of identifying relationships within a network is vital. Procedures to identify association, communities, and centrality within nodes in a biological network can provide insight into the relationships of whatever the nodes represent whether they are genes, species, etc. Formulation of these methods transcends disciplines and relies heavily on graph theory, computer science, and bioinformatics. There are many different ways to measure the relationships of nodes when analyzing a network. In many cases, the measure used to find nodes that share similarity within a network is specific to the application it is being used. One of the types of measures that biologists utilize is correlation which specifically centers around the linear relationship between two variables. As an example, weighted gene co-expression network analysis uses Pearson correlation to analyze linked gene expression and understand genetics at a systems level. Another measure of correlation is linkage disequilibrium. Linkage disequilibrium describes the non-random association of genetic sequences among loci in a given chromosome. An example of its use is in detecting relationships in GAM data across genomic intervals based upon detection frequencies of certain loci. The concept of centrality can be extremely useful when analyzing biological network structures. There are many different methods to measure centrality such as betweenness, degree, Eigenvector, and Katz centrality. Every type of centrality technique can provide different insights on nodes in a particular network; However, they all share the commonality that they are to measure the prominence of a node in a network. In 2005, Researchers at Harvard Medical School utilized centrality measures with the yeast protein interaction network. They found that proteins that exhibited high Betweenness centrality were more essential and translated closely to a given protein's evolutionary age. Studying the community structure of a network by subdividing groups of nodes into like-regions can be an integral tool for bioinformatics when exploring data as a network. A food web of The Secaucus High School Marsh exemplifies the benefits of grouping as the relationships between nodes are far easier to analyze with well-made communities. While the first graphic is hard to visualize, the second provides a better view of the pockets of highly connected feeding relationships that would be expected in a food web. The problem of community detection is still an active problem. Scientists and graph theorists continuously discover new ways of subsectioning networks and thus a plethora of different algorithms exist for creating these relationships. Like many other tools that biologists utilize to understand data with network models, every algorithm can provide its own unique insight and may vary widely on aspects such as accuracy or time complexity of calculation. In 2002, a food web of marine mammals in the Chesapeake Bay was divided into communities by biologists using a community detection algorithm based on neighbors of nodes with high degree centrality. The resulting communities displayed a sizable split in pelagic and benthic organisms. Two very common community detection algorithms for biological networks are the Louvain Method and Leiden Algorithm. The Louvain method is a greedy algorithm that attempts to maximize modularity, which favors heavy edges within communities and sparse edges between, within a set of nodes. The algorithm starts by each node being in its own community and iteratively being added to the particular node's community that favors a higher modularity. Once no modularity increase can occur by joining nodes to a community, a new weighted network is constructed of communities as nodes with edges representing between-community edges and loops representing edges within a community. The process continues until no increase in modularity occurs. While the Louvain Method provides good community detection, there are a few ways that it is limited. By mainly focusing on maximizing a given measure of modularity, it may be led to craft badly connected communities by degrading a model for the sake of maximizing a modularity metric; However, the Louvain Method performs fairly and is easy to understand compared to many other community detection algorithms. The Leiden Algorithm expands on the Louvain Method by providing a number of improvements. When joining nodes to a community, only neighborhoods that have been recently changed are considered. This greatly improves the speed of merging nodes. Another optimization is in the refinement phase in which the algorithm randomly chooses for a node from a set of communities to merge with. This allows for greater depth in choosing communities as the Louvain Method solely focuses on maximizing the modularity that was chosen. The Leiden algorithm, while more complex than the Louvain Method, performs faster with better community detection and can be a valuable tool for identifying groups. Network motifs, or statistically significant recurring interaction patterns within a network, are a commonly used tool to understand biological networks. A major use case of network motifs is in Neurophysiology where motif analysis is commonly used to understand interconnected neuronal functions at varying scales. As an example, in 2017, researchers at Beijing Normal University analyzed highly represented 2 and 3 node network motifs in directed functional brain networks constructed by Resting state fMRI data to study the basic mechanisms in brain information flow. See also References Books External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Leonardo_Torres_Quevedo] | [TOKENS: 12311] |
Contents Leonardo Torres Quevedo Leonardo Torres Quevedo (Spanish: [leoˈnaɾðo ˈtores keˈβeðo]; 28 December 1852 – 18 December 1936) was a Spanish civil engineer, mathematician and inventor, known for his numerous engineering innovations, including aerial trams, airships, catamarans, and remote control. He was also a pioneer in the field of computing and robotics. Torres was a member of several scientific and cultural institutions and held such important positions as the seat N of the Real Academia Española (1920–1936) and the presidency of the Spanish Royal Academy of Sciences (1928–1934). In 1927 he became a foreign associate of the French Academy of Sciences. His first groundbreaking invention was a cable car system patented in 1887 for the safe transportation of people, an activity that culminated in 1916 when the Whirlpool Aero Car was opened in Niagara Falls. In the 1890s, Torres focused his efforts on analog computation. He published Sur les machines algébriques (1895) and Machines à calculer (1901), technical studies that gave him recognition in France for his construction of machines to solve real and complex roots of polynomials. He made significant aeronautical contributions at the beginning of the 20th century, becoming the inventor of the non-rigid Astra-Torres airships, a trilobed structure that helped the British and French armies counter Germany's submarine warfare during World War I. These tasks in dirigible engineering led him to be a key figure in the development of radio control systems in 1901–05 with the Telekine, which he laid down modern wireless remote-control operation principles. From his Laboratory of Automation created in 1907, Torres invented one of his greatest technological achievements, El Ajedrecista (The Chess Player) of 1912, an electromagnetic device capable of playing a limited form of chess that demonstrated the capability of machines to be programmed to follow specified rules (heuristics) and marked the beginnings of research into the development of artificial intelligence. He advanced beyond the work of Charles Babbage in his 1914 paper Essays on Automatics, where he speculated about thinking machines and included the design of a special-purpose electromechanical calculator, introducing concepts still relevant like floating-point arithmetic. British historian Brian Randell called it "a fascinating work which well repays reading even today". Subsequently, Torres demonstrated the feasibility of an electromechanical analytical engine by successfully producing a typewriter-controlled calculating machine in 1920. He conceived other original designs before his retirement in 1930, some of the most notable were in naval architecture projects, such as the Buque campamento (Camp-Vessel, 1913), a balloon carrier for transporting airships attached to a mooring mast of his creation, and the Binave (Twin Ship, 1916), a multihull steel vessel driven by two propellers powered by marine engines. In addition to his interests in engineering, Torres also stood out in the field of letters and was a prominent speaker and supporter of Esperanto. Early life and education Torres was born on 28 December 1852, on the Feast of the Holy Innocents, in Santa Cruz de Iguña, Cantabria, Spain. His father, Luis Torres Vildósola y Urquijo (1818–1891), was a civil engineer in Bilbao, where he worked as a railway engineer. His mother was Valentina Quevedo de la Maza (1825–1891). He had two siblings, Joaquina (b. 1851) and Luis (b. 1855). The family resided for the most part in Bilbao, although they also spent long periods in his mother's family home in Cantabria's mountain region. As a child, he spent considerable time apart from his parents due to their work travels. He was therefore cared for by the Barrenechea sisters, relatives on his father’s side, who named him their heir, enabling his future independence. He studied high school in Bilbao and later went to Paris, to the college of the Christian Brothers, to complete studies for two years (1868 and 1869), where he met French culture, customs, and language and that in later years it would help him in his scientific-technical relationships with personalities, and scientific institutions. In 1870, his father was transferred, bringing his family to Madrid. The following year, Torres began his higher studies in the Official School of the Road Engineers' Corps [es]. He temporarily suspended his studies in 1873 to volunteer along with his brother Luis for the defense of Bilbao, which had been surrounded by Carlist troops during the Third Carlist War. Once the siege of Bilbao was lifted in 1874, he returned to Madrid and completed his studies in 1876, graduating fourth in his class. Career Torres began to work as a civil engineer for a few months on railway projects as his father did, but his curiosity and desire to learn led him to give up joining the Corps to dedicate himself in "thinking about my things". As a young entrepreneur who had inherited a considerable family fortune, he immediately set out on a long trip through Europe in 1877, visiting Italy, France and Switzerland, to know the scientific and technical advances of the day, especially in the incipient area of electricity. Returning to Spain, he settled in Santander, where he continued his self-supported research activities. Torres' experimentation in the field of cableways and cable cars began very early during his residence in the town of his birth, Molledo. There, in 1885, he constructed the first cableway to span a depression of some 40 metres (130 ft). The cableway was about 200 metres (660 ft) across and transported a single person who was sitting in a chair hanging from a cable and had another traction cable. The engine used to move the human load was a pair of cows. Later, in 1887, he would build a cableway over the Río León in Valle de Iguña [es], much bigger and motorized, but which was intended only for transporting materials. These experiments were the basis for his first patent application on 17 September 1887, in Spain, "Un sistema de camino funicular aéreo de alambres múltiples" ("A multi-wire suspended aerial system"), for a cable car with which he obtained a level of safety suitable for the transport of people, not only cargo. The patent was extended to other countries: United States, Austria, Germany, France, United Kingdom, and Italy. His cable car used a novel multi-cable support system, in which one end of a cable is anchored to fixed counterweights and the other (through a system of pulleys) to mobile counterweights. With this system the axial force of the cables via is constant and equal to the weight of the counterweight, regardless of the load in the shuttle. What will vary with this load is the deflection of the via cables, which will increase by raising the counterweight. Thus, the safety coefficient of these cables is perfectly known, and is independent of the shuttle load. The resulting design is very strong and remains safe in case of a support cable failure. In April 1889 Torres presented his cableway in Switzerland, a place very interested in this means of transport due to its geography, between Pilatus-Kulm and Pilatus-Klimsenhorn (Mount Pilatus). It was an aerial funicular with a length of 2 km and a gradient of 300 m. In 1890 he traveled to that country to convince different authorities of its construction. He failed to convince the Swiss, who did not grant any reliability to the work of a Spanish engineer, and even the newspapers Nebelspalter and Eulm Spiegel published articles and satirical drawings about the project. This disappointment, known as the "Swiss failure", led him to focus on other fields for several years. On 30 September 1907, Torres put into operation a pioneer cableway suitable for public transportation, the Mount Ulia aerial ropeway [es] in San Sebastián. The journey was 280 meters, with a drop of 28 meters, lasted for just over three minutes, and the gondola had the capacity to board up to 18 people on each trip. The execution of the project was the responsibility of the Society of Engineering Studies and Works of Bilbao, which was established in 1906 by Valentín Gorbeña Ayarragaray, one of his closest friends, with the sole purpose of developing or marketing Torres' patents. The Ulia cable car transported passengers until its closure in 1917. The successful result of this type of cable car gave him the opportunity to design the Spanish Aerocar based on J. Enoch Thompson's idea at Niagara Falls in Canada. The cableway of 550 meters in length is an aerial cable car that spans the whirlpool in the Niagara Gorge on the Canadian side. It travels at about 7.2 kilometres per hour (4.5 mph). The load per cable via is 9 tonnes (9.9 short tons), with a safety coefficient for the cables of 4.6. and carries 35 standing passengers over a one-kilometre trip. It was constructed between 1914 and 1916. For its construction and assembly, the Niagara Spanish Aerocar Company Limited was set up from the Society of Engineering Studies and Works, with a capital of $110,000 (roughly $3.5 million in 2025), and a planned concession of 20 years. The construction was directed by Torres' son, Gonzalo Torres Polanco. It completed its first tests on 15 February in 1916 and was officially inaugurated on 8 August, opening to the public the following day. The cableway, with small modifications, runs to this day with no accidents worthy of mention, constituting a popular tourist and cinematic attraction. The Aero Car is believed to be the sole remaining example of Torres' design for an aerial ferry. Although constructed and operated in Canada, it was a Spanish project from beginning to end: designed by a Spaniard and constructed by a Spanish company with Spanish capital. In 1991, the Niagara Parks Commission received the Leonardo Torres Quevedo Award [es] on the 75th anniversary of the Aero Car, in recognition of its commitment to preserving Torres' design. A plaque, mounted on a boulder in front of Aero Car Gift Shop recalls this fact: International Historic Civil Engineering Site. The Niagara Spanish Aerocar. A tribute to the distinguished Spanish Engineer who designed the Niagara Spanish Aerocar. This was only one of his many outstanding contributions to the engineering profession. Engineer Leonardo Torres Quevedo (1852–1936). Constructed 1914–1916. CSCE. The Canadian Society for Civil Engineering. 2010. Asociación de Ingenieros de Caminos, Canales y Puertos de España. Spanish aerial ferry of the Niagara. Since the middle of the 19th century, several mechanical devices were known, including integrators, multipliers, etc. The work of Torres in this matter is framed within this tradition, which began in 1893 with the presentation of the "Memória sobre las máquinas algébricas" ("Memory about algebraic machines") at the Spanish Royal Academy of Sciences in Madrid. This paper was commented in a report by Eduardo Saavedra in 1894 and published in the Revista de Obras Públicas [es]. Saavedra, who considered Torres' calculating machine as "an extraordinary event in the course of Spanish scientific production", recommended that the final project of the device be financed. In 1895 Torres presented "Sur les machines algébriques", accompanied by a demonstration model, at the Bordeaux Congress of the Association pour l'Avancement des Sciences, and in Paris in the Comptes rendus de l'Académie des Sciences. Later on, in 1900, he presented a more detailed work, "Machines à calculer" ("Calculating machines") at the Paris Academy of Sciences. The commission formed by Marcel Deprez, Henri Poincaré and Paul Appell, asked the academy for its publication, where they reported favorably: "In Mécanique analytique, Joseph-Louis Lagrange considers material systems whose connections are expressed by relationships between the coordinates or parameters used to define the position of the system. We can, and this is what Mr. Torres does, take the opposite point of view." Concluding: "In short, Mr. Torres has given a theoretical, general and complete solution to the problem of the construction of algebraic and transcendental relations by means of machines; moreover, he has effectively constructed machines that are easy to use for the solution of certains types of algebraic equations that are frequently encountered in applications." These studies explored the mathematical and physical parallels behind analog computation using continuous quantities, and how such relationships—expressed through mathematical formulas—can be mechanically implemented. The study included complex variables and used the logarithmic scale. From a practical standpoint, it showed that mechanisms such as turning disks could be used endlessly with precision, so that changes in variables were unlimited in both directions. Torres developed a whole series of analogue mechanical calculating machines that used certain elements known as arithmophores, which consisted of a moving part and an index that made it possible to read the quantity according to the position shown thereon. The aforesaid moving part was a graduated disk or a drum turning on an axis. The angular movements were proportional to the logarithms of the magnitudes to be represented. Between 1910 and 1920, using a number of such elements, Torres built a machine that was able to compute the roots of arbitrary polynomials of order eight, including the complex ones, with a precision down to thousandths. This machine could calculated the equation: α = A 1 X a + A 2 X b + A 3 X c + A 4 X d + A 5 X e A 6 X f + A 7 X g + A 8 X h {\displaystyle \alpha ={\frac {A_{1}X^{a}+A_{2}X^{b}+A_{3}X^{c}+A_{4}X^{d}+A_{5}X^{e}}{A_{6}X^{f}+A_{7}X^{g}+A_{8}X^{h}}}\,} where X is the variable and A1 ... A8 is the coefficient of each term. Considering the case of α = 1, it becomes the following formula, and the root of the algebraic equation can be obtained: A 1 X a + A 2 X b + A 3 X c + A 4 X d + A 5 X e − A 6 X f − A 7 X g − A 8 X h = 0 {\displaystyle A_{1}X^{a}+A_{2}X^{b}+A_{3}X^{c}+A_{4}X^{d}+A_{5}X^{e}-A_{6}X^{f}-A_{7}X^{g}-A_{8}X^{h}=0\,} By calculated each term on a logarithmic scale, they can be calculated only by sums and products like A1 + a × log(X), which can handle a very wide range of values, and the relative error during calculation is constant regardless of the size of the value. However, to calculate the sum of each term, it is necessary to accurately obtain log(u + v) from the calculated values log(u) and log(v) on a logarithmic scale. For this calculation, Torres invented a unique mechanism called the "endless spindle" ("fusee sans fin"), a complex differential gear using a helical gear shaped like a wine bottle, which allowed the mechanical expression of the relation y = log ( 10 x + 1 ) {\displaystyle y=\log(10^{x}+1)} . Putting log(u) – log(v) = log(u/v) = V, then u/v = 10 V, and the following formula is used to calculate log(u + v): log ( u + v ) = log ( v ( u / v + 1 ) ) = log ( v ) + log ( u / v + 1 ) = log ( v ) + log ( 10 V + 1 ) {\displaystyle \log(u+v)=\log(v(u/v+1))=\log(v)+\log(u/v+1)=\log(v)+\log(10^{V}+1)\,} , the same technique which is the basis of the modern electronic logarithmic number system. Torres devised another machine around 1900 with a small computing using gears and linkages to obtain the complex number solution of the quadratic equation X2 – pX + q = 0. Nowadays, all these machines are kept in the Torres Quevedo Museum at the School of Civil Engineering of the Technical University of Madrid. In 1902, Torres started the project of a new type of dirigible that would solve the serious problem of suspending the gondola,. He applied for a patent in France wrote "Note sur le calcul d'un ballon dirigeable a quille et suspentes interieures" ("Note on the calculus of a dirigible balloon with interior suspension and keel"), and presented both to Madrid and Paris' Academies of Science. By the end of that year the report at Paris's Academy of Science was included in the French journal L'Aérophile, and an English-language summary was published in the British The Aeronautical Journal. In 1904, Torres was appointed director of the Centre for Aeronautical Research in Madrid, a civil institution created by the government of Spain "for the technical and experimental study of the air navigation problem and the management of remote engine maneuvers." From March 1905, with Army Engineer Captain Alfredo Kindelán as Technical Assistant, he supervised the construction of the first Spanish dirigible in the Army Military Aerostatics Service, located in Guadalajara, which was completed in June 1908. The new airship, named Torres Quevedo in his honour, made successful test flights with passengers in the gondola. Despite this, in 1907 and 1909 he had requested an improved patent for his airship in France. He moved all the material to a rented hangar in Sartrouville (Paris), beginning a collaboration with the Société Astra, a new Aeronautical Society integrated in the conglomerate of French petroleum businessman Henri Deutsch de la Meurthe and directed by Édouard Surcouf, who had been familiar with Torres' work since 1901. The Astra company managed to buy the patent with a cession of rights extended to all countries except Spain, making the use of said system free in the country. In 1911, the construction of dirigibles known as the Astra-Torres airships was begun and Torres would receive royalties of 3 francs for every m3 of each airship sold. In 1910, Torres also drew up designs for a 'docking station' to find a solution to the slew of problems faced by airship engineers in docking dirigibles. He proposed the idea of attaching an airship's nose to a mooring mast and allowing the airship to weathervane with changes of wind direction. The use of a metal column erected on the ground, the top of which the bow or stem would be directly attached to (by a cable) would allow a dirigible to be moored at any time, in the open, regardless of wind speeds. Torres' design also called for the improvement and accessibility of temporary landing sites, where airships were to be moored for the purpose of disembarkation of passengers. The patent was presented in February 1911 in Belgium, and later to France and the United Kingdom in 1912, which he named "Improvements in Mooring Arrengements for Airships". Mooring mast structure following his design became widely used as it allowed an unprecedented accessibility to dirigibles, eliminating the manhandling required when placing an airship in its hangar. In Issy-les-Moulineaux (south-west of Paris) in February 1911, the trials of 'Astra-Torres no.1' were successful, with a volume of 1590m³ and a speed of up to 53 km/h. Other Astra-Torres dirigibles followed, including the Astra-Torres XIV (HMA.No 3 to the Royal Naval Air Service), which broke the then world speed record for airships in September 1913 by reaching 83.2 km/h, and the Pilâtre de Rozier (Astra-Torres XV) named after the aerostier Jean-François Pilâtre de Rozier, which at 24,300 m3 was the same size of the German 'Zeppelins' and could reach speeds of around 85 km/h. The distinctive trilobed design was also employed in the United Kingdom in the Coastal, C Star, and North Sea airships. The Entente powers used these dirigibles during the First World War (1914–1918) for diverse tasks, principally to the escort of convoys, the continuous surveillance of coasts and the search, from bases in Marseille, Tunisia and Algeria, for German submarines in the Bay of Biscay, the English Channel and the Mediterranean Sea. In 1919, Torres designed, based on a proposal from engineer Emilio Herrera Linares, a transatlantic dirigible, which was named Hispania, aiming to claim the honour of the first transatlantic flight for Spain. Owing to financial problems, the project was finally not carried out. The success of the trilobed blimps during the war even drew the attention of the Imperial Japanese Navy in 1922, who acquired the Nieuport AT-2 with almost 263 ft long, maximum diameter 54 ft and with a hydrogen capacity of 363,950 ft 3. This type of non-rigid airship continued to be manufactured in various countries during the post war era, especially those by the French Zodiac Company which influenced the design of most later dirigibles. Torres was a pioneer in remote control technology. He began to develop a radio control system around 1901 or 1902, as a way of testing his airships without risking human lives. Between 1902 and 1903, he applied for patents in France, Spain, and Great Britain, under the name "Systéme dit Télékine pour commander à distance un mouvement mécanique" ("Means or method for directing mechanical movements at or from a distance"). On 3 August 1903, he presented the Telekino at the French Academy of Sciences, together with a detailed memory, and making a practical demonstration to its members. For the construction of this first model, Torres received help from Gabriel Koenigs, director of the Mechanics Laboratory of the Sorbonne, and Octave Rochefort, who collaborated by providing wireless telegraphy devices. In 1904 Torres chose to conduct initial Telekino testings in the Beti Jai fronton of Madrid, which became the temporary headquarters of the Centre for Aeronautical Research, first in an electric three-wheeled land vehicle with an effective range of just 20 to 30 meters, which has been considered the first known example of a radio-controlled unmanned ground vehicle (UGV). In 1905, Torres tested a second model of the Telekino remotely controlling the maneuvers of an electric boat in the pond of the Casa de Campo in Madrid, achieving distances of up to about 250 m, and later testing a dinghy on the Bilbao Abra from the terrace of the Club Marítimo in the presence of the president of the Provincial Council and other authorities. Witness to the success of these tests, José Echegaray highlighted how "no one moves" the Telekino, "it moves automatically." It was an automaton of "a certain intelligence, not conscious, but disciplined"; "a material device, without intelligence, interpreting, as if it were intelligent, the instructions communicated to it in a succession of Hertzian waves." These feats were also echoed in the international press. On 25 September 1906, in the presence of the king Alfonso XIII and before a great crowd, Torres successfully demonstrated the invention in the port of Bilbao, guiding the boat Vizcaya from the shore with people on board, demonstrating a standoff range of 2 km. By applying the Telekino to electrically powered vessels, he was able to select different positions for the steering engine and different velocities for the propelling engine independently. He was also able to act over other mechanisms such a light, for switching on or off, and a flag, for raising or dropping it, at the same time. Specifically, Torres was able to do up to 19 different actions with his prototypes. The positive results of those experiences encouraged Torres to apply the Spanish government for the financial aid required to use his Telekino to steer submarine torpedoes, a technological field which was just starting out. His application was denied, which caused him to abandon the improvement of the Telekino. On 15 March 2007, the prestigious Institute of Electrical and Electronics Engineers (IEEE) dedicated a Milestone in Electrical Engineering and Computing to the Telekino, based on the research work developed at Technical University of Madrid by Prof. Antonio Pérez Yuste, who was the driving force behind the Milestone nomination. In 1907, Torres introduced a formal language for the description of mechanical drawings, and thus for mechanical devices, in Vienna. He previously published "Sobre un sistema de notaciones y símbolos destinados a facilitar la descripción de las máquinas" ("System of notations and symbols intended to facilitate the description of the machines") in the Revista de Obras Públicas. According to the Austrian computer scientist Heinz Zemanek, this was equivalent to a programming language for the numerical control of machine tools. He defined a table of symbols, a collection of rules and, as usual in his works, applied them to an example. This symbolic language reveals Torres' main capacities, both his ability to detect a problem, in this case a social problem of origin and its technical consequences, as well as his capacity for creation – invention – to give a rational, properly technical response. In the words of Torres: "Charles Babbage and Franz Reuleaux – and I suppose others as well, although I don't have news of them – have tried, without any success, to put remedy to this inconvenience; but although these eminent authors have failed, should not be a sufficient reason to abandon such an important effort". Babbage, Reuleaux and Torres failed. The world of machines continues without any other symbolic language than descriptive geometry. As a member of the steering committee of the Junta para Ampliación de Estudios [es] (JAE) established in 1907 in Madrid to promote research and scientific education in Spain, Torres played a leading role in the creation of three key state agencies that were the models for the JAE's support to research, regardless of the discipline: the Laboratory of Automation (1907) – of which he was named director, the construction of instruments – the Laboratories Association (1910) – the union of state laboratories and workshops – and the Institute of Science Materials (1911) – the budget allocation. The Laboratory of Automation produced the most varied instruments; it not only built its own inventions, but also provided services and support to universities and researchers of the JAE. Torres, the physicist Blas Cabrera, and Juan Costa, the head of the workshop, jointly designed several scientific instruments (Weiss-type electromagnet, an X-ray spectrometer, a mechanism to handle through remote control a Bunge scale, a reservoir of variable height with micrometer movements for magnetic-chemical measurements, and some on). Ángel del Campo [es], head of the Spectroscopy Section of the Laboratory of Physical Research and Miguel A. Catalán's teacher, ordered Torres's workshop a spectrographic equipment; Manuel Martínez Risco [es] requested an interferometer for a variable distance, Michelson- type; Juan Negrín requested a stalagmometer, and Santiago Ramón y Cajal commissioned a microtome and panmicrotome, and a projector for film screenings. The development of the Laboratory of Automation reached its peak with the reform of the Palace of the Arts and Industry [es], to house the School of Industrial Engineers and the JAE, and the National Museum of Natural Sciences, also expanding the own Laboratory. In 1939 the Laboratory of Automation gave rise to the Torres Quevedo Institute of the Spanish National Research Council (Consejo Superior de Investigaciones Científicas, CSIC). By the beginning of 1910 Torres commenced his work to make a chess-playing automaton, which he dubbed El Ajedrecista (The Chess Player). As opposed to The Turk and Ajeeb, El Ajedrecista was an electromechanical machine with true integrated automation that could automatically play a king and rook endgame against the king from any position, without any human intervention. The pieces had a metallic mesh at their base, which closed an electric circuit that encoded their position in the board. When the black king was moved by hand, an algorithm calculated and performed the next best move for the white player. If an illegal move was made by the opposite player, the automaton would signal it by turning on a light. If the opposing player made three illegal moves, the automaton would stop playing. The automaton does not deliver checkmate in the minimum number of moves, nor always within the 50 moves allotted by the fifty-move rule, because of the simple algorithm that calculates the moves. It did, however, checkmate the opponent every time. Claude Shannon noted in his work Programming a Computer for Playing Chess (1950) that Torres' machine was quite advanced for that period. The device has been considered the first computer game in history. This example recorded in portable game notation shows how White checkmates the black King, following Torres' algorithm: It created great excitement when it made its public debut at the University of Paris in 1914. Its internal construction was published by Henri Vigneron in the French magazine La Nature. On 6 November 1915 Scientific American magazine in their Supplement 2079 pp. 296–298 published an illustrated article entitled "Torres and His Remarkable Automatic Devices. He Would Substitute Machinery for the Human Mind". It was summarized as follows: "The inventor claims that the limits within which thought is really necessary need to be better defined, and that the automaton can do many things that are popularly classed with thought". In November 1922, about to turn 70, Torres finished the construction designs of the second chess player, in which, under his direction, his son Gonzalo had introduced various improvements. The mechanical arms to move pieces were replaced for electromagnets located under the board, sliding the pieces from one square to another. This version included a gramophone, with a voice recording announcing checkmate when the computer won the game. Torres initially presented it in 1923 in Paris. His son later exposed the advanced machine at several international meetings, introducing it to a wider audience at the 1951 Paris conference on computers and human thinking. Norbert Wiener played on 12 or 13 January. El Ajedrecista defeated Savielly Tartakower at the conference, being the first Grandmaster to lose against a machine. It was also demonstrated at the 1958 Brussels World's Fair, Heinz Zemanek, who played against that device, described it as "a historical witness of automaton artistry that was far ahead of its time. Torres created a prefect algorithm with 6 subrules which he realized with the technological means of that time, essentially with levers, gearwheels, and relays." It has been commonly assumed (see Metropolis and Worlton 1980) that Charles Babbage's work on a mechanical digital program-controlled computer, which he started in 1835 and pursued off and on until his death in 1871, had been completely forgotten and was only belatedly recognized as a forerunner to the modern digital computer. Ludgate, Torres y Quevedo, and Bush give the lie to this belief, and all made fascinating contributions that deserve to be better known. — Brian Randell, presentation at MIT (1980), printed in Annals of the History of Computing, IEEE (October 1982) On 19 November 1914, Torres published "Ensayos sobre Automática. Su definición. Extensión teórica de sus aplicaciones" (Essays on Automatics. Its Definition – Theoretical Extent of Its Applications) in the Revista de Obras Públicas. It was translated into French with the title "Essais sur l'Automatique" in the Revue Générale des Sciences Pures et Appliquées, 1915, vol. 2, pp. 601–611. This paper is Torres' major written work on the subject he called Automatics, "another type of automaton of great interest: those that imitate, not the simple gestures, but the thoughtful actions of a man, and which can sometimes replace him". He drew a distinction between the simpler sort of automaton, which has invariable mechanical relationships and the more complicated, interesting kind, whose relationships between operating parts alter "suddenly when necessary circumstances arise". Such an automaton must have sense organs, that is, "thermometers, magnetic compasses, dynamometers, manometers", and limbs, as Torres called them, mechanisms capable of executing the instructions that would come from the sense organs. The automaton postulated by Torres would be able to make decisions so long as "the rules the automaton must follow are known precisely". The paper provides the main link between Torres and Babbage. He gives a brief history of Babbage's efforts at constructing a mechanical Difference engine and Analytical engine. He described the Analytical Engine as exemplifying his theories as to the potential power of machines, and takes the problem of designing such an engine as a challenge to his skills as an inventor of electromechanical devices. Contains a complete design (albeit one that Torres regarded as theoretical rather than practical) for a machine capable of calculating completely automatically the value of the formula a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values of the variables involved. It demonstrates cunning electromechanical gadgets for storing decimal digits, for performing arithmetic operations using built-in function tables, and for comparing the values of two quantities. The whole machine was to be controlled from a read-only program (complete with provisions for conditional branching), represented by a pattern of conducting areas mounted around the surface of a rotating cylinder. It also introduced the idea of floating-point arithmetic, which historian Randell says was described "almost casually", apparently without recognizing the significance of the discovery. Torres proposed a format that showed he understood the need for a fixed-size significand as is presently used for floating-point data. He did it in the following way: "Very large numbers are as embarrassing in mechanical calculations as in usual calculations (Babbage planned 50 wheels to represent each variable, and even then they would not be sufficient if one does not have recourse to the means that I will indicate later, or to another analogue). In these, they are usually avoided by representing each quantity by a small number of significant figures (six to eight at the most, except in exceptional cases) and by indicating by a comma or zeros, if necessary, the order of magnitude of the units represented by each digit. Sometimes also, so as not to have to write a lot of zeros, we write the quantities in the form n x 10 m {\displaystyle ^{m}} . We could greatly simplify this writing by arbitrarily establishing these three simple rules: 1. n will always have the same number of digits (six for example). 2. The first digit of n will be of order of tenths, the second of hundredths, etc. 3. One will write each quantity in the form: n; m. Thus, instead of 2435.27 and 0.00000341682, they will be respectively, 243527; 4 and 341862; −5. I have not indicated a limit for the value of the exponent, but it is obvious that, in all the usual calculations, it will be less than one hundred, so that, in this system, one will write all the quantities which intervene in calculations with eight or ten digits only." The paper ends with a comparison of the advantages of electromechanical devices that were all that were available to Babbage. It establishes that Torres would have been quite capable of building a general-purpose electromechanical computer more than 20 years ahead of its time, had the practical need, motivation, and financing been present. "The achievements of George Stibitz, Howard Aiken and IBM, and Konrad Zuse crown the transitory but capital period of relays and theoreticians. This stage of the march towards automatic calculation was built on a summary and proven technology, that of electromagnetic relays. The very modesty of this technological level contributes to giving a brilliant relief to the quality of the intellectual contributions of Torres y Quevedo, Alan Turing, and Claude Shannon." — Robert Ligonnière, Préhistoire et Histoire des ordinateurs (1987) Torres went ahead to prove his theories with a series of working prototypes. He demonstrated twice, in 1914 and in 1920, that all of the cogwheel mechanisms of a calculating machine like that of Babbage could be implemented using electromechanical parts. His 1914 analytical machine used a small memory built with electromagnets, capable of evaluating p × q – b. In 1920, during a conference in Paris, commemorating the centenary of the invention of the mechanical arithmometer, Torres surprised attendees with the demonstration of the "Arithmomètre Electroméchanique" (Electromechanical Arithmometer). It consisted of an arithmetic unit connected to a (possibly remote) typewriter, on which commands could be typed and the results printed automatically (e.g. "532 × 257" and "= " from the typewriter). This calculator was not programmable, but was able to print the numerical value of the answer. From the user interface point of view, this machine can be regarded as the predecessor of current computers that use a keyboard as an input interface. In terms of usage, it was also assumed that calculations could be performed remotely by extending electric wires, and is considered to be a rudimentary version of today's online systems that use communication lines. Torres had no thought of making such a machine commercially, viewing it instead as a means of demonstrating his ideas and techniques. Furthermore, in his paper about this device, he pointed out the need for various automatic machines to represent continuous numerical values as finite, discrete values for processing and evaluation, which corresponds to current digital data. From 26 April to 23 September 1990, an exposition called De la Machine à Calculer de Pascal à l'Ordinateur. 350 annes d'Informatique was held at the Musée des Arts et Métiers in Paris, where Torres' invention would be recognized as one of the first digital calculation systems: "In 1920, the Spaniard Leonardo Torres Quevedo built a fully automatic electromagnetic arithmometer. To do this, he used relay technology, developed for the needs of the telephone." In those days when the outbreak of the Great War was anticipated, Torres designed a transport ship intended to accompany fleets. On 30 July 1913, he patented the "Buque campamento" ("Camp-Vessel"), an airship carrier with a mooring mast and a hold large enough to house up to two inflated units, and hydrogen cylinders. He had thought of the possibility of combining aeronautics with the navy in this way, offering his patent to Vickers Limited, although the conglomerate did not show interest in the project. Negotiations continued, and Torres reached Admiral Reginald Bacon, who, on 17 March 1914, wrote from the Coventry Ordnance Works that "the experience of the Navy has invariably been that any auxiliary craft carried on board ship are of very little real service". A few years later, in 1922, the Spanish Navy would construct a real airship carrier, the Dédalo, to be used in the war against Morocco. In 1916 Torres patented in Spain a new kind of ship, a multihull steel vessel which received the name of "Binave" ("Twin Ship"). He applied for the patent of the Binave in the United Kingdom with the name "Improvements in Ships" in 1917, and it was built by the Euskalduna company in Bilbao in 1918, with several test departures such as the successful round trip to Santoña on 28 September. The tests would be resumed in 1919, obtaining the certificate of implementation of the patent on 12 November of that year. The design introduces new features, including two 30 HP Hispano-Suiza marine engines, and the ability to modify its configuration when sailing, positioning two rudders at the stern of each float, and placing the propellers aft too. As a result of the experience acquired in the tests, to improve stability in 1920 it was considered appropriate to add a lower keel to each of the floats proposed in the patent, making it similar to modern catamarans, whose development would become widespread from the 1990s onwards. Apart from the aforementioned inventions, Torres patented the "Indicadores coordinados" ("Coordinate Indicator", 1901), a guidance system for vehicles and pedestrians using markers installed on streetlights throughout an entire city, which he proposed for Madrid and Paris under the name of "Guide Torres", the "Dianemologo" (1907), an apparatus for copying a speech as it is delivery without the need for shorthand, "Globos fusiformes deformables" ("Deformable Fusiform Balloons", 1914), a fusiform envelope with a variable section depending on the volume of the hydrogen contained, and "Enclavamientos T.Q." ("Interlocks T.Q.", 1918), a railway interlock of his own design to protect the movement of trains within a certain area. In the last years of his life, Torres turned his attention to the field of educational disciplines, to investigate those elements or machines that could help educators in their task. His last patents related to subjects such as typewriters and their improvement (1922–23), the marginal pagination of books (1926), and, especially, the "Puntero Proyectable" (Projectable Pointer, 1930), and the "Proyector Didáctico" (Didactic Projector, 1930). The Projectable Pointer was based on the shadow produced on a plate or screen by an opaque body in motion. The presenter had the option to move the pointer on any place on the plate (today a slide) at operate with an articulated system. The Didactic Projector improved the way slides were placed on glass plates for projection. In the early 1900s, Torres learned the international language Esperanto, and was an advocate of the language throughout his life. From 1922 to 1926 he participated in the work of the International Committee on Intellectual Cooperation of the League of Nations, where such figures as Albert Einstein, Marie Curie, Gilbert Murray and Henri Bergson, its first president, attended. Torres proposed to the Committee that it study the role of an artistic auxiliary language to facilitate the scientific ones relations between the peoples. Although almost half of the Committee members were in favor of Esperanto, his motion was strongly opposed by President Bergson, receiving a clear notice from French diplomats to put the influence of French culture first, which included the French ambassador in Bern, who considered Torres a "farouchement espérantiste" ("fierce Esperantist"). In 1925 he participated as the official representative of the Spanish government in the "Conference on the Use of Esperanto in Pure and Applied Sciences" held in Paris, together with Vicente Inglada Ors [es] and Emilio Herrera Linares. That same year, he joined to the Honorary Committee of the Spanish Association of Esperanto [es] (HEA) founded by Julio Mangada, and continued defending the language in other forums until his death in 1936. In 1910 Torres traveled to Argentina with the Infanta Isabel to assist at the International Scientific Congress held in Buenos Aires, one of the events organized to mark the centenary of the independence of Argentina. At the congress, he proposed, along with the Argentinean engineer Santiago Barabino, the constitution of a Spanish-American board of scientific technology, which would eventually become the "Unión Internacional Hispano–Americana de Bibliografía y Terminología Científicas". The first task was the publication of a technological dictionary of the Spanish language to tackle the problems caused by the increasing use of scientific and technological neologisms, as well as the adaptation of words from other languages, confronted with the avalanche of foreign terms. As a result of the work of this board, the Diccionario Tecnológico Hispanoamericano (Hispanic American Technological Dictionary) began to be published in fascicles between 1926 and 1930, although it did not see a complete edition until 1983, with a second expanded edition in 1990. Distinctions Over the years, Torres received an increasing number of decorations, prizes, and societal memberships, both Spanish and from other countries. In 1901, he entered the Spanish Royal Academy of Sciences in Madrid for his work carried out in these years about algebraic machines, an entity of which he was its president between 1928 and 1934. In 1916 King Alfonso XIII of Spain bestowed the Echegaray Medal upon him; and in 1918, he declined the offer of the position of Minister of Development. In 1920, he was admitted to the Real Academia Española, to fill the seat N vacated by the death of Benito Pérez Galdós. In his acceptance speech he said in a humble and funny way: "You were wrong in choosing me as I do not have that minimum culture required of an academic. I will always be a stranger in your wise and learned society. I come from very remote lands. I have not cultivated literature, nor art, nor philosophy, nor even science, at least in its higher degrees… My work is much more modest. I spend my busy life solving practical mechanics problems. My laboratory is a locksmith shop, more complete, better assembled than those usually known by that name; but destined, like all, to project and build mechanisms…" That same year Torres was elected President of the Spanish Royal Physics Society and the Royal Spanish Mathematical Society, the latter position he held until 1924, and became a member of the Mechanics Section of the Paris Academy. In 1921 he was appointed President of the International Spanish-American Union of Scientific Bibliography and Technology. From 1921 to 1928 he assumed the presidency of the Spanish section of the International Committee for Weights and Measures, where due to his experience in development of instruments, contributed to the improvement of measurements made in the laboratories of the International Bureau of Weights and Measures (BIPM). In 1923 he became an Honorary Academician of the Geneva Society of Physics and Natural History [fr]. In 1925 he was promoted to Corresponding Member of the Hispanic Society of America. In 1926 he became Honorary Inspector General of the Corps of Civil Engineers. On 27 June 1927 he was named one of the twelve foreign associate academicians of the French Academy of Sciences with 34 votes in favor for his entry, surpassing Ernest Rutherford (4 votes) and Santiago Ramón y Cajal (2 votes). His accolades also include: Personal life, religious beliefs and death On 16 April 1885 Torres married Luz Polanco y Navarro (1856–1954) in Portolín (Molledo). The marriage lasted 51 years and had eight children (3 sons and 5 daughters: Leonardo (born 1887, died 2 years old in 1889), Gonzalo (born 1893, died in 1965, who also became an engineer, and used to work as an assistant of his father), Luz, Valentina, Luisa, Julia (also died young), Joaquina, and Fernando). After the death of his first son, in 1889, Torres moved with his family to Madrid with the firm intention of putting into practice the projects he had devised in previous years. During this time he attended the Athenæum in the Spanish capital and the literary gatherings at the Café Suizo [es], but generally without participating in debates and discussions of a political nature. He lived for many years in Calle de Válgame Dios [es] nº 3. Torres was a devout Catholic who usually read the catechism and take communion every First Friday of the month. He read the catechism as if intimately preparing himself for the next peaceful end that awaited him. His daughter Valentina told him on one occasion: "Dad, maybe you don't fully understand the mysteries that faith offers us, just as I don't understand your inventions either" and he responded affectionately: "Oh daughter, it's just that from God to me there is an infinite distance!". Once the Spanish Civil War began, his daughter Luz was arrested by the militia, and the family had to resort to the fact that Torres was a Commander of the Legion of Honour to save her life, with the intervention of the French Embassy included. In his last moments, his family managed to have the sacraments administered to him despite the difficulties due to religious persecution. At the moment of receiving the extreme unction, he pronounced his last words: "Memento homnia, quia pulvis eris et in pulverem reverteris" ("Remember, man, you are dust and to dust you will return"). On 18 December 1936, after a progressive illness, Torres died at his son Gonzalo's home in Madrid, in the middle of the Civil War, ten days before his eighty-fourth birthday. He was initially buried in the Cementerio de la Almudena, and later removed in 1957 to the monumental Saint Isidore Cemetery. Legacy "The learned Spanish engineer Torres Quevedo – today a foreign associate of our Academy of Sciences – who is perhaps the most prodigious inventor of our time, at least in terms of mechanisms, has not been afraid to address Babbage's problem in turn..." "What perspectives do not open such marvels about the possibilities of the future regarding the reduction to a purely mechanical process of any operation that obeys mathematical rules! In this area, the way was opened, almost three centuries ago, by the genius of Pascal; in recent times, the genius of Torres Quevedo has managed to make it penetrate into regions where we would never have dared to think a priori that it could have access." — Philbert Maurice d'Ocagne, Hommes et choses de science, 1930 The distressing circumstances that Spain was going through during its Civil War meant that Torres' death in 1936 went somewhat unnoticed. However, newspapers such as The New York Times and the French mathematician Maurice d'Ocagne reported on his demise by publishing obituaries and articles in 1937–38, with d'Ocagne giving some lectures about his research work in Paris and Brussels. In the years following his death, Torres was not forgotten. Created the Spanish National Research Council (CSIC) in 1939, the architect Ricardo Fernández Vallespín [es] was commissioned with the project and construction of a large building in Madrid to house the new Institute «Leonardo Torres Quevedo» of Applied Physics, which was completed in 1943. Its dedicated to "designing and manufacturing instruments and investigating mechanical, electrical and electronic problems", and was the germ of the current Institute of Physical and Information Technologies "Leonardo Torres Quevedo" (ITEFI). In 1940 his name was among those selected by American philanthropist Archer Milton Huntington to inscribe on the building of the Hispanic Society of America. In 1953, the commemorative events for the centenary of his birth began, which took place at the Spanish Royal Academy of Sciences with the participation of high academic, scientific and university figures from the country and abroad, among them Louis Couffignal, Charles Lambert Manneback, and Aldo Ghizzetti [it]. Two postage stamps were issued in Spain to honoured him in 1955 and 1983, the last one next to the image of the Niagara cable car, regarded as a work of genius. In 1965, the City Council of Madrid dedicated a commemorative plaque to him in his residence building at Válgame Dios, 3, informing the people of Madrid that "the scientist who brought so much glory to Spain lived in that place." In 1978 his work was honoured in Madrid at the Palacio de Cristal del Retiro, an exposition that was organized by the College of Civil Engineers led by José Antonio Fernández Ordóñez [es]. The Leonardo Torres Quevedo National Research Award [es] was established in 1982 in Spain by the Ministry of Science in recognition of the merits of Spanish scientists or researchers in the field of engineering. The same year the Leonardo Torres Quevedo Foundation [es] (FLTQ) was created under his name as a non-profit organization to promote scientific research within the framework of the University of Cantabria and to training professionals in this area. The Foundation had its headquarters at the University of Cantabria School of Civil Engineering. A bronze statue on a stone pedestal was erected in 1986 on the occasion of the fiftieth anniversary of his death. The work was commissioned to the sculptor Ramón Muriedas [es] and its located in Santa Cruz de Iguña, Torres' birth town. Between the end of the 1980s and the mid-1990s, three symposiums were held in Spain on his figure titled Leonardo Torres Quevedo, su vida, su tiempo, su obra in Molledo (1987), Camargo (1991) and Pozuelo de Alarcón (1995). On 19 July 2008, Spain's National Lottery [es] commemorated the centenary of the Torres Quevedo airship built in Guadalajara, which was the beginnings of the Spanish Air Force. In November, the Leonardo Torres Quevedo Centre was established in Santa Cruz, Molledo, dedicated to his life and work. On 28 December 2012, Google celebrated his 160th birthday with a Google Doodle. The company had also commemorated the 100th anniversary of El Ajedrecista, highlighting that it was a marvel of its time and could be considered the "grandfather" of current video games. A conference was organized on 7 November in cooperation with the School of Telecommunication Engineering of the Technical University of Madrid to exhibit Torres' devices. Since 2015, an image of his Mount Ulia aerial ropeway [es], a pioneering cable car built in San Sebastián in 1907 to transport people, can be seen on the 'visas' page of the Spanish passports. On 8 August 2016, the 100th Anniversary of the Whirlpool Aero Car was celebrated for its uninterrupted operation, without having had any accidents. The ceremony also included members of the Torres Quevedo family, who made a special trip from Spain to attend the anniversary celebrations and Carlos Gómez-Múgica [es], the Spanish Ambassador to Canada. According to Niagara Parks Commission Chair, Janice Thomson, "this morning's celebrations have allowed us to properly mark an important milestone in the history of the Niagara Parks Commission, all while recognizing the accomplishments and paying tribute to Leonardo Torres Quevedo, who through his work made a lasting impression on both the engineering profession and the tourism industry here in Niagara." In February 2022 was presented in Santander the new turbosail of La Fura dels Baus, La Naumon, a large white structure at the base of which stands out the figure of Leonardo Torres Quevedo, with whose name it was baptized the device. A museum called El Valle de los Inventos was opened in La Serna de Iguña, which offers a permanent exhibition about him and his inventions where guided tours, scientific workshops and an escape room are organized. On 4 July, the flag carrier Iberia received the fifth of the six Airbus A320neo planned for that year. This A320neo with registration EC-NTQ bears the name "Leonardo Torres Quevedo" in his honour. On 5 May 2023, the Instituto Cervantes opens the Caja de las Letras to house the "in memoriam" legacy of Leonardo Torres Quevedo. Among the deposited objects, letters and manuscripts; a dozen publications, with books, monographs or catalogues; postcards and a schedule of the Niagara Falls cable car designed by him, and the Milestone awarded by the Institute of Electrical and Electronics Engineers that recognizes the engineer's scoop in the development of remote-control in 1901 with the Telekino. Torres' granddaughter Mercedes Torres Quevedo expressed her gratitude to the institution on behalf of all her descendants for welcoming her grandfather's legacy and the "pride" of all of them for the scientific and humanistic work he carried out throughout of his life. His legacy has been deposited in box number 1275 and the keys in the hands of his descendants and the institution itself. In fiction Leonardo Torres Quevedo is a main character of the novel Los horrores del escalpelo (The Horrors of the Scalpel, 2011), written by Daniel Mares. The plot tells how the Spanish engineer travels to London in 1888 to find Maelzel's Chess Player, a mechanical automaton that was believed to have been lost for decades. Together with Raimundo Aguirre, a thief and murderer, who claims to have the clue to the lost automaton, he begins the search through the London underworld and Victorian high society. The search is interrupted due to the streets of the Whitechapel neighborhood dawn with corpses of prostitutes, which causes Torres and his partner Aguirre to become involved in the hunt for Jack the Ripper. Selected works See also References External links |
======================================== |
[SOURCE: https://www.reddit.com/help/healthycommunities/] | [TOKENS: 893] |
Moderator Code of Conduct Effective June 5, 2025. Reddit’s mission is to bring community, belonging, and empowerment to everyone in the world. Moderators are key to making this happen: you are at the frontlines using your creativity, decision-making, and passion to create fun and engaging spaces for redditors. The Moderator Code of Conduct serves to clarify our expectations of mods, help you develop subreddit rules and norms to create and nurture your communities, and empower you to make decisions more easily. Your role as a moderator is an important one in shaping a positive community experience. Whether you’re new to moderating, or have been moderating for years, our goal is to make sure you feel safe and supported. We also expect that moderators uphold the Reddit Rules and abide by Reddit’s User Agreement (especially Section 8), as well as make a concerted effort to remove and report violating content in their communities. Remember, your subreddit and moderator team can be held accountable for individual moderator actions. Given this, it’s important to continuously collaborate with your fellow mods to understand and adhere to the Moderator Code of Conduct. If you have any questions, please don’t hesitate to let us know. TL;DR: If you follow the tenets below, we’ll stay out of your hair. If you don't, we'll reach out to remedy any issues. Moderators are expected to uphold the Reddit Rules by setting community rules, norms, and expectations that abide by our site policies. Your role as a moderator means that you not only abide by our terms and the Reddit Rules, but that you actively strive to promote a community that abides by them, as well. This means that you should never create, approve, enable or encourage rule-breaking content or behavior. The content in your subreddit that is subject to the Reddit Rules includes, but is not limited to: Users who enter your community should know exactly what they’re getting into, and should not be surprised by what they encounter. It is critical to be transparent about what your community is and what your rules are in order to create stable and dynamic engagement among redditors. Moderators can ensure people have predictable experiences on Reddit by doing the following: While we allow meta discussions about Reddit, including other subreddits, your community should not be used to direct, coordinate, or encourage interference in other communities and/or to target redditors for harassment. As a moderator, you cannot interfere with or disrupt Reddit communities, nor can you facilitate, encourage, coordinate, or enable members of your community to do this. Interference includes: Whether your community is big or small, it is important for communities to be actively and consistently moderated. This will ensure that issues are being addressed, and that redditors feel safe as a result. Being active and engaged means that: Users expect that content in communities is authentic, and trust that moderators make choices about content based on community and sitewide rules. In order to maintain that trust, moderators are prohibited from taking moderation actions (including actions taken using mod tools, bots, and other services) in exchange for any form of compensation, consideration, gift, or favor from or on behalf of third parties. Some examples of moderator actions include, but are not limited to: Some examples of compensation include, but are not limited to: Events and engagements with third parties, activity in your subreddit from a brand or company, or employees of a company starting and/or maintaining a subreddit are allowed, so long as no compensation is received. We will strive to work with you to resolve issues without having to resort to restrictive measures. In most cases, we can achieve resolution and understanding through discussion, not remediation. If an Admin reaches out to let you know that you’ve violated the Moderator Code of Conduct, your cooperation and swift responsiveness can help to resolve the issue. Many issues can be resolved easily by, for example, updating the community rules, engaging in active moderation, and adding more moderators. You are welcome to ask questions and seek clarity. If we find the issues to be unresolvable via educational outreach or moderators refuse to collaborate or abide by the CoC, we may consider the following enforcement actions: We understand that moderating a community can be a challenge. The resources below can greatly assist you in curating a strong, stable community: To file a Moderator Code of Conduct report, please use this form. Copyright 2026 Reddit, Inc. All rights reserved. |
======================================== |
[SOURCE: https://github.com/features/issues] | [TOKENS: 771] |
Navigation Menu Search code, repositories, users, issues, pull requests... Provide feedback We read every piece of feedback, and take your input very seriously. Saved searches Use saved searches to filter your results more quickly To see all available qualifiers, see our documentation. Project planning for developers Create issues, break them into sub-issues, track progress, add custom fields, and have conversations. Visualize large projects as tables, boards, or roadmaps, and automate everything with code. Logos for Shopify, Vercel, Stripe, Ford, and NASA Break issues into sub-issues Tackle complex issues with sub-issues and track their status with progress indicators. Navigate the full scope of work all in one view. Streamline conversations Express ideas with GitHub Flavored Markdown, mention contributors, react with emoji, clarify with attachments, and see references from commits, pull requests, releases, and deploys. Coordinate by assigning contributors and teams, or by adding them to milestones and projects. All in a single timeline. Features Bored of boards? Switch to tables and roadmaps. Create views for how you work. Track metadata like iterations, priority, story points, dates, notes, and links. Add custom fields to projects and edit from the issue sidebar. Track the health of your current iteration cycle, milestone, or any other custom field you create with new project insights. Identify bottlenecks and issues blocking the team from making progress with the new burn up chart. Create templates to share and reuse when getting started with a new project. Share inspiration across teams and get started with a single click. Manage work automatically Accelerate your project planning with workflows. Automatically triage issues, set values for custom fields, or archive issues. Issues, where you need them Issues can be viewed, created, and managed in your browser, your favorite terminal, or on your phone or tablet. View, update, and create issues without ever leaving your terminal. Create and manage issues on the go with our native iOS and Android mobile apps. What developers are saying Flexible project planning for developers We all need a way to plan our work, track issues, and discuss the things we build. Our answer to this universal question is GitHub Issues, and it’s built-in to every repository. GitHub’s issue tracking is unique because of our focus on simplicity, references, and elegant formatting. With GitHub Issues, you can express ideas with GitHub Flavored Markdown, assign and mention contributors, react with emojis, clarify with attachments and videos, plus reference code like commits, pull requests, and deploys. With task lists, you can break big issues into tasks, further organize your work with milestones and labels, and track relationships and dependencies. We built GitHub Issues for developers. It is simple, adaptable, and powerful. As teams and projects grow, how we work evolves. Tools that hard-code a methodology are too specific and rigid to adapt to any moment. Often, we find ourselves creating a spreadsheet or pulling out a notepad to have the space to think. Then our planning is disconnected from where the work happens. The new Projects connect your planning directly to the work your teams are doing and flexibly adapt to whatever your team needs at any point. Built like a spreadsheet, project tables give you a live canvas to filter, sort, and group issues and pull requests. You can use it, or the accompanying project board, along with custom fields, to track a sprint, plan a feature, or manage a large-scale release. All users have access to the free tier of GitHub Issues and Projects. For more information about paid tiers, see our pricing page. Yes! GitHub Enterprise Server (GHES) support follows our regular cadence of one to two quarters before enabling the on-premises functionality. Site-wide Links Get tips, technical guides, and best practices. Twice a month. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Haskalah] | [TOKENS: 7063] |
Contents Haskalah The Haskalah (Hebrew: הַשְׂכָּלָה, literally, "wisdom", "erudition" or "education"), often termed the Jewish Enlightenment, was an intellectual movement among the Jews of Central and Eastern Europe, with a certain influence on those in Western Europe and the Muslim world. It arose as a defined ideological worldview during the 1770s, and its last stage ended around 1881, with the rise of Jewish nationalism. The movement advocated against Jewish reclusiveness, encouraged the adoption of prevalent attire over traditional dress, while also working to diminish the authority of traditional community institutions such as rabbinic courts and boards of elders. It pursued a set of projects of cultural and moral renewal, including a revival of Hebrew for use in secular life, which resulted in an increase in Hebrew found in print. Concurrently, it strove for an optimal integration in surrounding societies. Practitioners promoted the study of exogenous culture, style and language as well as the adoption of modern values. At the same time, economic production, and the taking up of new occupations was pursued. The Haskalah promoted rationalism, romanticism, liberalism, relativism, and enquiry, and is largely perceived as the Jewish variant of the general Age of Enlightenment. The movement encompassed a wide spectrum ranging from moderates, who hoped for maximal compromise, to radicals, who sought sweeping changes. In its various changes, the Haskalah fulfilled an important, though limited, part in the modernization of Central and Eastern European Jews. Its activists, the Maskilim, exhorted and implemented communal, educational and cultural reforms in both the public and the private spheres. Owing to its dual policies, it collided both with the traditionalist rabbinic elite, which attempted to preserve old Jewish values and norms in their entirety, and with the radical assimilationists who wished to eliminate or minimize the existence of the Jews as a defined collective. Definitions The Haskalah was multifaceted, with many loci which rose and dwindled at different times and across vast territories. The name Haskalah became a standard self-appellation in 1860, when it was taken as the motto of the Odessa-based newspaper Ha-Melitz, but derivatives and the title Maskil for activists were already common in the first edition of Ha-Meassef from 1 October 1783: its publishers described themselves as Maskilim. While Maskilic centres sometimes had loose institutions around which their members operated, the movement as a whole lacked any such. In spite of that diversity, the Maskilim shared a sense of common identity and self-consciousness. They were anchored in the existence of a shared literary canon, which began to be formulated in the first Maskilic locus at Berlin. Its members, like Moses Mendelssohn, Naphtali Hirz Wessely, Isaac Satanow and Isaac Euchel, authored tracts in various genres that were further disseminated and re-read among other Maskilim. Each generation, in turn, elaborated and added its own works to the growing body. The emergence of the Maskilic canon reflected the movement's central and defining enterprise, the revival of Hebrew as a literary language for secular purposes (its restoration as a spoken tongue occurred only much later). The Maskilim researched and standardized grammar, minted countless neologisms and composed poetry, magazines, theatrical works and literature of all sorts in Hebrew. Historians described the movement largely as a Republic of Letters, an intellectual community based on printing houses and reading societies. The Maskilim's attitude toward Hebrew, as noted by Moses Pelli, was derived from Enlightenment perceptions of language as reflecting both individual and collective character. To them, a corrupt tongue mirrored the inadequate condition of the Jews which they sought to ameliorate. They turned to Hebrew as their primary creative medium. The Maskilim inherited the Medieval Grammarians' – such as Jonah ibn Janah and Judah ben David Hayyuj – distaste of Mishnaic Hebrew and preference of the Biblical one as pristine and correct. They turned to the Bible as a source and standard, emphatically advocating what they termed "Pure Hebrew Tongue" (S'fat E'ver tzacha) and lambasting the Rabbinic style of letters, which mixed it with Aramaic as a single "Holy Tongue" and often employed loanwords from other languages. Some activists, however, were not averse to using Mishnaic and Rabbinic forms. They also preferred the Sephardi pronunciation, considered more prestigious, to the Ashkenazi one, which was linked with the Jews of Poland, who were deemed backward. The movement's literary canon is defined by a grandiloquent, archaic register copying the Biblical one and often combining lengthy allusions or direct quotes from verses in the prose. During a century of activity, the Maskilim produced a massive contribution, forming the first phase of modern Hebrew literature. In 1755, Moses Mendelssohn began publishing Qohelet Musar "The Moralist", regarded as the beginning of modern writing in Hebrew and the first journal in the language. Between 1789 and his death, Naphtali Hirz Wessely compiled Shirei Tif'eret "Poems of Glory", an eighteen-part epic cycle concerning Moses that exerted influence on all neo-Hebraic poets in the following generations. Joseph ha-Efrati Troplowitz [he] was the Haskalah's pioneering playwright, best known for his 1794 epic drama Melukhat Sha'ul "Reign of Saul", which was printed in twelve editions by 1888. Judah Leib Ben-Ze'ev was the first modern Hebrew grammarian, and beginning with his 1796 manual of the language, he authored books which explored it and were vital reading material for young Maskilim until the end of the 19th century. Solomon Löwisohn was the first to translate Shakespeare into Hebrew, and an abridged form of the "Are at this hour asleep!" monologue in Henry IV, Part 2 was included in his 1816 lyrical compilation Melitzat Yeshurun (Eloquence of Jeshurun). Joseph Perl pioneered satirist writings in his biting, mocking critique of Hasidic Judaism, Megaleh Tmirin "Revealer of Secrets" from 1819. Avraham Dov Ber Lebensohn was primarily a leading metricist, with his 1842 Shirei S'fat haQodesh "Verses in the Holy Tongue" considered a milestone in Hebrew poetry, and also authored biblical exegesis and educational handbooks. Abraham Mapu authored the first Hebrew full-length novel, Ahavat Zion "Love of Zion", which was published in 1853 after twenty-three years of work. Judah Leib Gordon was the most eminent poet of his generation and arguably of the Haskalah in its entirety. His most famous work was the 1876 epic Qotzo shel Yodh (Tittle of a Jot). Mendele Mocher Sforim was during his youth a Maskilic writer but from his 1886 Beseter ra'am (Hebrew: בסתר רעם),[a] (Hidden in Thunder), he abandoned its strict conventions in favour of a mixed, facile and common style. His career marked the end of the Maskilic period in Hebrew literature and the beginning of the Era of Renaissance. The writers of the latter period lambasted their Maskilic predecessors for their didactic and florid style, more or less paralleling the Romantics' criticism of Enlightenment literature. The central platforms of the Maskilic "Republic of Letters" were its great periodicals, each serving as a locus for contributors and readers during the time it was published. The first was the Königsberg (and later Berlin)-based Ha-Meassef, launched by Isaac Abraham Euchel in 1783 and printed with growing intervals until 1797. The magazine had several dozen writers and 272 subscribers at its zenith, from Shklow in the east to London in the west, making it the sounding board of the Berlin Haskalah. The movement lacked an equivalent until the appearance of Bikurei ha-I'tim in Vienna between 1820 until 1831, serving the Moravian and Galician Haskalah. That function was later fulfilled by the Prague-based Kerem Hemed from 1834 to 1857, and to a lesser degree by Kokhvei Yizhak, published in the same city from 1845 to 1870. The Russian Haskalah was robust enough to lack any single platform. Its members published several large magazines, including the Vilnius-based Ha-Karmel (1860–1880), Ha-Tsefirah in Warsaw and more, though the probably most influential of them all was Ha-Melitz, launched in 1860 at Odessa by Aleksander Zederbaum. While the partisans of the Haskalah were much immersed in the study of sciences and Hebrew grammar, this was not a profoundly new phenomenon, and their creativity was a continuation of a long, centuries-old trend among educated Jews. What truly marked the movement was the challenge it laid to the monopoly of the rabbinic elite over the intellectual sphere of Jewish life, contesting its role as spiritual leadership. In his 1782 circular Divrei Shalom v'Emeth (Words of Peace and Truth), Hartwig Wessely, one of the most traditional and moderate maskilim, quoted the passage from Leviticus Rabbah stating that a Torah scholar who lacked wisdom was inferior to an animal's carcass. He called upon the Jews to introduce general subjects, like science and vernacular language, into their children's curriculum; this "Teaching of Man" was necessarily linked with the "Teaching (Torah) of God", and the latter, though superior, could not be pursued and was useless without the former. Historian Shmuel Feiner discerned that Wessely insinuated (consciously or not) a direct challenge to the supremacy of sacred teachings, comparing them with general subjects and implying the latter had an intrinsic rather than merely instrumental value. He therefore also contested the authority of the rabbinical establishment, which stemmed from its function as interpreters of the holy teachings and their status as the only truly worthy field of study. Though secular subjects could be and were easily tolerated, their elevation to the same level as sacred ones was a severe threat, and indeed mobilized the rabbis against the nascent Haskalah. The potential of "Words of Peace and Truth" was fully realized later, by the second generation of the movement in Berlin and other radical maskilim, who openly and vehemently denounced the traditional authorities. The appropriate intellectual and moral leadership needed by the Jewish public in modern times was, according to the maskilim, that of their own. Feiner noted that in their usurpation of the title of spiritual elite, unprecedented in Jewish history since the dawn of Rabbinic Judaism (various contestants before the Enlightened were branded as schismatics and cast out), they very much emulated the manner in which secular intellectuals dethroned and replaced the Church from the same status among Christians. Thus the maskilim generated an upheaval which – though by no means alone – broke the sway held by the rabbis and the traditional values over Jewish society. Combined with many other factors, they laid the path to all modern Jewish movements and philosophies, either those critical, hostile or supportive to themselves. The maskilim sought to replace the framework of values held by the Ashkenazim of Central and Eastern Europe with their own philosophy, which embraced the liberal, rationalistic notions of the 18th and 19th centuries and cast them in their own particular mold. This intellectual upheaval was accompanied by the desire to practically change Jewish society. Even the moderate maskilim viewed the contemporary state of Jews as deplorable and in dire need of rejuvenation, whether in matters of morals, cultural creativity or economic productivity. They argued that such conditions were rightfully scorned by others and untenable from both practical and idealistic perspectives. It was to be remedied by the shedding of the base and corrupt elements of Jewish existence and retention of only the true, positive ones; indeed, the question what those were, exactly, loomed as the greatest challenge of Jewish modernity. The more extreme and ideologically bent came close to the universalist aspirations of the radical Enlightenment, of a world freed of superstition and backwardness in which all humans will come together under the liberating influence of reason and progress. The reconstituted Jews, these radical maskilim believed, would be able to take their place as equals in an enlightened world. But all, including the moderate and disillusioned, stated that adjustment to the changing world was both unavoidable and positive in itself. Haskalah ideals were converted into practical steps via numerous reform programs initiated locally and independently by its activists, acting in small groups or even alone at every time and area. Members of the movement sought to acquaint their people with European culture, have them adopt the vernacular language of their lands, and integrate them into larger society. They opposed Jewish reclusiveness and self-segregation, called upon Jews to discard traditional dress in favour of the prevalent one, and preached patriotism and loyalty to the new centralized governments. They acted to weaken and limit the jurisdiction of traditional community institutions – the rabbinic courts, empowered to rule on numerous civic matters, and the board of elders, which served as lay leadership. The maskilim perceived those as remnants of medieval discrimination. They criticized various traits of Jewish society, such as child marriage – traumatized memories from unions entered at the age of thirteen or fourteen are a common theme in Haskalah literature – the use of anathema to enforce community will and the concentration on virtually only religious studies. Maskilic reforms included educational efforts. In 1778, partisans of the movement were among the founders of the Berlin Jewish Free School, or Hevrat Hinuch Ne'arim (Society for the Education of Boys), the first institution in Ashkenazi Jewry that taught general studies in addition to the reformulated and reduced traditional curriculum. This model, with different stresses, was applied elsewhere. Joseph Perl opened the first modern Jewish school in Galicia at Tarnopol in 1813, and Eastern European maskilim opened similar institutes in the Pale of Settlement and Congress Poland. They all abandoned the received methods of Ashkenazi education: study of the Pentateuch with the archaic I'vri-Taitsch (medieval Yiddish) translation and an exclusive focus on the Talmud as a subject of higher learning, all presided over by old-school tutors, melamdim, who were particularly reviled in maskilic circles. Those were replaced by teachers trained in modern methods, among others in the spirit of German philanthropinism, who sought to acquaint their pupils with refined Hebrew so they may understand the Pentateuch and prayers and thus better identify with their heritage; ignorance of Hebrew was often lamented by maskilim as breeding apathy towards Judaism. Far less Talmud, considered cumbersome and ill-suited for children, was taught; elements considered superstitious, like midrashim, were also removed. Matters of faith were taught in rationalistic spirit, and in radical circles also in a sanitized manner. On the other hand, the curriculum was augmented by general studies like math, vernacular language, and so forth. In the linguistic field, the maskilim wished to replace the dualism which characterized the traditional Ashkenazi community, which spoke Judaeo-German and its formal literary language was Hebrew, with another: a refined Hebrew for internal usage and the local vernacular for external ones. They almost universally abhorred Judaeo-German, regarding it as a corrupt dialect and another symptom of Jewish destitution – the movement pioneered the negative attitude to Yiddish which persisted many years later among the educated – though often its activists had to resort to it for lack of better medium to address the masses. Aaron Halle-Wolfssohn, for example, authored the first modern Judaeo-German play, Leichtsinn und Frömmelei (Rashness and Sanctimony) in 1796. On the economic front, the maskilim preached productivization and abandonment of traditional Jewish occupations in favour of agriculture, trades and liberal professions. In matters of faith (which were being cordoned off into a distinct sphere of "religion" by modernization pressures) the movement's partisans, from moderates to radicals, lacked any uniform coherent agenda. The main standard through which they judged Judaism was that of rationalism. Their most important contribution was the revival of Jewish philosophy, rather dormant since the Italian Renaissance, as an alternative to mysticist Kabbalah which served as almost the sole system of thought among Ashkenazim and an explanatory system for observance. Rather than complex allegorical exegesis, the Haskalah sought a literal understanding of scripture and sacred literature. The rejection of Kabbalah, often accompanied with attempts to refute the ancientness of the Zohar, were extremely controversial in traditional society; apart from that, the maskilim had little in common. On the right-wing were conservative members of the rabbinic elite who merely wanted a rationalist approach, and on the extreme left some ventured far beyond the pale of orthodoxy towards Deism. Another aspect was the movement's attitude to gender relations. Many of the maskilim were raised in the rabbinic elite, in which (unlike among the poor Jewish masses or the rich communal wardens) the males were immersed in traditional studies and their wives supported them financially, mostly by running business. Many of the Jewish enlightened were traumatized by their own experiences, either of assertive mothers or early marriage, often conducted at the age of thirteen. Bitter memories from those are a common theme in maskilic autobiographies. Having imbibed the image of European bourgeoisie family values, many of them sought to challenge the semi-matriarchal order of rabbinic families – which combined a lack of Jewish education for women with granting them the status of providers – early marriage, and rigid modesty. Instead, they insisted that men become economically productive while confining their wives to the home environment but also granting them proper religious education, reversing Jewish custom and copying contemporary Christian attitudes. The Haskalah was also mainly a movement of transformation, straddling both the declining traditional Jewish society of autonomous community and cultural seclusion and the beginnings of a modern Jewish public. As noted by Feiner, everything connected with the Haskalah was dualistic in nature. The Jewish Enlighteners pursued two parallel agendas: they exhorted the Jews to acculturate and harmonize with the modern state, and demanded that the Jews remain a distinct group with its own culture and identity. Theirs was a middle position between Jewish community and surrounding society, received mores and modernity. Sliding away from this precarious equilibrium, in any direction, signified also one's break with the Jewish Enlightenment. Virtually all maskilim received old-style, secluded education, and were young Torah scholars before they were first exposed to outside knowledge (from a gender perspective, the movement was almost totally male-dominated; women did not receive sufficient tutoring to master Hebrew). For generations, Mendelssohn's Bible translation to German was employed by such young initiates to bridge the linguistic gap and learn a foreign language, having been raised on Hebrew and Yiddish only. The experience of abandoning one's sheltered community and struggle with tradition was a ubiquitous trait of maskilic biographies. The children of these activists almost never followed their parents; they rather went forward in the path of acculturation and assimilation. While their fathers learned the vernaculars late and still consumed much Hebrew literature, the little available material in the language did not attract their offspring, who often lacked a grasp of Hebrew due to not sharing their parents' traditional education. Haskalah was, by and large, a unigenerational experience. In the linguistic field, this transitory nature was well attested. The traditional Jewish community in Europe inhabited two separate spheres of communication: one internal, where Hebrew served as written high language and Yiddish as vernacular for the masses, and one external, where Latin and the like were used for apologetic and intercessory purposes toward the Christian world. A tiny minority of writers was concerned with the latter. The Haskalah sought to introduce a different bilingualism: renovated, refined Hebrew for internal matters, while Yiddish was to be eliminated; and national vernaculars, to be taught to all Jews, for external ones. However, they insisted on the maintenance of both spheres. When acculturation far exceeded the movement's plans, Central European Jews turned almost solely to the vernacular. David Sorkin demonstrated this with the two great journals of German Jewry: the maskilic Ha-Me'assef was written in Hebrew and supported the study of German; the post-maskilic Sulamith (published since 1806) was written almost entirely in German, befitting its editors' agenda of linguistic assimilation. Likewise, upon the demise of Jewish Enlightenment in Eastern Europe, authors abandoned the maskilic paradigm not toward assimilation but in favour of exclusive use of Hebrew and Yiddish. The political vision of the Haskalah was predicated on a similar approach. It opposed the reclusive community of the past but sought a maintenance of a strong Jewish framework (with themselves as leaders and intercessors with the state authorities); the Enlightened were not even fully agreeable to civic emancipation, and many of them viewed it with reserve, sometimes anxiety. In their writings, they drew a sharp line between themselves and whom they termed "pseudo-maskilim" – those who embraced the Enlightenment values and secular knowledge but did not seek to balance these with their Jewishness, but rather strove for full assimilation. Such elements, whether the radical universalists who broke off the late Berlin Haskalah or the Russified intelligentsia in Eastern Europe a century later, were castigated and derided no less than the old rabbinic authorities which the movement confronted. It was not uncommon for its partisans to become a conservative element, combating against further dilution of tradition: in Vilnius, Samuel Joseph Fuenn turned from a progressive into an adversary of more radical elements within a generation. In the Maghreb, the few local maskilim were more concerned with the rapid assimilation of local Jews into the colonial French culture than with the ills of traditional society. Likewise, those who abandoned the optimistic, liberal vision of the Jews (albeit as a cohesive community) integrating into wider society in favour of full-blown Jewish nationalism or radical, revolutionary ideologies which strove to uproot the established order like Socialism, also broke with the Haskalah. The Jewish national movements of Eastern Europe, founded by disillusioned maskilim, derisively regarded it – in a manner similar to other romantic-nationalist movements' understanding of the general Enlightenment – as a naive, liberal and assimilationist ideology which induced foreign cultural influences, gnawed at the Jewish national consciousness and promised false hopes of equality in exchange for spiritual enslavement. This hostile view was promulgated by nationalist thinkers and historians, from Peretz Smolenskin, Ahad Ha'am, Simon Dubnow and onwards. It was once common in Israeli historiography. A major factor which always characterized the movement was its weakness and its dependence of much more powerful elements. Its partisans were mostly impoverished intellectuals, who eked out a living as private tutors and the like; few had a stable financial base, and they required patrons, whether affluent Jews or the state's institutions. This triplice – the authorities, the Jewish communal elite, and the maskilim – was united only in the ambition of thoroughly reforming Jewish society. The government had no interest in the visions of renaissance which the Enlightened so fervently cherished. It demanded the Jews to turn into productive, loyal subjects with rudimentary secular education, and no more. The rich Jews were sometimes open to the movement's agenda, but mostly practical, hoping for a betterment of their people that would result in emancipation and equal rights. Indeed, the great cultural transformation which occurred among the Parnassim (affluent communal wardens) class – they were always more open to outside society, and had to tutor their children in secular subjects, thus inviting general Enlightenment influences – was a precondition of Haskalah. The state and the elite required the maskilim as interlocutors and specialists in their efforts for reform, especially as educators, and the latter used this as leverage to benefit their ideology. However, the activists were much more dependent on the former than vice versa; frustration from one's inability to further the maskilic agenda and being surrounded by apathetic Jews, either conservative "fanatics" or parvenu "assimilationists", is a common theme in the movement's literature. The term Haskalah became synonymous, among friends and foes alike and in much of early Jewish historiography, with the sweeping changes that engulfed Jewish society (mostly in Europe) from the late 18th century to the late 19th century. It was depicted by its partisans, adversaries and historians like Heinrich Graetz as a major factor in those; Feiner noted that "every modern Jew was identified as a maskil and every change in traditional religious patterns was dubbed Haskalah". Later research greatly narrowed the scope of the phenomenon and limited its importance: while Haskalah undoubtedly played a part, the contemporary historical consensus portrays it as much humbler. Other transformation agents, from state-imposed schools to new economic opportunities, were demonstrated to have rivaled or overshadowed the movement completely in propelling such processes as acculturation, secularization, religious reform from moderate to extreme, adoption of native patriotism and so forth. In many regions the Haskalah had no effect at all. Origins As long as the Jews lived in segregated communities, and as long as all social interaction with their gentile neighbors was limited, the rabbi was the most influential member of the Jewish community. In addition to being a religious scholar and "clergy", a rabbi also acted as a civil judge in all cases in which both parties were Jews. Rabbis sometimes had other important administrative powers, together with the community elders. The rabbinate was the highest aim of many Jewish boys, and the study of the Talmud was the means of obtaining that coveted position, or one of many other important communal distinctions. Haskalah followers advocated "coming out of the ghetto", not just physically but also mentally and spiritually, in order to assimilate among gentile nations. The example of Moses Mendelssohn (1729–86), a Prussian Jew, served to lead this movement, which was also shaped by Aaron Halle-Wolfssohn (1754–1835) and Joseph Perl (1773–1839). Mendelssohn's extraordinary success as a popular philosopher and man of letters revealed hitherto unsuspected possibilities of integration and acceptance of Jews among non-Jews. Mendelssohn also provided methods for Jews to enter the general society of Germany. A good knowledge of the German language was necessary to secure entrance into cultured German circles, and an excellent means of acquiring it was provided by Mendelssohn in his German translation of the Torah. This work became a bridge over which ambitious young Jews could pass to the great world of secular knowledge. The Biur, or grammatical commentary, prepared under Mendelssohn's supervision, was designed to counteract the influence of traditional rabbinical methods of exegesis. Together with the translation, it became, as it were, the primer of Haskalah. Language played a key role in the haskalah movement, as Mendelssohn and others called for a revival of Hebrew and a reduction in the use of Yiddish. The result was an outpouring of new, secular literature, as well as critical studies of religious texts. Julius Fürst along with other German-Jewish scholars compiled Hebrew and Aramaic dictionaries and grammars. Jews also began to study and communicate in the languages of the countries in which they settled, providing another gateway for integration. Berlin was the city of origin for the movement. The capital city of Prussia and, later, the German Empire, Berlin became known as a secular, multi-cultural and multi-ethnic center, a fertile environment for conversations and radical movements. This move by the Maskilim away from religious study, into much more critical and worldly studies was made possible by this German city of modern and progressive thought. It was a city in which the rising middle class Jews and intellectual elites not only lived among, but were exposed to previous Age of Enlightenment thinkers such as Voltaire, Diderot, and Rousseau. The movement is often referred to as the Berlin Haskalah. Reference to Berlin in relation to the Haskalah movement is necessary because it provides context for this episode of Jewish history. Subsequently, having left Germany and spreading across Eastern Europe, the Berlin Haskalah influenced multiple Jewish communities who were interested in non-religious scholarly texts and insight to worlds beyond their Jewish enclaves. Spread Haskalah did not stay restricted to Germany, however, and the movement quickly spread throughout Europe. Poland–Lithuania was the heartland of Rabbinic Judaism, with its two streams of Misnagdic Talmudism centred primarily in Lithuania and Belarus, and Hasidic mysticism popular in Ukraine, Poland, Hungary and Russia. In the 19th century, Haskalah sought dissemination and transformation of traditional education and inward pious life in Eastern Europe. It adapted its message to these different environments, working with the Russian government of the Pale of Settlement to influence secular educational methods, while its writers satirised Hasidic mysticism, in favour of solely Rationalist interpretation of Judaism. Isaac Baer Levinsohn (1788–1860) became known as the "Russian Mendelssohn". Joseph Perl's (1773–1839) satire of the Hasidic movement, "Revealer of Secrets" (Megalleh Temirim), is said to be the first modern novel in Hebrew. It was published in Vienna in 1819 under the pseudonym "Obadiah ben Pethahiah". The Haskalah's message of integration into non-Jewish society was subsequently counteracted by alternative secular Jewish political movements advocating Folkish, Socialist or Nationalist secular Jewish identities in Eastern Europe. While Haskalah advocated Hebrew and sought to remove Yiddish, these subsequent developments advocated Yiddish Renaissance among Maskilim. Writers of Yiddish literature variously satirised or sentimentalised Hasidic mysticism. Effects The Haskalah also resulted in the creation of a secular Jewish culture, with an emphasis on Jewish history and Jewish identity, rather than on religion. This, in turn, resulted in the political engagement of Jews in a variety of competing ways within the countries where they lived on issues that included[citation needed] One commentator describes these effects as "The emancipation of the Jews brought forth two opposed movements: the cultural assimilation, begun by Moses Mendelssohn, and Zionism, founded by Theodor Herzl in 1896." One facet of the Haskalah was a widespread cultural adaptation, as those Jews who participated in the enlightenment began, in varying degrees, to participate in the cultural practices of the surrounding gentile population. Connected with this was the birth of the Reform movement, whose founders (such as Israel Jacobson and Leopold Zunz) rejected the continuing observance of those aspects of Jewish law which they classified as ritual—as opposed to moral or ethical. Even within orthodoxy, the Haskalah was felt through the appearance, in response, of the Mussar Movement in Lithuania, and Torah im Derech Eretz in Germany. "Enlightened" Jews sided with gentile governments, in plans to increase secular education and assimilation among the Jewish masses, which brought them into acute conflict with the orthodox, who believed this threatened the traditional Jewish lifestyle – which had up until that point been maintained through segregation from their gentile neighbors – and Jewish identity itself. The spread of Haskalah affected Judaism, as a religion, because of the degree to which different sects desired to be integrated, and in turn, integrate their religious traditions. The effects of the Enlightenment were already present in Jewish religious music, and in opinion on the tension between traditionalist and modernist tendencies. Groups of Reform Jews, including the Society of the Friends of Reform and the Association for the Reform of Judaism were formed, because such groups wanted, and actively advocated for, a change in Jewish tradition, in particular, regarding rituals like circumcision. Another non-Orthodox group was the Conservative Jews, who emphasized the importance of traditions but viewed with a historical perspective. The Orthodox Jews were actively against these reformers because they viewed changing Jewish tradition as an insult to God and believed that fulfillment in life could be found in serving God and keeping his commandments. The effect of Haskalah was that it gave a voice to plurality of views, while the orthodoxy preserved the tradition, even to the point of insisting on dividing between sects. Another important facet of the Haskalah was its interest in non-Jewish religions, and for some the desire to synchronize or appreciate Christian and Muslim traditions and history. Moses Mendelssohn criticized some aspects of Christianity, but depicted Jesus as a Torah-observant rabbi, who was loyal to traditional Judaism. Mendelssohn explicitly linked positive Jewish views of Jesus with the issues of Emancipation and Jewish-Christian reconciliation. Similar revisionist views were expressed by Rabbi Isaac Ber Levinsohn and other traditional representatives of the Haskalah movement. List of Maskilim See also Notes References Literature External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.