text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Anaconda_(Python_distribution)] | [TOKENS: 1308] |
Contents Anaconda (Python distribution) Anaconda is an open source data science and artificial intelligence distribution platform for Python and R programming languages. Developed by Anaconda, Inc., an American company founded in 2012, the platform is used to develop and manage data science and AI projects. In 2024, Anaconda Inc. has about 300 employees and 45 million users. History Co-founded in Austin, Texas in 2012 as Continuum Analytics by Peter Wang and Travis Oliphant, Anaconda Inc. operates from the United States and Europe. Anaconda Inc. developed Conda, a cross-platform, language-agnostic binary package manager. It also launched PyData community workshops and the Jupyter Cloud Notebook service (Wakari.io). In 2013, it received funding from DARPA. In 2015, the company had two million users including 200 of the Fortune 500 companies and raised $24 million in a Series A funding round led by General Catalyst and BuildGroup. Anaconda secured an additional $30 million in funding in 2021. Continuum Analytics rebranded as Anaconda in 2017. That year, it announced the release of Anaconda Enterprise 5, an integration with Microsoft Azure, and had over 13 million users by year's end. In 2022, it released Anaconda Business; new integrations with Snowflake and others; and the open-source PyScript. It also acquired PythonAnywhere, while Anaconda's user base exceeded 30 million in 2022. In 2023, Anaconda released Python in Excel, a new integration with Microsoft Excel, and launched PyScript.com. The company made a series of investments in AI during 2024. That February, Anaconda partnered with IBM to import its repository of Python packages into Watsonx, IBM's generative AI platform. The same year, Anaconda joined IBM's AI Alliance and released an integration with Teradata and Lenovo. In 2024, Anaconda's user base reached 45 million users and Barry Libert was named company CEO, after serving on Anaconda's board of directors. He was succeeded as CEO in October 2025 by David DeSanto, who also became a company director. In May 2025, the company introduced the first unified AI platform for Open Source, Anaconda AI Platform, a central control for AI workflows that enables customization in Python-based enterprise AI development. That July, after reaching over $150 million in a Series C funding round, Anaconda was evaluated at about $1.5 billion. Overview Anaconda distribution comes with over 300 packages automatically installed, and over 7,500 additional open-source packages can be installed from the Anaconda repository as well as the Conda package and virtual environment manager. It also includes a GUI, Anaconda Navigator, as a graphical alternative to the command-line interface (CLI). Conda was developed to address dependency conflicts native to the pip package manager, which would automatically install any dependent Python packages without checking for conflicts with previously installed packages (until its version 20.3, which later implemented consistent dependency resolution). The Conda package manager's historical differentiation analyzed and resolved these installation conflicts. Anaconda is a distribution of the Python and R programming languages for scientific computing (data science, machine learning applications, large-scale data processing, predictive analytics, etc.), that aims to simplify package management and deployment. Anaconda distribution includes data-science packages suitable for Windows, Linux, and macOS. Other company products include Anaconda Free, and subscription-based Starter, Business and Enterprise. Anaconda's business tier offers Package Security Manager. Package versions in Anaconda are managed by the package management system Conda, which was spun out as a separate open-source package as useful both independently and for applications other than Python. There is also a small, bootstrap version of Anaconda called Miniconda, which includes only Conda, Python, the packages they depend on, and a small number of other packages. Open source packages can be individually installed from the Anaconda repository, Anaconda Cloud (anaconda.org), or the user's own private repository or mirror, using the conda install command. Anaconda, Inc. compiles and builds the packages available in the Anaconda repository itself, and provides binaries for Windows 32/64 bit, Linux 64 bit and MacOS 64-bit (Intel, Apple Silicon). Anything available on PyPI may be installed into a Conda environment using pip, and Conda will keep track of what it has installed and what pip has installed.[citation needed] Custom packages can be made using the conda build command, and can be shared with others by uploading them to Anaconda Cloud, PyPI or other repositories.[citation needed] The default installation of Anaconda2 includes Python 2.7 and Anaconda3 includes Python 3.7. However, it is possible to create new environments that include any version of Python packaged with Conda. Anaconda Navigator is a desktop graphical user interface (GUI) included in Anaconda distribution that allows users to launch applications and manage Conda packages, environments and channels without using command-line commands. Navigator can search for packages on Anaconda Cloud or in a local Anaconda Repository, install them in an environment, run the packages and update them. It is available for Windows, macOS and Linux. The following applications are available by default in Navigator: Conda is an open source, cross-platform, language-agnostic package manager and environment management system that installs, runs, and updates packages and their dependencies. It was created for Python programs, but it can package and distribute software for any language (e.g., R), including multi-language projects. The Conda package and environment manager is included in all versions of Anaconda, Miniconda, and Anaconda Repository. Anaconda.org Anaconda Cloud is a package management service by Anaconda where users can find, access, store and share public and private notebooks, environments, and Conda and PyPI packages. Cloud hosts useful Python packages, notebooks and environments for a wide variety of applications. Users do not need to log in or to have a Cloud account, to search for public packages, download and install them. Users can build new Conda packages using Conda-build and then use the Anaconda Client CLI to upload packages to Anaconda.org. Notebooks users can be aided with writing and debugging code with Anaconda's AI Assistant. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-22] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/H._G._Wells] | [TOKENS: 10026] |
Contents H. G. Wells Herbert George Wells (21 September 1866 – 13 August 1946) was an English writer, prolific in many genres. He wrote more than forty novels and dozens of short stories. His non-fiction output included works of social commentary, politics, history, popular science, satire, biography, and autobiography. Wells is most known today for his groundbreaking science fiction novels; he has sometimes been called the "father of science fiction", a title that has also been given to Jules Verne and Hugo Gernsback. In addition to his fame as a writer, he was prominent in his lifetime as a forward-looking, even prophetic social critic who devoted his literary talents to the development of a progressive vision on a global scale. As a futurist, he wrote a number of utopian works and foresaw the advent of aircraft, tanks, space travel, nuclear weapons, satellite television and something resembling the World Wide Web. His science fiction imagined time travel, alien invasion, invisibility, and biological engineering before these subjects were common in the genre. Brian Aldiss referred to Wells as the "Shakespeare of science fiction", while Charles Fort called him a "wild talent".: 7 Wells rendered his works convincing by instilling commonplace detail alongside a single extraordinary assumption per work – dubbed "Wells's law" – leading Joseph Conrad to hail him in 1898 with "O Realist of the Fantastic!". His most notable science fiction works include The Time Machine (1895), which was his first novella, The Island of Doctor Moreau (1896), The Invisible Man (1897), The War of the Worlds (1898), the military science fiction The War in the Air (1907), and the dystopian When the Sleeper Wakes (1910). Novels of social realism such as Kipps (1905) and The History of Mr Polly (1910), which describe lower-middle-class English life, led to the suggestion that he was a worthy successor to Charles Dickens,: 99 but Wells described a range of social strata and even attempted, in Tono-Bungay (1909), a diagnosis of English society as a whole. Wells was nominated for the Nobel Prize in Literature four times. Wells's earliest specialised training was in biology, and his thinking on ethical matters took place in a Darwinian context. He was also an outspoken socialist from a young age, often (but not always, as at the beginning of the First World War) sympathising with pacifist views. In his later years, he wrote less fiction and more works expounding his political and social views, sometimes giving his profession as that of journalist. Wells was a diabetic and co-founded the charity The Diabetic Association (Diabetes UK) in 1934. Life Herbert George Wells was born at Atlas House, 162 High Street in Bromley, Kent, on 21 September 1866. Called "Bertie" by his family, he was the fourth and last child of Joseph Wells, a former domestic gardener, and at the time a shopkeeper and professional cricketer and Sarah Neal, a former domestic servant. An inheritance had allowed the family to acquire a shop in which they sold china and sporting goods, although it failed to prosper in part because the stock was old and worn out, and the location was poor. Joseph Wells managed to earn a meagre income, but little of it came from the shop and he received an unsteady amount of money from playing professional cricket for the Kent county team. A defining incident of young Wells's life was an accident in 1874 that left him bedridden with a broken leg. To pass the time he began to read books from the local library, brought to him by his father. He soon became devoted to the other worlds and lives to which books gave him access; they also stimulated his desire to write. Later that year he entered Thomas Morley's Commercial Academy, a private school founded in 1849, following the bankruptcy of Morley's earlier school. The teaching was erratic, and the curriculum mostly focused, Wells later said, on producing copperplate handwriting and doing the sort of sums useful to tradesmen. Wells continued at Morley's Academy until 1880. In 1877, his father, Joseph Wells, fractured his femur. The accident effectively put an end to Joseph's career as a cricketer, and his subsequent earnings as a shopkeeper were not enough to compensate for the loss of the primary source of family income. No longer able to support themselves financially, the family instead sought to place their sons as apprentices in various occupations. From 1880 to 1883, Wells had an unhappy apprenticeship as a draper at Hyde's Drapery Emporium in Southsea. His experiences at Hyde's, where he worked a thirteen-hour day and slept in a dormitory with other apprentices, later inspired his novels The Wheels of Chance, The History of Mr Polly, and Kipps, which portray the life of a draper's apprentice as well as providing a critique of society's distribution of wealth.: 2 Wells's parents had a turbulent marriage, owing primarily to his mother being a Protestant and his father being a freethinker. When his mother returned to work as a lady's maid (at Uppark, a country house in West Sussex), one of the conditions of work was that she would not be permitted to have living space for her husband and children. Thereafter, she and Joseph lived separate lives, though they remained faithful to each other and never divorced. As a consequence, Herbert's personal troubles increased as he subsequently failed as a draper and also, later, as a chemist's assistant. However, Uppark had a magnificent library in which he immersed himself, reading many classic works, including Plato's Republic, Thomas More's Utopia, and the works of Daniel Defoe. When he became the first doyen of science fiction as a distinct genre of fiction, Wells referenced Mary Shelley's Frankenstein in relation to his works, writing, "they belong to a class of writing which includes the story of Frankenstein." In October 1879, Wells's mother arranged through a distant relative, Arthur Williams, for him to join the National School at Wookey in Somerset as a pupil–teacher, a senior pupil who acted as a teacher of younger children. In December that year, however, Williams was dismissed for irregularities in his qualifications and Wells was returned to Uppark. After a short apprenticeship at a chemist in nearby Midhurst and an even shorter stay as a boarder at Midhurst Grammar School, he signed his apprenticeship papers at Hyde's. In 1883, Wells persuaded his parents to release him from the apprenticeship, taking an opportunity offered by Midhurst Grammar School again to become a pupil–teacher; his proficiency in Latin and science during his earlier short stay had been remembered. The years he spent in Southsea had been the most miserable of his life to that point, but his good fortune in securing a position at Midhurst Grammar School meant that Wells could continue his self-education in earnest. The following year, Wells won a scholarship to the Normal School of Science (later the Royal College of Science in South Kensington, which became part of Imperial College London) in London, studying biology under Thomas Henry Huxley.: 164 As an alumnus, he later helped to set up the Royal College of Science Association, of which he became the first president in 1909. Wells studied in his new school until 1887, with a weekly allowance of 21 shillings (a guinea) thanks to his scholarship. This ought to have been a comfortable sum of money (at the time many working class families had "round about a pound a week" as their entire household income), yet in his Experiment in Autobiography Wells speaks of constantly being hungry, and indeed photographs of him at the time show a youth who is very thin and malnourished. He soon entered the debating society of the school. These years mark the beginning of his interest in a possible reformation of society. At first approaching the subject through Plato's Republic, he soon turned to contemporary ideas of socialism as expressed by the recently formed Fabian Society and free lectures delivered at Kelmscott House, the home of William Morris. He was also among the founders of The Science School Journal, a school magazine that allowed him to express his views on literature and society, as well as trying his hand at fiction; a precursor to his novel The Time Machine was published in the journal under the title "The Chronic Argonauts". The school year 1886–87 was the last year of his studies.: 164 During 1888, Wells stayed in Stoke-on-Trent, living in Basford. The unique environment of The Potteries was certainly an inspiration. He wrote in a letter to a friend from the area that "the district made an immense impression on me". The inspiration for some of his descriptions in The War of the Worlds is thought to have come from his short time spent here, seeing the iron foundry furnaces burn over the city, shooting huge red light into the skies. His stay in The Potteries also resulted in the macabre short story "The Cone" (1895, contemporaneous with his famous The Time Machine), set in the north of the city.: 90 After teaching for some time—he was briefly on the staff of Holt Academy in Wales—Wells found it necessary to supplement his knowledge relating to educational principles and methodology and entered the College of Preceptors (College of Teachers). He later received his Licentiate and Fellowship FCP diplomas from the college. It was not until 1890 that Wells earned a Bachelor of Science degree in zoology from the University of London External Programme. In 1889–90, he managed to find a post as a teacher at Henley House School in London, where he taught A. A. Milne (whose father ran the school). His first published work was a Text-Book of Biology in two volumes (1893). Upon leaving the Normal School of Science, Wells was left without a source of income. His aunt Mary—his father's sister-in-law—invited him to stay with her for a while, which solved his immediate problem of accommodation. During his stay at his aunt's, he grew increasingly interested in her daughter, Isabel, whom he later courted and married. To earn money, he began writing short humorous articles for journals such as The Pall Mall Gazette, later collecting these in Select Conversations with an Uncle (1895) and Certain Personal Matters (1897). So prolific did Wells become at this mode of journalism that many of his early pieces remain unidentified. According to David C. Smith, Most of Wells's occasional pieces have not been collected, and many have not even been identified as his. Wells did not automatically receive the byline his reputation demanded until after 1896 or so . ... As a result, many of his early pieces are unknown. It is obvious that many early Wells items have been lost. His success with these shorter pieces encouraged him to write book-length work, and he published his first novel, The Time Machine, in 1895. In 1891, Wells married his cousin Isabel Mary Wells (1865–1931; from 1902 Isabel Mary Smith). The couple agreed to separate in 1894, when he had fallen in love with one of his students, Amy Catherine Robbins (1872–1927; later known as Jane), with whom he moved to Woking, Surrey, in May 1895. They lived in a rented house, 'Lynton' (now No. 141), Maybury Road, in the town centre for just under 18 months and married at St Pancras register office in October 1895.: 165 His short period in Woking was perhaps the most creative and productive of his whole writing career; while there, he planned and wrote The War of the Worlds and The Time Machine, completed The Island of Doctor Moreau, wrote and published The Wonderful Visit and The Wheels of Chance, and began writing two other early books, When the Sleeper Wakes and Love and Mr Lewisham. In late summer 1896, Wells and Jane moved to a larger house in Worcester Park, near Kingston upon Thames, for two years; this lasted until his poor health took them to Sandgate, near Folkestone, where he constructed a large family home, Spade House, in 1901. He had two sons with Jane: George Philip (known as "Gip"; 1901–1985) and Frank Richard (1903–1982): 295 (grandfather of film director Simon Wells). Jane died on 6 October 1927, in Dunmow, at the age of 55, which left Wells devastated. She was cremated at Golders Green, with friends of the couple present including George Bernard Shaw.: 64 Wells had multiple love affairs. Dorothy Richardson was a friend with whom he had a brief affair which led to a pregnancy and miscarriage, in 1907. Wells's wife had been a schoolmate of Richardson. In December 1909, he had a daughter, Anna-Jane, with the writer Amber Reeves, whose parents, William and Maud Pember Reeves, he had met through the Fabian Society. Amber had married the barrister G. R. Blanco White in July of that year, as co-arranged by Wells. After Beatrice Webb voiced disapproval of Wells's "sordid intrigue" with Amber, he responded by lampooning Beatrice Webb and her husband Sidney Webb in his 1911 novel The New Machiavelli as 'Altiora and Oscar Bailey', a pair of short-sighted, bourgeois manipulators. Between 1910 and 1913, novelist Elizabeth von Arnim was one of his mistresses. In 1914, he had a son, Anthony West (1914–1987), by the novelist and feminist Rebecca West, 26 years his junior. In 1920–21, and intermittently until his death, he had a love affair with the American birth control activist Margaret Sanger. Between 1924 and 1933, he partnered with the 22-year-younger Dutch adventurer and writer Odette Keun, with whom he lived in Lou Pidou, a house they built together in Grasse, France. Wells dedicated his longest book to her (The World of William Clissold, 1926). When visiting Maxim Gorky in Russia 1920, he had slept with Gorky's mistress Moura Budberg, then still Countess Benckendorf and 27 years his junior. In 1933, when she left Gorky and emigrated to London, their relationship renewed and she cared for him through his final illness. Wells repeatedly asked her to marry him, but Budberg strongly rejected his proposals. In Experiment in Autobiography (1934), Wells wrote: "I was never a great amorist, though I have loved several people very deeply". David Lodge's novel A Man of Parts (2011) – a 'narrative based on factual sources' (author's note) – gives a convincing and generally sympathetic account of Wells's relations with the women mentioned above, and others. One of the ways that Wells expressed himself was through his drawings and sketches. One common location for these was the endpapers and title pages of his own diaries, and they covered a wide variety of topics, from political commentary to his feelings toward his literary contemporaries and his current romantic interests. During his marriage to Amy Catherine, whom he nicknamed Jane, he drew a considerable number of pictures, many of them being overt comments on their marriage. During this period, he called these pictures "picshuas". These picshuas have been the topic of study by Wells scholars for many years, and in 2006, a book was published on the subject. Some of his early novels, called "scientific romances", invented several themes now classic in science fiction in such works as The Time Machine, The Island of Doctor Moreau, The Invisible Man, The War of the Worlds, When the Sleeper Wakes, and The First Men in the Moon. He also wrote realistic novels that received critical acclaim, including Kipps and a critique of English culture during the Edwardian period, Tono-Bungay. Wells also wrote dozens of short stories and novellas, including, "The Flowering of the Strange Orchid", which helped bring the full impact of Darwin's revolutionary botanical ideas to a wider public, and was followed by many later successes such as "The Country of the Blind" (1904). Writer James E. Gunn contended that one of Wells's major contributions to the science fiction genre was his approach, referring to it as his "new system of ideas". Gunn opined that an author should always strive to make the story as credible as possible, even if both the writer and the reader knew certain elements are impossible, allowing the reader to accept the ideas as something that could really happen, today referred to as "the plausible impossible" and "suspension of disbelief". While neither invisibility nor time travel was new in speculative fiction, Wells added a sense of realism to the concepts which the readers were not familiar with. He conceived the idea of using a vehicle that allows an operator to travel purposely and selectively forwards or backwards in time. The term "time machine", coined by Wells, is almost universally used to refer to such a vehicle. He explained that while writing The Time Machine, he realized that "the more impossible the story I had to tell, the more ordinary must be the setting, and the circumstances in which I now set the Time Traveller were all that I could imagine of solid upper-class comforts." In "Wells's Law", a science fiction story should contain only a single extraordinary assumption. Therefore, as justifications for the impossible, he employed scientific ideas and theories. Wells's best-known statement of the "law" appears in his introduction to a collection of his works published in 1934: As soon as the magic trick has been done the whole business of the fantasy writer is to keep everything else human and real. Touches of prosaic detail are imperative and a rigorous adherence to the hypothesis. Any extra fantasy outside the cardinal assumption immediately gives a touch of irresponsible silliness to the invention. Dr. Griffin / The Invisible Man is a brilliant research scientist who discovers a method of invisibility, but finds himself unable to reverse the process. An enthusiast of random and irresponsible violence, Griffin has become an iconic character in horror fiction. The Island of Doctor Moreau sees a shipwrecked man left on the island home of Doctor Moreau, a mad scientist who creates human-like hybrid beings from animals via vivisection. The earliest depiction of uplift, the novel deals with a number of philosophical themes, including pain and cruelty, moral responsibility, human identity, and human interference with nature. In The First Men in the Moon Wells used the idea of radio communication between astronomical objects, a plot point inspired by Nikola Tesla's claim that he had received radio signals from Mars. In addition to science fiction, Wells produced work dealing with mythological beings like an angel in The Wonderful Visit (1895) and a mermaid in The Sea Lady (1902). Though Tono-Bungay is not a science-fiction novel, radioactive decay plays a small but consequential role in it. Radioactive decay plays a much larger role in The World Set Free (1914), a book dedicated to Frederick Soddy who would receive a Nobel for proving the existence of radioactive isotopes. This book contains what is surely Wells's biggest prophetic "hit", with the first description of a nuclear weapon (which he termed "atomic bombs"). Scientists of the day were well aware that the natural decay of radium releases energy at a slow rate over thousands of years. The rate of release is too slow to have practical utility, but the total amount released is huge. Wells's novel revolves around an (unspecified) invention that accelerates the process of radioactive decay, producing bombs that explode with no more than the force of ordinary high explosives—but which "continue to explode" for days on end. "Nothing could have been more obvious to the people of the earlier twentieth century, than the rapidity with which war was becoming impossible ... [but] they did not see it until the atomic bombs burst in their fumbling hands". In 1932, the physicist and conceiver of nuclear chain reaction Leó Szilárd read The World Set Free (the same year Sir James Chadwick discovered the neutron), a book which he wrote in his memoirs had made "a very great impression on me". In 1934, Szilárd took his ideas for a chain reaction to the British War Office and later the Admiralty, assigning his patent to the Admiralty to keep the news from reaching the notice of the wider scientific community. He wrote, "Knowing what this [a chain reaction] would mean—and I knew it because I had read H. G. Wells—I did not want this patent to become public." Wells also wrote non-fiction. His first non-fiction bestseller was Anticipations of the Reaction of Mechanical and Scientific Progress upon Human Life and Thought (1901). When originally serialised in a magazine it was subtitled "An Experiment in Prophecy", and is considered his most explicitly futuristic work. It offered the immediate political message of the privileged sections of society continuing to bar capable men from other classes from advancement until war would force a need to employ those most able, rather than the traditional upper classes, as leaders. Anticipating what the world would be like in the year 2000, the book is interesting both for its hits (trains and cars resulting in the dispersion of populations from cities to suburbs; moral restrictions declining as men and women seek greater sexual freedom; the defeat of German militarism, and the existence of a European Union) and its misses (he did not expect successful aircraft before 1950, and averred that "my imagination refuses to see any sort of submarine doing anything but suffocate its crew and founder at sea"). His bestselling two-volume work, The Outline of History (1920), began a new era of popularised world history. It received a mixed critical response from professional historians. However, it was very popular amongst the general population and made Wells a rich man. Many other authors followed with "Outlines" of their own in other subjects. He reprised his Outline in 1922 with a much shorter popular work, A Short History of the World, a history book praised by Albert Einstein, and two long efforts, The Science of Life (1930)—written with his son G. P. Wells and evolutionary biologist Julian Huxley, and The Work, Wealth and Happiness of Mankind (1931). The "Outlines" became sufficiently common for James Thurber to parody the trend in his humorous essay, "An Outline of Scientists"—indeed, Wells's Outline of History remains in print with a new 2005 edition, while A Short History of the World has been re-edited (2006). From quite early in Wells's career, he sought a better way to organise society and wrote a number of Utopian novels. The first of these was A Modern Utopia (1905), which shows a worldwide utopia with "no imports but meteorites, and no exports at all"; two travellers from our world fall into its alternate history. The others usually begin with the world rushing to catastrophe, until people realise a better way of living: whether by mysterious gases from a comet causing people to behave rationally and abandoning a European war (In the Days of the Comet (1906)), or a world council of scientists taking over, as in The Shape of Things to Come (1933, which he later adapted for the 1936 Alexander Korda film, Things to Come). This depicted, all too accurately, the impending World War, with cities being destroyed by aerial bombs. He also portrayed the rise of fascist dictators in The Autocracy of Mr Parham (1930) and The Holy Terror (1939). Men Like Gods (1923) is also a utopian novel. Wells in this period was regarded as an enormously influential figure; the literary critic Malcolm Cowley stated: "by the time he was forty, his influence was wider than any other living English writer". Wells contemplates the ideas of nature and nurture and questions humanity in books such as The First Men in the Moon, where nature is completely suppressed by nurture, and The Island of Doctor Moreau, where the strong presence of nature represents a threat to a civilized society. Not all his scientific romances ended in a Utopia, and Wells also wrote a dystopian novel, When the Sleeper Wakes (1899, rewritten as The Sleeper Awakes, 1910), which pictures a future society where the classes have become more and more separated, leading to a revolt of the masses against the rulers. The Island of Doctor Moreau is even darker. The narrator, having been trapped on an island of animals vivisected (unsuccessfully) into human beings, eventually returns to England; like Gulliver on his return from the Houyhnhnms, he finds himself unable to shake off the perceptions of his fellow humans as barely civilised beasts, slowly reverting to their animal natures. Wells also wrote the preface for the first edition of W. N. P. Barbellion's diaries, The Journal of a Disappointed Man, published in 1919. Since "Barbellion" was the real author's pen name, many reviewers believed Wells to have been the true author of the Journal; Wells always denied this, despite being full of praise for the diaries. In 1927, a Canadian teacher and writer Florence Deeks unsuccessfully sued Wells for infringement of copyright and breach of trust, claiming that much of The Outline of History had been plagiarised from her unpublished manuscript, The Web of the World's Romance, which had spent nearly nine months in the hands of Wells's Canadian publisher, Macmillan Canada. However, it was sworn on oath at the trial that the manuscript remained in Toronto in the safekeeping of Macmillan, and that Wells did not even know it existed, let alone seen it. The court found no proof of copying, and decided the similarities were due to the fact that the books had similar nature and both writers had access to the same sources. The case went on appeal from the Canadian courts to the Judicial Committee of the Privy Council, at that time the highest court of appeal for the British Empire, which dismissed the appeal in Deeks v Wells. In 2000, A. B. McKillop, a professor of history at Carleton University, produced a book on the case, The Spinster & The Prophet: Florence Deeks, H. G. Wells, and the Mystery of the Purloined Past. According to McKillop, the lawsuit was unsuccessful due to the prejudice against a woman suing a well-known and famous male author, and he paints a detailed story based on the circumstantial evidence of the case. In 2004, Denis N. Magnusson, professor emeritus of the Faculty of Law, Queen's University, Ontario, published an article on Deeks v. Wells. This re-examines the case in relation to McKillop's book. While having some sympathy for Deeks, he argues that she had a weak case that was not well presented, and though she may have met with sexism from her lawyers, she received a fair trial, adding that the law applied is essentially the same law that would be applied to a similar case today (i.e., 2004). In 1933, Wells predicted in The Shape of Things to Come that the world war he feared would begin in January 1940, a prediction which ultimately came true four months early, in September 1939, with the outbreak of World War II.: 209 In 1936, before the Royal Institution, Wells called for the compilation of a constantly growing and changing World Encyclopaedia, to be reviewed by outstanding authorities and made accessible to every human being. He also presented on his conception of a world encyclopedia at the World Congress of Universal Documentation in Paris in 1937. In 1938, he published a collection of essays on the future organisation of knowledge and education, World Brain, including the essay "The Idea of a Permanent World Encyclopaedia". Prior to 1933, Wells's books were widely read in Germany and Austria, and most of his science fiction works had been translated shortly after publication. By 1933, he had attracted the attention of German officials because of his criticism of the political situation in Germany, and on 10 May 1933, Wells's books were burned by the Nazi youth in Berlin's Opernplatz, and his works were banned from libraries and book stores. Wells, as president of PEN International (Poets, Essayists, Novelists), angered the Nazis by overseeing the expulsion of the German PEN club from the international body in 1934 following the German PEN's refusal to admit non-Aryan writers to its membership. At a PEN conference in Ragusa, Wells refused to yield to Nazi sympathisers who demanded that the exiled author Ernst Toller be prevented from speaking. Near the end of World War II, Allied forces discovered that the SS had compiled lists of people slated for immediate arrest during the invasion of Britain in the abandoned Operation Sea Lion, with Wells included in the alphabetical list of "The Black Book". Seeking a more structured way to play war games, Wells wrote Floor Games (1911) followed by Little Wars (1913), which set out rules for fighting battles with toy soldiers (miniatures). A pacifist prior to the First World War, Wells stated "how much better is this amiable miniature [war] than the real thing". According to Wells, the idea of the game developed from a visit by his friend Jerome K. Jerome. After dinner, Jerome began shooting down toy soldiers with a toy cannon and Wells joined in to compete. During August 1914, immediately after the outbreak of the First World War, Wells published a number of articles in London newspapers that subsequently appeared as a book entitled The War That Will End War.: 147 He coined the expression with the idealistic belief that the result of the war would make a future conflict impossible. Wells blamed the Central Powers for the coming of the war and argued that only the defeat of German militarism could bring about an end to war. Wells used the shorter form of the phrase, "the war to end war", in In the Fourth Year (1918), in which he noted that the phrase "got into circulation" in the second half of 1914. In fact, it had become one of the most common catchphrases of the war. In 1918, Wells worked for the British War Propaganda Bureau, also called Wellington House. Wells was also one of fifty-three leading British authors — a number that included Rudyard Kipling, Thomas Hardy and Sir Arthur Conan Doyle — who signed their names to the "Authors' Declaration." This manifesto declared that the German invasion of Belgium had been a brutal crime, and that Britain "could not without dishonour have refused to take part in the present war". Wells visited Russia three times: 1914, 1920 and 1934. After his visits to Petrograd and Moscow, in January 1914, he came back to England, "a staunch Russophile". His views were recorded in a newspaper article, "Russia and England: A Study on Contrasts", published in The Daily News on 1 February 1941, and in his novel Joan and Peter (1918). During his second visit, he saw his old friend Maxim Gorky and with Gorky's help, met Vladimir Lenin. In his book Russia in the Shadows, Wells portrayed Russia as recovering from a total social collapse, "the completest that has ever happened to any modern social organisation". On 23 July 1934, after visiting U.S. President Franklin D. Roosevelt, Wells went to the Soviet Union and interviewed Joseph Stalin for three hours for the New Statesman magazine, which was extremely rare at that time. He told Stalin how he had seen 'the happy faces of healthy people' in contrast with his previous visit to Moscow in 1920. However, he also criticised the lawlessness, class discrimination, state violence, and absence of free expression. Stalin enjoyed the conversation and replied accordingly. As the chairman of the London-based PEN International, which protected the rights of authors to write without being intimidated, Wells hoped by his trip to USSR, he could win Stalin over by force of argument. Before he left, he realised that no reform was to happen in the near future. Wells's greatest literary output occurred before the First World War, which was lamented by younger authors whom he had influenced. In this connection, George Orwell described Wells as "too sane to understand the modern world", and "since 1920 he has squandered his talents in slaying paper dragons." G. K. Chesterton quipped: "Mr Wells is a born storyteller who has sold his birthright for a pot of message". Wells had diabetes, and was a co-founder in 1934 of The Diabetic Association (now Diabetes UK, the leading charity for people with diabetes in the UK). On 28 October 1940, on the radio station KTSA in San Antonio, Texas, Wells took part in a radio interview with Orson Welles, who two years previously had performed a famous radio adaptation of The War of the Worlds. During the interview, by Charles C Shaw, a KTSA radio host, Wells admitted his surprise at the sensation that resulted from the broadcast but acknowledged his debt to Welles for increasing sales of one of his "more obscure" titles. Wells died on 13 August 1946, aged 79, at his home at 13 Hanover Terrace, overlooking Regent's Park, London. In his preface to the 1941 edition of The War in the Air, Wells had stated that his epitaph should be: "I told you so. You damned fools." Wells's body was cremated at Golders Green Crematorium on 16 August 1946; his ashes were subsequently scattered into the English Channel at Old Harry Rocks, the most eastern point of the Jurassic Coast and about 3.5 miles (5.6 km) from Swanage in Dorset. A commemorative blue plaque in his honour was installed by the Greater London Council at his home in Regent's Park in 1966. Futurist A futurist and "visionary", Wells foresaw the advent of aircraft, tanks, space travel, nuclear weapons, satellite television, and something resembling the World Wide Web. Asserting that "Wells's visions of the future remain unsurpassed", John Higgs, author of Stranger Than We Can Imagine: Making Sense of the Twentieth Century, states that in the late 19th century Wells "saw the coming century clearer than anyone else. He anticipated wars in the air, the sexual revolution, motorised transport causing the growth of suburbs and a proto-Wikipedia he called the "world brain". In his novel The World Set Free, he imagined an "atomic bomb" of terrifying power that would be dropped from aeroplanes. This was an extraordinary insight for an author writing in 1913, and it made a deep impression on Winston Churchill." Many readers have hailed H. G. Wells and George Orwell as special kinds of writers, ones endowed with remarkable prescriptive and prophetic powers. Wells was the twentieth-century prototype of this literary vatic figure: he invented the role, explored its possibilities, especially through new forms of prose and new ways to publish, and defined its boundaries. His impact on his culture was profound; as George Orwell wrote, "The minds of all of us, and therefore the physical world, would be perceptibly different if Wells had never existed." — The Author as Cultural Hero: H. G. Wells and George Orwell. In 2011, Wells was among a group of science fiction writers featured in the Prophets of Science Fiction series, a show produced and hosted by film director Sir Ridley Scott, which depicts how predictions influenced the development of scientific advancements by inspiring many readers to assist in transforming those futuristic visions into everyday reality. In a 2013 review of The Time Machine for the New Yorker magazine, Brad Leithauser writes, "At the base of Wells's great visionary exploit is this rational, ultimately scientific attempt to tease out the potential future consequences of present conditions—not as they might arise in a few years, or even decades, but millennia hence, epochs hence. He is world literature's Great Extrapolator. Like no other fiction writer before him, he embraced "deep time". Political views Wells was a socialist and a member of the Fabian Society. He stood as a Labour Party candidate for London University in the 1922 and 1923 general elections. Wells was a member of The Other Club, a London dining club co-founded by Winston Churchill who was an avid reader of his books; after they first met in 1902, they kept in touch until Wells died in 1946. As a junior minister, Churchill borrowed lines from Wells for one of his most famous early landmark speeches in 1906; as Prime Minister, the phrase "the gathering storm"—used by Churchill to describe the rise of Nazi Germany—had been written by Wells in The War of the Worlds, which depicts an attack on Britain by Martians. Wells's extensive writings on equality and human rights, most notably his most influential work, The Rights of Man (1940), laid the groundwork for the 1948 Universal Declaration of Human Rights, which was adopted by the United Nations shortly after his death. His efforts regarding the League of Nations, on which he collaborated on the project with Leonard Woolf with the booklets The Idea of a League of Nations, Prolegomena to the Study of World Organization, and The Way of the League of Nations, became a disappointment as the organization turned out to be a weak one unable to prevent the Second World War, which itself occurred towards the very end of his life and only increased the pessimistic side of his nature. In his last book Mind at the End of Its Tether (1945), he considered the idea that humanity being replaced by another species might not be a bad idea. He referred to the era between the two World Wars as "The Age of Frustration". Wells was initially an opponent of Zionism. In In the Days of the Comet, Jews are described as parasites on European society; however, Wells later became a strong supporter of the establishment of a Jewish state in response to the Holocaust and initiated a correspondence with Chaim Weizmann, the first president of Israel. Religious views Wells's views on God and religion changed over his lifetime. Early in his life, he distanced himself from Christianity, and later from theism; finally, late in life, he was essentially atheistic. Martin Gardner summarises this progression: [The younger Wells] ... did not object to using the word "God" provided it did not imply anything resembling human personality. In his middle years Wells went through a phase of defending the concept of a "finite God," similar to the god of such process theologians as Samuel Alexander, Edgar Brightman, and Charles Hartshorne. (He even wrote a book about it called God the Invisible King.) Later Wells decided he was really an atheist. In God the Invisible King (1917), Wells wrote that his idea of God did not draw upon the traditional religions of the world: This book sets out as forcibly and exactly as possible the religious belief of the writer. [Which] is a profound belief in a personal and intimate God. ... Putting the leading idea of this book very roughly, these two antagonistic typical conceptions of God may be best contrasted by speaking of one of them as God-as-Nature or the Creator, and of the other as God-as-Christ or the Redeemer. One is the great Outward God; the other is the Inmost God. The first idea was perhaps developed most highly and completely in the God of Spinoza. It is a conception of God tending to pantheism, to an idea of a comprehensive God as ruling with justice rather than affection, to a conception of aloofness and awestriking worshipfulness. The second idea, which is contradictory to this idea of an absolute God, is the God of the human heart. The writer suggested that the great outline of the theological struggles of that phase of civilisation and world unity which produced Christianity, was a persistent but unsuccessful attempt to get these two different ideas of God into one focus. Later in the work, he aligns himself with a "renascent or modern religion ... neither atheist nor Buddhist nor Mohammedan nor Christian ... [that] he has found growing up in himself". Of Christianity, he said: "it is not now true for me. ... Every believing Christian is, I am sure, my spiritual brother ... but if systemically I called myself a Christian I feel that to most men I should imply too much and so tell a lie". Of other world religions, he writes: "All these religions are true for me as Canterbury Cathedral is a true thing and as a Swiss chalet is a true thing. There they are, and they have served a purpose, they have worked. Only they are not true for me to live in them. ... They do not work for me". In The Fate of Homo Sapiens (1939), Wells criticised almost all world religions and philosophies, stating "there is no creed, no way of living left in the world at all, that really meets the needs of the time .... When we come to look at them coolly and dispassionately, all the main religions, patriotic, moral and customary systems in which human beings are sheltering today, appear to be in a state of jostling and mutually destructive movement, like the houses and palaces and other buildings of some vast, sprawling city overtaken by a landslide." Wells's opposition to organised religion reached a fever pitch in 1943 with publication of his book Crux Ansata, subtitled "An Indictment of the Roman Catholic Church" in which he attacked Catholicism, Pope Pius XII and called for the bombing of the city of Rome. Literary influence and legacy The science fiction historian John Clute describes Wells as "the most important writer the genre has yet seen", and notes his work has been central to both British and American science fiction. Science fiction author and critic Algis Budrys said Wells "remains the outstanding expositor of both the hope, and the despair, which are embodied in the technology and which are the major facts of life in our world". He was nominated for the Nobel Prize in Literature in 1921, 1932, 1935, and 1946. Wells so influenced real exploration of space that impact craters on Mars and the Moon were named after him: Wells's genius was his ability to create a stream of brand new, wholly original stories out of thin air. Originality was Wells's calling card. In a six-year stretch from 1895 to 1901, he produced a stream of what he called "scientific romance" novels, which included The Time Machine, The Island of Doctor Moreau, The Invisible Man, The War of the Worlds and The First Men in the Moon. This was a dazzling display of new thought, endlessly copied since. A book like The War of the Worlds inspired every one of the thousands of alien invasion stories that followed. It burned its way into the psyche of mankind and changed us all forever. — Cultural historian John Higgs, The Guardian. In the United Kingdom, Wells's work was a key model for the British "scientific romance", and other writers in that mode, such as Olaf Stapledon, J. D. Beresford, S. Fowler Wright, and Naomi Mitchison, all drew on Wells's example. Wells was also an important influence on British science fiction of the period after the Second World War, with Arthur C. Clarke and Brian Aldiss expressing strong admiration for Wells's work. A self-declared fan of Wells, John Wyndham, author of The Day of the Triffids and The Midwich Cuckoos, echoes Wells's obsession with catastrophe and its aftermath. His early work (pre 1920) made Wells the literary hero of dystopian novelist George Orwell. Among contemporary British science fiction writers, Stephen Baxter, Christopher Priest and Adam Roberts have all acknowledged Wells's influence on their writing; all three are vice-presidents of the H. G. Wells Society. He also had a strong influence on British scientist J. B. S. Haldane, who wrote Daedalus; or, Science and the Future (1924), "The Last Judgement" and "On Being the Right Size" from the essay collection Possible Worlds (1927), and Biological Possibilities for the Human Species in the Next Ten Thousand Years (1963), which are speculations about the future of human evolution and life on other planets. Haldane gave several lectures about these topics which in turn influenced other science fiction writers. In the United States, Hugo Gernsback reprinted most of Wells's work in the pulp magazine Amazing Stories, regarding Wells's work as "texts of central importance to the self-conscious new genre". Later American writers such as Ray Bradbury, Isaac Asimov, Frank Herbert, Carl Sagan, and Ursula K. Le Guin all recalled being influenced by Wells. Sinclair Lewis's early novels were strongly influenced by Wells's realistic social novels, such as The History of Mr Polly; Lewis also named his first son Wells after the author. Lewis nominated H. G. Wells for the Nobel Prize in Literature in 1932. In an interview with The Paris Review, Vladimir Nabokov described Wells as his favourite writer when he was a boy and "a great artist". He went on to cite The Passionate Friends, Ann Veronica, The Time Machine, and The Country of the Blind as superior to anything else written by Wells's British contemporaries. Nabokov said: "His sociological cogitations can be safely ignored, of course, but his romances and fantasies are superb." Jorge Luis Borges wrote many short pieces on Wells in which he demonstrates a deep familiarity with much of Wells's work. While Borges wrote several critical reviews, including a mostly negative review of Wells's film Things to Come, he regularly treated Wells as a canonical figure of fantastic literature. Late in his life, Borges included The Invisible Man and The Time Machine in his Prologue to a Personal Library, a curated list of 100 great works of literature that he undertook at the behest of the Argentine publishing house Emecé. Wells also inspired writers of continental European speculative fiction such as Karel Čapek, Mikhail Bulgakov and Yevgeny Zamyatin. In 2021, Wells was one of six British writers commemorated on a series of UK postage stamps issued by Royal Mail to celebrate British science fiction. Six classic science fiction novels were depicted, one from each author, with The Time Machine chosen to represent Wells. Representations Film adaptations The novels and short stories of H. G. Wells have been adapted for cinema. These include Island of Lost Souls (1932), The Invisible Man (1933), Things to Come (1936), The Man Who Could Work Miracles (1937), The War of the Worlds (1953), The Time Machine (1960), First Men in the Moon (1964), The Island of Dr. Moreau (1977), Time After Time (1979), The Island of Dr. Moreau (1996), The Time Machine (2002), War of the Worlds (2005) and War of the Worlds (2025). Literary papers In 1954, the University of Illinois Urbana-Champaign purchased the H. G. Wells literary papers and correspondence collection. The university's Rare Book & Manuscript Library holds the largest collection of Wells manuscripts, correspondence, first editions and publications in the United States. Among these is unpublished material and the manuscripts of such works as The War of the Worlds and The Time Machine. The collection includes first editions, revisions and translations. The letters contain general family correspondence, communications from publishers, material regarding the Fabian Society, and letters from politicians and public figures, most notably George Bernard Shaw and Joseph Conrad. Bibliography See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Alexis_de_Tocqueville] | [TOKENS: 9121] |
Contents Alexis de Tocqueville Alexis Charles Henri Clérel, comte de Tocqueville[a] (29 July 1805 – 16 April 1859), was a French diplomat, political philosopher and historian. He is best known for his works Democracy in America (appearing in two volumes, 1835 and 1840) and The Old Regime and the Revolution (1856). In both, he analyzed the living standards and social conditions of individuals as well as their relationship to the market and state in Western societies. Democracy in America was published after Tocqueville's travels in the United States and is today considered an early work of sociology and political science. Tocqueville was active in French politics, first under the July Monarchy (1830–1848) and then during the Second Republic (1849–1851) which succeeded the February 1848 Revolution. He retired from political life after Louis Napoléon Bonaparte's 2 December 1851 coup and thereafter began work on The Old Regime and the Revolution. Tocqueville argued the importance of the French Revolution was to continue the process of modernizing and centralizing the French state which had begun under King Louis XIV. He believed the failure of the Revolution came from the inexperience of the deputies who were too wedded to abstract Enlightenment ideals. Tocqueville was a classical liberal who advocated parliamentary government and was sceptical of the extremes of majoritarianism. During his time in parliament, he was first a member of the centre-left before moving to the centre-right, and the complex and restless nature of his liberalism has led to contrasting interpretations and admirers across the political spectrum. For example, Democracy in America was interpreted differently across national contexts. In France and the United States, Tocqueville's work was seen as liberal, whereas both progressives and conservatives in the British Isles interpreted his work as supporting their own positions. Early life Tocqueville came from an old aristocratic Norman family, the great-grandson of the statesman Malesherbes, who was guillotined in 1793. He was the third son of Hervé Louis François Jean Bonaventure Clérel, Count of Tocqueville, an officer of the Constitutional Guard of King Louis XVI, and Louise Madeleine Le Peletier de Rosanbo who, themselves, might have faced the guillotine but for the fall in 1794 of Maximilien Robespierre. Under the Bourbon Restoration, Tocqueville's father became a noble peer and prefect. Tocqueville attended the Lycée Fabert in Metz. Political career Tocqueville, who despised the July Monarchy (1830–1848), began his political career in 1839. From 1839 to 1851, he served as member of the lower house of parliament for the Manche department (Valognes). He sat on the centre-left, defended abolitionist views and upheld free trade while supporting the colonisation of Algeria carried on by Louis-Philippe I's regime. In 1842, he was elected as a member of the American Philosophical Society. In 1847, Tocqueville sought to found a Young Left (Jeune Gauche) party which would advocate wage increases, a progressive tax, and other labor concerns in order to undermine the appeal of the socialists. After the fall of the July Monarchy in the Revolution of 1848, Tocqueville was elected a member of the Constituent Assembly of 1848, where he became a member of the commission charged with the drafting of the new Constitution of the Second Republic (1848–1851). He defended bicameralism and the election of the President of the Republic by universal suffrage. As the countryside was thought to be more conservative than the labouring population of Paris, he conceived of universal suffrage as a means to counteract the revolutionary spirit of Paris. During the Second Republic, Tocqueville sided with the Party of Order against the socialists. A few days after the February 1848 insurrection, he anticipated that a violent clash between the Parisian workers' population led by socialists agitating in favour of a "Democratic and Social Republic" and the conservatives, which included the aristocracy and the rural population, would be inescapable. Indeed, these social tensions eventually exploded in the June Days Uprising of 1848. Led by General Cavaignac, the suppression of the uprising was supported by Tocqueville, who advocated the "regularization" of the state of siege declared by Cavaignac and other measures promoting suspension of the constitutional order. Between May and September, Tocqueville participated in the Constitutional Commission which wrote the new Constitution. His proposals, such as his amendment about the President and his reelection, reflected lessons he drew from his North American experience. A supporter of Cavaignac and of the Party of Order, Tocqueville accepted an invitation to enter Odilon Barrot's government as Minister of Foreign Affairs from 3 June to 31 October 1849. During the troubled days of June 1849, he pleaded with Interior Minister Jules Armand Dufaure for the reestablishment of the state of siege in the capital and approved the arrest of demonstrators. Tocqueville, who since February 1848 had supported laws restricting political freedoms, approved the two laws voted immediately after the June 1849 days which restricted the liberty of clubs and freedom of the press. This active support in favour of laws restricting political freedoms stands in contrast of his defence of freedoms in Democracy in America. According to Tocqueville, he favoured order as "the sine qua non for the conduct of serious politics. He [hoped] to bring the kind of stability to French political life that would permit the steady growth of liberty unimpeded by the regular rumblings of the earthquakes of revolutionary change″. Tocqueville had supported Cavaignac against Louis Napoléon Bonaparte for the presidential election of 1848. Opposed to Louis Napoléon Bonaparte's 2 December 1851 coup which followed his election, Tocqueville was among the deputies who gathered at the 10th arrondissement of Paris in an attempt to resist the coup and have Napoleon III judged for "high treason" as he had violated the constitutional limit on terms of office. Detained at Vincennes and then released, Tocqueville, who supported the Restoration of the Bourbons against Napoleon III's Second Empire (1851–1871), quit political life and retreated to his castle (Château de Tocqueville). Against this image of Tocqueville, biographer Joseph Epstein concluded: "Tocqueville could never bring himself to serve a man he considered a usurper and despot. He fought as best he could for the political liberty in which he so ardently believed—had given it, in all, thirteen years of his life ... . He would spend the days remaining to him fighting the same fight, but conducting it now from libraries, archives, and his own desk." There, he began the draft of L'Ancien Régime et la Révolution, publishing the first tome in 1856 but leaving the second one unfinished. Travels In 1831, Tocqueville obtained from the July Monarchy a mission to examine prisons and penitentiaries in the United States and proceeded there with his lifelong friend Gustave de Beaumont. While they did visit some prisons, Tocqueville and Beaumont traveled widely in the United States: from the east-coast cities to what was then the north-west frontier, Michigan; by steamboat down the Ohio and Mississippi to New Orleans; and by stagecoach across the South back toward the east coast and north to New York. Tocqueville also made a side trip to Montreal and Quebec City. Throughout his trip, he took extensive notes on his observations and reflections. He returned within nine months and published a report, The Penitentiary System in the United States, although the more well-known result of his tour was his major work Democracy in America, which appeared in 1835. Beaumont also wrote an account of their travels in Jacksonian America: Marie or Slavery in the United States (1835). Tocqueville returned to France in February 1832. Before putting the finishing touches to his reflections on American democracy, he departed for England in 1833. Tocqueville had a private reason for crossing the Channel: to meet the family of Mary Mottley, a young woman he had met at Versailles.: xiii–xiv The couple were married in 1836. He stayed five weeks in England, eager to observe what many imagined as the dawning of the age of democracy, the passage of the Parliamentary Reform Act.: xiii–xiv Tocqueville concluded that there was "a good chance for the English to succeed in modifying the social and political set-up ... without violent convulsions". The British nobility was open to new recruits. He suggested that the difference with the French was "clear from the use of one word" as "gentleman in English applies to any well-educated man, regardless of birth, whereas in France gentilhomme can only be used of a noble by birth".: xiv In May 1835, Tocqueville returned to England but then in summer with Beaumont travelled on to Ireland,: xviii then part of the United Kingdom of Great Britain and Ireland. He described Ireland as having "all the evils of an aristocracy and none of its advantages". There was "no moral tie between rich and poor; the difference of political opinion of religious belief and the actual distance they live apart make them strangers one to the other, one could almost say enemies".: 114–115 In this circumstance he remarked on the "unbelievable unity between the Irish clergy and the Catholic population". The people looked to the clergy, and the clergy "rebuffed" by the "upper classes" ("Protestants and enemies"), had "turned all its attention to the lower classes; it has the same instincts, the same interests and the same passions as the people; [a] state of affairs altogether peculiar to Ireland".: 127–128 Back in England, Tocqueville found confirmation of a close connection between centralisation and democratisation. He observed that in England centralisation took a form less absolute than in France. It was of "legislation and not administration", and co-existed with a "spirit of [civic] association" that in responding to specific and local issues narrowed the range of government intervention.: xvi Tocqueville had intended the joint impressions of their trip to Britain and Ireland would form the basis of a work by Beaumont, just as their common reflections of the United States on had served as him as material for Democracy in America. Beaumont did produce L'Irlande sociale, politique et religieuse (1839). Much praised by Daniel O'Connell, the first sentence of its historical introduction read: "The dominion of the English in Ireland, from their invasion of the country in 1169, to the close of the last century, has been nothing but a tyranny." In 1841 and 1846, Tocqueville traveled to Algeria which France had invaded and colonised from 1830. Having himself entertained the possibility of settling in Algeria as a colonist, from his election to the Chamber of Deputies in 1839 Tocqueville had come to be seen as the parliament's foremost expert on the colony. In 1837, he had written of his hope for eventual intermarriage between the French and indigenous Arabs and their amalgamation into a distinct whole. Following the first of his two visits to Algeria (again accompanied by Beaumont), his position was reversed. When it came to the French colonists, he "displayed his usual liberalism", as he criticised the "coarseness and violence" of the military rule to which they too were subject.: 376–377 Yet from what he observed of Algerian society, including what he understood as "the absence of all political life",: 377 he was persuaded that not only could its violent subjugation be justified but also that its result could not, and should never, be assimilation of the indigenous people into the civil and political life of France. Death A longtime sufferer from bouts of tuberculosis, Tocqueville eventually succumbed to the disease on 16 April 1859 and was buried in the Tocqueville cemetery in Normandy.[citation needed] He was survived by his English wife of 23 years, Mary Mottley. Although she was "too liberal ... too Protestant, too middle-class, and too English" for some in his family, de Tocqueville described Mottley as perhaps his only true friend. While they had hoped for a family, they had no children. In advance of their marriage, Mottley converted to Roman Catholicism, Tocqueville's professed religion,. While she appeared to be comparatively devout, Tocqueville's own attitude toward religion has been described as "utilitarian", regarding it as a "social cement, a safety valve for passions that might otherwise feed a revolutionary torrent dangerous to individual liberty". Provided it was separated from state power, Tocqueville did not believe that his church was bound to be anti-democratic. Democracy in America In Democracy in America, published in 1835, Tocqueville wrote of the New World and its burgeoning democratic order. Observing from the perspective of a detached social scientist, Tocqueville wrote of his travels through the United States in the early 19th century when the Market Revolution, Western expansion and Jacksonian democracy were radically transforming the fabric of American life. According to political scientist Joshua Kaplan, one purpose of writing Democracy in America was to help the people of France get a better understanding of their position between a fading aristocratic order and an emerging democratic order and to help them sort out the confusion. Tocqueville saw democracy as an enterprise that balanced liberty and equality, concern for the individual as well as for the community. On a negative note, Tocqueville remarked that "in democracies manners are never so refined as amongst aristocratic nations." Tocqueville was an ardent supporter of liberty. He wrote: "I have a passionate love for liberty, law, and respect for rights. I am neither of the revolutionary party nor of the conservative. ... Liberty is my foremost passion." He wrote of "Political Consequences of the Social State of the Anglo-Americans" by saying: "But one also finds in the human heart a depraved taste for equality, which impels the weak to want to bring the strong down to their level, and which reduces men to preferring equality in servitude to inequality in freedom." The above is often misquoted as a slavery quote because of previous translations of the French text. The most recent translation by Arthur Goldhammer in 2004 translates the meaning to be as stated above. Examples of misquoted sources are numerous on the internet such as "Americans are so enamored of equality that they would rather be equal in slavery than unequal in freedom", but the text does not contain the words "Americans were so enamored by equality" anywhere. His view on government reflects his belief in liberty and the need for individuals to be able to act freely while respecting others' rights. Of centralized government, he wrote that it "excels in preventing, not doing". Tocqueville continues to comment on equality by saying: "Furthermore, when citizens are all almost equal, it becomes difficult for them to defend their independence against the aggressions of power. As none of them is strong enough to fight alone with advantage, the only guarantee of liberty is for everyone to combine forces. But such a combination is not always in evidence". Tocqueville explicitly cites inequality as being incentive for the poor to become rich. He observes that it is not often that two generations within a family maintain success, and considers inheritance laws which divide a person's estate among multiple heirs to cause a constant cycle of churn between the poor and the rich, thereby over generations making the poor rich and the rich poor. He cites protective laws in France at the time that protected an estate from being split apart among heirs, thereby preserving wealth and preventing a churn of wealth such as was perceived by him in 1835 within the United States. Tocqueville's main purpose was to analyze the functioning of political society and various forms of political associations, although he brought some reflections on civil society too (and relations between political and civil society). For Tocqueville, as for Georg Wilhelm Friedrich Hegel and Karl Marx, civil society was a sphere of private entrepreneurship and civilian affairs regulated by civil code. As a critic of individualism, Tocqueville thought that through associating for mutual purpose, both in public and private, Americans are able to overcome selfish desires, thus making both a self-conscious and active political society and a vibrant civil society functioning according to political and civil laws of the state. According to political scientist Joshua Kaplan, Tocqueville did not originate the concept of individualism, instead he changed its meaning and saw it as a "calm and considered feeling which disposes each citizen to isolate himself from the mass of his fellows and to withdraw into the circle of family and friends ... . [W]ith this little society formed to his taste, he gladly leaves the greater society to look for itself." While Tocqueville saw egotism and selfishness as vices, he saw individualism not as a failure of feeling but as a way of thinking about things which could have either positive consequences such as a willingness to work together, or negative consequences such as isolation and that individualism could be remedied by improved understanding. When individualism was a positive force and prompted people to work together for common purposes and seen as "self-interest properly understood", then it helped to counterbalance the danger of the tyranny of the majority since people could "take control over their own lives" without government aid. According to Kaplan, Americans have a difficult time accepting Tocqueville's criticism of the stifling intellectual effect of the "omnipotence of the majority" and that Americans tend to deny that there is a problem in this regard. Others such as the Catholic writer Daniel Schwindt disagree with Kaplan's interpretation, arguing instead that Tocqueville saw individualism as just another form of egotism and not an improvement over it. To make his case, Schwindt provides citations such as the following: Egoism springs from a blind instinct; individualism from wrong-headed thinking rather than from depraved feelings. It originates as much from defects of intelligence as from the mistakes of the heart. Egoism blights the seeds of every virtue; individualism at first dries up only the source of public virtue. In the longer term it attacks and destroys all the others and will finally merge with egoism. Tocqueville warned that modern democracy may be adept at inventing new forms of tyranny because radical equality could lead to the materialism of an expanding bourgeoisie and to the selfishness of individualism. "In such conditions, we might become so enamored with 'a relaxed love of present enjoyments' that we lose interest in the future of our descendants...and meekly allow ourselves to be led in ignorance by a despotic force all the more powerful because it does not resemble one", wrote The New Yorker's James Wood. Tocqueville worried that if despotism were to take root in a modern democracy, it would be a much more dangerous version than the oppression under the Roman emperors or tyrants of the past who could only exert a pernicious influence on a small group of people at a time. In contrast, a despotism under a democracy could see "a multitude of men", uniformly alike, equal, "constantly circling for petty pleasures", unaware of fellow citizens and subject to the will of a powerful state which exerted an "immense protective power". Tocqueville compared a potentially despotic democratic government to a protective parent who wants to keep its citizens (children) as "perpetual children" and which does not break men's wills but rather guides it and presides over people in the same way as a shepherd looking after a "flock of timid animals". Tocqueville's penetrating analysis sought to understand the peculiar nature of American political life. In describing the American, he agreed with thinkers such as Aristotle and Montesquieu that the balance of property determined the balance of political power; however, his conclusions differed radically from those of his predecessors. Tocqueville tried to understand why the United States was so different from Europe in the last throes of aristocracy. In contrast to the aristocratic ethic, the United States was a society where hard work and money-making was the dominant ethic, where the common man enjoyed a level of dignity which was unprecedented, where commoners never deferred to elites and where what he described as crass individualism and market capitalism had taken root to an extraordinary degree.[citation needed] Tocqueville writes: "Among a democratic people, where there is no hereditary wealth, every man works to earn a living. ... Labor is held in honor; the prejudice is not against but in its favor." Tocqueville asserted that the values that had triumphed in the North and were present in the South had begun to suffocate old-world ethics and social arrangements. Legislatures abolished primogeniture and entails, resulting in more widely distributed land holdings. This was a contrast to the general aristocratic pattern in which only the eldest child, usually a man, inherited the estate, which had the effect of keeping large estates intact from generation to generation. In contrast, landed elites in the United States were less likely to pass on fortunes to a single child by the action of primogeniture, which meant that as time went by large estates became broken up within a few generations which in turn made the children more equal overall. According to Joshua Kaplan's Tocqueville, it was not always a negative development since bonds of affection and shared experience between children often replaced the more formal relation between the eldest child and the siblings, characteristic of the previous aristocratic pattern. Overall, hereditary fortunes in the new democracies became exceedingly difficult to secure and more people were forced to struggle for their own living.[citation needed] As Tocqueville understood it, this rapidly democratizing society had a population devoted to "middling" values which wanted to amass through hard work vast fortunes. In Tocqueville's mind, this explained why the United States was so different from Europe. In Europe, he claimed, nobody cared about making money. The lower classes had no hope of gaining more than minimal wealth while the upper classes found it crass, vulgar and unbecoming of their sort to care about something as unseemly as money and many were virtually guaranteed wealth and took it for granted. At the same time in the United States, workers would see people fashioned in exquisite attire and merely proclaim that through hard work they too would soon possess the fortune necessary to enjoy such luxuries.[citation needed] Beyond the eradication of old-world aristocracy, ordinary Americans also refused to defer to those possessing, as Tocqueville put it, superior talent and intelligence, and these natural elites could not enjoy much share in political power as a result. Ordinary Americans enjoyed too much power and claimed too great a voice in the public sphere to defer to intellectual superiors. Tocqueville argued that this culture promoted a relatively pronounced equality, but the same mores and opinions that ensured such equality also promoted mediocrity. Those who possessed true virtue and talent were left with limited choices. Tocqueville said that those with the most education and intelligence were left with two choices. They could join limited intellectual circles to explore the weighty and complex problems facing society, or they could use their superior talents to amass vast fortunes in the private sector. He wrote that he did not know of any country where there was "less independence of mind, and true freedom of discussion, than in America". Tocqueville blamed the omnipotence of majority rule as a chief factor in stifling thinking: "The majority has enclosed thought within a formidable fence. A writer is free inside that area, but woe to the man who goes beyond it, not that he stands in fear of an inquisition, but he must face all kinds of unpleasantness in every day persecution. A career in politics is closed to him for he has offended the only power that holds the keys." According to Kaplan's interpretation of Tocqueville, he argued in contrast to previous political thinkers that a serious problem in political life was not that people were too strong but that people were "too weak" and felt "swept up in something that they could not control". Uniquely positioned at a crossroads in American history, Tocqueville's Democracy in America attempted to capture the essence of American culture and values. Although a supporter of colonialism, Tocqueville could clearly perceive the evils that black people and natives had been subjected to in the United States. Tocqueville devoted the last chapter of the first volume of Democracy in America to the question, while his travel companion Gustave de Beaumont wholly focused on slavery and its fallouts for the American nation in Marie or Slavery in America. Tocqueville observes among the American races: The first who attracts the eye, the first in enlightenment, in power and in happiness, is the white man, the European, man par excellence; below him appear the Negro and the Indian. These two unfortunate races have neither birth, nor face, nor language, nor mores in common; only their misfortunes look alike. Both occupy an equally inferior position in the country that they inhabit; both experience the effects of tyranny; and if their miseries are different, they can accuse the same author for them. Tocqueville contrasted the settlers of Virginia with the middle class, religious Puritans who founded New England and analyzed the debasing influence of slavery: The men sent to Virginia were seekers of gold, adventurers without resources and without character, whose turbulent and restless spirit endangered the infant colony. ... Artisans and agriculturalists arrived afterwards[,] ... hardly in any respect above the level of the inferior classes in England. No lofty views, no spiritual conception presided over the foundation of these new settlements. The colony was scarcely established when slavery was introduced; this was the capital fact which was to exercise an immense influence on the character, the laws and the whole future of the South. Slavery ... dishonors labor; it introduces idleness into society, and with idleness, ignorance and pride, luxury and distress. It enervates the powers of the mind and benumbs the activity of man. On this same English foundation there developed in the North very different characteristics. Tocqueville maintained that the friction between races in America was deeper than merely the issue of slavery, even going so far as to say that discrimination against African Americans was worse in states where slavery was outlawed: Whosoever has inhabited the United States must have perceived that in those parts of the Union in which the negroes are no longer slaves, they have in no wise drawn nearer to the whites. On the contrary, the prejudice of the race appears to be stronger in the States which have abolished slavery, than in those where it still exists; and nowhere is it so intolerant as in those States where servitude has never been known. Tocqueville concluded that return of the Black population to Africa could not resolve the problem, as he writes at the end of Democracy in America: If the colony of Liberia were able to receive thousands of new inhabitants every year, and if the Negroes were in a state to be sent thither with advantage; if the Union were to supply the society with annual subsidies, and to transport the Negroes to Africa in government vessels, it would still be unable to counterpoise the natural increase of population among the blacks; and as it could not remove as many men in a year as are born upon its territory within that time, it could not prevent the growth of the evil which is daily increasing in the states. The Negro race will never leave those shores of the American continent to which it was brought by the passions and the vices of Europeans; and it will not disappear from the New World as long as it continues to exist. The inhabitants of the United States may retard the calamities which they apprehend, but they cannot now destroy their efficient cause. In 1855, Tocqueville wrote the following text published by Maria Weston Chapman in the Liberty Bell: Testimony against Slavery: I do not think it is for me, a foreigner, to indicate to the United States the time, the measures, or the men by whom Slavery shall be abolished. Still, as the persevering enemy of despotism everywhere, and under all its forms, I am pained and astonished by the fact that the freest people in the world is, at the present time, almost the only one among civilized and Christian nations which yet maintains personal servitude; and this while serfdom itself is about disappearing, where it has not already disappeared, from the most degraded nations of Europe.An old and sincere friend of America, I am uneasy at seeing Slavery retard her progress, tarnish her glory, furnish arms to her detractors, compromise the future career of the Union which is the guaranty of her safety and greatness, and point out beforehand to her, to all her enemies, the spot where they are to strike. As a man, too, I am moved at the spectacle of man's degradation by man, and I hope to see the day when the law will grant equal civil liberty to all the inhabitants of the same empire, as God accords the freedom of the will, without distinction, to the dwellers upon earth. French historian of colonialism Olivier Le Cour Grandmaison argues that Tocqueville (along with Jules Michelet) was ahead of his time in his use of the term "extermination" to describe what was happening during the colonization of Western United States and the Indian removal period. According to Tocqueville, assimilation of black people would be almost impossible, as was already being demonstrated in the Northern states; however, assimilation was the best solution for Native Americans, and since they were too proud to assimilate, they would inevitably become extinct. Displacement was another part of America's Indian policy. Both populations were "undemocratic", or without the qualities, intellectual and otherwise, needed to live in a democracy. Tocqueville shared many views on assimilation and segregation of his and the coming epochs but opposed Arthur de Gobineau's theories as found in An Essay on the Inequality of the Human Races (1853–1855). In his Democracy in America, Tocqueville also forecast the preeminence of the United States and Russia as the two main global powers. In his book, he stated: "There are now two great nations in the world, which starting from different points, seem to be advancing toward the same goal: the Russians and the Anglo-Americans. ... Each seems called by some secret design of Providence one day to hold in its hands the destinies of half the world." Tocqueville believed that the American jury system was particularly important in educating citizens in self-government and rule of law. He often expressed how the civil jury system was one of the most effective showcases of democracy because it connected citizens with the true spirit of the justice system. In his 1835 treatise Democracy in America, he explained: "The jury, and more especially the civil jury, serves to communicate the spirit of the judges to the minds of all the citizens; and this spirit, with the habits which attend it, is the soundest preparation for free institutions. ... It invests each citizen with a kind of magistracy; it makes them all feel the duties which they are bound to discharge toward society; and the part which they take in the Government." Tocqueville believed that jury service not only benefited the society as a whole but also enhanced jurors' qualities as citizens. Because of the jury system, "they were better informed about the rule of law, and they were more closely connected to the state. Thus, quite independently of what the jury contributed to dispute resolution, participation on the jury had salutary effects on the jurors themselves." Views on Algeria Alexis de Tocqueville was an important figure in the colonization of Algeria. A member of French parliament during the French conquest of Algeria and subsequent July Monarchy, Tocqueville took it upon himself to become an expert on the Algeria question, and to this end penned a number of discourses and letters. He also made a point of studying Islam, the Quran, and the Arabic language, in order to better understand the country. In a series of letters penned by Alexis de Tocqueville, he describes the situation of France as well as the geography and society of Algeria at the time. "Suppose that the Emperor of China, landing in France at the head of an armed power, should make himself master of our largest cities and of our capital. That after having burned all the public registers before suffering to read them, and having destroyed or dispersed all of the civil service without inquiring into their various attributions, he should finally seize every functionary – from the head of the government to the campesino guards, the peers, the deputies, and in general the whole ruling class – and deport them all at once to some distant country. Do you not think that this great prince, in spite of his powerful army, his fortresses and his treasures, will soon find himself extremely unprepared in administering the conquered country; that his new subjects, deprived of all those who conducted or could conduct affairs of state, will be unable to govern themselves, while he, coming from the antipodes, knows neither the religion, nor the language, nor the laws, nor the habits, nor the administrative customs of the country, and who has taken care to remove all those who could have instructed him in them, will be in no state rule them. You will therefore have no difficulty in foreseeing that if the parts of France which are materially occupied by the victor obey him, the rest of the country will soon be given over to an immense anarchy." Despite being initially critical of the French invasion of Algeria, Toccqueville also believed that geopolitical necessities of the time would not allow for a withdrawal of military forces for two reasons: first, his understanding of the international situation and France's position in the world; and second, changes in French society. Tocqueville believed that war and colonization would "restore national pride; threatened", he believed, by "the gradual softening of social mores" in the middle classes. Their taste for "material pleasures" was spreading to the whole of society, giving it "an example of weakness and egotism". Tocqueville expressed himself in an 1841 essay concerning the conquest of Algeria in which he called for a dual program of "domination" and "colonization". For my part, I have brought back from Africa the distressing notion that at the moment we are waging war in a much more barbaric manner than the Arabs themselves. At present, theirs is the side of civilization. This way of waging war seems to me as stupid as it is cruel. It can only enter into the crude and brutal mind of a soldier. It was not worth putting displacing the Turks to reproduce that which in them deserved the detestation of the world. That, even from the point of view of interest, is much more harmful than useful; because, as another officer said to me, if we only aim to equal the Turks we will be by the fact in a position much lower than them: barbarians among barbarians, the Turks will always have on us the advantage of being Muslim barbarians. It is thus to a principle superior to theirs that we must appeal. I have often heard in France men whom I respect, but whom I do not agree with, say that it wrong to burn the harvests, to empty the silos and finally to imprison unarmed men, women and children. These are, in my opinion, unfortunate necessities, but ones to which any people who want to make war on the Arabs will be obliged to submit. And, if I must say what I think, these acts do not revolt me more or even as much as several others which the law of war obviously authorizes and which take place in all the wars of Europe. Why is it more odious to burn harvests and take women and children prisoner than to bombard the harmless population of a besieged city or to seize merchant ships belonging to the subjects of an enemy power at sea? The one is, in my opinion, much crueler and less justifiable than the other. Applauding the methods of General Bugeaud, Tocqueville went so far to claim that "war in Africa is a science. Everyone is familiar with its rules and everyone can apply those rules with almost complete certainty of success. One of the greatest services that Field Marshal Bugeaud has rendered his country is to have spread, perfected and made everyone aware of this new science." Tocqueville advocated racial segregation as a form of consociationalism in Algeria with two distinct legislations, one for European colonists and one for the Arab population. Without doubt, it would be as dangerous as it would be useless to try to suggest to them our morals, our ideas, our customs. It is not in the direction of our European civilization that we must now push them, but in the direction of their own civilization; we must ask of them what they desire and not what they despise. Individual property, industry, sedentary living are not contrary to the religion of Mohammed. Arabs have known or know these things elsewhere; they are appreciated and enjoyed by some of them in Algeria itself. Why should we despair of making them familiar to the greatest number? It has already been attempted on some points with success. Islam is not absolutely impenetrable to the Enlightenment; it has often admitted in its bosom certain sciences or certain arts. Why should we not try to make these flourish under our empire? Let us not force the natives to come to our schools, but let us help them to raise theirs, to multiply those who teach there, to train the men of law and the men of religion, of whom the Muslim civilization cannot do without any more than us. Such a two-tier arrangement would be fully realised with the 1870 Crémieux decree and the Indigenousness Code, which extended French citizenship to European settlers and Algerian Jews whereas Muslim Algerians would be governed under the Code de l'indigénat; however Tocqueville hoped for an eventual mixing of the French and Arab populations into a single body: Every day the French are developing clearer and more accurate notions about the inhabitants of Algeria. They learn their languages, become familiar with their customs, and one even sees some who show a kind of unthinking enthusiasm for them. On the other hand, the whole of the young Arab generation in Algiers speaks our language and has already taken on some of our customs. ... There is therefore no reason to believe that time cannot succeed in amalgamating the two races. God does not prevent it; only the faults of men could impede it. In opposition to Olivier Le Cour Grandmaison, Jean-Louis Benoît said that given the extent of racial prejudices during the colonization of Algeria, Tocqueville was one of its "most moderate supporters". Benoît said that it was wrong to assume Tocqueville was a supporter of Bugeaud despite his 1841 apologetic discourse. It seems that Tocqueville modified his views after his second visit to Algeria in 1846 as he criticized Bugeaud's desire to invade Kabylia in an 1847 speech to the Assembly.[citation needed] Although Tocqueville had favoured retention of distinct traditional law, administrators, schools and so on for Arabs who had come under French control, he compared the Berber tribes of Kabylia (in his second of Two Letters on Algeria, 1837) to Rousseau's concept of the "noble savage", stating: If Rousseau had known the Kabyles ... he would not have spouted so much nonsense about the Caribbean and other American Indians: He would have looked to the Atlas for his models; there he would have found men who are subject to a kind of social police and yet almost as free as the isolated individual who enjoys his wild independence in the depths of the woods; men who are neither rich nor poor, neither servants nor masters; who appoint their own chiefs, and scarcely notice that they have chiefs, who are content with their state and remain in it Tocqueville's views on the matter were complex. Although in his 1841 report on Algeria he applauded Bugeaud for making war in a way that defeated Abd-el-Kader's resistance, he had advocated in the Two Letters that the French military advance leave Kabylia undisturbed and in subsequent speeches and writings he continued to oppose intrusion into Kabylia. In the debate about the 1846 extraordinary funds, Tocqueville denounced Bugeaud's conduct of military operations and succeeded in convincing the Assembly not to vote funds in support of Bugeaud's military columns. Tocqueville considered Bugeaud's plan to invade Kabylia despite the opposition of the Assembly as a seditious act in the face of which the government was opting for cowardice. In his 1847 "Report on Algeria", Tocqueville declared that Europe should avoid making the same mistake they made with the European colonization of the Americas in order to avoid the bloody consequences. More particularly he reminds his countrymen of a solemn caution whereby he warns them that if the methods used towards the Algerian people remain unchanged, colonization will end in a blood bath. Tocqueville includes in his report on Algeria that the fate of their soldiers and finances depended on how the French government treats the various native populations of Algeria, including the various Arab tribes, independent Kabyles living in the Atlas Mountains and the powerful political leader Abd-el-Kader. The latter stresses the obtainment and protection of land and passageways that promise commercial wealth. In the case of Algeria, the Port of Algiers and the control over the Strait of Gibraltar were considered by Tocqueville to be particularly valuable whereas direct control of the political operations of the entirety of Algeria was not. Thus, the author stresses domination over only certain points of political influence as a means to colonization of commercially valuable areas. Tocqueville argued that although unpleasant, domination via violent means is necessary for colonization and justified by the laws of war. Such laws are not discussed in detail; however, given that the goal of the French mission in Algeria was to obtain commercial and military interest as opposed to self-defense, it can be deduced that Tocqueville would not concur with just war theory's jus ad bellum criteria of just cause. Furthermore, given that Tocqueville approved of the use of force to eliminate civilian housing in enemy territory, his approach does not accord with just war theory's jus in bello criteria of proportionality and discrimination. The Old Regime and the Revolution In 1856, Tocqueville published The Old Regime and the Revolution. The book analyzes French society before the French Revolution—the ancien régime—and investigates the forces that caused the Revolution. References in popular literature Tocqueville was quoted in several chapters of Toby Young's memoirs How to Lose Friends and Alienate People to explain his observation of widespread homogeneity of thought even amongst intellectual elites at Harvard University during his time spent there. He is frequently quoted and studied in American history classes. Tocqueville is the inspiration for Australian novelist Peter Carey in his 2009 novel Parrot and Olivier in America. Tocqueville and his memoir Recollections are mentioned in Ada Palmer's novel Too Like the Lightning to describe someone with divided loyalty. Works See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-americanheritage_animal-6] | [TOKENS: 6011] |
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/2013] | [TOKENS: 512] |
תוכן עניינים 2013 שנת 2013 היא השנה ה-13 במאה ה-21. זוהי שנה רגילה, שאורכה 365 ימים. 1 בינואר 2013 לפי הלוח הגרגוריאני מקדים את 1 בינואר לפי הלוח היוליאני ב-13 ימים. כל התאריכים שלהלן הם לפי הלוח הגרגוריאני. אירועים ינואר פברואר מרץ אפריל מאי יוני יולי אוגוסט ספטמבר נובמבר דצמבר נולדו יולי נפטרו ינואר פברואר מרץ אפריל מאי יוני יולי אוגוסט ספטמבר אוקטובר נובמבר דצמבר לוח שנה להלן לוח שנה גרגוריאני – עברי משולב עם ימים בינלאומיים. חגים ומועדים עבריים: Nothing. ראו גם קישורים חיצוניים הערות שוליים 2008 • 2009 • 2010 • 2011 • 2012 • 2013 • 2014 • 2015 • 2016 • 2017 • 2018 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-barlow08-23] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Karl_Marx] | [TOKENS: 16261] |
Contents Karl Marx Karl Marx[a] (German: [ˈkaʁl ˈmaʁks]; 5 May 1818 – 14 March 1883) was a German philosopher, social and political theorist, economist, journalist, and revolutionary socialist. He is best-known for the 1848 pamphlet The Communist Manifesto (written with Friedrich Engels), and his three-volume Das Kapital (1867–1894), a critique of classical political economy which employs his theory of historical materialism in an analysis of capitalism, in the culmination of his life's work. Marx's ideas and their subsequent development, collectively known as Marxism, have had enormous influence. Born in Trier in the Kingdom of Prussia, Marx studied at the University of Bonn and the University of Berlin, and received a doctoral degree in philosophy from the University of Jena in 1841. A Young Hegelian, he was influenced by the philosophy of Georg Wilhelm Friedrich Hegel, and both critiqued and developed Hegel's ideas in works such as The German Ideology (written 1846) and the Grundrisse (written 1857–1858). While in Paris, Marx wrote his Economic and Philosophic Manuscripts of 1844 and met Engels, who became his closest friend and collaborator. After moving to Brussels in 1845, they were active in the Communist League, and in 1848 they wrote The Communist Manifesto, which expresses Marx's ideas and lays out a programme for revolution. Marx was expelled from Belgium and Germany, and in 1849 moved to London, where he wrote The Eighteenth Brumaire of Louis Bonaparte (1852) and Das Kapital. From 1864, Marx was involved in the International Workingmen's Association (First International), in which he fought the influence of anarchists led by Mikhail Bakunin. In his Critique of the Gotha Programme (1875), Marx wrote on revolution, the state and the transition to communism. He died stateless in 1883 and was buried in Highgate Cemetery. Marx's critiques of history, society and political economy hold that human societies develop through class conflict. In the capitalist mode of production, this manifests itself in the conflict between the ruling classes (the bourgeoisie) that control the means of production and the working classes (the proletariat) that enable these means by selling their labour power for wages. Employing his historical materialist approach, Marx predicted that capitalism produced internal tensions like previous socioeconomic systems and that these tensions would lead to its self-destruction and replacement by a new system known as the socialist mode of production. For Marx, class antagonisms under capitalism—owing in part to its instability and crisis-prone nature—would eventuate the working class's development of class consciousness, leading to their conquest of political power and eventually the establishment of a classless, communist society constituted by a free association of producers. Marx actively pressed for its implementation, arguing that the working class should carry out organised proletarian revolutionary action to topple capitalism and bring about socio-economic emancipation. Marx has been described as one of the most influential figures of the modern era, and his work has been both lauded and criticised. Marxism has exerted major influence on socialist thought and political movements, with Marxist schools of thought such as Marxism–Leninism and its offshoots becoming the guiding ideologies of revolutions that took power in many countries during the 20th century, forming communist states. Marx's work in economics has had a strong influence on modern heterodox theories of labour and capital, and he is often cited as one of the principal architects of modern sociology. Biography Karl Marx was born on 5 May 1818 to Heinrich Marx and Henriette Pressburg, at Brückengasse 664 in Trier, then part of the Kingdom of Prussia. Marx's family was originally non-religious Jewish but had converted formally to Christianity before his birth. His maternal grandfather was a Dutch rabbi, while his paternal line had supplied Trier's rabbis since 1723, a role taken by his grandfather Meier Halevi Marx. His father was the first in the line to receive a secular education. He became a lawyer with a comfortably upper middle class income and the family owned a number of Moselle vineyards, in addition to his income as an attorney. After Prussia's annexation of the Rhineland in 1815 and the subsequent abrogation of Jewish emancipation, Heinrich converted from Judaism to the state Evangelical Church of Prussia in order to retain his career as a lawyer. Largely non-religious, Heinrich was a man of the Enlightenment, interested in the ideas of the philosophers Immanuel Kant and Voltaire. A classical liberal, he took part in agitation for a constitution and reforms in Prussia, which was then an absolute monarchy. In 1815, Heinrich Marx began working as an attorney and in 1819 moved his family to a ten-room property near the Porta Nigra. His wife, Henriette Pressburg, was a Dutch Jew from a prosperous business family that later founded the company Philips Electronics. Her sister Sophie Pressburg married Lion Philips and was the grandmother of both Gerard and Anton Philips and great-grandmother to Frits Philips. Lion Philips was a wealthy Dutch tobacco manufacturer and industrialist, upon whom Karl and Jenny Marx would later often come to rely for loans while they were exiled in London. Little is known of Marx's childhood. The third of nine children, he became the eldest son when his brother Moritz died in 1819. Marx and his surviving siblings were baptised into the Lutheran Church on 28 August 1824, and their mother in November 1825. Marx was privately educated by his father until 1830 when he entered Trier High School [de], whose headmaster, Hugo Wyttenbach, was a friend of his father. By employing many liberal humanists as teachers, Wyttenbach incurred the anger of the local conservative government. In 1832, police raided the school and discovered that literature promoting political liberalism was being distributed among the students. Viewing the distribution of such material as a seditious act, the authorities implemented reforms and replaced several members of the staff during Marx's time at the school. In October 1835 at the age of 16, Marx travelled to the University of Bonn wishing to study philosophy and literature, but his father insisted on law as a more practical field. Due to a condition referred to as a "weak chest", Marx was excused from military duty when he turned 18. While at the University at Bonn, Marx joined the Poets' Club, a group containing political radicals that were monitored by the police. Marx also joined the Trier Tavern Club drinking society and at one point served as the club's co-president. In August 1836 he took part in a duel with a member of the university's Borussian Korps. Although his grades in the first term were good, they soon deteriorated, leading his father to force a transfer to the more serious and academic University of Berlin. Spending summer and autumn 1836 in Trier, Marx became more serious about his studies and his life. He became engaged to Jenny von Westphalen, an educated member of the petty nobility who had known Marx since childhood. As she had broken off her engagement with a young aristocrat to be with Marx, their relationship was socially controversial owing to the differences between their religious and class origins, but Marx befriended her father Ludwig von Westphalen (a liberal aristocrat) and later dedicated his doctoral thesis to him. Seven years after their engagement, on 19 June 1843, they married in a Protestant church in Kreuznach. In October 1836, Marx arrived in Berlin, matriculating in the university's faculty of law and renting a room in the Mittelstrasse. During the first term, Marx attended lectures of Eduard Gans (who represented the progressive Hegelian standpoint, elaborated on rational development in history by emphasising particularly its libertarian aspects, and the importance of the social question) and of Karl von Savigny (who represented the Historical School of Law). Although studying law, he was fascinated by philosophy and looked for a way to combine the two, believing that "without philosophy nothing could be accomplished". Marx became interested in the recently deceased German philosopher Georg Wilhelm Friedrich Hegel, whose ideas were then widely debated among European philosophical circles. During a convalescence in Stralau, he joined the Doctors Club, a student group which discussed Hegelian ideas, and through them became involved with a group of radical thinkers known as the Young Hegelians in 1837. They gathered around Ludwig Feuerbach and Bruno Bauer, with Marx developing a particularly close friendship with Adolf Rutenberg. Like Marx, the Young Hegelians were critical of Hegel's metaphysical assumptions but adopted his dialectical method to criticise established society, politics and religion from a left-wing perspective. Marx's father died in May 1838, resulting in a diminished income for the family. Marx had been emotionally close to his father and treasured his memory after his death. By 1837, Marx had completed a short novel, Scorpion and Felix; a drama, Oulanem; and a number of love poems dedicated to his wife. None of this early work was published during his lifetime. The love poems were published posthumously in the Collected Works of Karl Marx and Frederick Engels: Volume 1. Marx soon abandoned fiction for other pursuits, including the study of English and Italian, art history and the translation of Latin classics. He began co-operating with Bruno Bauer on editing Hegel's Philosophy of Religion in 1840. Marx was also engaged in writing his doctoral thesis, The Difference Between the Democritean and Epicurean Philosophy of Nature, which he completed in 1841. It was described as "a daring and original piece of work in which Marx set out to show that theology must yield to the superior wisdom of philosophy". The essay was controversial, particularly among the conservative professors at the University of Berlin. Marx decided instead to submit his thesis to the more liberal University of Jena, whose faculty awarded him his Ph.D. in April 1841. As Marx and Bauer were both atheists, in March 1841 they began plans for a journal entitled Archiv des Atheismus (Atheistic Archives), but it never came to fruition. In July, Marx and Bauer took a trip to Bonn from Berlin. There they scandalised their class by getting drunk, laughing in church and galloping through the streets on donkeys. Marx was considering an academic career, but this path was barred by the government's growing opposition to classical liberalism and the Young Hegelians. Marx moved to Cologne in 1842, where he became a journalist, writing for the radical newspaper Rheinische Zeitung (Rhineland News), expressing his early views on socialism and his developing interest in economics. Marx criticised right-wing European governments as well as figures in the liberal and socialist movements, whom he thought ineffective or counter-productive. The newspaper attracted the attention of the Prussian government censors, who checked every issue for seditious material before printing, which Marx lamented: "Our newspaper has to be presented to the police to be sniffed at, and if the police nose smells anything un-Christian or un-Prussian, the newspaper is not allowed to appear". After the Rheinische Zeitung published an article strongly criticising the Russian monarchy, Tsar Nicholas I requested it be banned, and Prussia's government complied in 1843. In 1843, Marx became co-editor of a new, radical left-wing Parisian newspaper, the Deutsch-Französische Jahrbücher (German-French Annals), then being set up by the German activist Arnold Ruge to bring together German and French radicals. Therefore Marx and his wife moved to Paris in October 1843. Initially living with Ruge and his wife communally at 23 Rue Vaneau, they found the living conditions difficult, so moved out following the birth of their daughter Jenny in 1844. Although intended to attract writers from both France and the German states, the Jahrbücher was dominated by the latter and the only non-German writer was the exiled Russian anarchist collectivist Mikhail Bakunin. Marx contributed two essays to the paper, "Introduction to a Contribution to the Critique of Hegel's Philosophy of Right" and "On the Jewish Question", the latter introducing his belief that the proletariat were a revolutionary force and marking his embrace of communism; "On the Jewish Question" has also been described as evidence of Marx's antisemitic views by writers such as Paul Johnson, Bernard Lewis, Hyam Maccoby, and Robert S. Wistrich, but this view is disputed by Wendy Brown, Robert Fine, David McLellan, and Francis Wheen, among others. Only one issue was published, but it was relatively successful, largely owing to the inclusion of Heinrich Heine's satirical odes on King Ludwig of Bavaria, leading the German states to ban it and seize imported copies (Ruge nevertheless refused to fund the publication of further issues and his friendship with Marx broke down). After Jahrbücher's collapse, Marx began writing for Vorwärts! (Forwards!), the only remaining uncensored German-language radical newspaper. Based in Paris, the paper was connected to the League of the Just, a utopian socialist secret society of workers and artisans. Marx attended some of their meetings but did not join. In Vorwärts!, Marx refined his views on socialism based upon Hegelian and Feuerbachian ideas of dialectical materialism, at the same time criticising liberals and other socialists operating in Europe. On 28 August 1844, Marx met the German socialist Friedrich Engels at the Café de la Régence, beginning a lifelong friendship. Engels showed Marx his recently published The Condition of the Working Class in England in 1844, convincing Marx that the working class would be the agent and instrument of the final revolution in history. Soon, Marx and Engels were collaborating on a criticism of the philosophical ideas of Marx's former friend, Bruno Bauer. This work was published in 1845 as The Holy Family. Although critical of Bauer, Marx was increasingly influenced by the ideas of the Young Hegelians Max Stirner and Ludwig Feuerbach, but eventually Marx and Engels abandoned Feuerbachian materialism as well. During the time that he lived at 38 Rue Vaneau in Paris (from October 1843 until January 1845), Marx engaged in an intensive study of political economy (Adam Smith, David Ricardo, James Mill, etc.), the French socialists (especially Claude Henri St. Simon and Charles Fourier) and the history of France. The study of, and critique, of political economy is a project that Marx would pursue for the rest of his life and would result in his major economic work—the three-volume series called Das Kapital. Marxism is based in large part on three influences: Hegel's dialectics, French utopian socialism and British political economy. Together with his earlier study of Hegel's dialectics, the studying that Marx did during this time in Paris meant that all major components of "Marxism" were in place by the autumn of 1844. Marx was constantly being pulled away from his critique of political economy—not only by the usual daily demands of the time, but additionally by editing a radical newspaper and later by organising and directing the efforts of a political party during years of potentially revolutionary popular uprisings of the citizenry. Still, Marx was always drawn back to his studies where he sought "to understand the inner workings of capitalism". An outline of "Marxism" had definitely formed in the mind of Karl Marx by late 1844. Indeed, many features of the Marxist view of the world had been worked out in great detail, but Marx needed to write down all of the details of his world view to further clarify the new critique of political economy in his own mind. Accordingly, Marx wrote The Economic and Philosophical Manuscripts. These manuscripts covered numerous topics, detailing Marx's concept of alienated labour. By the spring of 1845, his continued study of political economy, capital and capitalism had led Marx to the belief that the new critique of political economy he was espousing—that of scientific socialism—needed to be built on the base of a thoroughly developed materialistic view of the world. The Economic and Philosophical Manuscripts of 1844 had been written between April and August 1844, but soon Marx recognised that the Manuscripts had been influenced by some inconsistent ideas of Ludwig Feuerbach. Accordingly, Marx recognised the need to break with Feuerbach's philosophy in favour of historical materialism, thus a year later (in April 1845) after moving from Paris to Brussels, Marx wrote his eleven "Theses on Feuerbach". The "Theses on Feuerbach" are best known for Thesis 11, which states that "philosophers have only interpreted the world in various ways, the point is to change it". This work contains Marx's criticism of materialism (for being contemplative), idealism (for reducing practice to theory), and, overall, philosophy (for putting abstract reality above the physical world). It thus introduced the first glimpse at Marx's historical materialism, an argument that the world is changed not by ideas but by actual, physical, material activity and practice. In 1845, after receiving a request from the Prussian king, the French government shut down Vorwärts!, with the interior minister, François Guizot, expelling Marx from France. Unable either to stay in France or to move to Germany, Marx decided to emigrate to Brussels in Belgium in February 1845. However, to stay in Belgium he had to pledge not to publish anything on the subject of contemporary politics. In Brussels, Marx associated with other exiled socialists from across Europe, including Moses Hess, Karl Heinzen and Joseph Weydemeyer. In April 1845, Engels moved from Barmen in Germany to Brussels to join Marx and the growing cadre of members of the League of the Just now seeking home in Brussels. Later, Mary Burns, Engels' long-time companion, left Manchester, England, to join Engels in Brussels. In mid-July 1845, Marx and Engels left Brussels for England to visit the leaders of the Chartists, a working-class movement in Britain. This was Marx's first trip to England and Engels was an ideal guide for the trip. Engels had already spent two years living in Manchester from November 1842 to August 1844. Not only did Engels already know the English language, but he had also developed a close relationship with many Chartist leaders. Indeed, Engels was serving as a reporter for many Chartist and socialist English newspapers. Marx used the trip as an opportunity to examine the economic resources available for study in various libraries in London and Manchester. In collaboration with Engels, Marx also set about writing a book which is often seen as his best treatment of the concept of historical materialism, The German Ideology. In this work, Marx broke with Ludwig Feuerbach, Bruno Bauer, Max Stirner and the rest of the Young Hegelians, while he also broke with Karl Grün and other "true socialists" whose philosophies were still based in part on "idealism". In German Ideology, Marx and Engels finally completed their philosophy, which was based solely on materialism as the sole motor force in history. German Ideology is written in a humorously satirical form, but even this satirical form did not save the work from censorship. Like so many other early writings of his, German Ideology would not be published in Marx's lifetime and was published only in 1932. After completing German Ideology, Marx turned to a work that was intended to clarify his own position regarding "the theory and tactics" of a truly "revolutionary proletarian movement" operating from the standpoint of a truly "scientific materialist" philosophy. This work was intended to draw a distinction between the utopian socialists and Marx's own scientific socialist philosophy. Whereas the utopians believed that people must be persuaded one person at a time to join the socialist movement, the way a person must be persuaded to adopt any different belief, Marx knew that people would tend, on most occasions, to act in accordance with their own economic interests, thus appealing to an entire class (the working class in this case) with a broad appeal to the class's best material interest would be the best way to mobilise the broad mass of that class to make a revolution and change society. This was the intent of the new book that Marx was planning, but to get the manuscript past the government censors he called the book The Poverty of Philosophy (1847) and offered it as a response to the "petty-bourgeois philosophy" of the French anarchist socialist Pierre-Joseph Proudhon as expressed in his book The Philosophy of Poverty (1840). These books laid the foundation for Marx and Engels's most famous work, a political pamphlet that has since come to be commonly known as The Communist Manifesto. While residing in Brussels in 1846, Marx continued his association with the secret radical organisation League of the Just. As noted above, Marx thought the League to be just the sort of radical organisation that was needed to spur the working class of Europe toward the mass movement that would bring about a working-class revolution. However, to organise the working class into a mass movement the League had to cease its "secret" or "underground" orientation and operate in the open as a political party. Members of the League eventually became persuaded in this regard. Accordingly, in June 1847 the League was reorganised by its membership into a new open "above ground" political society that appealed directly to the working classes. This new open political society was called the Communist League. Both Marx and Engels participated in drawing up the programme and organisational principles of the new Communist League. In late 1847, Marx and Engels began writing what was to become their most famous work – a programme of action for the Communist League. Written jointly by Marx and Engels from December 1847 to January 1848, The Communist Manifesto was first published on 21 February 1848. The Communist Manifesto laid out the beliefs of the new Communist League. No longer a secret society, the Communist League wanted to make aims and intentions clear to the general public rather than hiding its beliefs as the League of the Just had been doing. The opening lines of the pamphlet set forth the principal basis of Marxism: "The history of all hitherto existing society is the history of class struggles". It goes on to examine the antagonisms that Marx claimed were arising in the clashes of interest between the bourgeoisie (the wealthy capitalist class) and the proletariat (the industrial working class). Proceeding on from this, the Manifesto presents the argument for why the Communist League, as opposed to other socialist and liberal political parties and groups at the time, was truly acting in the interests of the proletariat to overthrow capitalist society and to replace it with socialism. Later that year, Europe experienced a series of protests, rebellions, and often violent upheavals that became known as the Revolutions of 1848. In France, a revolution led to the overthrow of the monarchy and the establishment of the French Second Republic. Marx was supportive of such activity and having recently received a substantial inheritance from his father (withheld by his uncle Lionel Philips since his father's death in 1838) of either 6,000 or 5,000 francs he allegedly used a third of it to arm Belgian workers who were planning revolutionary action. Although the veracity of these allegations is disputed, the Belgian Ministry of Justice accused Marx of it, subsequently arresting him and he was forced to flee back to France, where with a new republican government in power he believed that he would be safe. Temporarily settling down in Paris, Marx transferred the Communist League executive headquarters to the city and also set up a German Workers' Club with various German socialists living there. Hoping to see the revolution spread to Germany, in 1848 Marx moved back to Cologne where he began issuing a handbill entitled the Demands of the Communist Party in Germany, in which he argued for only four of the ten points of the Communist Manifesto, believing that in Germany at that time the bourgeoisie must overthrow the feudal monarchy and aristocracy before the proletariat could overthrow the bourgeoisie. On 1 June, Marx started the publication of a daily newspaper, the Neue Rheinische Zeitung, which he helped to finance through his recent inheritance from his father. Designed to put forward news from across Europe with his own Marxist interpretation of events, the newspaper featured Marx as a primary writer and the dominant editorial influence. Despite contributions by fellow members of the Communist League, according to Friedrich Engels it remained "a simple dictatorship by Marx". Whilst editor of the paper, Marx and the other revolutionary socialists were regularly harassed by the police and Marx was brought to trial on several occasions, facing various allegations including insulting the Chief Public Prosecutor, committing a press misdemeanor and inciting armed rebellion through tax boycotting, although each time he was acquitted. Meanwhile, the democratic parliament in Prussia collapsed and the king, Frederick William IV, introduced a new cabinet of his reactionary supporters, who implemented counterrevolutionary measures to expunge left-wing and other revolutionary elements from the country. Consequently, the Neue Rheinische Zeitung was soon suppressed, and Marx was ordered to leave the country on 16 May 1849. Marx returned to Paris, which was then under the grip of both a reactionary counterrevolution and a cholera epidemic, and was soon expelled by the city authorities, who considered him a political threat. With his wife Jenny expecting their fourth child and with Marx not able to move back to Germany or Belgium, in August 1849 he sought refuge in London. Marx moved to London in early June 1849 and would remain based in the city for the rest of his life. The headquarters of the Communist League also moved to London. However, in the winter of 1849–1850, a split within the ranks of the Communist League occurred when a faction within it led by August Willich and Karl Schapper began agitating for an immediate uprising. Willich and Schapper believed that once the Communist League had initiated the uprising, the entire working class from across Europe would rise "spontaneously" to join it, thus creating revolution across Europe. Marx and Engels protested that such an unplanned uprising on the part of the Communist League was "adventuristic" and would be suicide for the Communist League. Such an uprising as that recommended by the Schapper/Willich group would easily be crushed by the police and the armed forces of the reactionary governments of Europe. Marx maintained that this would spell doom for the Communist League itself, arguing that changes in society are not achieved overnight through the efforts and will power of a handful of men. They are instead brought about through a scientific analysis of economic conditions of society and by moving toward revolution through different stages of social development. In the present stage of development (circa 1850), following the defeat of the uprisings across Europe in 1848 he felt that the Communist League should encourage the working class to unite with progressive elements of the rising bourgeoisie to defeat the feudal aristocracy on issues involving demands for governmental reforms, such as a constitutional republic with freely elected assemblies and universal (male) suffrage. In other words, the working class must join with bourgeois and democratic forces to bring about the successful conclusion of the bourgeois revolution before stressing the working-class agenda and a working-class revolution.[citation needed] After a long struggle that threatened to ruin the Communist League, Marx's opinion prevailed and eventually, the Willich/Schapper group left the Communist League. Meanwhile, Marx also became heavily involved with the socialist German Workers' Educational Society. The Society held their meetings in Great Windmill Street, Soho, central London's entertainment district. This organisation was also racked by an internal struggle among its members, some of whom followed Marx while others followed the Schapper/Willich faction. The issues in this internal split were the same issues raised in the internal split within the Communist League, but Marx lost the fight with the Schapper/Willich faction within the German Workers' Educational Society and on 17 September 1850 resigned from the Society. In the early period in London, Marx committed himself almost exclusively to his studies, such that his family endured extreme poverty. His main source of income was Engels, whose own source was his wealthy industrialist father. In Prussia as editor of his own newspaper, and contributor to others ideologically aligned, Marx could reach his audience, the working classes. In London, without finances to run a newspaper themselves, he and Engels turned to international journalism. At one stage they were being published by six newspapers from England, the United States, Prussia, Austria, and South Africa. Marx's principal earnings came from his work as European correspondent, from 1852 to 1862, for the New-York Daily Tribune,: 17 and from also producing articles for more "bourgeois" newspapers. Marx had his articles translated from German by Wilhelm Pieper [de], until his proficiency in English had become adequate. The New-York Daily Tribune had been founded in April 1841 by Horace Greeley. Its editorial board contained progressive bourgeois journalists and publishers, among them George Ripley and the journalist Charles Dana, who was editor-in-chief. Dana, a fourierist and an abolitionist, was Marx's contact. The Tribune was a vehicle for Marx to reach a transatlantic public, such as for his "hidden warfare" against Henry Charles Carey. The journal had wide working-class appeal from its foundation; at two cents, it was inexpensive; and, with about 50,000 copies per issue, its circulation was the widest in the United States.: 14 Its editorial ethos was progressive and its anti-slavery stance reflected Greeley's.: 82 Marx's first article for the paper, on the British parliamentary elections, was published on 21 August 1852. On 21 March 1857, Dana informed Marx that due to the economic recession only one article a week would be paid for, published or not; the others would be paid for only if published. Marx had sent his articles on Tuesdays and Fridays, but, that October, the Tribune discharged all its correspondents in Europe except Marx and B. Taylor, and reduced Marx to a weekly article. Between September and November 1860, only five were published. After a six-month interval, Marx resumed contributions from September 1861 until March 1862, when Dana wrote to inform him that there was no longer space in the Tribune for reports from London, due to American domestic affairs. In 1868, Dana set up a rival newspaper, the New York Sun, at which he was editor-in-chief. In April 1857, Dana invited Marx to contribute articles, mainly on military history, to the New American Cyclopedia, an idea of George Ripley, Dana's friend and literary editor of the Tribune. In all, 67 Marx-Engels articles were published, of which 51 were written by Engels, although Marx did some research for them in the British Museum. By the late 1850s, American popular interest in European affairs waned and Marx's articles turned to topics such as the "slavery crisis" and the outbreak of the American Civil War in 1861 in the "War Between the States". Between December 1851 and March 1852, Marx worked on his theoretical work about the French Revolution of 1848, titled The Eighteenth Brumaire of Louis Napoleon. In this he explored concepts in historical materialism, class struggle, dictatorship of the proletariat, and victory of the proletariat over the bourgeois state. The 1850s and 1860s may be said to mark a philosophical boundary distinguishing the young Marx's Hegelian idealism and the more mature Marx's scientific ideology associated with structural Marxism. However, not all scholars accept this distinction. For Marx and Engels, their experience of the Revolutions of 1848 to 1849 were formative in the development of their theory of economics and historical progression. After the "failures" of 1848, the revolutionary impetus appeared spent and not to be renewed without an economic recession. Contention arose between Marx and his fellow communists, whom he denounced as "adventurists". Marx deemed it fanciful to propose that "will power" could be sufficient to create the revolutionary conditions when in reality the economic component was the necessary requisite. The recession in the United States' economy in 1852 gave Marx and Engels grounds for optimism for revolutionary activity, yet this economy was seen as too immature for a capitalist revolution. Open territories on America's western frontier dissipated the forces of social unrest. Moreover, any economic crisis arising in the United States would not lead to revolutionary contagion of the older economies of individual European nations, which were closed systems bounded by their national borders. When the so-called Panic of 1857 in the United States spread globally, it broke all economic theory models, and was the first truly global economic crisis. Marx continued to write articles for the New York Daily Tribune as long as he was sure that the Tribune's editorial policy was still progressive. However, the departure of Charles Dana from the paper in late 1861 and the resultant change in the editorial board brought about a new editorial policy. No longer was the Tribune to be a strong abolitionist paper dedicated to a complete Union victory. The new editorial board supported an immediate peace between the Union and the Confederacy in the Civil War in the United States with slavery left intact in the Confederacy. Marx strongly disagreed with this new political position and in 1863 was forced to withdraw as a writer for the Tribune. In 1864, Marx became involved in the International Workingmen's Association (known as the First International), to whose General Council he was elected at its inception in 1864. In that organisation, Marx was involved in the struggle against the anarchist wing centred on Mikhail Bakunin. Although Marx won this contest, the transfer of the seat of the General Council from London to New York in 1872, which Marx supported, led to the decline of the International. The most important political event during the existence of the International was the Paris Commune of 1871 when the citizens of Paris rebelled against their government and held the city for two months. In response to the bloody suppression of this rebellion, Marx wrote one of his most famous pamphlets, "The Civil War in France", a defence of the Commune. Given the repeated failures and frustrations of workers' revolutions and movements, Marx also sought to understand and provide a critique suitable for the capitalist mode of production, and hence spent a great deal of time in the reading room of the British Museum studying. By 1857, Marx had accumulated over 800 pages of notes and short essays on capital, landed property, wage labour, the state, and foreign trade, and the world market, though this work did not appear in print until 1939, under the title Grundrisse der Kritik der Politischen Ökonomie (English: Outlines of the Critique of Political Economy). In 1859, Marx published A Contribution to the Critique of Political Economy, his first serious critique of political economy. This work was intended merely as a preview of his three-volume Das Kapital (English title: Capital: Critique of Political Economy), which he intended to publish at a later date. In A Contribution to the Critique of Political Economy, Marx began to critically examine axioms and categories of economic thinking. The work was enthusiastically received, and the edition sold out quickly. The successful sales of A Contribution to the Critique of Political Economy stimulated Marx in the early 1860s to finish work on the three large volumes that would compose his major life's work – Das Kapital and the Theories of Surplus Value, which discussed and critiqued the theoreticians of political economy, particularly Adam Smith and David Ricardo. Theories of Surplus Value is often referred to as the fourth volume of Das Kapital and constitutes one of the first comprehensive treatises on the history of economic thought. In 1867, the first volume of Das Kapital was published, a work which critically analysed capital. Das Kapital proposes an explanation of the "laws of motion" of the mode of production from its origins to its future by describing the dynamics of the accumulation of capital, with topics such as the growth of wage labour, the transformation of the workplace, capital accumulation, competition, the banking system, the tendency of the rate of profit to fall and land-rents, as well as how waged labour continually reproduce the rule of capital. Marx proposes that the driving force of capital is in the exploitation of labour, whose unpaid work is the ultimate source of surplus value. Demand for a Russian language edition of Das Kapital soon led to the printing of 3,000 copies of the book in the Russian language, which was published on 27 March 1872. By the autumn of 1871, the entire first edition of the German-language edition of Das Kapital had been sold out and a second edition was published. Volumes II and III of Das Kapital remained mere manuscripts upon which Marx continued to work for the rest of his life. Both volumes were published by Engels after Marx's death. Volume II of Das Kapital was prepared and published by Engels in July 1893 under the name Capital II: The Process of Circulation of Capital. Volume III of Das Kapital was published a year later in October 1894 under the name Capital III: The Process of Capitalist Production as a Whole. Theories of Surplus Value derived from the sprawling Economic Manuscripts of 1861–1863, a second draft for Das Kapital, the latter spanning volumes 30–34 of the Collected Works of Marx and Engels. Specifically, Theories of Surplus Value runs from the latter part of the Collected Works' thirtieth volume through the end of their thirty-second volume; meanwhile, the larger Economic Manuscripts of 1861–1863 run from the start of the Collected Works' thirtieth volume through the first half of their thirty-fourth volume. The latter half of the Collected Works' thirty-fourth volume consists of the surviving fragments of the Economic Manuscripts of 1863–1864, which represented a third draft for Das Kapital, and a large portion of which is included as an appendix to the Penguin edition of Das Kapital, volume I. A German-language abridged edition of Theories of Surplus Value was published in 1905 and in 1910. This abridged edition was translated into English and published in 1951 in London, but the complete unabridged edition of Theories of Surplus Value was published as the "fourth volume" of Das Kapital in 1963 and 1971 in Moscow. During the last decade of his life, Marx's health declined, and he became incapable of the sustained effort that had characterised his previous work. He did manage to comment substantially on contemporary politics, particularly in Germany and Russia. His Critique of the Gotha Programme opposed the tendency of his followers Wilhelm Liebknecht and August Bebel to compromise with the state socialist ideas of Ferdinand Lassalle in the interests of a united socialist party. This work is also notable for another famous Marx quote: "From each according to his ability, to each according to his need". In a letter to Vera Zasulich dated 8 March 1881, Marx contemplated the possibility of Russia's bypassing the capitalist stage of development and building communism on the basis of the common ownership of land characteristic of the village mir. While admitting that Russia's rural "commune is the fulcrum of social regeneration in Russia", Marx also warned that in order for the mir to operate as a means for moving straight to the socialist stage without a preceding capitalist stage it "would first be necessary to eliminate the deleterious influences which are assailing it [the rural commune] from all sides". Given the elimination of these pernicious influences, Marx allowed that "normal conditions of spontaneous development" of the rural commune could exist. However, in the same letter to Vera Zasulich he points out that "at the core of the capitalist system ... lies the complete separation of the producer from the means of production". In one of the drafts of this letter, Marx reveals his growing passion for anthropology, motivated by his belief that future communism would be a return on a higher level to the communism of our prehistoric past. He wrote: the historical trend of our age is the fatal crisis which capitalist production has undergone in the European and American countries where it has reached its highest peak, a crisis that will end in its destruction, in the return of modern society to a higher form of the most archaic type – collective production and appropriation. He added that "the vitality of primitive communities was incomparably greater than that of Semitic, Greek, Roman, etc. societies, and, a fortiori, that of modern capitalist societies". Before he died, Marx asked Engels to write up these ideas, which were published in 1884 under the title The Origin of the Family, Private Property and the State, partially based on Marx's notes to Lewis H. Morgan's book Ancient Society. Personal life Marx and von Westphalen had seven children together, but partly owing to the poor conditions in which they lived whilst in London, only three survived to adulthood. Their children were: Jenny Caroline (m. Longuet; 1844–1883); Jenny Laura (m. Lafargue; 1845–1911); Edgar (1847–1855); Henry Edward Guy ("Guido"; 1849–1850); Jenny Eveline Frances ("Franziska"; 1851–1852); Jenny Julia Eleanor (1855–1898) and one more who died before being named (July 1857). According to his son-in-law, Paul Lafargue, Marx was a loving father. In 1962, there were allegations that Marx fathered a son, Freddy, out of wedlock by his housekeeper, Helene Demuth, but the claim is disputed for lack of documented evidence. Helene Demuth was also largely entrusted as a confidante. In her obituary, penned by Friedrich Engels, her role is revealed as: "Marx took counsel of Helena Demuth, not only in difficult and intricate party matters, but even in respect of his economical writings". Marx frequently used pseudonyms, often when renting a house or flat, apparently to make it harder for the authorities to track him down. While in Paris, he used that of "Monsieur Ramboz", whilst in London, he signed off his letters as "A. Williams". His friends referred to him as "Moor", owing to his dark complexion and black curly hair, while he encouraged his children to call him "Old Nick" and "Charley". He also bestowed nicknames and pseudonyms on his friends and family, referring to Friedrich Engels as "General", his housekeeper Helene as "Lenchen" or "Nym", while one of his daughters, Jennychen, was referred to as "Qui Qui, Emperor of China" and another, Laura, was known as "Kakadou" or "the Hottentot". Marx drank heavily after joining the Trier Tavern Club drinking society in the 1830s, and continued to do so until his death. Marx was afflicted by poor health, what he himself described as "the wretchedness of existence", and various authors have sought to describe and explain it. His biographer Werner Blumenberg attributed it to liver and gall problems which Marx had in 1849 and from which he was never afterward free, exacerbated by an unsuitable lifestyle. The attacks often came with headaches, eye inflammation, neuralgia in the head, and rheumatic pains. A serious nervous disorder appeared in 1877 and protracted insomnia was a consequence, which Marx fought with narcotics. The illness was aggravated by excessive nocturnal work and faulty diet. Marx was fond of highly seasoned dishes, smoked fish, caviare, pickled cucumbers, "none of which are good for liver patients", but he also liked wine and liqueurs and smoked an enormous amount "and since he had no money, it was usually bad-quality cigars". From 1863, Marx complained a lot about boils: "These are very frequent with liver patients and may be due to the same causes". The abscesses were so bad that Marx could neither sit nor work upright. According to Blumenberg, Marx's irritability is often found in liver patients: The illness emphasised certain traits in his character. He argued cuttingly, his biting satire did not shrink at insults, and his expressions could be rude and cruel. Though in general Marx had blind faith in his closest friends, nevertheless he himself complained that he was sometimes too mistrustful and unjust even to them. His verdicts, not only about enemies but even about friends, were sometimes so harsh that even less sensitive people would take offence ... There must have been few whom he did not criticize like this ... not even Engels was an exception. According to Princeton historian Jerrold Seigel, in his late teens, Marx may have had pneumonia or pleurisy, the effects of which led to his being exempted from Prussian military service. In later life whilst working on Das Kapital (which he never completed), Marx suffered from a trio of afflictions. A liver ailment, probably hereditary, was aggravated by overwork, a bad diet, and lack of sleep. Inflammation of the eyes was induced by too much work at night. A third affliction, eruption of carbuncles or boils, "was probably brought on by general physical debility to which the various features of Marx's style of life – alcohol, tobacco, poor diet, and failure to sleep – all contributed. Engels often exhorted Marx to alter this dangerous regime". In Seigel's thesis, what lay behind this punishing sacrifice of his health may have been guilt about self-involvement and egoism, originally induced in Karl Marx by his father. In 2007, a retrodiagnosis of Marx's skin disease was made by dermatologist Sam Shuster of Newcastle University. For Shuster, the most probable explanation was that Marx suffered not from liver problems, but from hidradenitis suppurativa, a recurring infective condition arising from blockage of apocrine ducts opening into hair follicles. Shuster went on to consider the potential psychosocial effects of the disease, noting that the skin is an organ of communication and that hidradenitis suppurativa produces much psychological distress, including loathing and disgust and depression of self-image, mood, and well-being, feelings for which Shuster found "much evidence" in the Marx correspondence. Professor Shuster went on to ask himself whether the mental effects of the disease affected Marx's work and even helped him to develop his theory of alienation. Following the death of his wife Jenny in December 1881, Marx developed a catarrh that kept him in ill health for the last 15 months of his life. It eventually brought on the bronchitis and pleurisy that killed him in London on 14 March 1883, when he died a stateless person at age 64. Family and friends in London buried his body in Highgate Cemetery (East), London, on 17 March 1883 in an area reserved for agnostics and atheists. According to Francis Wheen, there were between nine and eleven mourners at his funeral. Research from contemporary sources identifies thirteen named individuals attending the funeral: Friedrich Engels, Eleanor Marx, Edward Aveling, Paul Lafargue, Charles Longuet, Helene Demuth, Wilhelm Liebknecht, Gottlieb Lemke, Frederick Lessner, G Lochner, Sir Ray Lankester, Carl Schorlemmer and Ernest Radford. A contemporary newspaper account claims that twenty-five to thirty relatives and friends attended the funeral. A writer in The Graphic noted: By a strange blunder ... his death was not announced for two days, and then as having taken place at Paris. The next day the correction came from Paris; and when his friends and followers hastened to his house in Haverstock Hill, to learn the time and place of burial, they learned that he was already in the cold ground. But for this secresy [sic] and haste, a great popular demonstration would undoubtedly have been held over his grave. Several of his closest friends spoke at his funeral, including Wilhelm Liebknecht and Friedrich Engels. Engels' speech included the passage: On the 14th of March, at a quarter to three in the afternoon, the greatest living thinker ceased to think. He had been left alone for scarcely two minutes, and when we came back we found him in his armchair, peacefully gone to sleep – but forever. Marx's surviving daughters Eleanor and Laura, as well as Charles Longuet and Paul Lafargue, Marx's two French socialist sons-in-law, were also in attendance. He had been predeceased by his wife and his eldest daughter, the latter dying a few months earlier in January 1883. Liebknecht, a founder and leader of the German Social Democratic Party, gave a speech in German, and Longuet, a prominent figure in the French working-class movement, made a short statement in French. Two telegrams from workers' parties in France and Spain were read out; from Jose Mesa y Leompart [es] on behalf of the Madrid branch of the Partido Socialista Obrero Español [es], and from 'The Secretary, Lipine' from the Paris branch of the French Workers' Party. Together with Engels's speech, this constituted the entire programme of the funeral. Non-relatives attending the funeral included three communist associates of Marx: Friedrich Lessner, imprisoned for three years after the Cologne Communist Trial of 1852; G. Lochner, whom Engels described as "an old member of the Communist League"; and Carl Schorlemmer, a professor of chemistry in Manchester, a member of the Royal Society, and a communist activist involved in the 1848 Baden revolution. Another attendee of the funeral was Ray Lankester, a British zoologist who would later become a prominent academic. Marx left a personal estate valued for probate at £250, equivalent to £38,095 in 2024. Upon his own death in 1895, Engels left Marx's two surviving daughters a "significant portion" of his considerable estate, valued in 2024 at US$6.8 million. Marx and his family were reburied on a new site nearby in November 1954. The tomb at the new site, unveiled on 14 March 1956, bears the carved message: "Workers of All Lands Unite", the final line of The Communist Manifesto; and, from the 11th "Thesis on Feuerbach" (as edited by Engels), "The philosophers have only interpreted the world in various ways—the point however is to change it". The Communist Party of Great Britain (CPGB) had the monument with a portrait bust by Laurence Bradshaw erected and Marx's original tomb had only humble adornment. The Marxist historian Eric Hobsbawm remarked: "One cannot say Marx died a failure." Although he had not achieved a large following of disciples in Britain, his writings had already begun to make an impact on the left-wing movements in Germany and Russia. Within twenty-five years of his death, the continental European socialist parties that acknowledged Marx's influence on their politics had contributed to significant gains in their representative democratic elections. Thought Marx's thought demonstrates influence from many sources, including but not limited to: Marx's view of history, which came to be called historical materialism (controversially adapted as the philosophy of dialectical materialism by Engels and Lenin), certainly shows the influence of Hegel's claim that one should view reality (and history) dialectically. However, whereas Hegel had thought in idealist terms, putting ideas in the forefront, Marx sought to conceptualise dialectics in materialist terms, arguing for the primacy of matter over idea. Where Hegel saw the "spirit" as driving history, Marx saw this as an unnecessary mystification, obscuring the reality of humanity and its physical actions shaping the world. He wrote that Hegelianism stood the movement of reality on its head, and that one needed to set it upon its feet. Despite his dislike of mystical terms, Marx used Gothic language in several of his works: in The Communist Manifesto he proclaims "A spectre is haunting Europe – the spectre of communism. All the powers of old Europe have entered into a holy alliance to exorcise this spectre", and in The Capital he refers to capital as "necromancy that surrounds the products of labour". Though inspired by French socialist and sociological thought, Marx criticised utopian socialists, arguing that their favoured small-scale socialistic communities would be bound to marginalisation and poverty and that only a large-scale change in the economic system could bring about real change. Other important contributions to Marx's revision of Hegelianism came from Engels's book, The Condition of the Working Class in England in 1844, which led Marx to conceive of the historical dialectic in terms of class conflict and to see the modern working class as the most progressive force for revolution, as well as from the social democrat Friedrich Wilhelm Schulz, who in Die Bewegung der Produktion described the movement of society as "flowing from the contradiction between the forces of production and the mode of production". Marx believed that he could study history and society scientifically, discerning tendencies of history and thereby predicting the outcome of social conflicts. Some followers of Marx, therefore, concluded that a communist revolution would inevitably occur. However, Marx famously asserted in the eleventh of his "Theses on Feuerbach" that "philosophers have only interpreted the world, in various ways; the point however is to change it" and he clearly dedicated himself to trying to alter the world. Marx's theories inspired several theories and disciplines of future, including but not limited to: Marx has been called "the first great user of critical method in social sciences", a characterisation stemming from his frequent use of polemics throughout his work to effect critiques of other thinkers. He criticised speculative philosophy, equating metaphysics with ideology. By adopting this approach, Marx attempted to separate key findings from ideological biases. This set him apart from many contemporary philosophers. Like Tocqueville, who described a faceless and bureaucratic despotism with no identifiable despot, Marx also broke with classical thinkers who spoke of a single tyrant and with Montesquieu, who discussed the nature of the single despot. Instead, Marx set out to analyse "the despotism of capital". Fundamentally, Marx assumed that human history involves transforming human nature, which encompasses both human beings and material objects. Humans recognise that they possess both actual and potential selves. For both Marx and Hegel, self-development begins with an experience of internal alienation stemming from this recognition, followed by a realisation that the actual self, as a subjective agent, renders its potential counterpart an object to be apprehended. Marx further argues that by moulding nature in desired ways the subject takes the object as its own and thus permits the individual to be actualised as fully human. For Marx, the human nature – Gattungswesen, or species-being – exists as a function of human labour. Fundamental to Marx's idea of meaningful labour is the proposition that for a subject to come to terms with its alienated object it must first exert influence upon literal, material objects in the subject's world. Marx acknowledges that Hegel "grasps the nature of work and comprehends objective man, authentic because actual, as the result of his own work", but characterises Hegelian self-development as unduly "spiritual" and abstract. Marx thus departs from Hegel by insisting that "the fact that man is a corporeal, actual, sentient, objective being with natural capacities means that he has actual, sensuous objects for his nature as objects of his life-expression, or that he can only express his life in actual sensuous objects". Consequently, Marx revises Hegelian "work" into material "labour" and in the context of human capacity to transform nature the term "labour power". The history of all hitherto existing society is the history of class struggles. — Karl Marx, The Communist Manifesto Marx had a special concern with how people relate to their own labour power. He wrote extensively about this in terms of the problem of alienation. As with the dialectic, Marx began with a Hegelian notion of alienation but developed a more materialist conception. Capitalism mediates social relationships of production (such as among workers or between workers and capitalists) through commodities, including labour, that are bought and sold on the market. For Marx, the possibility that one may give up ownership of one's own labour – one's capacity to transform the world – is tantamount to being alienated from one's own nature and it is a spiritual loss. Marx described this loss as commodity fetishism, in which the things that people produce, commodities, appear to have a life and movement of their own to which humans and their behaviour merely adapt. Commodity fetishism provides an example of what Engels called "false consciousness", which relates closely to the understanding of ideology. By "ideology", Marx and Engels meant ideas that reflect the interests of a particular class at a particular time in history, but which contemporaries see as universal and eternal. Marx and Engels's point was not only that such beliefs are at best half-truths, as they serve an important political function. Put another way, the control that one class exercises over the means of production include not only the production of food or manufactured goods but also the production of ideas (this provides one possible explanation for why members of a subordinate class may hold ideas contrary to their own interests). Marx was an outspoken opponent of child labour, saying that British industries "could but live by sucking blood, and children's blood too", and that U.S. capital was financed by the "capitalized blood of children". Marx agreed with Ludwig Feuerbach that religion is a human construct reflecting human conditions ("man creates religion, religion does not create man"), but analysed this in historical, not abstract terms. He saw religion as both an expression of suffering and a protest against it. In his 1843 essay Critique of Hegel's Philosophy of Right, Marx sought to distance himself from Young Hegelians like Bruno Bauer, whose religion-focused critique, in his view, could not be a solution to human suffering without a transformative critique of society. Critique of religion would be ineffective without changing the real social conditions of which religion is only an expression. According to Shlomo Avineri, the famous passage from the introduction to this essay is, though often only partially quoted, "both more complex and more profound" than would seem, and Marx here expressed "empathy, not scorn" for religious feelings: Religious suffering is, at one and the same time, the expression of real suffering and a protest against real suffering. Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people. The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up a condition that requires illusions. Similar to the later views of Max Weber, Marx believed that religion plays a legitimating function for the dominant classes by providing a divine sanction for inequality and existing social conditions, and that for subordinate classes religion offers an escape: like an opiate, alleviating pain but not offering a cure. Marx's gymnasium senior thesis at the Gymnasium zu Trier [de] argued that religion had as its primary social aim the promotion of solidarity.[citation needed] But you Communists would introduce community of women, screams the whole bourgeoisie in chorus. The bourgeois sees in his wife a mere instrument of production. He hears that the means of production are to be exploited in common, and, naturally, can come to no other conclusion than that the lot of being common to all will likewise fall to the women. He has not even a suspicion that the real point aimed at is to do away with the status of women as mere mean of production. Marx's thoughts on labour and its function in reproducing capital were related to the primacy he gave to social relations in determining the society's past, present and future. Critics have called this economic determinism. Labour is the precondition for the existence of, and accumulation of capital, which both shape the social system. For Marx, social change was driven by conflict between opposing interests, by parties situated in the historical situation of their mode of production. This became the inspiration for the body of works known as the conflict theory. In his evolutionary model of history, he argued that human history began with free, productive and creative activities that was over time coerced and dehumanised, a trend most apparent under capitalism. Marx noted that this was not an intentional process, but rather due to the immanent logic of the current mode of production which demands more human labour (abstract labour) to reproduce the social relationships of capital. The organisation of society depends on means of production. The means of production are all things required to produce material goods, such as land, natural resources, and technology but not human labour. The relations of production are the social relationships people enter into as they acquire and use the means of production. Together, these compose the mode of production and Marx distinguished historical eras in terms of modes of production. Marx differentiated between base and superstructure, where the base (or substructure) is the economic system and superstructure is the cultural and political system. Marx regarded this mismatch between economic base and social superstructure as a major source of social conflict. Despite Marx's stress on the critique of capitalism and discussion of the new communist society that should replace it, his explicit critique is guarded, as he saw it as an improved society compared to the past ones (slavery and feudalism). Marx never clearly discusses issues of morality and justice, but scholars agree that his work contained implicit discussion of those concepts. Marx's view of capitalism was two-sided. On one hand, in the 19th century's deepest critique of the dehumanising aspects of this system he noted that defining features of capitalism include alienation, exploitation and recurring, cyclical depressions leading to mass unemployment. On the other hand, he characterised capitalism as "revolutionising, industrialising and universalising qualities of development, growth and progressivity" (by which Marx meant industrialisation, urbanisation, technological progress, increased productivity and growth, rationality, and scientific revolution) that are responsible for progress, at in contrast to earlier forms of societies. Marx considered the capitalist class to be one of the most revolutionary in history because it constantly improved the means of production, more so than any other class in history and was responsible for the overthrow of feudalism. Capitalism can stimulate considerable growth because the capitalist has an incentive to reinvest profits in new technologies and capital equipment. According to Marx, capitalists take advantage of the difference between the labour market and the market for whatever commodity the capitalist can produce. Marx observed that in practically every successful industry, input unit-costs are lower than output unit-prices. Marx called the difference "surplus value" and argued that it was based on surplus labour, the difference between what it costs to keep workers alive, and what they can produce. Although Marx describes capitalists as vampires sucking worker's blood, he notes that drawing profit is "by no means an injustice" since Marx, according to Allen W. Wood "excludes any trans-epochal standpoint from which one can comment" on the morals of such particular arrangements. Marx also noted that even the capitalists themselves cannot go against the system. The problem is the "cancerous cell" of capital, understood not as property or equipment, but the social relations between workers and owners, (the selling and purchasing of labour power) – the societal system, or rather mode of production, in general. At the same time, Marx stressed that capitalism was unstable and prone to periodic crises. He suggested that over time capitalists would invest more and more in new technologies and less and less in labour. Since Marx believed that profit derived from surplus value appropriated from labour, he concluded that the rate of profit would fall as the economy grows. Marx believed that increasingly severe crises would punctuate this cycle of growth and collapse. Moreover, he believed that in the long-term, this process would enrich and empower the capitalist class and impoverish the proletariat. In section one of The Communist Manifesto, Marx describes feudalism, capitalism, and the role internal social contradictions play in the historical process: We see then: the means of production and of exchange, on whose foundation the bourgeoisie built itself up, were generated in feudal society. At a certain stage in the development of these means of production and of exchange, the conditions under which feudal society produced and exchanged ... the feudal relations of property became no longer compatible with the already developed productive forces; they became so many fetters. They had to be burst asunder; they were burst asunder. Into their place stepped free competition, accompanied by a social and political constitution adapted in it, and the economic and political sway of the bourgeois class. A similar movement is going on before our own eyes ... The productive forces at the disposal of society no longer tend to further the development of the conditions of bourgeois property; on the contrary, they have become too powerful for these conditions, by which they are fettered, and so soon as they overcome these fetters, they bring order into the whole of bourgeois society, endanger the existence of bourgeois property. Marx believed that those structural contradictions within capitalism necessitate its end, giving way to socialism, or a post-capitalistic, communist society: The development of modern industry, therefore, cuts from under its feet the very foundation on which the bourgeoisie produces and appropriates products. What the bourgeoisie, therefore, produces, above all, are its own grave-diggers. Its fall and the victory of the proletariat are equally inevitable. Thanks to various processes overseen by capitalism, such as urbanisation, the working class, the proletariat, should grow in numbers and develop class consciousness, in time realising that they can and must change the system. Marx believed that if the proletariat were to seize the means of production, they would encourage social relations that would benefit everyone equally, abolishing the exploiting class and introducing a system of production less vulnerable to cyclical crises. Marx argued in The German Ideology that capitalism will end through the organised actions of an international working class: Communism is for us not a state of affairs which is to be established, an ideal to which reality will have to adjust itself. We call communism the real movement which abolishes the present state of things. The conditions of this movement result from the premises now in existence. In this new society, the alienation would end and humans would be free to act without being bound by selling their labour. It would be a democratic society, enfranchising the entire population. In such a utopian world, there would also be little need for a state, whose goal was previously to enforce the alienation. Marx theorised that between capitalism and the establishment of a socialist/communist system, would exist a period of dictatorship of the proletariat – where the working class holds political power and forcibly socialises the means of production. As he wrote in his Critique of the Gotha Program, "between capitalist and communist society there lies the period of the revolutionary transformation of the one into the other. Corresponding to this is also a political transition period in which the state can be nothing but the revolutionary dictatorship of the proletariat". While he allowed for the possibility of peaceful transition in some countries with strong democratic institutional structures (such as Britain, the United States, and the Netherlands), he suggested that in other countries in which workers cannot "attain their goal by peaceful means" the "lever of our revolution must be force". Marx viewed Russian Tsarism as the main threat to European revolutions. During the Crimean War, Marx backed the Ottoman Empire and its allies Britain and France against Russia. He was absolutely opposed to Pan-Slavism, viewing it as an instrument of Russian foreign policy. Marx considered the Slavic nations except Poles as 'counter-revolutionary'. Marx and Engels published in the Neue Rheinische Zeitung in February 1849: To the sentimental phrases about brotherhood which we are being offered here on behalf of the most counter-revolutionary nations of Europe, we reply that hatred of Russians was and still is the primary revolutionary passion among Germans; that since the revolution [of 1848] hatred of Czechs and Croats has been added, and that only by the most determined use of terror against these Slav peoples can we, jointly with the Poles and Magyars, safeguard the revolution. We know where the enemies of the revolution are concentrated, viz. in Russia and the Slav regions of Austria, and no fine phrases, no allusions to an undefined democratic future for these countries can deter us from treating our enemies as enemies. Then there will be a struggle, an "inexorable life-and-death struggle", against those Slavs who betray the revolution; an annihilating fight and ruthless terror – not in the interests of Germany, but in the interests of the revolution!" Marx and Engels sympathised with the Narodnik revolutionaries of the 1860s and 1870s. When the Russian revolutionaries assassinated Tsar Alexander II of Russia, Marx expressed the hope that the assassination foreshadowed 'the formation of a Russian commune'. Marx supported the Polish uprisings against tsarist Russia. He said in a speech in London in 1867: In the first place the policy of Russia is changeless... Its methods, its tactics, its manoeuvres may change, but the polar star of its policy – world domination – is a fixed star. In our times only a civilised government ruling over barbarian masses can hatch out such a plan and execute it. ... There is but one alternative for Europe. Either Asiatic barbarism, under Muscovite direction, will burst around its head like an avalanche, or else it must re-establish Poland, thus putting twenty million heroes between itself and Asia and gaining a breathing spell for the accomplishment of its social regeneration. Marx supported the cause of Irish independence. In 1867, he wrote Engels: "I used to think the separation of Ireland from England impossible. I now think it inevitable. The English working class will never accomplish anything until it has got rid of Ireland. ... English reaction in England had its roots ... in the subjugation of Ireland." Marx spent some time in French Algeria, which had been invaded and made a French colony in 1830, and had the opportunity to observe life in colonial North Africa. He wrote about the colonial justice system, in which "a form of torture has been used (and this happens 'regularly') to extract confessions from the Arabs; naturally it is done (like the English in India) by the 'police'; the judge is supposed to know nothing at all about it." Marx was surprised by the arrogance of many European settlers in Algiers and wrote in a letter: when a European colonist dwells among the 'lesser breeds,' either as a settler or even on business, he generally regards himself as even more inviolable than handsome William I [a Prussian king]. Still, when it comes to bare-faced arrogance and presumptuousness vis-à-vis the 'lesser breeds,' the British and Dutch outdo the French. According to the Stanford Encyclopedia of Philosophy: Marx's analysis of colonialism as a progressive force bringing modernization to a backward feudal society sounds like a transparent rationalization for foreign domination. His account of British domination, however, reflects the same ambivalence that he shows towards capitalism in Europe. In both cases, Marx recognizes the immense suffering brought about during the transition from feudal to bourgeois society while insisting that the transition is both necessary and ultimately progressive. He argues that the penetration of foreign commerce will cause a social revolution in India. Marx discussed British colonial rule in India in the New York Herald Tribune in 1853: There cannot remain any doubt but that the misery inflicted by the British on Hindostan [India] is of an essentially different and infinitely more intensive kind than all Hindostan had to suffer before. England has broken down the entire framework of Indian society, without any symptoms of reconstitution yet appearing... [however], we must not forget that these idyllic village communities, inoffensive though they may appear, had always been the solid foundation of Oriental despotism, that they restrained the human mind within the smallest possible compass, making it the unresisting tool of superstition. Legacy Marx's ideas have had a profound impact on world politics and intellectual thought, in particular in the aftermath of the 1917 Russian Revolution. Followers of Marx have often debated among themselves over how to interpret Marx's writings and apply his concepts to the modern world. The legacy of Marx's thought has become contested between numerous tendencies, each of which sees itself as Marx's most accurate interpreter. In the political realm, these tendencies include political theories such as Leninism, Marxism–Leninism, Trotskyism, Maoism, Luxemburgism, libertarian Marxism, and Open Marxism. Various currents have also developed in academic Marxism, often under influence of other views, resulting in structuralist Marxism, historical materialism, phenomenological Marxism, analytical Marxism, and Hegelian Marxism. From an academic perspective, Marx's work contributed to the birth of modern sociology. He has been cited as one of the 19th century's three masters of the "school of suspicion", and as one of the three principal architects of modern social science. In contrast to other philosophers, Marx offered theories that could often be tested with the scientific method. Both Marx and Auguste Comte set out to develop scientifically justified ideologies in the wake of European secularisation and new developments in the philosophies of history and science. Working in the Hegelian tradition, Marx rejected Comtean sociological positivism in an attempt to develop a science of society. Karl Löwith considered Marx and Søren Kierkegaard to be the two greatest philosophical successors of Hegel. In modern sociological theory, Marxist sociology is recognised as one of the main classical perspectives. Isaiah Berlin considers Marx the true founder of modern sociology "in so far as anyone can claim the title". Beyond social science, he has also had a lasting legacy in philosophy, literature, the arts, and the humanities. Social theorists of the 20th and 21st centuries have pursued two main strategies in response to Marx. One move has been to reduce it to its analytical core, known as analytical Marxism. Another, more common move has been to dilute the explanatory claims of Marx's social theory and emphasise the "relative autonomy working-class agenda" of aspects of social and economic life not directly related to Marx's central narrative of interaction between the development of the "forces of production" and the succession of "modes of production". This has been the neo-Marxist theorising adopted by historians inspired by Marx's social theory such as E. P. Thompson and Eric Hobsbawm. It has also been a line of thinking pursued by thinkers and activists such as Antonio Gramsci who have sought to understand the opportunities and the difficulties of transformative political practice, seen in the light of Marxist social theory. Marx's ideas had a profound influence on subsequent artists and art history, with avant-garde movements across literature, visual art, music, film, and theatre. Politically, Marx's legacy is more complex. Throughout the 20th century, revolutions in dozens of countries labelled themselves "Marxist"—most notably the Russian Revolution, which led to the founding of the Soviet Union. Major world leaders including Vladimir Lenin, Mao Zedong, Fidel Castro, Salvador Allende, Josip Broz Tito, Kwame Nkrumah, Jawaharlal Nehru, Nelson Mandela, Xi Jinping, Joseph Stalin and Thomas Sankara have all cited Marx as an influence. Beyond where Marxist revolutions took place, Marx's ideas have informed political parties worldwide. Many prominent communist revolutionaries and activists throughout the world such as Rosa Luxemburg, Bhagat Singh, Ernst Thälmann, Che Guevara, Chandra Shekhar Azad, Antonio Gramsci and Fred Hampton were deeply influenced by Marxist ideology. In countries associated with Marxism, political opponents have blamed Marx for millions of deaths, while others argue for a distinction between the legacy and influence of Marx specifically, and the legacy and influence of those who have shaped his ideas for political purposes. Arthur Lipow describes Marx and his collaborator Friedrich Engels as "the founders of modern revolutionary democratic socialism." The cities of Marks, Russia and Karl-Marx-Stadt, Germany, now known as Chemnitz, were named after Marx. In May 2018, to mark the bicentenary of his birth, a statue of him by leading Chinese sculptor Wu Weishan and donated by the Chinese government was unveiled in his birthplace of Trier, Germany. The then-European Commission president Jean-Claude Juncker defended Marx's memory, saying that today Marx "stands for things which he is not responsible for and which he didn't cause because many of the things he wrote down were redrafted into the opposite". In 2013, UNESCO added two documents with Marx's handwriting to its Memory of the World International Register. These are his annotated first edition of Das Kapital Volume 1 and a manuscript page from The Communist Manifesto. These are held among more of Marx's papers at the International Institute of Social History in Amsterdam. As well as influencing 20th century cinema, Marx's life and times and his principal works have all been represented in film as subjects in their own right. Films depicting Marx and his ideas range from documentary to fictional drama, art house and comedy. In 2017, The Young Karl Marx received good reviews for both its historical accuracy and its brio in dealing with intellectual life. Selected bibliography See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Famicom] | [TOKENS: 11992] |
Contents Nintendo Entertainment System The Nintendo Entertainment System (NES) is a home video game console developed and marketed by Nintendo. It was released as the Family Computer (Famicom),[note 1] in Japan on July 15, 1983, and as the NES in test markets in the United States on October 18, 1985, followed by a nationwide launch on September 27, 1986. The NES was distributed in Europe, Australia, and parts of Asia throughout the 1980s. It was Nintendo's first programmable home console, succeeding the Color TV-Game line of dedicated consoles, and primarily competed with Sega's Master System in the third generation of video game consoles. The engineer Masayuki Uemura designed the 8-bit Famicom after Nintendo's president Hiroshi Yamauchi called for a simple, cheap console that could run arcade games from cartridges. Its hardware was based on that of Nintendo's arcade game Donkey Kong (1981) and its controller design was reused from Nintendo's portable Game & Watch hardware. For Western markets, Lance Barr and Don James redesigned it as the NES to resemble a video cassette recorder. To aid its acceptance in stores, Nintendo released add-ons such as the Zapper, a light gun for shooting games, and R.O.B., a toy robot. In Japan, Nintendo released the Famicom Disk System add-on to distribute games on floppy disks, although it gradually reduced support for the peripheral due to hardware constraints. Nintendo released the NES in the aftermath of the video game crash of 1983. In Japan and North America, it quickly dominated and gave Nintendo a near-monopoly on the home console market. Unlike the previous market leader, Atari, Nintendo sought the support of third-party developers, such as Capcom, Hudson Soft, Konami, Namco, Enix, and Square. Its restrictive licensing terms, which included platform exclusivity and a five-game-per-year limit, led to accusations of antitrust violations in the US, culminating in a 1991 settlement with the Federal Trade Commission. Nintendo sold 61.91 million consoles; though dominant in Japan and North America, the NES performed less well in Europe,[g] where it faced strong competition from the Master System and home computers such as the Commodore 64 and ZX Spectrum. Nintendo released the Super Nintendo Entertainment System in 1990, although it continued to support the NES well into the 16-bit era. It ceased NES production in 1995 and Famicom production in 2003. The NES is regarded as one of the most influential consoles, as it helped revitalize the American gaming industry following the 1983 crash and pioneered the now-standard business model of licensing third-party developers to produce and distribute games. Several games released for the NES, including Super Mario Bros. (1985), The Legend of Zelda (1986), Dragon Warrior (1986), and Final Fantasy (1987), became major franchises. History The video game industry experienced rapid growth and popularity from the late 1970s to the early 1980s, marked by the golden age of arcade games and the second generation of consoles. Games like Space Invaders (1978) became a phenomenon across arcades worldwide, while home consoles such as the Atari 2600 and home computers such as the Commodore 64 and the Intellivision gained a foothold in the American market. Many companies emerged to capitalize on the growing industry, including the card and toy company Nintendo. Nintendo president Hiroshi Yamauchi realized that breakthroughs in the electronics industry meant that entertainment products could be produced at lower prices. Companies such as Atari and Magnavox were already selling gaming devices for use with television sets to moderate success. Yamauchi negotiated a license with Magnavox for the patents on the technology used in the Magnavox Odyssey. Since Nintendo's operation was not yet sophisticated enough to design its own hardware, Yamauchi forged an alliance with Mitsubishi Electric and hired several Sharp Electronics employees to assist in developing the Color TV-Game 6 and the Color TV-Game 15 in Japan. This was followed the handheld Game & Watch series. The successes of these consoles gave Yamauchi the confidence to expand Nintendo's influence in the fledgling video game industry. In 1978, Yamauchi split Nintendo into separate research and development divisions. He appointed Masayuki Uemura as head of Nintendo Research & Development 2. Yamauchi, through extensive discussions with Uemura and other engineers, recognized the potential of the developing console beyond gaming. He envisioned a home computer system disguised as a toy, which could significantly expand Nintendo's reach if it became popular with children. This popularity would drive demand for games, with Nintendo as the sole provider. Indeed, by 1980 several systems had already been released in Japan by both American and Japanese companies. Yamauchi tasked Uemura with developing a system that would be superior to its competitors and difficult to replicate for at least a year. Uemura's main challenge was economic rather than technological; Yamauchi wanted the system to be affordable enough for widespread household adoption, aiming for a price of ¥9,800 (less than $75) compared to existing machines priced at ¥30,000 to ¥50,000 ($200 to $350). The new system had to outperform other systems, both Japanese and American, while being significantly more affordable. As development progressed on the new video game system, engineers sought Yamauchi's guidance on its features. They questioned whether to include a disk drive, keyboard, data port, as well as the potential for a modem, expanded memory, and other computer-like capabilities. Yamauchi ultimately instructed Uemura to prioritize simplicity and affordability, omitting these peripherals entirely. Game cartridges, which Uemura saw as "less intimidating" to consumers, were chosen as the format. The team designed the system with 2,000 bytes of random-access memory (RAM). The console's hardware was largely based on arcade video games, particularly the hardware for Namco's Galaxian (1979) and Nintendo's own Donkey Kong (1981), with the goal of matching their powerful sprite and scrolling capabilities in a home system. A test model was constructed in October 1982 to verify the functionality of the hardware, and work began on programming tools. Because 65xx CPUs had not been manufactured or sold in Japan by that time, no cross-development software was available, and had to be developed from scratch. Early Famicom games were written on a PC-8001 computer. LEDs on a grid were used with a digitizer to design graphics, as no such software design tools existed at the time. The codename for the project was GameCom, but Masayuki Uemura's wife proposed the name Famicom, arguing that "In Japan, 'pasokon' is used to mean a personal computer, but it is neither a home nor personal computer. Perhaps we could say it is a family computer."[h] Meanwhile, Yamauchi decided that the console should use a red and white color scheme after seeing a hoarding for DX Antenna (a Japanese antenna manufacturer) that used those colors. The Famicom was influenced by the ColecoVision, Coleco's competition against the Atari 2600 in the United States; the ColecoVision's top-seller was a port of Nintendo's Donkey Kong. The project's chief manager Takao Sawano brought a ColecoVision home to his family, who was impressed by its smooth graphics, which contrasted with the flicker and slowdown commonly seen on Atari 2600 games. Uemura said the ColecoVision set the bar for the Famicom. The team, wanting to surpass the ColecoVision and match the more powerful Donkey Kong arcade hardware, took a Donkey Kong arcade cabinet to chip manufacturer Ricoh for analysis, which led to Ricoh producing the Picture Processing Unit (PPU) chip for the Famicom. During development, Yamauchi directed engineers to reduce costs by removing non-essential components. However, he insisted on including a low-cost circuit and connector that allowed the CPU to send or receive unmodified signals, enabling future hardware expansions such as modems or keyboards. This built-in capability led some within Nintendo to refer to the console as "Yamauchi's Trojan Horse": it entered homes as a simple gaming device with two controllers, and yet contained features far beyond its apparent function. A 1989 corporate report later acknowledged, "In the initial stages of [the system's] development, we foresaw these possibilities... we built a data communications function into the system." Lead engineer Masayuki Uemura credited luck for this foresight, while colleague Genyo Takeda remarked that Uemura's lack of experience allowed him to attempt what others might have deemed unfeasible. Design decisions were also carefully considered. Yamauchi took a hands-on role in determining the controller layout, casing shape, and overall aesthetic. The final design featured a directional pad and two buttons on the right controller, a microphone on the left controller, rounded edges, and a red and white color scheme deliberately made to appear more like a toy than a computer. Original plans called for the Famicom's cartridges to be the size of a cassette tape, but they ultimately ended up being twice as large. Careful design attention was paid to the cartridge connectors, as loose and faulty connections often plagued arcade machines. Because it necessitated 60 connection lines for the memory and expansion, Nintendo decided to produce its own connectors. Each cartridge typically contained two primary chips: one for the game’s program code (up to 32 kilobytes), and another for graphical data used to render on-screen characters (up to 8 kilobytes). Nintendo's R&D3 team designed the "UNROM" cartridge, which enabled larger memory capacities and the use of bank switching. This technique involved storing additional data in RAM and dynamically accessing it as needed, thereby significantly expanding gameplay possibilities. At Gunpei Yokoi's suggestion, a cartridge eject lever was also added, not for functionality, but to amuse children. The Famicom design team initially considered arcade-style joysticks, and even dismantled existing models from American consoles, but ultimately rejected them due to concerns about durability and the risk of children stepping on them. Instead, they adopted the D-pad and two action button layout developed by R&D1 for their handheld Game & Watch series. As an early prototype, Katsuya Nakagawa attached a Game & Watch D-pad to the Famicom and found it comfortable and easy to use. To reduce costs, the controllers were hardwired to the console and stored in molded pockets on the case. A 15-pin expansion port was added to the front of the console so that an optional arcade-style joystick could be used. The second controller also included a microphone, which Uemura envisioned being used to make players' voices come through the TV speaker. On July 15, 1983, the console was released in Japan as the Home Cassette Type Video Game: Family Computer,[note 2] priced at ¥14,800 (¥20,718 in 2025) with three launch games, all of which were ports of popular Nintendo arcade games: Donkey Kong (1981), Donkey Kong Jr. (1982), and Popeye (1982). Although it was priced higher than originally intended, the Famicom remained less than half the cost of rival consoles. Backed by a robust marketing campaign, 500,000 units were sold within the first two months. However, a major fault emerged ahead of the critical Japanese New Year season, as reports began surfacing of consoles crashing during gameplay. Uemura and engineer Gunpei Yokoi traced the issue to a defective integrated circuit that could lock under specific data conditions. Upon reporting the issue to Yamauchi, staff proposed selectively replacing affected units. However, they were warned that a partial response could damage consumer trust and jeopardize Nintendo's first-mover advantage before competitors could respond. Yamauchi considered their input, then issued a decisive directive: "Recall them all." After a product recall and the release of a revised model with a new motherboard, the system's popularity soared. By the end of 1984, the Famicom had become the best-selling game console in Japan in what came to be called the "Famicom Boom".: 279, 285 Following the sale of the first million units, demand showed no signs of slowing. Japanese retailers inundated Nintendo with urgent requests for stock. Anticipation for new game releases reached unprecedented levels, with children lining up outside shops and games selling out almost immediately. This phenomenon, soon dubbed "Nintendomania", overwhelmed the supply chain and further increased demand. The Famicom's success quickly cleared the field of competition in Japan. Fourteen rival console manufacturers exited the market, and Sega's SG-1000, launched in Japan on the same day as the Famicom, failed to gain traction. At launch, Nintendo released only first-party games for the Famicom. However, in 1984, after being approached by Namco and Hudson Soft, the company agreed to allow third-party titles. Developers paid a 30% fee to cover console licensing and production costs, a revenue model that would later influence the video game industry for decades. Nintendo initially planned for the console to enter the North American market through a distribution agreement with Atari. The agreement was expected to be finalized at the Summer Consumer Electronics Show (CES) in June 1983. However, during the show, Atari discovered that Coleco was demonstrating an unlicensed port of Nintendo's Donkey Kong on its Adam computer system. Atari, believing this violated its exclusive license for the game, delayed the deal. Shortly afterward, Atari CEO Ray Kassar was fired, the deal fell apart, and Nintendo decided to market its system on its own.: 283–286 Nintendo believed that the Famicom name might not resonate with American consumers, and initially rebranded the console as the Advanced Video System (AVS). The AVS resembled a home computer rather than a "toy", featuring a built-in keyboard, a cassette-based data drive, and infrared wireless controllers.: 287 By positioning the console as a more "sophisticated" consumer electronics product, Nintendo aimed to distance itself from the recent failures of companies such as Atari, Coleco, and Mattel. The AVS was publicly demonstrated at the Winter CES in January 1985, but the reaction was lukewarm. While the hardware and games were praised, there was deep skepticism that the console could succeed in the United States, as the industry there was still recovering from the video game crash of 1983. Electronic Games magazine reported in March 1985 that the video game market in America "[had] virtually disappeared", and believed "[it] could be a miscalculation on Nintendo's part". With American retailers still wary of stocking game consoles after the 1983 crash, Yamauchi saw an opportunity to introduce the Famicom's hardware to North America through arcades. In 1984, Nintendo launched the VS. System, an arcade conversion system that featured ports of select Famicom games, with a focus on two-player competitive play. The VS. System became a major success, selling nearly 100,000 cabinets and becoming the highest-grossing arcade machine of 1985 in the United States. This success gave Nintendo the confidence to pursue a home console launch in North America, and provided a platform to test new titles to help shape the launch line-up. Nintendo of America designers Lance Barr and Don James were disappointed with the prototype console they received from Japan, which they nicknamed "the lunchbox". For the console's western redesign, they added a two-tone gray color scheme with a black stripe and red lettering, as well as a front-loading, zero insertion force slot modeled after a videocassette recorder which concealed the game cartridge once inserted. The redesigned console, now called the Nintendo Entertainment System (NES), was unveiled by Nintendo at the June 1985 Summer CES, and dropped the home computer features of the earlier AVS prototype while retaining its gray color scheme and boxy form factor. It also replaced the Famicom's hardwired controllers and the AVS's wireless ones with detachable wired controllers using proprietary 7-pin connectors. To avoid the language used by earlier game consoles, marketing manager Gail Tilden coined alternative terms for the NES's hardware, calling the cartridges "Game Paks" and the console itself the "Control Deck", which would later aid its acceptance in toy stores. To further distance the NES from previous consoles, Nintendo heavily promoted optional accessories, such as the Zapper light gun and the Robotic Operating Buddy (R.O.B.), to position the system as cutting-edge and sophisticated. While initial consumer interest in the console was limited, its peripherals drew significant attention. The NES launched in a limited test market in New York City on October 18, 1985, followed by Los Angeles in February 1986, and finally a full North American release on September 27, 1986. The launch line-up included 17 games: 10-Yard Fight, Baseball, Clu Clu Land, Duck Hunt, Excitebike, Golf, Gyromite, Hogan's Alley, Ice Climber, Kung Fu, Pinball, Soccer, Stack-Up, Super Mario Bros., Tennis, Wild Gunman, and Wrecking Crew.[i] Nintendo contracted with toy company Worlds of Wonder (WoW) to get the NES distributed in stores. WoW's aggressive sales tactics, which included requiring retailers to carry the NES in order to sell WoW's other popular toys, helped secure shelf space for the console. WoW salesman Jim Whims distinctly recalled delivering an ultimatum: "if you want to sell Teddy Ruxpin and you want to sell Lazer Tag, you're gonna sell Nintendo as well." WoW's efforts led to a successful first year for the NES; afterwards, Nintendo of America ended the distribution deal and hired WoW's sales team, taking over distribution directly. With the launch of the NES, Nintendo redefined the home video game market in North America. The 1983 crash had been fueled by misleading marketing, lack of quality control, and hardware fragmentation. In contrast, Nintendo introduced strict standards for software approval, packaging, and quality. It used consistent branding with genre icons, box art that reflected in-game graphics, and the "Official Nintendo Seal of Quality". To enforce its standards, the company used the 10NES lock-out chip to deter production of unlicensed games. In Europe and Oceania, the NES was released in two separate marketing regions. The first consisted of mainland Europe (excluding Italy), where distribution was handled by several different companies, with Nintendo responsible for manufacturing. The NES saw an early launch in Europe in 1986, although most European countries received the console in 1987. In Scandinavia, it was released on September 1, 1986, and was distributed by Bergsala. In the Netherlands, it was released in the last quarter of 1987, and was distributed by Bandai BV. In France, it was released in October 1987, and in Spain most likely in 1988 through distributor Spaco. In 1987, Mattel handled distribution for the second region, consisting of the United Kingdom, Ireland, Italy, Australia and New Zealand. In other European countries, distribution was handled by smaller companies like Bienengräber in Germany, ASD in France, Concentra in Portugal, Itochu in Greece and Cyprus, and Stadlbauer in Austria, Switzerland, and the former Eastern Bloc. In Poland, the NES had its release on October 6, 1994, along with the SNES and the Game Boy. In November 1994, Nintendo signed an agreement with Steepler to permit the continued sale of the Dendy, an unauthorized hardware clone of the Famicom, in Russia in exchange for also distributing the SNES. Nintendo anticipated that the NES would have a 25 percent market share in Europe, and saw particular potential in the United Kingdom. The console struggled to gain a foothold in the region, however, in part due to the widespread popularity of the ZX Spectrum, which had already established a strong home computing and gaming culture. The affordability, local software support, and versatility of the Spectrum also made it a dominant choice among British consumers, which severely limited the NES's market penetration. The console would see an increase in share in 1990 with the release of the Teenage Mutant Hero Turtles bundle, which was released against Nintendo of America's wishes but ultimately allowed the console's European sales to overtake those of the Master System. In Brazil, the console was officially released in late 1993 by Playtronic. However, the Brazilian market had been dominated by unlicensed NES clones that were either locally made and smuggled from Taiwan. One of the most successful Brazilian NES clones was the Phantom System by Gradiente, which licensed Nintendo products in the country for the following decade. The sales of officially licensed products in the region were low due to the abundance of clones, the console's official launch coming after the SNES, and the high prices of Nintendo's licensed products. Outside of Japan, regions of Bangladesh, Indonesia, Malaysia, Nepal, Pakistan, Singapore, Sri Lanka and Thailand received an "Asian version" of the front-loader NES, although imported Famicom systems were still prevalent. Due to import restrictions, NES consoles in India and South Korea were rebranded and distributed by local licenses. The Korean version is known as the Hyundai Comboy, and the Indian version is known as the Samurai Electronic TV Game System.[j] India was the third region outside of North America and Japan to officially receive the NES. It was produced locally by Samurai Electronics in North India, and was released to strong initial sales in the region, selling 3,000 units per month. However, in the early 1990s, when retailers began promoting bootleg consoles, the console saw a significant drop in sales, selling 300 units per month. The NES was released in several retail bundles throughout its commercial life. For its 1985 American test launch, the initial offering was the Deluxe Set, which retailed for US$179.99 (equivalent to $540 in 2025) and included the Control Deck, two controllers, the NES Zapper light gun, the R.O.B. robotic accessory, and two Game Paks: Gyromite and Duck Hunt. Ahead of the console's nationwide launch in 1986, Nintendo introduced a basic Control Deck set with two controllers, bundled with Super Mario Bros. for US$99.99 (equivalent to $280 in 2025). In 1988, the Deluxe Set was replaced by the Action Set, which retailed for US$99.99 (equivalent to $270 in 2025) and bundled the Control Deck with two controllers, the NES Zapper, and a dual Game Pak containing Super Mario Bros. and Duck Hunt. 1988 also saw the introduction of the Power Set, which added the Power Pad floor mat game controller and replaced the dual cartridge with a triple Game Pak featuring Super Mario Bros., Duck Hunt, and World Class Track Meet. In 1990, Nintendo released the Sports Set, which included the Control Deck, four controllers, an NES Satellite infrared wireless multitap adapter, and a dual Game Pak containing Super Spike V'Ball and Nintendo World Cup. In 1992, the Challenge Set debuted at US$89.99 (equivalent to $210 in 2025), featuring the Control Deck and two controllers, bundled with Super Mario Bros. 3. Finally, in October 1993, Nintendo released a redesigned version of the console, known as the New-Style NES or NES-101, in North America, Australia, and Japan. This version included a single redesigned "dogbone" shape controller, and retailed for US$49.99 (equivalent to $110 in 2025) in North America before its discontinuation in 1995. In Australia, the console was bundled with a triple Game Pak featuring Super Mario Bros., Tetris, and Nintendo World Cup, and sold for A$79.99, or A$69.99 without the bundled Game Pak. On August 14, 1995, Nintendo discontinued the Nintendo Entertainment System in both North America and Europe. In North America, replacements for the original front-loading NES were available for $25 in exchange for a broken system until at least December 1996, under Nintendo's Power Swap program. In September 2003, Nintendo discontinued the Famicom in Japan, alongside the Super Famicom and disk rewriting services for the Famicom Disk System. The last Famicom model, serial number HN11033309, was manufactured on September 25, 2003; it was kept by Nintendo and subsequently loaned to the organizers of Level X, a video game exhibition held from December 4, 2003, to February 8, 2004, at the Tokyo Metropolitan Museum of Photography, for a Famicom retrospective in commemoration of the console's 20th anniversary. Nintendo offered repair services for the Famicom in Japan until 2007, when it was discontinued due to a shortage of available parts. Hardware Although all versions of the Famicom and NES include essentially similar hardware, they vary in physical characteristics. The original Famicom's design is predominantly white plastic with a dark red trim; it featured a top-loading cartridge slot, grooves on both sides of the deck in which the hardwired game controllers could be placed when not in use, and a 15-pin expansion port located on the unit's front panel for accessories. In contrast, the design of the original NES features a more subdued gray, black, and red color scheme, with a front-loading cartridge slot covered by a small, hinged door that can be opened to insert or remove a cartridge and closed at other times, and an expansion port on the bottom of the unit. The NES also includes the 10NES lock-out chip, and incorporates a matching chip validation check in its cartridge connector. In late 1993, Nintendo introduced a redesigned version of the Famicom and NES (known officially as the New Famicom in Japan and the New-Style NES in the US) to complement the Super Famicom and SNES, to prolong interest in the console, and to reduce costs. The redesigned NES features a top-loading cartridge slot and omits the 10NES lock-out chip to avoid reliability issues with the original console; the redesign also omits AV output. Conversely, the redesigned Famicom features AV output, and introduces detachable game controllers, which ultimately omitted microphone functionality as a result. The redesigned Famicom and NES models are cosmetically similar, aside from the presence of a cartridge "bump" on the NES model, which the Famicom model lacks to accommodate its shorter cartridges and the RAM Adapter for the Famicom Disk System. Sharp Corporation produced three licensed variants of the Famicom in Japan, all of which prominently display the shortened moniker rather than the official name, Family Computer.[h] One variant was a television set with an integrated Famicom; originally released in 1983 as the My Computer TV in 14-inch (36 cm) and 19-inch (48 cm) models, it was later released in the United States in 1989 as a 19-inch model named the Video Game Television. Another variant is the Twin Famicom console, which was released in 1986 and combines a Famicom with a Famicom Disk System. Sharp then produced the Famicom Titler in 1989; intended for video capture and production, it features internal RGB video generation and video output via S-Video, as well as inputs for adding subtitles and voice-overs. A thriving market of unlicensed NES hardware clones emerged during the climax of the console's popularity. Initially, such clones were popular in markets with weak copyright laws and countries in which Nintendo issued its systems after "famiclones" became well-known, making legal products difficult to market or create brand awareness for. In particular, the Dendy (Russian: Де́нди), an unlicensed hardware clone produced in Taiwan and sold in the former Soviet Union by Steepler, emerged as the most popular console of its time, eventually selling six million units. In Poland, the Pegasus clone, distributed by Bobmark International, sold more than a million units. In China, a reported 30 million units were sold until late 1995. A range of Famicom clones was marketed in Latin America during the late 1980s and 1990s under the name "Family Game", resembling the original hardware design. The Ending-Man Terminator clone enjoyed popularity in the Eastern Bloc, as well as in parts of Africa, Asia, and Latin America. The unlicensed clone market flourished following Nintendo's discontinuation of the NES. Some of these surpass the functionality of the original hardware, such as PocketFami, a portable system with a color LCD screen. Others have been produced for certain specialized markets, such as a personal computer with a keyboard and basic word processing software. These unauthorised clones have been helped by the invention of the so-called NES-on-a-chip. Nintendo's design styling for the NES's North American release was made deliberately different from that of other game consoles. The company wished to distinguish their product from those of competitors and avoid the generally poor reputation that game consoles had acquired following the video game crash of 1983. One result of this philosophy was to disguise the cartridge slot design as a front-loading zero-insertion force (ZIF) cartridge socket, designed to resemble the front-loading mechanism of a videocassette recorder. However, when a user inserts the cartridge, the force of pressing it into place bends the contact pins slightly and presses the cartridge's ROM board back into the cartridge. Frequent insertion and removal of cartridges can wear out the pins, and the ZIF design has proven to be more prone to interference by dirt and dust than an industry-standard card edge connector. The design problems were exacerbated by Nintendo's choice of materials. The console slot nickel connector springs wear out due to their design, and the game cartridge's brass plated nickel connectors are also prone to tarnishing and oxidation. Nintendo sought to fix these problems by redesigning the next generation Super Nintendo Entertainment System (SNES) as a top loader similar to the Famicom. Many users reportedly tried to alleviate issues caused by corrosion by blowing into the cartridges and then reinserting them, which conversely sped up the tarnishing due to moisture. The Famicom as released in Japan contains no lock-out hardware, which led to unlicensed cartridges (both legitimate and bootleg) becoming extremely common in Japan and East Asia. To combat bootlegs, Nintendo attempted to promote its "Seal of Quality" in these regions to identify licensed games, but bootleg Famicom games continued to be produced even after Nintendo moved production onto the Super Famicom, effectively extending the lifetime of the original Famicom. The original NES, released for western countries in 1985, contains the 10NES lock-out chip, which prevents the console from running cartridges unapproved by Nintendo. The inclusion of the 10NES chip was a result of the 1983 North American video game crash, which was partially caused by a market flooded with uncontrolled publishing of poor-quality home console games. Nintendo sought to use the lock-out chip to restrict games to only those they licensed for the system. This means of protection worked in combination with Nintendo's "Seal of Quality", which a developer had to acquire before they would be able to have access to the required 10NES information prior to publication of a game. Original NES consoles sold in different regions have different lock-out chips, thereby enforcing regional lock-out regardless of TV signal compatibility. Such regions include North America; most of continental Europe (PAL-B); Asia; and the British Isles, Italy, and Australasia (PAL-A). Problems with the 10NES lock-out chip frequently result in one of the console's most common issues: the blinking, red power light, in which the system appears to turn itself on and off repeatedly because the 10NES would reset the console once per second. The lock-out chip requires constant communication with the chip in the game to work.: 247 The console's main central processing unit (CPU) was produced by Ricoh, which manufactured different versions for NTSC and PAL regions; NTSC consoles have a 2A03 clocked at 1.79 MHzTooltip megahertz, and PAL consoles have a 2A07 clocked at 1.66 MHz. Both CPUs are unlicensed variants of the MOS Technology 6502, an 8-bit microprocessor prevalent in contemporary home computers and consoles. Nintendo ostensibly disabled the 6502's binary-coded decimal mode on them to avoid patent infringement against or licensing fees towards MOS Technology, which was owned by then-rival Commodore International. The CPU has access to 2 KBTooltip kilobyte of onboard work RAMTooltip random-access memory. The console's graphics are handled by a Ricoh 2C02, a processor known as the Picture Processing Unit (PPU) that is clocked at 5.37 MHz. A derivative of the Texas Instruments TMS9918 (a video display controller used in the ColecoVision), the PPU features 2 KB of video RAM, 256 bytes of on-die "object attribute memory" (OAM) to store sprite display information on up to 64 sprites, and 28 bytes of RAM to store information on the YIQ-based color palette; the console can display up to 25 colors simultaneously out of 54 usable colors. The console's standard display resolution is 256 × 240 pixels, though video output options vary between models. The original Famicom features only radio frequency (RF) modulator output, and the NES additionally supports composite video via RCA connectors.[k] The redesigned Famicom omits the RF modulator entirely, only outputting composite video via a proprietary "multi-out" connector first introduced on the Super Famicom/SNES; conversely, the redesigned NES features RF modulator output only, though a version of the model including the "multi-out" connector was produced in rare quantities. The console produces sound via an audio processing unit (APU) integrated into the processor. It supports a total of five sound channels: two pulse wave channels, one triangle wave channel, one white noise channel, and one DPCMTooltip differential pulse-code modulation channel for sample playback. Audio playback speed is dependent on the CPU clock rate, which is set by a crystal oscillator. The game controller for both the NES and the Famicom has an oblong brick-like design with a simple four button layout: two round buttons labelled "A" and "B", a "START" button, and a "SELECT" button. Additionally, the controllers use the cross-shaped D-pad, designed by Nintendo employee Gunpei Yokoi for Game & Watch systems, to replace the bulkier joysticks of controllers used by earlier gaming consoles.: 279 The original model Famicom features two game controllers, both of which are hardwired to the back of the console.[l] The second controller lacks the Start and Select buttons, and instead features a small microphone; however, few games use this feature. The earliest produced Famicom units have square A and B buttons; issues with them getting stuck when pressed down led Nintendo to change their shape to a circular design in subsequent units following the console's recall. In contrast to the Famicom's hardwired controllers, the NES has two proprietary seven-pin ports on the front of the console to support detachable controllers and third-party peripherals. The controllers bundled with the NES are identical and include the Start and Select buttons, lacking the microphone on the original Famicom's second controller. The cables for NES controllers are also generally three times longer than their Famicom counterparts. Several special controllers are intended for use with specific games but not commonly used. Such peripherals include the NES Zapper (a light gun), R.O.B. (a toy robot),: 297 and the Power Pad (a dance pad).: 226 The original Famicom has a deepened DA-15 expansion port on the front of the unit to accommodate them. Two official advanced controllers were produced for the NES: the NES Advantage, an arcade controller produced by Asciiware and licensed by Nintendo of America; and the NES Max, a controller with grip handles and a "cycloid" sliding-disc D-pad in place of the traditional one. Both controllers have a "Turbo" feature that simulates multiple rapid presses for the A and B buttons; the NES Max has manually pressed Turbo buttons, and the NES Advantage offers toggle buttons for Turbo functionality, along with knobs that adjust the firing rate of each button. The latter also includes a "Slow" button that rapidly pauses games; however, this function is not intended for games that invoke a pause menu or screen. The standard controller was redesigned for the introduction of the New-Style NES in 1993. This version retained detachable controller ports and the original button layout, but the shape was changed to loosely resemble that of the Super Famicom/SNES controller; its shape has led to it being nicknamed the "dog bone" controller. Nintendo created a knitting machine that interfaced with the NES and showed it at CES in 1987 for "business feedback", although the accessory was ultimately not released as a product. Nintendo spokesperson Howard Phillips demoed it for Toys "R" Us in the late 1980s, and an advertisement used the headline "Now you're knitting with power!" in reference to the slogan used by Nintendo at the time. Few of the numerous peripheral devices and software packages for the Famicom were released outside Japan. The Famicom 3D System, an active shutter 3D headset peripheral released in 1987, enabled the ability to play stereoscopic video games. It was a commercial failure and never released outside Japan; users described the headset as bulky and uncomfortable. Seven games are compatible with the glasses, with three of them developed by Square; two titles received worldwide releases as Rad Racer and The 3-D Battles of WorldRunner. Family BASIC is an implementation of BASIC for the Famicom, packaged with a keyboard. Similar in concept to the Atari 2600's BASIC Progamming cartridge, it allows the user to write programs, especially games, which can be saved on an included cassette recorder. Nintendo of America rejected releasing Famicom BASIC in the US, due to the NES's primary marketing demographic being children.: 162 The Family Computer Network System connected a Famicom to a now-defunct proprietary network in Japan which provided content such as financial services. A dial-up modem was reportedly being produced for the NES in a partnership with Fidelity Investments, but was ultimately not released. By 1986, the cost and size limitations of ROM chips used in the Famicom's ROM cartridges were apparent, with no new advancements present to address them. With this in mind, Nintendo looked at the personal computer (PC) market, where the floppy disk was gaining wide adoption as a computer data storage medium. Partnering with Mitsumi to develop a floppy disk add-on for the Famicom based on the latter's Quick Disk format, Nintendo officially released it as the Family Computer Disk System (or Famicom Disk System) in Japan on February 21, 1986, at a retail price of ¥15,000. The advantages of the format (called "Disk Card") were apparent on launch. It has more than triple the data storage capacity of the then-largest cartridge (used for Super Mario Bros.), introduced game save capability, and had lower production costs compared to cartridges, which resulted in lower retail prices. The add-on also has a new wavetable synthesis sound channel and more data storage for the Famicom's audio sample channel. Taking advantage of the disk's re-writability, Nintendo set up Disk Writer interactive kiosks at retail stores throughout Japan; at each kiosk, consumers could buy new games to rewrite onto their old disks or onto new disks.: 75 Disk Fax kiosks allowed players to submit their high scores on special blue disks for contests and rankings, predating online leaderboards by several years. Although Nintendo committed to exclusively releasing games on the Disk System after its release, numerous external issues plagued its long-term viability. Just four months after launch, Capcom released a Famicom port of Makaimura (known as Ghosts 'n Goblins in the US) on a cartridge with more data storage capacity than what was possible on Disk Cards, nullifying one of the Disk System's major advantages by using discrete logic chips to perform bank switching. Nintendo also demanded half of the copyright ownership for each game it selected for release on the Disk System, resulting in developers electing to remain on cartridge instead as the latter gained functionality previously considered unique to the former. Developers disliked the lower profit margin of the Disk Writer kiosks, and retailers complained of their use of valuable space as demand for the format waned.: 78 Usage of a floppy disk-based medium brought about further complications; Disk Cards were more fragile than cartridges and were prone to data corruption from magnetic exposure. Their unreliability was exacerbated by their lack of a shutter, which Nintendo substituted with a wax sleeve and clear keep case to reduce costs; blue disks and later Disk Cards included shutters. The rubber belt-based disk drives were also unreliable, with cryptic error codes complicating troubleshooting; even when fully functional, players accustomed to cartridges were annoyed with the introduction of loading times and disk flipping. Furthermore, the rewritable nature of the format resulted in rampant software piracy, with Nintendo's attempts at anti-piracy measures quickly defeated. Despite selling close to two million Disk System units in 1986, Nintendo only managed to increase the total to 4.4 million units by 1990, falling well short of internal projections.: 76 By then, the Disk System was rendered obsolete due to advancements in ROM cartridge production, such as memory mapping chips[m] for expanded data storage capacity, battery-backed SRAMTooltip static random-access memory for game saving, and declining overall production costs. Nintendo alluded to a western release for the Disk System, going so far as to successfully file a US patent for it and having the Famicom's cartridge pins used by its RAM Adapter for enhanced audio rerouted to the NES's seldom-used bottom expansion port. However, such a release never materialized due to the Disk System's lackluster reception in Japan. Most of its games were re-released with workarounds on cartridge for both the Famicom and NES, without the enhanced audio. Although the last game for the Disk System was released in December 1992, Nintendo continued offering repair and rewrite services for it until September 2003. The NES Test Station diagnostics machine, an NES-based unit designed for testing NES hardware, components, and games, was introduced in 1988. It was only provided for use in World of Nintendo boutiques as part of the Nintendo World Class Service program. Visitors were to bring items to test with the station, and could be assisted by a store technician or employee. The NES Test Station's front has a Game Pak slot and connectors for testing various components (AC adapter, RF switch, Audio/Video cable, NES Control Deck, accessories and games), with a centrally located selector knob to choose which component to test. The unit itself weighs approximately 5.3 kilograms (11.7 lb) and connects to a television via a combined A/V and RF Switch cable. By actuating the green button, a user can toggle between an A/V Cable or RF Switch connection. The television it is connected to (typically 11" to 14") is meant to be placed atop it. Games The NES uses a 72-pin design, compared to 60 pins on the Famicom. To reduce costs and inventory, some early games released in North America are simply Famicom cartridges attached to an adapter to fit inside the NES hardware. Early NES cartridges are held together with five small slotted screws. The back of the cartridge bears a label with handling instructions. Production and software revision codes were imprinted as stamps on the back label to correspond with the software version and producer. All licensed NTSC and PAL cartridges are a standard shade of gray plastic, with the exceptions of The Legend of Zelda and Zelda II: The Adventure of Link, which were manufactured in gold plastic cartridges. Unlicensed cartridges were produced in black, robin egg blue, and gold, and are all slightly different shapes than standard NES cartridges. Nintendo also produced yellow plastic cartridges for internal use at Nintendo Service Centers, although these "test carts" were never made available for purchase. All licensed US cartridges were manufactured by Nintendo, Konami, and Acclaim.[citation needed] Famicom cartridges are shaped slightly differently. Unlike NES games, official Famicom cartridges were produced in many colors of plastic. Adapters, similar in design to the popular accessory Game Genie, are available that allow Famicom games to be played on an NES. In Japan, several companies manufactured the cartridges for the Famicom.: 61 This allowed these companies to develop customized chips designed for specific purposes, such as superior sound and graphics. Nintendo's near-monopoly on the home video game market left it with a dominant influence over the industry. Unlike Atari, which never actively pursued third-party developers (and even went to court in an attempt to force Activision to cease production of Atari 2600 games), Nintendo had anticipated and encouraged the involvement of third-party software developers, albeit strictly on its own terms. To this end, a 10NES authentication chip is in every console and licensed cartridge. If the console's chip can not detect a counterpart chip inside the cartridge, the game does not load.: 247 Nintendo portrayed these measures as intended to protect consumers from what it saw as poor-quality games, and placed a golden seal of approval on all licensed games released for the system. Nintendo was not as restrictive as Sega, which did not permit third-party publishers until Mediagenic in late summer 1988. Nintendo's intention was to reserve a large part of NES game revenue for itself. The company required that it be the sole manufacturer of all cartridges, and that the publisher had to pay in full before the cartridges for that game be produced. Cartridges could not be returned to Nintendo, so publishers assumed all the risk. As a result, some publishers lost more money due to distress sales of remaining inventory at the end of the NES era than they ever earned in profits from sales of the games. Because Nintendo controlled the production of all cartridges, it was able to enforce strict rules on its third-party developers, who were required to sign a contract that would obligate them to develop exclusively for the system, order at least 10,000 cartridges, and only make five games per year.: 214–215 The global 1988 shortage of DRAM and ROM chips reportedly caused Nintendo to only permit an average of 25% of publishers' requests for cartridges, with some receiving much higher amounts and others receiving almost none. GameSpy noted that Nintendo's "iron-clad terms" made the company many enemies during the 1980s. Some developers tried to circumvent the five game limit by creating additional company brands like Konami's Ultra Games label; others tried circumventing the 10NES chip. Due to its strict licensing requirements, Nintendo was accused of antitrust violations. The United States Department of Justice and several states began probing the company's business practices, leading to the involvement of Congress and the Federal Trade Commission (FTC). The FTC conducted an extensive investigation which included interviewing hundreds of retailers. During the FTC probe, Nintendo changed the terms of its publisher licensing agreements to eliminate the two-year rule and other restrictive terms. Nintendo and the FTC settled the case in April 1991, with Nintendo being required to send vouchers giving a $5 discount off to a new game, to every person that had purchased an NES game between June 1988 and December 1990. GameSpy remarked that Nintendo's punishment was particularly weak given the case's findings, although it has been speculated that the FTC did not want to damage the video game industry in the United States. With the NES near the end of its life, many third-party publishers such as Electronic Arts supported upstart competing consoles with less strict licensing terms, such as the Sega Genesis and the PlayStation, which respectively eroded and took over Nintendo's dominance in the home console market. Consoles from Nintendo's rivals in the post-SNES era had always enjoyed much stronger third-party support than Nintendo, which relied more heavily on first-party games. Companies that refused to pay the licensing fee or were rejected by Nintendo found ways to circumvent the console's authentication system. Most of these companies created circuits that use a voltage spike to temporarily disable the 10NES chip.: 286 A few unlicensed games released in Europe and Australia are in the form of a dongle to connect to a licensed game and use its 10NES chip for authentication. To combat this, Nintendo of America threatened to revoke the supply of licensed games from retailers who sold unlicensed games, and multiple revisions were made to the NES PCBs to prevent unlicensed cartridges from working. Atari Games took a different approach with its console game subsidiary Tengen, who attempted to reverse engineer the lock-out chip to develop its own "Rabbit" chip. Tengen also obtained a description of the lock-out chip from the United States Patent and Trademark Office by falsely claiming that it was required to defend against present infringement claims. Nintendo successfully sued Tengen for copyright infringement; however, Tengen's antitrust claims against Nintendo were never decided. Color Dreams made Christian video games under the subsidiary name Wisdom Tree. Historian Steven Kent wrote that "Wisdom Tree presented Nintendo with a prickly situation. The general public did not seem to pay close attention to the court battle with Atari Games, and industry analysts were impressed with Nintendo's legal acumen; but going after a tiny company that published innocuous religious games was another story.": 400 As the NES grew in popularity and entered millions of American homes, some small video rental shops began buying their own copies of NES games and renting them out to customers for around the same price as a video cassette rental for a few days. Nintendo received no profit from the practice, beyond the initial cost of their game; unlike movie rentals, a newly released game could circulate and be available for rent on the same day. Nintendo took steps to stop game rentals, but did not take any formal legal action until Blockbuster Video began to make game rentals a large-scale service. Nintendo claimed that allowing customers to rent games would significantly hurt sales and drive up the cost of games. Nintendo notably lost the lawsuit, but did win on a claim of copyright infringement. Blockbuster was banned from including photocopies of original, copyrighted instruction booklets with its rented games. In compliance with the ruling, Blockbuster printed its own short instructions, usually in the form of a small booklet, card, or label on the back of the rental box, which explained a game's basic premise and controls. Other video rental shops, however, continued the practice of renting video games. Reception By 1988, industry observers stated that the NES's popularity had grown so quickly that the market for Nintendo cartridges was larger than all home computer software combined.: 347 Compute! reported in 1989 that Nintendo had sold seven million NES systems in 1988 alone, almost as many as the number of Commodore 64s sold in that system's first five years on the market. "Computer game makers [are] scared stiff", the magazine said, stating that Nintendo's popularity caused most competitors to have poor sales during the previous holiday season, and resulted in serious financial problems for some. In June 1989, Peter Main, Nintendo of America's vice president of marketing, said that the Famicom was present in 37% of households in Japan. By 1990, the NES was present in 30% of households in the United States, compared to 23% for all personal computers. By 1990, the NES had outsold all previously released consoles worldwide. In the early 1990s, some predicted that competition from technologically superior systems such as the 16-bit Mega Drive would mean the immediate end of the NES's dominance. Instead, during the first year of the Famicom's successor, the Super Famicom (named Super Nintendo Entertainment System outside Japan), the Famicom was the second highest-selling video game console in Japan, outselling the newer and more powerful PC Engine and Mega Drive by a wide margin. The console remained popular in Japan and North America until late 1993, when the demand for new NES software abruptly plummeted. The final licensed games for the console were Adventure Island IV in Japan (released on June 24, 1994), Wario's Woods in North America (December 10, 1994), and The Lion King in Europe (May 25, 1995). In the wake of ever decreasing sales and the lack of new games, Nintendo of America officially discontinued the NES in 1995. Nintendo produced new Famicom units in Japan until September 25, 2003, and continued to repair Famicom consoles until October 31, 2007, attributing the discontinuation of support to insufficient supplies of parts. The NES was initially not as successful in Europe during the late 1980s, when it was outsold by the Master System and ZX Spectrum in the United Kingdom. By 1990, the Master System was the highest-selling console in Europe, even as the NES was beginning to have a fast-growing user base in the UK. During the early 1990s, NES sales caught up with and narrowly overtook the Master System overall in Western Europe; however, the Master System maintained its lead in several markets such as the UK, Belgium, and Spain. Legacy The NES was released two years after the video game crash of 1983, when many retailers and adult consumers regarded electronic games as a passing fad,: 280 so many believed at first that the NES would soon fade. Before the NES and Famicom, Nintendo was known as a moderately successful Japanese toy and playing card manufacturer, but the console's popularity helped the company grow into an internationally recognized name almost synonymous with video games as Atari had been, and also set the stage for Japanese dominance of the video game industry in the 1980s and 1990s. With the NES, Nintendo also changed the relationship between console manufacturers and third-party software developers by restricting developers from publishing and distributing software without licensed approval. This led to higher-quality games, which helped change the attitude of a public that had grown weary from poorly produced games for earlier systems.: 306–307 The hardware design of the NES is also very influential. Nintendo chose the name "Nintendo Entertainment System" for the US market and redesigned the system so it would not give the appearance of a child's toy. The front-loading cartridge input allowed it to be used more easily in a TV stand with other entertainment devices such as a videocassette recorder. The system's hardware limitations led to design principles that still influence the development of modern video games. Many prominent game franchises originated on the NES, including Nintendo's own Super Mario Bros.,: 57 The Legend of Zelda,: 353 and Metroid,: 357 as well as Capcom's Mega Man, Konami's Castlevania,: 358 Square's Final Fantasy,: 95 and Enix's Dragon Quest.: 222 The imagery of the NES, especially its controller, has become a popular motif for a variety of products, including Nintendo's Game Boy Advance. The original NES controller has become one of the most recognizable symbols of the console. Nintendo has mimicked the look of the controller in several other products, from promotional merchandise to limited edition versions of the Game Boy Advance. At the Tokyo Game Show in 2023, the Famicom was bestowed "The Minister of Economy, Trade and Industry Award" in honor of the console's influence and for laying down the foundations for the game industry. In 2011, IGN named the NES the greatest video game console of all time. The NES can be emulated on many other systems. The earliest known NES emulator was known simply as the Family Computer Emulator. Developed by Haruhisa Udagawa, it was made available in 1990 for the FM Towns computer. The earliest emulator for IBM PC compatibles was the Japanese-only Pasofami. It was soon followed by iNES, which is available in English and is cross-platform, in 1996. It was described as being the first NES emulation software that could be used by a non-expert. The first version of NESticle, an unofficial MS-DOS-based emulator, was released on April 3, 1997. Nintendo offers licensed emulation of select NES games via its Virtual Console service for the Wii, Nintendo 3DS, and Wii U, and via its Nintendo Classics service for Nintendo Switch and Nintendo Switch 2. On July 14, 2016, Nintendo announced the November 2016 launch of a miniature replica of the NES, known as the Nintendo Entertainment System: NES Classic Edition in the United States and as the Nintendo Classic Mini: Nintendo Entertainment System in Europe and Australia. The emulation-based console, released on November 10, 2016, includes 30 pre-installed games from the NES library, including the Super Mario Bros. and The Legend of Zelda series. The system has HDMI display output and a new replica controller, which can also connect to the Wii Remote for use with Virtual Console games. It was discontinued in North America on April 13, 2017, followed by the rest of the world on April 15, 2017. However, Nintendo announced in September 2017 that the NES Classic Mini would return to production on June 29, 2018, only to be discontinued again permanently by December of that year. See also Notes References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Gini_coefficient] | [TOKENS: 8110] |
Contents Gini coefficient Empirical methods Prescriptive and policy In economics, the Gini coefficient (/ˈdʒiːni/ JEE-nee), also known as the Gini index or Gini ratio, is a measure of statistical dispersion intended to represent the income inequality, the wealth inequality, or the consumption inequality within a nation or a social group. It was developed by Italian statistician and sociologist Corrado Gini. The Gini coefficient measures the inequality among the values of a frequency distribution, such as income levels. A Gini coefficient of 0 reflects perfect equality, where all income or wealth values are the same. In contrast, a Gini coefficient of 1 (or 100%) reflects maximal inequality among values, where a single individual has all the income while all others have none. Corrado Gini proposed the Gini coefficient as a measure of inequality of income or wealth. For OECD countries in the late 20th century, considering the effect of taxes and transfer payments, the income Gini coefficient ranged between 0.24 and 0.49, with Slovakia being the lowest and Mexico the highest. African countries had the highest pre-tax Gini coefficients in 2008–2009, with South Africa having the world's highest, estimated to be 0.63 to 0.7. However, this figure drops to 0.52 after social assistance is taken into account and drops again to 0.47 after taxation. Slovakia has the lowest Gini coefficient, with a Gini coefficient of 0.232. Various sources have estimated the Gini coefficient of the global income in 2005 to be between 0.61 and 0.68. There are multiple issues in interpreting a Gini coefficient, as the same value may result from many different distribution curves. The demographic structure should be taken into account to mitigate this. Countries with an aging population or those with an increased birth rate experience an increasing pre-tax Gini coefficient even if real income distribution for working adults remains constant. Many scholars have devised over a dozen variants of the Gini coefficient. History The Italian statistician Corrado Gini developed the Gini coefficient and published it in his 1912 paper Variabilità e mutabilità (English: variability and mutability). Building on the work of American economist Max Lorenz, Gini proposed using the difference between the hypothetical straight line depicting perfect equality and the actual line depicting people's incomes as a measure of inequality. In this paper, he introduced the concept of simple mean difference as a measure of variability. He then applied the simple mean difference of observed variables to income and wealth inequality in his work On the measurement of concentration and variability of characters in 1914. Here, he presented the concentration ratio, which further developed into today's Gini coefficient. Secondly, Gini observed that improving methods introduced by Lorenz, Chatelain, or Séailles could also achieve his proposed ratio. In 1915, Gaetano Pietra introduced a geometrical interpretation between Gini's proposed ratio and between the observed area of concentration and maximum concentration. This altered version of the Gini coefficient became the most commonly used inequality index in upcoming years. According to data from the OECD, the Gini coefficient was first officially used country-wide in Canada in the 1970s. Canadian index of income inequality ranged from 0.303 to 0.284 from 1976 to the end of the 1980s. The OECD has published more data on countries since the start of the 21st century. The Central European countries of Slovenia, Czechia, and Slovakia have had the lowest inequality index of all OECD countries ever since the 2000s. Scandinavian countries also frequently appeared at the top of the equality list in recent decades. Definition The Gini coefficient is an index for the degree of inequality in the distribution of income/wealth, used to estimate how far a country's wealth or income distribution deviates from an equal distribution. The Gini coefficient is usually defined mathematically based on the Lorenz curve, which plots the proportion of the total income of the population (y-axis) that is cumulatively earned by the bottom x of the population (see diagram). The line at 45 degrees thus represents perfect equality of incomes. The Gini coefficient can then be thought of as the ratio of the area that lies between the line of equality and the Lorenz curve (marked A in the diagram) over the total area under the line of equality (marked A and B in the diagram); i.e., G = A/(A + B). If there are no negative incomes, it is also equal to 2A and 1 − 2B due to the fact that A + B = 0.5. Assuming non-negative income or wealth for all, the Gini coefficient's theoretical range is from 0 (total equality) to 1 (absolute inequality). This measure is often rendered as a percentage, spanning 0 to 100. However, if negative values are factored in, as in cases of debt, the Gini index could exceed 1. Typically, we presuppose a positive mean or total, precluding a Gini coefficient below zero. An alternative approach is to define the Gini coefficient as half of the relative mean absolute difference, which is equivalent to the definition based on the Lorenz curve. The mean absolute difference is the average absolute difference of all pairs of items of the population, and the relative mean absolute difference is the mean absolute difference divided by the average, x ¯ {\displaystyle {\bar {x}}} , to normalize for scale. If xi is the wealth or income of person i, and there are n persons, then the Gini coefficient G is given by: When the income (or wealth) distribution is given as a continuous probability density function p(x), the Gini coefficient is again half of the relative mean absolute difference: where μ = ∫ − ∞ ∞ x p ( x ) d x {\displaystyle \textstyle \mu =\int _{-\infty }^{\infty }xp(x)\,dx} is the mean of the distribution, and the lower limits of integration may be replaced by zero when all incomes are positive. Calculation While the income distribution of any particular country will not correspond perfectly to the theoretical models, these models can provide a qualitative explanation of the income distribution in a nation given the Gini coefficient. The extreme cases are represented by the most equal possible society in which every person receives the same income (G = 0), and the most unequal society (with N individuals) where a single person receives 100% of the total income and the remaining N − 1 people receive none (G = 1 − 1/N). A simple case assumes just two levels of income, low and high. If the high income group is a proportion u of the population and earns a proportion f of all income, then the Gini coefficient is f − u. A more graded distribution with these same values u and f will always have a higher Gini coefficient than f − u. For example, if the wealthiest u = 20% of the population has f = 80% of all income (see Pareto principle), the income Gini coefficient is at least 60%. In another example, if u = 1% of the world's population owns f = 50% of all wealth, the wealth Gini coefficient is at least 49%. In some cases, this equation can be applied to calculate the Gini coefficient without direct reference to the Lorenz curve. For example, (taking y to indicate the income or wealth of a person or household): The Gini coefficient can also be considered as half the relative mean absolute difference. For a random sample S with values y 1 ≤ y 2 ≤ ⋯ ≤ y n {\displaystyle y_{1}\leq y_{2}\leq \cdots \leq y_{n}} , the sample Gini coefficient is a consistent estimator of the population Gini coefficient, but is not in general unbiased. In simplified form: There does not exist a sample statistic that is always an unbiased estimator of the population Gini coefficient. For a discrete probability distribution with probability mass function f ( y i ) , {\displaystyle f(y_{i}),} i = 1 , … , n {\displaystyle i=1,\ldots ,n} , where f ( y i ) {\displaystyle f(y_{i})} is the fraction of the population with income or wealth y i > 0 {\displaystyle y_{i}>0} , the Gini coefficient is: where If the points with non-zero probabilities are indexed in increasing order ( y i < y i + 1 ) {\displaystyle (y_{i}<y_{i+1})} , then: where When the population is large, the income distribution may be represented by a continuous probability density function f(x) where f(x) dx is the fraction of the population with wealth or income in the interval dx about x. If F(x) is the cumulative distribution function for f(x): and L(x) is the Lorenz function: then the Lorenz curve L(F) may then be represented as a function parametric in L(x) and F(x) and the value of B can be found by integration: The Gini coefficient can also be calculated directly from the cumulative distribution function of the distribution F(y). Defining μ as the mean of the distribution, then specifying that F(y) is zero for all negative values, the Gini coefficient is given by: The latter result comes from integration by parts. (Note that this formula can be applied when there are negative values if the integration is taken from minus infinity to plus infinity.) The Gini coefficient may be expressed in terms of the quantile function Q(F) (inverse of the cumulative distribution function: Q(F(x)) = x) Since the Gini coefficient is independent of scale, if the distribution function can be expressed in the form f(x,φ,a,b,c...) where φ is a scale factor and a, b, c... are dimensionless parameters, then the Gini coefficient will be a function only of a, b, c.... For example, for the exponential distribution, which is a function of only x and a scale parameter, the Gini coefficient is a constant, equal to 1/2. For some functional forms, the Gini index can be calculated explicitly. For example, if y follows a log-normal distribution with the standard deviation of logs equal to σ {\displaystyle \sigma } , then G = erf ( σ 2 ) {\displaystyle G=\operatorname {erf} \left({\frac {\sigma }{2}}\right)} where erf {\displaystyle \operatorname {erf} } is the error function ( since G = 2 Φ ( σ 2 ) − 1 {\displaystyle G=2\Phi \left({\frac {\sigma }{\sqrt {2}}}\right)-1} , where Φ {\displaystyle \Phi } is the cumulative distribution function of a standard normal distribution). In the table below, some examples for probability density functions with support on [ 0 , ∞ ) {\displaystyle [0,\infty )} are shown. The Dirac delta distribution represents the case where everyone has the same wealth (or income); it implies no variations between incomes.[citation needed] Sometimes the entire Lorenz curve is not known, and only values at certain intervals are given. In that case, the Gini coefficient can be approximated using various techniques for interpolating the missing values of the Lorenz curve. If (Xk, Yk) are the known points on the Lorenz curve, with the Xk indexed in increasing order (Xk – 1 < Xk), so that: If the Lorenz curve is approximated on each interval as a line between consecutive points, then the area B can be approximated with trapezoids and: is the resulting approximation for G. More accurate results can be obtained using other methods to approximate the area B, such as approximating the Lorenz curve with a quadratic function across pairs of intervals or building an appropriately smooth approximation to the underlying distribution function that matches the known data. If the population mean and boundary values for each interval are also known, these can also often be used to improve the accuracy of the approximation. The Gini coefficient calculated from a sample is a statistic, and its standard error, or confidence intervals for the population Gini coefficient, should be reported. These can be calculated using bootstrap techniques, mathematically complicated and computationally demanding even in an era of fast computers. Economist Tomson Ogwang made the process more efficient by setting up a "trick regression model" in which respective income variables in the sample are ranked, with the lowest income being allocated rank 1. The model then expresses the rank (dependent variable) as the sum of a constant A and a normal error term whose variance is inversely proportional to yk: Thus, G can be expressed as a function of the weighted least squares estimate of the constant A and that this can be used to speed up the calculation of the jackknife estimate for the standard error. Economist David Giles argued that the standard error of the estimate of A can be used to derive the estimate of G directly without using a jackknife. This method only requires using ordinary least squares regression after ordering the sample data. The results compare favorably with the estimates from the jackknife with agreement improving with increasing sample size. However, it has been argued that this depends on the model's assumptions about the error distributions and the independence of error terms. These assumptions are often not valid for real data sets. There is still ongoing debate surrounding this topic. Guillermina Jasso and Angus Deaton independently proposed the following formula for the Gini coefficient: where μ {\displaystyle \mu } is mean income of the population, Pi is the income rank P of person i, with income X, such that the richest person receives a rank of 1 and the poorest a rank of N. This effectively gives higher weight to poorer people in the income distribution, which allows the Gini to meet the Transfer Principle. Note that the Jasso-Deaton formula rescales the coefficient so that its value is one if all the X i {\displaystyle X_{i}} are zero except one. Note however Allison's reply on the need to divide by N² instead. FAO explains another version of the formula. Generalized inequality indices The Gini coefficient and other standard inequality indices reduce to a common form. Perfect equality—the absence of inequality—exists when and only when the inequality ratio, r j = x j / x ¯ {\displaystyle r_{j}=x_{j}/{\overline {x}}} , equals 1 for all j units in some population (for example, there is perfect income equality when everyone's income x j {\displaystyle x_{j}} equals the mean income x ¯ {\displaystyle {\overline {x}}} , so that r j = 1 {\displaystyle r_{j}=1} for everyone). Measures of inequality, then, are measures of the average deviations of the r j = 1 {\displaystyle r_{j}=1} from 1; the greater the average deviation, the greater the inequality. Based on these observations the inequality indices have this common form: where pj weights the units by their population share, and f(rj) is a function of the deviation of each unit's rj from 1, the point of equality. The insight of this generalized inequality index is that inequality indices differ because they employ different functions of the distance of the inequality ratios (the rj) from 1. Of income distributions Gini coefficients of income are calculated on a market income and a disposable income basis. The Gini coefficient on market income—sometimes referred to as a pre-tax Gini coefficient—is calculated on income before taxes and transfers. It measures inequality in income without considering the effect of taxes and social spending already in place in a country. The Gini coefficient on disposable income—sometimes referred to as the after-tax Gini coefficient—is calculated on income after taxes and transfers. It measures inequality in income after considering the effect of taxes and social spending already in place in a country. For OECD countries over the 2008–2009 period, the Gini coefficient (pre-taxes and transfers) for a total population ranged between 0.34 and 0.53, with South Korea the lowest and Italy the highest. The Gini coefficient (after-taxes and transfers) for a total population ranged between 0.25 and 0.48, with Denmark the lowest and Mexico the highest. For the United States, the country with the largest population among OECD countries, the pre-tax Gini index was 0.49, and the after-tax Gini index was 0.38 in 2008–2009. The OECD average for total populations in OECD countries was 0.46 for the pre-tax income Gini index and 0.31 for the after-tax income Gini index. Taxes and social spending that were in place in 2008–2009 period in OECD countries significantly lowered effective income inequality, and in general, "European countries—especially Nordic and Continental welfare states—achieve lower levels of income inequality than other countries." Using the Gini can help quantify differences in welfare and compensation policies and philosophies. However, it should be borne in mind that the Gini coefficient can be misleading when used to make political comparisons between large and small countries or those with different immigration policies (see limitations section). The Gini coefficient for the entire world has been estimated by various parties to be between 0.61 and 0.68. The graph shows the values expressed as a percentage in their historical development for a number of countries. According to UNICEF, Latin America and the Caribbean region had the highest net income Gini index in the world at 48.3, on an unweighted average basis in 2008. The remaining regional averages were: sub-Saharan Africa (44.2), Asia (40.4), Middle East and North Africa (39.2), Eastern Europe and Central Asia (35.4), and High-income Countries (30.9). Using the same method, the United States is claimed to have a Gini index of 36, while South Africa had the highest income Gini index score of 67.8. Taking income distribution of all human beings, worldwide income inequality has been constantly increasing since the early 19th century (and will keep on increasing over the years) . There was a steady increase in the global income inequality Gini score from 1820 to 2002, with a significant increase between 1980 and 2002. This trend appears to have peaked and begun a reversal with rapid economic growth in emerging economies, particularly in the large populations of BRIC countries. The table below presents the estimated world income Gini coefficients over the last 200 years, as calculated by Milanovic. More detailed data from similar sources plots a continuous decline since 1988. This is attributed to globalization increasing incomes for billions of poor people, mostly in countries like China and India. Developing countries like Brazil have also improved basic services like health care, education, and sanitation; others like Chile and Mexico have enacted more progressive tax policies. Of social development The Gini coefficient is widely used in fields as diverse as sociology, economics, health science, ecology, engineering, and agriculture. For example, in social sciences and economics, in addition to income Gini coefficients, scholars have published education Gini coefficients and opportunity Gini coefficients. Education Gini index estimates the inequality in education for a given population. It is used to discern trends in social development through educational attainment over time. A study across 85 countries by three World Bank economists, Vinod Thomas, Yan Wang, and Xibo Fan, estimated Mali had the highest education Gini index of 0.92 in 1990 (implying very high inequality in educational attainment across the population), while the United States had the lowest education inequality Gini index of 0.14. Between 1960 and 1990, China, India and South Korea had the fastest drop in education inequality Gini Index. They also claim education Gini index for the United States slightly increased over the 1980–1990 period. Though India's education Gini Index has been falling from 1960 through 1990, most of the population still has not received any education, while 10 percent of the population received more than 40% of the total educational hours in the nation. This means that a large portion of capable children in the country are not receiving the support necessary to allow them to become positive contributors to society. This will lead to a deadweight loss to the national society because there are many people who are underdeveloped and underutilized. Similar in concept to the Gini income coefficient, the Gini opportunity coefficient measures inequality in opportunities. The concept builds on Amartya Sen's suggestion that inequality coefficients of social development should be premised on the process of enlarging people's choices and enhancing their capabilities, rather than on the process of reducing income inequality. Kovacevic, in a review of the Gini opportunity coefficient, explained that the coefficient estimates how well a society enables its citizens to achieve success in life where the success is based on a person's choices, efforts and talents, not their background defined by a set of predetermined circumstances at birth, such as gender, race, place of birth, parent's income and circumstances beyond the control of that individual. In 2003, Roemer reported Italy and Spain exhibited the largest opportunity inequality Gini index amongst advanced economies. In 1978, Anthony Shorrocks introduced a measure based on income Gini coefficients to estimate income mobility. This measure, generalized by Maasoumi and Zandvakili, is now generally referred to as Shorrocks index, sometimes as Shorrocks mobility index or Shorrocks rigidity index. It attempts to estimate whether the income inequality Gini coefficient is permanent or temporary and to what extent a country or region enables economic mobility to its people so that they can move from one (e.g., bottom 20%) income quantile to another (e.g., middle 20%) over time. In other words, the Shorrocks index compares inequality of short-term earnings, such as the annual income of households, to inequality of long-term earnings, such as 5-year or 10-year total income for the same households. Shorrocks index is calculated in several different ways, a common approach being from the ratio of income Gini coefficients between short-term and long-term for the same region or country. A 2010 study using social security income data for the United States since 1937 and Gini-based Shorrock's indices concludes that income mobility in the United States has had a complicated history, primarily due to the mass influx of women into the American labor force after World War II. Income inequality and income mobility trends have been different for men and women workers between 1937 and the 2000s. When men and women are considered together, the Gini coefficient-based Shorrocks index trends imply long-term income inequality has been substantially reduced among all workers, in recent decades for the United States. Other scholars, using just 1990s data or other short periods have come to different conclusions. For example, Sastre and Ayala conclude from their study of income Gini coefficient data between 1993 and 1998 for six developed economies that France had the least income mobility, Italy the highest, and the United States and Germany intermediate levels of income mobility over those five years. Features The Gini coefficient has features that make it useful as a measure of dispersion in a population, and inequalities in particular. The coefficient ranges from 0, for perfect equality, to 1, indicating perfect inequality. The Gini is based on the comparison of cumulative proportions of the population against cumulative proportions of income they receive. Limitations The Gini coefficient is a relative measure. The Gini coefficient of a developing country can rise (due to increasing inequality of income) even when the number of people in absolute poverty decreases. This is because the Gini coefficient measures relative, not absolute, wealth. Gini coefficients are simple, and this simplicity can lead to oversights and can confuse the comparison of different populations; for example, while both Bangladesh (per capita income of $1,693) and the Netherlands (per capita income of $42,183) had an income Gini coefficient of 0.31 in 2010, the quality of life, economic opportunity and absolute income in these countries are very different, i.e. countries may have identical Gini coefficients, but differ greatly in wealth. Basic necessities may be available to all in a developed economy, while in an undeveloped economy with the same Gini coefficient, basic necessities may be unavailable to most or unequally available due to lower absolute wealth. Gini has some mathematical limitations as well. It is not additive and different sets of people cannot be averaged to obtain the Gini coefficient of all the people in the sets. Even when the total income of a population is the same, in certain situations two countries with different income distributions can have the same Gini index (e.g. cases when income Lorenz Curves cross). Table A illustrates one such situation. Both countries have a Gini coefficient of 0.2, but the average income distributions for household groups are different. As another example, in a population where the lowest 50% of individuals have no income, and the other 50% have equal income, the Gini coefficient is 0.5; whereas for another population where the lowest 75% of people have 25% of income and the top 25% have 75% of the income, the Gini index is also 0.5. Economies with similar incomes and Gini coefficients can have very different income distributions. Bellù and Liberati claim that ranking income inequality between two populations is not always possible based on their Gini indices. Similarly, computational social scientist Fabian Stephany illustrates that income inequality within the population, e.g., in specific socioeconomic groups of same age and education, also remains undetected by conventional Gini indices. A Gini index does not contain information about absolute national or personal incomes. Populations can simultaneously have very low income Gini indices and very high wealth Gini indexes. By measuring inequality in income, the Gini ignores the differential efficiency of the use of household income. By ignoring wealth (except as it contributes to income), the Gini can create the appearance of inequality when the people compared are at different stages in their life. Wealthy countries such as Sweden can show a low Gini coefficient for the disposable income of 0.31, thereby appearing equal, yet have a very high Gini coefficient for wealth of 0.79 to 0.86, suggesting an extremely unequal wealth distribution in its society. These factors are not assessed in income-based Gini. Gini index has a downward-bias for small populations. Counties or states or countries with small populations and less diverse economies will tend to report small Gini coefficients. For economically diverse large population groups, a much higher coefficient is expected than for each of its regions. For example, taking the world economy as a whole and income distribution for all human beings, different scholars estimate the global Gini index to range between 0.61 and 0.68. As with other inequality coefficients, the Gini coefficient is influenced by the granularity of the measurements. For example, five 20% quantiles (low granularity) will usually yield a lower Gini coefficient than twenty 5% quantiles (high granularity) for the same distribution. Philippe Monfort has shown that using inconsistent or unspecified granularity limits the usefulness of Gini coefficient measurements. Changing income inequality, measured by Gini coefficients, can be due to structural changes in a society such as growing population (increased birth rates, aging populations, emigration, immigration) and income mobility. Another limitation of the Gini coefficient is that it is not a proper measure of egalitarianism, as it only measures income dispersion. For example, suppose two equally egalitarian countries pursue different immigration policies. In that case, the country accepting a higher proportion of low-income or impoverished migrants will report a higher Gini coefficient and, therefore, may exhibit more income inequality. The Gini coefficient measure gives different results when applied to individuals instead of households, for the same economy and same income distributions. If household data is used, the measured value of income Gini depends on how the household is defined. The comparison is not meaningful when different populations are not measured with consistent definitions. Furthermore, changes to the household income Gini can be driven by changes in household formation, such as increased divorce rates or extended family households splitting into nuclear families. Deininger and Squire (1996) show that the income Gini coefficient based on individual income rather than household income is different. For example, for the United States, they found that the individual income-based Gini index was 0.35, while for France, 0.43. According to their individual-focused method, in the 108 countries they studied, South Africa had the world's highest Gini coefficient at 0.62, Malaysia had Asia's highest Gini coefficient at 0.5, Brazil the highest at 0.57 in Latin America and the Caribbean region, and Turkey the highest at 0.5 in OECD countries. Billionaire Thomas Kwok claimed the income Gini coefficient for Hong Kong has been high (0.434 in 2010), in part because of structural changes in its population. Over recent decades, Hong Kong has witnessed increasing numbers of small households, elderly households, and elderly living alone. The combined income is now split into more households. Many older people live separately from their children in Hong Kong. These social changes have caused substantial changes in household income distribution. The income Gini coefficient, claims Kwok, does not discern these structural changes in its society. Household money income distribution for the United States, summarized in Table C of this section, confirms that this issue is not limited to just Hong Kong. According to the US Census Bureau, between 1979 and 2010, the population of the United States experienced structural changes in overall households; the income for all income brackets increased in inflation-adjusted terms, household income distributions shifted into higher income brackets over time, while the income Gini coefficient increased. The Gini coefficient is unable to discern the effects of structural changes in populations. Expanding on the importance of life-span measures, the Gini coefficient as a point-estimate of equality at a certain time ignores life-span changes in income. Typically, increases in the proportion of young or old members of a society will drive apparent changes in equality simply because people generally have lower incomes and wealth when they are young than when they are old. Because of this, factors such as age distribution within a population and mobility within income classes can create the appearance of inequality when none exist, taking into account demographic effects. Thus a given economy may have a higher Gini coefficient at any timepoint compared to another, while the Gini coefficient calculated over individuals' lifetime income is lower than the apparently more equal (at a given point in time) economy's.[clarification needed] Essentially, what matters is not just inequality in any particular year but the distribution composition over time. Inaccuracies in assign monetary value to income in kind reduce the accuracy of Gini as a measurement of true inequality. While taxes and cash transfers are relatively straightforward to account for, other government benefits can be difficult to value. Benefits such as subsidized housing, medical care, and education are difficult to value objectively, as it depends on the quality and extent of the benefit. In absence of a free market, valuing these income transfers as household income is subjective. The theoretical model of the Gini coefficient is limited to accepting correct or incorrect subjective assumptions. In subsistence-driven and informal economies, people may have significant income in other forms than money, for example, through subsistence farming or bartering. These forms of income tend to accrue to poor segments of populations in emerging and transitional economy countries such as those in sub-Saharan Africa, Latin America, Asia, and Eastern Europe. Informal economy accounts for over half of global employment and as much as 90 percent of employment in some of the poorer sub-Saharan countries with high official Gini inequality coefficients. Schneider et al., in their 2010 study of 162 countries, report about 31.2%, or about $20 trillion, of world's GDP is informal. In developing countries, the informal economy predominates for all income brackets except the richer, urban upper-income bracket populations. Even in developed economies, 8% (United States) to 27% (Italy) of each nation's GDP is informal. The resulting informal income predominates as a livelihood activity for those in the lowest income brackets. The value and distribution of the incomes from informal or underground economy is difficult to quantify, making true income Gini coefficients estimates difficult. Different assumptions and quantifications of these incomes will yield different Gini coefficients. Alternatives Given the limitations of the Gini coefficient, other statistical methods are used in combination or as an alternative measure of population dispersity. For example, entropy measures are frequently used (e.g. the Atkinson index or the Theil Index and Mean log deviation as special cases of the generalized entropy index). These measures attempt to compare the distribution of resources by intelligent agents in the market with a maximum entropy random distribution, which would occur if these agents acted like non-interacting particles in a closed system following the laws of statistical physics. The Ortego two-parameter model may be superior to the GINI index. Relation to other statistical measures There is a summary measure of the diagnostic ability of a binary classifier system that is also called the Gini coefficient, which is defined as twice the area between the receiver operating characteristic (ROC) curve and its diagonal. It is related to the AUC (Area Under the ROC Curve) measure of performance given by A U C = ( G + 1 ) / 2 {\displaystyle AUC=(G+1)/2} and to Mann–Whitney U. Although both Gini coefficients are defined as areas between certain curves and share certain properties, there is no simple direct relationship between the Gini coefficient of statistical dispersion and the Gini coefficient of a classifier. The Gini index is also related to the Pietra index — both of which measure statistical heterogeneity and are derived from the Lorenz curve and the diagonal line. In certain fields such as ecology, inverse Simpson's index 1 / λ {\displaystyle 1/\lambda } is used to quantify diversity, and this should not be confused with the Simpson index λ {\displaystyle \lambda } . These indicators are related to Gini. The inverse Simpson index increases with diversity, unlike the Simpson index and Gini coefficient, which decrease with diversity. The Simpson index is in the range [0, 1], where 0 means maximum and 1 means minimum diversity (or heterogeneity). Since diversity indices typically increase with increasing heterogeneity, the Simpson index is often transformed into inverse Simpson, or using the complement 1 − λ {\displaystyle 1-\lambda } , known as the Gini-Simpson Index. The Lorenz curve is another method of graphical representation of wealth distribution. It was developed 9 years before the Gini coefficient, which quantifies the extent to which the Lorenz curve deviates from the perfect equality line (with slope of 1). The Hoover index (also known as Robin Hood index) presents the percentage of total population's income that would have to be redistributed to make the Gini coefficient equal to 0 (perfect equality). Gini coefficients for pre-modern societies In recent decades, researchers have attempted to estimate Gini coefficients for pre-20th century societies. In the absence of household income surveys and income taxes, scholars have relied on proxy variables. These include wealth taxes in medieval European city states, patterns of landownership in Roman Egypt, variation of the size of houses in societies from ancient Greece to Aztec Mexico, and inheritance and dowries in Babylonian society. Other data does not directly document variations in wealth or income but are known to reflect inequality, such as the ratio of rents to wages or of labor to capital. Other uses Although the Gini coefficient is most popular in economics, it can, in theory, be applied in any field of science that studies a distribution. For example, in ecology, the Gini coefficient has been used as a measure of biodiversity, where the cumulative proportion of species is plotted against the cumulative proportion of individuals. In health, it has been used as a measure of the inequality of health-related quality of life in a population. In education, it has been used as a measure of the inequality of universities. In chemistry it has been used to express the selectivity of protein kinase inhibitors against a panel of kinases. In engineering, it has been used to evaluate the fairness achieved by Internet routers in scheduling packet transmissions from different flows of traffic. In machine learning, it has been used as a unified metric for evaluating many-versus-many (all-to-all) similarity in vector spaces across various data types, including images and text, and to show their effectiveness in guiding machine learning training sample selection, especially in sparse information settings. The Gini coefficient is sometimes used for the measurement of the discriminatory power of rating systems in credit risk management. In 2004, a paper by Jennifer Lotz used the Gini coefficient as a measure of "the relative distribution of the galaxy pixel flux values," finding it to be a reliable way to separate ULIRGs from normal galaxies. The Gini coefficient has since been used extensively in Galaxy morphological classification. A 2005 study accessed US census data to measure home computer ownership and used the Gini coefficient to measure inequalities amongst whites and African Americans. Results indicated that although decreasing overall, home computer ownership inequality was substantially smaller among white households. A 2016 peer-reviewed study titled Employing the Gini coefficient to measure participation inequality in treatment-focused Digital Health Social Networks illustrated that the Gini coefficient was helpful and accurate in measuring shifts in inequality, however as a standalone metric it failed to incorporate overall network size. Discriminatory power refers to a credit risk model's ability to differentiate between defaulting and non-defaulting clients. The formula G 1 {\displaystyle G_{1}} , in the calculation section above, may be used for the final model and at the individual model factor level to quantify the discriminatory power of individual factors. It is related to the accuracy ratio in population assessment models. The Gini coefficient has also been applied to analyze inequality in dating apps. Kaminskiy and Krivtsov extended the concept of the Gini coefficient from economics to reliability theory and proposed a Gini-type coefficient that helps to assess the degree of aging of non-repairable systems or aging and rejuvenation of repairable systems. The coefficient is defined between −1 and 1 and can be used in both empirical and parametric life distributions. It takes negative values for the class of decreasing failure rate distributions and point processes with decreasing failure intensity rate and is positive for the increasing failure rate distributions and point processes with increasing failure intensity rate. The value of zero corresponds to the exponential life distribution or the Homogeneous Poisson Process. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Atmospheric_pressure] | [TOKENS: 2426] |
Contents Atmospheric pressure Atmospheric pressure, also known as air pressure or barometric pressure (after the barometer), is the pressure within the atmosphere of Earth. The standard atmosphere (symbol: atm) is a unit of pressure defined as 101,325 Pa (1,013.25 hPa), which is equivalent to 1,013.25 millibars, 760 torr (or about 760 mmHg), about 29.9212 inHg, or about 14.696 psi. The atm unit is roughly equivalent to the mean sea-level atmospheric pressure on Earth; that is, the Earth's atmospheric pressure at sea level is approximately 1 atm. In most circumstances, atmospheric pressure is closely approximated by the hydrostatic pressure caused by the weight of air above the measurement point. As elevation increases, there is less overlying atmospheric mass, so atmospheric pressure decreases with increasing elevation. Because the atmosphere is thin relative to the Earth's radius—especially the dense atmospheric layer at low altitudes—the Earth's gravitational acceleration as a function of altitude can be approximated as constant and contributes little to this fall-off. Pressure measures force per unit area, with SI units of pascals (1 pascal = 1 newton per square metre, 1 N/m2). On average, a column of air with a cross-sectional area of 1 square centimetre (cm2), measured from the mean (average) sea level to the top of Earth's atmosphere, has a mass of about 1.03 kilogram and exerts a force or "weight" of about 10.1 newtons, resulting in a pressure of 10.1 N/cm2 or 101 kN/m2 (101 kilopascals, kPa). A column of air with a cross-sectional area of 1 in2 would have a weight of about 14.7 lbf, resulting in a pressure of 14.7 lbf/in2. Mechanism Atmospheric pressure is caused by the gravitational attraction of the planet on the atmospheric gases above the surface and is a function of the mass of the planet, the radius of the surface, and the amount and composition of the gases and their vertical distribution in the atmosphere. It is modified by the planetary rotation and local effects such as wind velocity, density variations due to temperature and variations in composition. Mean sea-level pressure The mean sea-level pressure (MSLP) is the atmospheric pressure at mean sea level. This is the atmospheric pressure normally given in weather reports via meteorologists on radio, television, and newspapers or on the Internet. The altimeter setting in aviation is an atmospheric pressure adjustment. Average sea-level pressure is 1,013.25 hPa (29.921 inHg; 760.00 mmHg). In aviation weather reports (METAR), QNH is transmitted around the world in hectopascals or millibars (1 hectopascal = 1 millibar). In the United States, Canada, and Japan altimeter setting is reported in inches of mercury (to two decimal places). The United States and Canada also report sea-level pressure SLP, which is adjusted to sea level by a different method, in the remarks section, not in the internationally transmitted part of the code, in hectopascals or millibars. However, in Canada's public weather reports, sea level pressure is instead reported in kilopascals. In the US weather code remarks, three digits are all that are transmitted; decimal points and the one or two most significant digits are omitted: 1,013.2 hPa (14.695 psi) is transmitted as 132; 1,000 hPa (100 kPa) is transmitted as 000; 998.7 hPa is transmitted as 987; etc. A system transmitting the last three digits transmits the same code (800) for 1080.0 hPa as for 980.0 hPa. The highest sea-level pressure on Earth occurs in Siberia, where the Siberian High often attains a sea-level pressure above 1,050 hPa (15.2 psi; 31 inHg), with record highs close to 1,085 hPa (15.74 psi; 32.0 inHg). The lowest measurable sea-level pressure is found at the centres of tropical cyclones and tornadoes, with a record low of 870 hPa (12.6 psi; 26 inHg). Surface pressure Surface pressure is the atmospheric pressure at a location on Earth's surface (terrain and oceans). It is directly proportional to the mass of air over that location. For numerical reasons, atmospheric models such as general circulation models (GCMs) usually predict the nondimensional logarithm of surface pressure. The average value of surface pressure on Earth is 985 hPa. This is in contrast to mean sea-level pressure, which involves the extrapolation of pressure to sea level for locations above or below sea level. The average pressure at mean sea level (MSL) in the International Standard Atmosphere (ISA) is 1,013.25 hPa, or 1 atmosphere (atm), or 29.92 inches of mercury. Pressure (P), mass (m), and acceleration due to gravity (g) are related by P = F/A = (m*g)/A, where A is the surface area. Atmospheric pressure is thus proportional to the weight per unit area of the atmospheric mass above that location. Altitude variation Pressure on Earth varies with the altitude of the surface, so air pressure on mountains is usually lower than air pressure at sea level. Pressure varies smoothly from the Earth's surface to the top of the mesosphere. Although the pressure changes with the weather, NASA has averaged the conditions for all parts of the earth year-round. As altitude increases, atmospheric pressure decreases. One can calculate the atmospheric pressure at a given altitude. Temperature and humidity also affect the atmospheric pressure. Pressure is proportional to temperature and inversely related to humidity, and both of these are necessary to compute an accurate figure. The graph on the rightabove was developed for a temperature of 15 °C and a relative humidity of 0%. At low altitudes above sea level, the pressure decreases by about 1.2 kPa (12 hPa) for every 100 metres. For higher altitudes within the troposphere, the following equation (the barometric formula) relates atmospheric pressure p to altitude h: p = p 0 ⋅ ( 1 + L ⋅ h T 0 ) − g ⋅ M R 0 ⋅ L = p 0 ⋅ ( 1 + g ⋅ h c p ⋅ T 0 ) − c p ⋅ M R 0 {\displaystyle {\begin{aligned}p&=p_{0}\cdot \left(1+{\frac {L\cdot h}{T_{0}}}\right)^{-{\frac {g\cdot M}{R_{0}\cdot L}}}\\&=p_{0}\cdot \left(1+{\frac {g\cdot h}{c_{\text{p}}\cdot T_{0}}}\right)^{-{\frac {c_{\text{p}}\cdot M}{R_{0}}}}\end{aligned}}} The values in these equations are: Local variation Atmospheric pressure varies widely on Earth, and differences in pressure are important in studying weather and climate. Some variations in pressure are very regular. One important source of variation is atmospheric tides. Atmospheric tides are strongest in tropical zones, with an amplitude of a few hectopascals, and almost zero in polar areas. Tropical tidal variations in pressure principle consist of two superimposed harmonics -- a circadian (24 h) cycle, and a semi-circadian (12 h) cycle. Records The highest adjusted-to-sea level barometric pressure ever recorded on Earth (above 750 meters) was 1,084.8 hPa (32.03 inHg; 1.0706 atm) measured in Tosontsengel, Mongolia on 19 December 2001. The highest adjusted-to-sea level barometric pressure ever recorded (below 750 meters) was at Agata in Evenk Autonomous Okrug, Russia (66°53' N, 93°28' E, elevation: 261 m, 856 ft) on 31 December 1968 of 1,083.8 hPa (32.005 inHg). The discrimination is due to the problematic assumptions (assuming a standard lapse rate) associated with reduction of sea level from high elevations. The Dead Sea, the lowest place on Earth at 430 metres (1,410 ft) below sea level, has a correspondingly high typical atmospheric pressure of 1,065 hPa. A below-sea-level surface pressure record of 1,081.8 hPa (31.95 inHg; 1.0677 atm) was set on 21 February 1961. The lowest non-tornadic atmospheric pressure ever measured was 870 hPa (26 inHg; 0.86 atm), set on 12 October 1979, during Typhoon Tip in the western Pacific Ocean. The measurement was based on an instrumental observation made from a reconnaissance aircraft. Measurement based on the depth of water One atmosphere (101.325 kPa or 14.7 psi) is also the pressure caused by the weight of a column of freshwater of approximately 10.3 m (33.8 ft). Thus, a diver 10.3 m under water experiences a pressure of about 2 atmospheres (1 atm of air plus 1 atm of water). Conversely, 10.3 m is the maximum height to which water can be raised using suction under standard atmospheric conditions. Low pressures, such as natural gas lines, are sometimes specified in inches of water, typically written as w.c. (water column) gauge or w.g. (inches water) gauge. A typical gas-using residential appliance in the US is rated for a maximum of 1⁄2 psi (3.4 kPa; 34 mbar), which is approximately 14 w.g. Similar metric units with a wide variety of names and notation based on millimetres, centimetres or metres are now less commonly used. Boiling point of liquids Pure water boils at 100 °C (212 °F) at Earth's standard atmospheric pressure. The boiling point is the temperature at which the vapour pressure is equal to the atmospheric pressure around the liquid. Because of this, the boiling point of liquids is lower at lower pressure and higher at higher pressure. Cooking at high elevations, therefore, requires adjustments to recipes or pressure cooking. A rough approximation of elevation can be obtained by measuring the temperature at which water boils; in the mid-19th century, this method was used by explorers. Conversely, if one wishes to evaporate a liquid at a lower temperature, for example in distillation, the atmospheric pressure may be lowered by using a vacuum pump, as in a rotary evaporator. Measurement and maps An important application of the knowledge that atmospheric pressure varies directly with altitude was in determining the height of hills and mountains, thanks to reliable pressure measurement devices. In 1774, Nevil Maskelyne was confirming Newton's theory of gravitation at and on Schiehallion mountain in Scotland, and he needed to measure elevations on the mountain's sides accurately. This event is known as the Schiehallion experiment. William Roy, using barometric pressure, was able to confirm Maskelyne's height determinations; the agreement was within one meter (3.28 feet). This method became and continues to be useful for survey work and map making. See also References External links |
======================================== |
[SOURCE: https://techcrunch.com/video/a16z-just-raised-1-7b-for-ai-infrastructure-heres-where-its-going/] | [TOKENS: 804] |
Save up to $680 on your pass with Super Early Bird rates. REGISTER NOW. Save up to $680 on your Disrupt 2026 pass. Ends February 27. REGISTER NOW. Latest AI Amazon Apps Biotech & Health Climate Cloud Computing Commerce Crypto Enterprise EVs Fintech Fundraising Gadgets Gaming Google Government & Policy Hardware Instagram Layoffs Media & Entertainment Meta Microsoft Privacy Robotics Security Social Space Startups TikTok Transportation Venture Staff Events Startup Battlefield StrictlyVC Newsletters Podcasts Videos Partner Content TechCrunch Brand Studio Crunchboard Contact Us A16z just raised $1.7B for AI infrastructure. Here’s where it’s going. Loading the player… Andreessen Horowitz just raised a whopping new $15 billion in funding. And a $1.7 billion chunk of that is going to its infrastructure team, the one responsible for some of its biggest, most prominent AI investments including Black Forrest Labs, Cursor, OpenAI, ElevenLabs, Ideogram, Fal and dozens of others. A16z general partner with the infra team Jennifer Li (who oversees such investments as ElevenLabs – just valued at $11 billion); Ideagram and Fal, has a clear thesis on where the team is looking to spend it’s latest chunk of cash. Watch as Venture and Startups editor Julie Bort talks with Li on Equity about where a16z sees this AI super cycle going next, including the talent crunch hitting AI-native startups, why search infrastructure matters more than people think, and what kinds of companies are actually getting funded right now. Subscribe to Equity on YouTube, Apple Podcasts, Overcast, Spotify and all the casts. You also can follow Equity on X and Threads, at @EquityPod. Topics Audio Producer Theresa Loconsolo is an audio producer at TechCrunch focusing on Equity, the network’s flagship podcast. Before joining TechCrunch in 2022, she was one of 2 producers at a four-station conglomerate where she wrote, recorded, voiced and edited content, and engineered live performances and interviews from guests like lovelytheband. Theresa is based in New Jersey and holds a bachelors degree in Communication from Monmouth University. You can contact or verify outreach from Theresa by emailing theresa.loconsolo@techcrunch.com. You can contact or verify outreach from Theresa by emailing theresa.loconsolo@techcrunch.com. You can contact or verify outreach from Theresa by emailing theresa.loconsolo@techcrunch.com. Save up to $680 on your pass before February 27.Meet investors. Discover your next portfolio company. Hear from 250+ tech leaders, dive into 200+ sessions, and explore 300+ startups building what’s next. Don’t miss these one-time savings. Most Popular FBI says ATM ‘jackpotting’ attacks are on the rise, and netting hackers millions in stolen cash Meta’s own research found parental supervision doesn’t really help curb teens’ compulsive social media use How Ricursive Intelligence raised $335M at a $4B valuation in 4 months After all the hype, some AI experts don’t think OpenClaw is all that exciting OpenClaw creator Peter Steinberger joins OpenAI Hollywood isn’t happy about the new Seedance 2.0 video generator The great computer science exodus (and where students are going instead) Subscribe for the industry’s biggest tech news Every weekday and Sunday, you can get the best of TechCrunch’s coverage. TechCrunch's AI experts cover the latest news in the fast-moving field. Every Monday, gets you up to speed on the latest advances in aerospace. Startups are the core of TechCrunch, so get our best coverage delivered weekly. By submitting your email, you agree to our Terms and Privacy Notice. Related © 2025 TechCrunch Media LLC. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Private_network] | [TOKENS: 1287] |
Contents Private network In Internet networking, a private network is a computer network that uses a private address space of IP addresses. These addresses are commonly used for local area networks (LANs) in residential, office, and enterprise environments. Both the IPv4 and the IPv6 specifications define private IP address ranges. Most Internet service providers (ISPs) allocate only a single publicly routable IPv4 address to each residential customer, but many homes have more than one computer, smartphone, or other Internet-connected device. In this situation, a network address translator (NAT/PAT) gateway is usually used to provide Internet connectivity to multiple hosts. Private addresses are also commonly used in corporate networks which, for security reasons, are not connected directly to the Internet. Often a proxy, SOCKS gateway, or similar devices are used to provide restricted Internet access to network-internal users. Private network addresses are not allocated to any specific organization. Anyone may use these addresses without approval from regional or local Internet registries. Private IP address spaces were originally defined to assist in delaying IPv4 address exhaustion. IP packets originating from or addressed to a private IP address cannot be routed through the public Internet. Private addresses are often seen as enhancing network security for the internal network since use of private addresses internally makes it difficult for an external host to initiate a connection to an internal system. Private IPv4 addresses The Internet Engineering Task Force (IETF) has directed the Internet Assigned Numbers Authority (IANA) to reserve the following IPv4 address ranges for private networks:: 4 In practice, it is common to subdivide these ranges into smaller subnets. Dedicated space for carrier-grade NAT deployment In April 2012, IANA allocated the 100.64.0.0/10 block of IPv4 addresses specifically for use in carrier-grade NAT scenarios. This address block should not be used on private networks or on the public Internet. The size of the address block was selected to be large enough to uniquely number all customer access devices for all of a single operator's points of presence in a large metropolitan area such as Tokyo. Private IPv6 addresses The concept of private networks has been extended in the next generation of the Internet Protocol, IPv6, and special address blocks are reserved. The address block fc00::/7 is reserved by IANA for unique local addresses (ULAs). They are unicast addresses, but contain a 40-bit random number in the routing prefix to prevent collisions when two private networks are interconnected. Despite being inherently local in usage, the IPv6 address scope of unique local addresses is global. The first block defined is fd00::/8, designed for /48 routing blocks, in which users can create multiple subnets, as needed. Examples: A former standard proposed the use of site-local addresses in the fec0::/10 block, but because of scalability concerns and poor definition of what constitutes a site, its use has been deprecated since September 2004. Link-local addresses Another type of private networking uses the link-local address range. The validity of link-local addresses is limited to a single link; e.g. to all computers connected to a switch, or to one wireless network. Hosts on different sides of a network bridge are also on the same link, whereas hosts on different sides of a network router are on different links. In IPv4, the utility of link-local addresses is in zero-configuration networking when Dynamic Host Configuration Protocol (DHCP) services are not available and manual configuration by a network administrator is not desirable. The block 169.254.0.0/16 was allocated for this purpose. If a host on an IEEE 802 (Ethernet) network cannot obtain a network address via DHCP, an address from 169.254.1.0 to 169.254.254.255[Note 2] may be assigned pseudorandomly. The standard prescribes that address collisions must be handled gracefully. In IPv6, the block fe80::/10 is reserved for IP address autoconfiguration. The implementation of these link-local addresses is mandatory, as various functions of the IPv6 protocol depend on them. A special case of private link-local addresses is the loopback interface. These addresses are private and link-local by definition since packets never leave the host device. IPv4 reserves the entire class A address block 127.0.0.0/8 for use as private loopback addresses. IPv6 reserves the single address ::1. Some are advocating reducing 127.0.0.0/8 to 127.0.0.0/16. Misrouting It is common for packets originating in private address spaces to be misrouted onto the Internet. Private networks often do not properly configure DNS services for addresses used internally and attempt reverse DNS lookups for these addresses, causing extra traffic to the Internet root nameservers. The AS112 project attempted to mitigate this load by providing special black hole anycast nameservers for private address ranges which only return negative result codes (not found) for these queries. Organizational edge routers are usually configured to drop ingress IP traffic for these networks, which can occur either by misconfiguration or from malicious traffic using a spoofed source address. Less commonly, ISP edge routers drop such egress traffic from customers, which reduces the impact to the Internet of such misconfigured or malicious hosts on the customer's network. Merging private networks Since the private IPv4 address space is relatively small, many private IPv4 networks unavoidably use the same address ranges. This can create a problem when merging such networks, as some addresses may be duplicated for multiple devices. In this case, networks or hosts must be renumbered, often a time-consuming task or a network address translator must be placed between the networks to translate or masquerade one of the address ranges. IPv6 defines unique local addresses, providing a very large private address space from which each organization can randomly or pseudo-randomly allocate a 40-bit prefix, each of which allows 65536 organizational subnets. With space for about one trillion (1012) prefixes, it is unlikely that two network prefixes in use by different organizations would be the same, provided each of them was selected randomly, as specified in the standard. When two such private IPv6 networks are connected or merged, the risk of an address conflict is therefore virtually absent. RFC documents See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Grey_alien#cite_note-LevyMendlesohn2019-8] | [TOKENS: 2835] |
Contents Grey alien Grey aliens, also referred to as Zeta Reticulans, Roswell Greys, or simply, Greys,[a] are purported extraterrestrial beings. They are frequently featured in claims of close encounter and alien abduction. Greys are typically described as having small, humanoid bodies, smooth, grey skin, disproportionately large, hairless heads, and large, black, almond-shaped eyes. The 1961 Barney and Betty Hill abduction claim was key to the popularization of Grey aliens. Precursor figures have been described in science fiction and similar descriptions appeared in later accounts of the 1947 Roswell UFO incident and early accounts of the 1948 Aztec UFO hoax. The Grey alien is cited an archetypal image of an intelligent non-human creature and extraterrestrial life in general, as well as an iconic trope of popular culture in the age of space exploration. Description Greys are typically depicted as grey-skinned, diminutive humanoid beings that possess reduced forms of, or completely lack, external human body parts such as noses, ears, or sex organs. Their bodies are usually depicted as being elongated, having a small chest, and lacking in muscular definition and visible skeletal structure. Their legs are depicted as being shorter and jointed differently from humans with limbs proportionally different from a human. Greys are depicted as having unusually large heads in proportion to their bodies, and as having no hair, no noticeable outer ears or noses, and small orifices for ears, nostrils, and mouths. In drawings, Greys are almost always shown with very large, opaque, black eyes, without eye whites. They are frequently described as shorter than average adult humans. The association between Grey aliens and Zeta Reticuli originated with the interpretation of a map drawn by Betty Hill by a school-teacher named Marjorie Fish sometime in 1969. Betty Hill, under hypnosis, had claimed to have been shown a map that displayed the aliens' home system and nearby stars. Upon learning of this, Fish attempted to create a model from a drawing produced by Hill, eventually determining that the stars marked as the aliens' home were Zeta Reticuli, a binary star system. History In literature, descriptions of beings similar to Grey aliens predate claims of supposed encounters with them. In 1893, H. G. Wells presented a description of humanity's future appearance in the article "The Man of the Year Million", describing humans as having no mouths, noses, or hair, and with large heads. In 1895, Wells also depicted the Eloi, a successor species to humanity, in similar terms in the novel The Time Machine. Both share many characteristics with future perceptions of Greys. As early as 1917, the occultist Aleister Crowley described a meeting with a "preternatural entity" named Lam that was similar in appearance to a modern Grey. Crowley claimed to have contacted Lam through a process called the "Amalantrah Workings," which he believed allowed humans to contact beings from outer space and across dimensions. Other occultists and ufologists, many of whom have retroactively linked Lam to later Grey encounters, have since described their own visitations from him, with one describing the being as a "cold, computer-like intelligence," and utterly beyond human comprehension. ...the creatures did not resemble any race of humans. They were short, shorter than the average Japanese, and their heads were big and bald, with strong, square foreheads, and very small noses and mouths, and weak chins. What was most extraordinary about them were the eyes—large, dark, gleaming, with a sharp gaze. They wore clothes made of soft grey fabric, and their limbs seemed to be similar to those of humans. In 1933, the Swedish novelist Gustav Sandgren, using the pen name Gabriel Linde, published a science fiction novel called Den okända faran (The Unknown Danger), in which he describes a race of extraterrestrials who wore clothes made of soft grey fabric and were short, with big bald heads, and large, dark, gleaming eyes. The novel, aimed at young readers, included illustrations of the imagined aliens. This description would become the template upon which the popular image of grey aliens is based. The conception remained a niche one until 1965, when newspaper reports of the Betty and Barney Hill abduction made the archetype famous. The alleged abductees, Betty and Barney Hill, claimed that in 1961, humanoid alien beings with greyish skin had abducted them and taken them to a flying saucer. In his 1990 article "Entirely Unpredisposed", Martin Kottmeyer suggested that Barney's memories revealed under hypnosis might have been influenced by an episode of the science-fiction television show The Outer Limits titled "The Bellero Shield", which was broadcast 12 days before Barney's first hypnotic session. The episode featured an extraterrestrial with large eyes, who says, "In all the universes, in all the unities beyond the universes, all who have eyes have eyes that speak." The report from the regression featured a scenario that was in some respects similar to the television show. In part, Kottmeyer wrote: Wraparound eyes are an extreme rarity in science fiction films. I know of only one instance. They appeared on the alien of an episode of an old TV series The Outer Limits entitled "The Bellero Shield." A person familiar with Barney's sketch in "The Interrupted Journey" and the sketch done in collaboration with the artist David Baker will find a "frisson" of "déjà vu" creeping up his spine when seeing this episode. The resemblance is much abetted by an absence of ears, hair, and nose on both aliens. Could it be by chance? Consider this: Barney first described and drew the wraparound eyes during the hypnosis session dated 22 February 1964. "The Bellero Shield" was first broadcast on 10 February 1964. Only twelve days separate the two instances. If the identification is admitted, the commonness of wraparound eyes in the abduction literature falls to cultural forces. — Martin Kottmeyer, Entirely Unpredisposed: The Cultural Background of UFO Reports Carl Sagan echoed Kottmeyer's suspicions in his 1997 book, The Demon Haunted World: Science as a Candle in the Dark, where Invaders from Mars was cited as another potential inspiration. After the Hills' encounter, Greys would go on to become an integral part of ufology and other extraterrestrial-related folklore. This is particularly true in the case of the United States: according to journalist C. D. B. Bryan, 73% of all reported alien encounters in the United States describe Grey aliens, a significantly higher proportion than other countries.: 68 During the early 1980s, Greys were linked to the alleged crash-landing of a flying saucer in Roswell, New Mexico, in 1947. A number of publications contained statements from individuals who claimed to have seen the U.S. military handling a number of unusually proportioned, bald, child-sized beings. These individuals claimed, during and after the incident, that the beings had oversized heads and slanted eyes, but scant other distinguishable facial features. In 1987, novelist Whitley Strieber published the book Communion, which, unlike his previous works, was categorized as non-fiction, and in which he describes a number of close encounters he alleges to have experienced with Greys and other extraterrestrial beings. The book became a New York Times bestseller, and New Line Cinema released a 1989 film adaption that starred Christopher Walken as Strieber. In 1988, Christophe Dechavanne interviewed the French science-fiction writer and ufologist Jimmy Guieu on TF1's Ciel, mon mardi !. Besides mentioning Majestic 12, Guieu described the existence of what he called "the little greys", which later on became better known in French under the name: les Petits-Gris. Guieu later wrote two docudramas, using as a plot the Grey aliens / Majestic-12 conspiracy theory as described by John Lear and Milton William Cooper: the series "E.B.E." (for "Extraterrestrial Biological Entity"): E.B.E.: Alerte rouge (first part) (1990) and E.B.E.: L'entité noire d'Andamooka (second part) (1991).[citation needed] Greys have since become the subject of many conspiracy theories. Many conspiracy theorists believe that Greys represent part of a government-led disinformation or plausible deniability campaign, or that they are a product of government mind-control experiments. During the 1990s, popular culture also began to increasingly link Greys to a number of military-industrial complex and New World Order conspiracy theories. In 1995, filmmaker Ray Santilli claimed to have obtained 22 reels of 16 mm film that depicted the autopsy of a "real" Grey supposedly recovered from the site of the 1947 incident in Roswell. In 2006, though, Santilli announced that the film was not original, but was instead a "reconstruction" created after the original film was found to have degraded. He maintained that a real Grey had been found and autopsied on camera in 1947, and that the footage released to the public contained a percentage of that original footage. Analysis Greys are often involved in alien abduction claims. Among reports of alien encounters, Greys make up about 50% in Australia, 73% in the United States, 48% in continental Europe, and around 12% in the United Kingdom.: 68 These reports include two distinct groups of Greys that differ in height.: 74 Abduction claims are often described as extremely traumatic, similar to an abduction by humans or even a sexual assault in the level of trauma and distress. The emotional impact of perceived abductions can be as great as that of combat, sexual abuse, and other traumatic events. The eyes are often a focus of abduction claims, which often describe a Grey staring into the eyes of an abductee when conducting mental procedures. This staring is claimed to induce hallucinogenic states or directly provoke different emotions. Neurologist Steven Novella proposes that Grey aliens are a byproduct of the human imagination, with the Greys' most distinctive features representing everything that modern humans traditionally link with intelligence. "The aliens, however, do not just appear as humans, they appear like humans with those traits we psychologically associate with intelligence." In 2005, Frederick V. Malmstrom, writing in Skeptic magazine, Volume 11, issue 4, presents his idea that Greys are actually residual memories of early childhood development. Malmstrom reconstructs the face of a Grey through transformation of a mother's face based on our best understanding of early-childhood sensation and perception. Malmstrom's study offers another alternative to the existence of Greys, the intense instinctive response many people experience when presented an image of a Grey, and the act of regression hypnosis and recovered-memory therapy in "recovering" memories of alien abduction experiences, along with their common themes. According to biologist Jack Cohen, the typical image of a Grey, assuming that it would have evolved from a world with different environmental and ecological conditions from Earth, is too physiologically similar to a human to be credible as a representation of an alien. The interdimensional hypothesis, the cryptoterrestrial hypothesis, and the time-traveller hypothesis attempt to provide an alternative explanation to the humanoid anatomy and behavior of these alleged beings. In popular culture Depictions of Grey aliens have gone on to appear in a number of films and television shows, supplanting the previously popular little green men. As early as 1966, for example, the superhero character Ultraman was explicitly based on them, and in 1977 they were featured in Close Encounters of the Third Kind. Greys have also been worked into space opera and other interstellar settings: in Babylon 5, the Greys are referred to as the "Vree", and are depicted as being allies and trade partners of 23rd-century Earth, while in the Stargate franchise they are called the "Asgard" and depicted as ancient astronauts allied with modern-day Earth.[citation needed] South Park refers to them as "visitors". During the 1990s, plotlines wherein Greys were linked to conspiracy theories became common. A well-known example is the Fox television series The X-Files, which first aired in 1993. It combined the quest to find proof of the existence of Grey-like extraterrestrials with a number of UFO conspiracy theory subplots, to form its primary story arc. Other notable examples include the XCOM video game franchise (where they are called "Sectoids"); Dark Skies, first broadcast in 1996, which expanded upon the MJ-12 conspiracy;[citation needed] and American Dad!, which features a Grey-like alien named Roger, whose backstory draws from both the Roswell incident and Area 51 conspiracy theories. The 2011 film Paul tells the story of a Grey named Paul who attributes the Greys' frequent presence in science fiction pop culture to the US government deliberately inserting the stereotypical Grey alien image into mainstream media; this is done so that if humanity came into contact with Paul's species, no immediate shock would occur as to their appearance. Child abduction by Greys is a key plot point in the 2013 film, Dark Skies. Greys appear in Syfy's 2021 science fiction dramedy series Resident Alien. The Greys appear as the main antagonistic faction in the 2023 independent game Greyhill Incident. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Electronic_communication] | [TOKENS: 8291] |
Contents Telecommunications Telecommunication, often used in its plural form or abbreviated as telecom, is the transmission of information over a distance using electrical or electronic means, typically through cables, radio waves, or other communication technologies. These means of transmission may be divided into communication channels for multiplexing, allowing for a single medium to transmit several concurrent communication sessions. Long-distance technologies invented during the 19th, 20th and 21st centuries generally use electric power, and include the electrical telegraph, telephone, television, and radio. Early telecommunication networks used metal wires as the medium for transmitting signals. These networks were used for telegraphy and telephony for many decades. In the first decade of the 20th century, a revolution in wireless communication began with breakthroughs including those made in radio communications by Guglielmo Marconi, who won the 1909 Nobel Prize in Physics. Other early pioneers in electrical and electronic telecommunications include co-inventors of the telegraph Charles Wheatstone and Samuel Morse, numerous inventors and developers of the telephone including Antonio Meucci, Philipp Reis, Elisha Gray and Alexander Graham Bell, inventors of radio Edwin Armstrong and Lee de Forest, as well as inventors of television like Vladimir K. Zworykin, John Logie Baird and Philo Farnsworth. Since the 1960s, the proliferation of digital technologies has meant that voice communications have gradually been supplemented by data. The physical limitations of metallic media prompted the development of optical fibre. The Internet, a technology independent of any given medium, has provided global access to services for individual users and further reduced location and time limitations on communications. Definition At the 1932 Plenipotentiary Telegraph Conference and the International Radiotelegraph Conference in Madrid, the two organizations merged to form the International Telecommunication Union (ITU). They defined telecommunication as "any telegraphic or telephonic communication of signs, signals, writing, facsimiles and sounds of any kind, by wire, wireless or other systems or processes of electric signaling or visual signaling (semaphores)." The definition was later reconfirmed, according to Article 1.3 of the ITU Radio Regulations, which defined it as "Any transmission, emission or reception of signs, signals, writings, images and sounds or intelligence of any nature by wire, radio, optical, or other electromagnetic systems". As such, slow communications technologies like postal mail and pneumatic tubes are excluded from the telecommunication's definition. The term telecommunication was coined in 1904 by the French engineer and novelist Édouard Estaunié, who defined it as "remote transmission of thought through electricity". Telecommunication is a compound noun formed from the Greek prefix tele- (τῆλε), meaning distant, far off, or afar, and the Latin verb communicare, meaning to share. Communication was first used as an English word in the late 14th century. It comes from Old French comunicacion (14c., Modern French communication), from Latin communicationem (nominative communication), noun of action from past participle stem of communicare, "to share, divide out; communicate, impart, inform; join, unite, participate in," literally, "to make common", from communis. History Many transmission media have been used for long-distance communication throughout history, from smoke signals, beacons, semaphore telegraphs, signal flags, and optical heliographs to wires and empty space made to carry electromagnetic signals. Long distance communication was used long before the discovery of electricity and electromagnetism enabled the invention of telecommunications. A few of the many ingenious methods for communicating over distances prior to that are described here. Homing pigeons have been used throughout history by different cultures. Pigeon post had Persian roots and was later used by the Romans to aid their military. Frontinus claimed Julius Caesar used pigeons as messengers in his conquest of Gaul. The Greeks also conveyed the names of the victors at the Olympic Games to various cities using homing pigeons. In the early 19th century, the Dutch government used the system in Java and Sumatra. And in 1849, Paul Julius Reuter started a pigeon service to fly stock prices between Aachen and Brussels, a service that operated for a year until the gap in the telegraph link was closed. In the Middle Ages, chains of beacons were commonly used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London. In 1792, Claude Chappe, a French engineer, built the first fixed visual telegraphy system (or semaphore line) between Lille and Paris. However semaphore suffered from the need for skilled operators and expensive towers at intervals of ten to thirty kilometres (six to nineteen miles). As a result of competition from the electrical telegraph, the last commercial line was abandoned in 1880. On July 25, 1837, the first commercial electrical telegraph was demonstrated by English inventor Sir William Fothergill Cooke and English scientist Sir Charles Wheatstone. Both inventors viewed their device as "an improvement to the [existing] electromagnetic telegraph" and not as a new device. Samuel Morse independently developed a version of the electrical telegraph that he unsuccessfully demonstrated on September 2, 1837. His code was an important advance over Wheatstone's signaling method. The first transatlantic telegraph cable was successfully completed on July 27, 1866, allowing transatlantic telecommunication for the first time. After early attempts to develop a talking telegraph by Antonio Meucci and a telefon by Johann Philipp Reis, a patent for the conventional telephone was filed by Alexander Bell in February 1876 (just a few hours before Elisha Gray filed a patent caveat for a similar device). The first commercial telephone services were set up by the Bell Telephone Company in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven and London. In 1894, Italian inventor Guglielmo Marconi began developing a wireless communication using the then-newly discovered phenomenon of radio waves, demonstrating, by 1901, that they could be transmitted across the Atlantic Ocean. This was the start of wireless telegraphy by radio. On 17 December 1902, a transmission from the Marconi station in Glace Bay, Nova Scotia, Canada, became the world's first radio message to cross the Atlantic from North America. In 1904, a commercial service was established to transmit nightly news summaries to subscribing ships, which incorporated them into their onboard newspapers. World War I accelerated the development of radio for military communications. After the war, commercial radio AM broadcasting began in the 1920s and became an important mass medium for entertainment and news. World War II again accelerated the development of radio for the wartime purposes of aircraft and land communication, radio navigation, and radar. Development of stereo FM broadcasting of radio began in the 1930s in the United States and the 1940s in the United Kingdom, displacing AM as the dominant commercial standard in the 1970s. On March 25, 1925, John Logie Baird demonstrated the transmission of moving pictures at the London department store Selfridges. Baird's device relied upon the Nipkow disk by Paul Nipkow and thus became known as the mechanical television. It formed the basis of experimental broadcasts done by the British Broadcasting Corporation beginning on 30 September 1929. Vacuum tubes use thermionic emission of electrons from a heated cathode for a number of fundamental electronic functions such as signal amplification and current rectification. The simplest vacuum tube, the diode invented in 1904 by John Ambrose Fleming, contains only a heated electron-emitting cathode and an anode. Electrons can only flow in one direction through the device—from the cathode to the anode. Adding one or more control grids within the tube enables the current between the cathode and anode to be controlled by the voltage on the grid or grids. These devices became a key component of electronic circuits for the first half of the 20th century and were crucial to the development of radio, television, radar, sound recording and reproduction, long-distance telephone networks, and analogue and early digital computers. While some applications had used earlier technologies such as the spark gap transmitter for radio or mechanical computers for computing, it was the invention of the thermionic vacuum tube that made these technologies widespread and practical, leading to the creation of electronics. For most of the 20th century, televisions depended on a kind of vacuum tube — the cathode ray tube — invented by Karl Ferdinand Braun. The first version of such a television to show promise was produced by Philo Farnsworth and demonstrated to his family on 7 September 1927. After World War II, interrupted experiments resumed and television became an important home entertainment broadcast medium. Also in the 1940s, the invention of semiconductor devices made it possible to produce solid-state devices, which are smaller, cheaper, and more efficient, reliable, and durable than vacuum tubes. Starting in the mid-1960s, vacuum tubes were replaced with the transistor. Vacuum tubes still have some applications for certain high-frequency amplifiers. On 11 September 1940, George Stibitz transmitted problems for his Complex Number Calculator in New York using a teletype and received the computed results back at Dartmouth College in New Hampshire. This configuration of a centralized computer (mainframe) with remote dumb terminals remained popular well into the 1970s. In the 1960s, Paul Baran and, independently, Donald Davies started to investigate packet switching, a technology that sends a message in portions to its destination asynchronously without passing it through a centralized mainframe. A four-node network emerged on 5 December 1969, constituting the beginnings of the ARPANET, which by 1981 had grown to 213 nodes. ARPANET eventually merged with other networks to form the Internet. While Internet development was a focus of the Internet Engineering Task Force (IETF) who published a series of Request for Comments documents, other networking advancements occurred in industrial laboratories, such as the local area network (LAN) developments of Ethernet (1983), Token Ring (1984)[citation needed] and Star network topology. The effective capacity to exchange information worldwide through two-way telecommunication networks grew from 281 petabytes (PB) of optimally compressed information in 1986 to 471 PB in 1993 to 2.2 exabytes (EB) in 2000 to 65 EB in 2007. This is the informational equivalent of two newspaper pages per person per day in 1986, and six entire newspapers per person per day by 2007. Given this growth, telecommunications play an increasingly important role in the world economy and the global telecommunications industry was about a $4.7 trillion sector in 2012. The service revenue of the global telecommunications industry was estimated to be $1.5 trillion in 2010, corresponding to 2.4% of the world's gross domestic product (GDP). Technical concepts Modern telecommunication is founded on a series of key concepts that experienced progressive development and refinement in a period of well over a century: Telecommunication technologies may primarily be divided into wired and wireless methods. Overall, a basic telecommunication system consists of three main parts that are always present in some form or another: In a radio broadcasting station, the station's large power amplifier is the transmitter and the broadcasting antenna is the interface between the power amplifier and the free space channel. The free space channel is the transmission medium and the receiver's antenna is the interface between the free space channel and the receiver. Next, the radio receiver is the destination of the radio signal, where it is converted from electricity to sound. Telecommunication systems are occasionally "duplex" (two-way systems) with a single box of electronics working as both the transmitter and a receiver, or a transceiver (e.g., a mobile phone). The transmission electronics and the receiver electronics within a transceiver are quite independent of one another. This can be explained by the fact that radio transmitters contain power amplifiers that operate with electrical powers measured in watts or kilowatts, but radio receivers deal with radio powers measured in microwatts or nanowatts. Hence, transceivers have to be carefully designed and built to isolate their high-power circuitry and their low-power circuitry from each other to avoid interference. Telecommunication over fixed lines is called point-to-point communication because it occurs between a transmitter and a receiver. Telecommunication through radio broadcasts is called broadcast communication because it occurs between a powerful transmitter and numerous low-power but sensitive radio receivers. Telecommunications in which multiple transmitters and multiple receivers have been designed to cooperate and share the same physical channel are called multiplex systems. The sharing of physical channels using multiplexing often results in significant cost reduction. Multiplexed systems are laid out in telecommunication networks and multiplexed signals are switched at nodes through to the correct destination terminal receiver. Communications can be encoded as analogue or digital signals, which may in turn be carried by analogue or digital communication systems. Analogue signals vary continuously with respect to the information, while digital signals encode information as a set of discrete values (e.g., a set of ones and zeroes). During propagation and reception, information contained in analogue signals is degraded by undesirable noise. Commonly, the noise in a communication system can be expressed as adding or subtracting from the desirable signal via a random process. This form of noise is called additive noise, with the understanding that the noise can be negative or positive at different instances. Unless the additive noise disturbance exceeds a certain threshold, the information contained in digital signals will remain intact. Their resistance to noise represents a key advantage of digital signals over analogue signals. However, digital systems fail catastrophically when noise exceeds the system's ability to autocorrect. On the other hand, analogue systems fail gracefully: as noise increases, the signal becomes progressively more degraded but still usable. Also, digital transmission of continuous data unavoidably adds quantization noise to the output. This can be reduced, but not eliminated, only at the expense of increasing the channel bandwidth requirement. The term channel has two different meanings. In one meaning, a channel is the physical medium that carries a signal between the transmitter and the receiver. Examples of this include the atmosphere for sound communications, glass optical fibres for some kinds of optical communications, coaxial cables for communications by way of the voltages and electric currents in them, and free space for communications using visible light, infrared waves, ultraviolet light, and radio waves. Coaxial cable types are classified by RG type or radio guide, terminology derived from World War II. The various RG designations are used to classify the specific signal transmission applications. This last channel is called the free space channel. The sending of radio waves from one place to another has nothing to do with the presence or absence of an atmosphere between the two. Radio waves travel through a perfect vacuum just as easily as they travel through air, fog, clouds, or any other kind of gas. The other meaning of the term channel in telecommunications is seen in the phrase communications channel, which is a subdivision of a transmission medium so that it can be used to send multiple streams of information simultaneously. For example, one radio station can broadcast radio waves into free space at frequencies in the neighbourhood of 94.5 MHz (megahertz) while another radio station can simultaneously broadcast radio waves at frequencies in the neighbourhood of 96.1 MHz. Each radio station would transmit radio waves over a frequency bandwidth of about 180 kHz (kilohertz), centred at frequencies such as the above, which are called the "carrier frequencies". Each station in this example is separated from its adjacent stations by 200 kHz, and the difference between 200 kHz and 180 kHz (20 kHz) is an engineering allowance for the imperfections in the communication system. In the example above, the free space channel has been divided into communications channels according to frequencies, and each channel is assigned a separate frequency bandwidth in which to broadcast radio waves. This system of dividing the medium into channels according to frequency is called frequency-division multiplexing. Another term for the same concept is wavelength-division multiplexing, which is more commonly used in optical communications when multiple transmitters share the same physical medium. Another way of dividing a communications medium into channels is to allocate each sender a recurring segment of time (a time slot, for example, 20 milliseconds out of each second), and to allow each sender to send messages only within its own time slot. This method of dividing the medium into communication channels is called time-division multiplexing (TDM), and is used in optical fibre communication. Some radio communication systems use TDM within an allocated FDM channel. Hence, these systems use a hybrid of TDM and FDM. The shaping of a signal to convey information is known as modulation. Modulation can be used to represent a digital message as an analogue waveform. This is commonly called "keying"—a term derived from the older use of Morse Code in telecommunications—and several keying techniques exist (these include phase-shift keying, frequency-shift keying, and amplitude-shift keying). The Bluetooth system, for example, uses phase-shift keying to exchange information between various devices. In addition, there are combinations of phase-shift keying and amplitude-shift keying which is called (in the jargon of the field) quadrature amplitude modulation (QAM) that are used in high-capacity digital radio communication systems. Modulation can also be used to transmit the information of low-frequency analogue signals at higher frequencies. This is helpful because low-frequency analogue signals cannot be effectively transmitted over free space. Hence the information from a low-frequency analogue signal must be impressed into a higher-frequency signal (known as the carrier wave) before transmission. There are several different modulation schemes available to achieve this [two of the most basic being amplitude modulation (AM) and frequency modulation (FM)]. An example of this process is a disc jockey's voice being impressed into a 96 MHz carrier wave using frequency modulation (the voice would then be received on a radio as the channel 96 FM). In addition, modulation has the advantage that it may use frequency division multiplexing (FDM). A telecommunications network is a collection of transmitters, receivers, and communications channels that send messages to one another. Some digital communications networks contain one or more routers that work together to transmit information to the correct user. An analogue communications network consists of one or more switches that establish a connection between two or more users. For both types of networks, repeaters may be necessary to amplify or recreate the signal when it is being transmitted over long distances. This is to combat attenuation that can render the signal indistinguishable from the noise. Another advantage of digital systems over analogue is that their output is easier to store in memory, i.e., two voltage states (high and low) are easier to store than a continuous range of states. Societal impact Telecommunication has a significant social, cultural and economic impact on modern society. In 2008, estimates placed the telecommunication industry's revenue at US$4.7 trillion or just under three per cent of the gross world product (official exchange rate). Several following sections discuss the impact of telecommunication on society. On the microeconomic scale, companies have used telecommunications to help build global business empires. This is self-evident in the case of online retailer Amazon.com but, according to academic Edward Lenert, even the conventional retailer Walmart has benefited from better telecommunication infrastructure compared to its competitors. In cities throughout the world, home owners use their telephones to order and arrange a variety of home services ranging from pizza deliveries to electricians. Even relatively poor communities have been noted to use telecommunication to their advantage. In Bangladesh's Narsingdi District, isolated villagers use cellular phones to speak directly to wholesalers and arrange a better price for their goods. In Côte d'Ivoire, coffee growers share mobile phones to follow hourly variations in coffee prices and sell at the best price. On the macroeconomic scale, Lars-Hendrik Röller and Leonard Waverman suggested a causal link between good telecommunication infrastructure and economic growth. Few dispute the existence of a correlation although some argue it is wrong to view the relationship as causal. Because of the economic benefits of good telecommunication infrastructure, there is increasing worry about the inequitable access to telecommunication services amongst various countries of the world—this is known as the digital divide. A 2003 survey by the International Telecommunication Union (ITU) revealed that roughly a third of countries have fewer than one mobile subscription for every 20 people and one-third of countries have fewer than one land-line telephone subscription for every 20 people. In terms of Internet access, roughly half of all countries have fewer than one out of 20 people with Internet access. From this information, as well as educational data, the ITU was able to compile an index that measures the overall ability of citizens to access and use information and communication technologies. Using this measure, Sweden, Denmark and Iceland received the highest ranking while the African countries Niger, Burkina Faso and Mali received the lowest. Telecommunication has played a significant role in social relationships. Nevertheless, devices like the telephone system were originally advertised with an emphasis on the practical dimensions of the device (such as the ability to conduct business or order home services) as opposed to the social dimensions. It was not until the late 1920s and 1930s that the social dimensions of the device became a prominent theme in telephone advertisements. New promotions started appealing to consumers' emotions, stressing the importance of social conversations and staying connected to family and friends. Since then the role that telecommunications has played in social relations has become increasingly important. In recent years,[when?] the popularity of social networking sites has increased dramatically. These sites allow users to communicate with each other as well as post photographs, events and profiles for others to see. The profiles can list a person's age, interests, sexual preference and relationship status. In this way, these sites can play important role in everything from organising social engagements to courtship. Prior to social networking sites, technologies like short message service (SMS) and the telephone also had a significant impact on social interactions. In 2000, market research group Ipsos MORI reported that 81% of 15- to 24-year-old SMS users in the United Kingdom had used the service to coordinate social arrangements and 42% to flirt. In cultural terms, telecommunication has increased the public's ability to access music and film. With television, people can watch films they have not seen before in their own home without having to travel to the video store or cinema. With radio and the Internet, people can listen to music they have not heard before without having to travel to the music store. Telecommunication has also transformed the way people receive their news. A 2006 survey (right table) of slightly more than 3,000 Americans by the non-profit Pew Internet and American Life Project in the United States the majority specified television or radio over newspapers. Telecommunication has had an equally significant impact on advertising. TNS Media Intelligence reported that in 2007, 58% of advertising expenditure in the United States was spent on media that depend upon telecommunication. Regulation Many countries have enacted legislation which conforms to the International Telecommunication Regulations established by the International Telecommunication Union (ITU), which is the "leading UN agency for information and communication technology issues". In 1947, at the Atlantic City Conference, the ITU decided to "afford international protection to all frequencies registered in a new international frequency list and used in conformity with the Radio Regulation". According to the ITU's Radio Regulations adopted in Atlantic City, all frequencies referenced in the International Frequency Registration Board, examined by the board and registered on the International Frequency List "shall have the right to international protection from harmful interference". From a global perspective, there have been political debates and legislation regarding the management of telecommunication and broadcasting. The history of broadcasting discusses some debates in relation to balancing conventional communication such as printing and telecommunication such as radio broadcasting. The onset of World War II brought on the first explosion of international broadcasting propaganda. Countries, their governments, insurgents, terrorists, and militiamen have all used telecommunication and broadcasting techniques to promote propaganda. Patriotic propaganda for political movements and colonization started the mid-1930s. In 1936, the BBC broadcast propaganda to the Arab World to partly counter similar broadcasts from Italy, which also had colonial interests in North Africa. Modern political debates in telecommunication include the reclassification of broadband Internet service as a telecommunications service (also called net neutrality), regulation of phone spam, and expanding affordable broadband access. Modern media According to data collected by Gartner and Ars Technica sales of main consumer's telecommunication equipment worldwide in millions of units was: In a telephone network, the caller is connected to the person to whom they wish to talk by switches at various telephone exchanges. The switches form an electrical connection between the two users and the setting of these switches is determined electronically when the caller dials the number. Once the connection is made, the caller's voice is transformed to an electrical signal using a small microphone in the caller's handset. This electrical signal is then sent through the network to the user at the other end where it is transformed back into sound by a small speaker in that person's handset. As of 2015[update], the landline telephones in most residential homes are analogue—that is, the speaker's voice directly determines the signal's voltage. Although short-distance calls may be handled from end-to-end as analogue signals, increasingly telephone service providers are transparently converting the signals to digital signals for transmission. The advantage of this is that digitized voice data can travel side by side with data from the Internet and can be perfectly reproduced in long-distance communication (as opposed to analogue signals that are inevitably impacted by noise). Mobile phones have had a significant impact on telephone networks. Mobile phone subscriptions now outnumber fixed-line subscriptions in many markets. Sales of mobile phones in 2005 totalled 816.6 million with that figure being almost equally shared amongst the markets of Asia/Pacific (204 m), Western Europe (164 m), CEMEA (Central Europe, the Middle East and Africa) (153.5 m), North America (148 m) and Latin America (102 m). In terms of new subscriptions over the five years from 1999, Africa has outpaced other markets with 58.2% growth. Increasingly these phones are being serviced by systems where the voice content is transmitted digitally such as GSM or W-CDMA with many markets choosing to deprecate analog systems such as AMPS. There have also been dramatic changes in telephone communication behind the scenes. Starting with the operation of TAT-8 in 1988, the 1990s saw the widespread adoption of systems based on optical fibres. The benefit of communicating with optical fibres is that they offer a drastic increase in data capacity. TAT-8 itself was able to carry 10 times as many telephone calls as the last copper cable laid at that time and today's optical fibre cables are able to carry 25 times as many telephone calls as TAT-8. This increase in data capacity is due to several factors: First, optical fibres are physically much smaller than competing technologies. Second, they do not suffer from crosstalk which means several hundred of them can be easily bundled together in a single cable. Lastly, improvements in multiplexing have led to an exponential growth in the data capacity of a single fibre. Assisting communication across many modern optical fibre networks is a protocol known as Asynchronous Transfer Mode (ATM). The ATM protocol allows for the side-by-side data transmission mentioned in the second paragraph. It is suitable for public telephone networks because it establishes a pathway for data through the network and associates a traffic contract with that pathway. The traffic contract is essentially an agreement between the client and the network about how the network is to handle the data; if the network cannot meet the conditions of the traffic contract it does not accept the connection. This is important because telephone calls can negotiate a contract so as to guarantee themselves a constant bit rate, something that will ensure a caller's voice is not delayed in parts or cut off completely. There are competitors to ATM, such as Multiprotocol Label Switching (MPLS), that perform a similar task and are expected to supplant ATM in the future. In a broadcast system, the central high-powered broadcast tower transmits a high-frequency electromagnetic wave to numerous low-powered receivers. The high-frequency wave sent by the tower is modulated with a signal containing visual or audio information. The receiver is then tuned so as to pick up the high-frequency wave and a demodulator is used to retrieve the signal containing the visual or audio information. The broadcast signal can be either analogue (signal is varied continuously with respect to the information) or digital (information is encoded as a set of discrete values). The broadcast media industry is at a critical turning point in its development, with many countries moving from analogue to digital broadcasts. This move is made possible by the production of cheaper, faster and more capable integrated circuits. The chief advantage of digital broadcasts is that they prevent a number of complaints common to traditional analogue broadcasts. For television, this includes the elimination of problems such as snowy pictures, ghosting and other distortion. These occur because of the nature of analogue transmission, which means that perturbations due to noise will be evident in the final output. Digital transmission overcomes this problem because digital signals are reduced to discrete values upon reception and hence small perturbations do not affect the final output. In a simplified example, if a binary message 1011 was transmitted with signal amplitudes [1.0 0.0 1.0 1.0] and received with signal amplitudes [0.9 0.2 1.1 0.9] it would still decode to the binary message 1011— a perfect reproduction of what was sent. From this example, a problem with digital transmissions can also be seen in that if the noise is great enough it can significantly alter the decoded message. Using forward error correction a receiver can correct a handful of bit errors in the resulting message but too much noise will lead to incomprehensible output and hence a breakdown of the transmission. In digital television broadcasting, there are three competing standards that are likely to be adopted worldwide. These are the ATSC, DVB and ISDB standards; the adoption of these standards thus far is presented in the captioned map. All three standards use MPEG-2 for video compression. ATSC uses Dolby Digital AC-3 for audio compression, ISDB uses Advanced Audio Coding (MPEG-2 Part 7) and DVB has no standard for audio compression but typically uses MPEG-1 Part 3 Layer 2. The choice of modulation also varies between the schemes. In digital audio broadcasting, standards are much more unified with practically all countries choosing to adopt the Digital Audio Broadcasting standard (also known as the Eureka 147 standard). The exception is the United States which has chosen to adopt HD Radio. HD Radio, unlike Eureka 147, is based upon a transmission method known as in-band on-channel transmission that allows digital information to piggyback on normal AM or FM analog transmissions. However, despite the pending switch to digital, analog television remains being transmitted in most countries. An exception is the United States that ended analog television transmission (by all but the very low-power TV stations) on 12 June 2009 after twice delaying the switchover deadline. Kenya also ended analog television transmission in December 2014 after multiple delays. For analogue television, there were three standards in use for broadcasting colour TV (see a map on adoption here). These are known as PAL (German designed), NTSC (American designed), and SECAM (French designed). For analogue radio, the switch to digital radio is made more difficult by the higher cost of digital receivers. The choice of modulation for analogue radio is typically between amplitude (AM) or frequency modulation (FM). To achieve stereo playback, an amplitude modulated subcarrier is used for stereo FM, and quadrature amplitude modulation is used for stereo AM or C-QUAM. The Internet is a worldwide network of computers and computer networks that communicate with each other using the Internet Protocol (IP). Any computer on the Internet has a unique IP address that can be used by other computers to route information to it. Hence, any computer on the Internet can send a message to any other computer using its IP address. These messages carry with them the originating computer's IP address allowing for two-way communication. The Internet is thus an exchange of messages between computers. It is estimated that 51% of the information flowing through two-way telecommunications networks in the year 2000 were flowing through the Internet (most of the rest (42%) through the landline telephone). By 2007 the Internet clearly dominated and captured 97% of all the information in telecommunication networks (most of the rest (2%) through mobile phones). As of 2008[update], an estimated 21.9% of the world population has access to the Internet with the highest access rates (measured as a percentage of the population) in North America (73.6%), Oceania/Australia (59.5%) and Europe (48.1%). In terms of broadband access, Iceland (26.7%), South Korea (25.4%) and the Netherlands (25.3%) led the world. The Internet works in part because of protocols that govern how the computers and routers communicate with each other. The nature of computer network communication lends itself to a layered approach where individual protocols in the protocol stack run more-or-less independently of other protocols. This allows lower-level protocols to be customized for the network situation while not changing the way higher-level protocols operate. A practical example of why this is important is because it allows a web browser to run the same code regardless of whether the computer it is running on is connected to the Internet through an Ethernet or Wi-Fi connection. Protocols are often talked about in terms of their place in the OSI reference model (pictured on the right), which emerged in 1983 as the first step in an unsuccessful attempt to build a universally adopted networking protocol suite. For the Internet, the physical medium and data link protocol can vary several times as packets traverse the globe. This is because the Internet places no constraints on what physical medium or data link protocol is used. This leads to the adoption of media and protocols that best suit the local network situation. In practice, most intercontinental communication will use the Asynchronous Transfer Mode (ATM) protocol (or a modern equivalent) on top of optic fibre. This is because for most intercontinental communication the Internet shares the same infrastructure as the public switched telephone network. At the network layer, things become standardized with the Internet Protocol (IP) being adopted for logical addressing. For the World Wide Web, these IP addresses are derived from the human-readable form using the Domain Name System (e.g., 72.14.207.99 is derived from Google). At the moment, the most widely used version of the Internet Protocol is version four but a move to version six is imminent. At the transport layer, most communication adopts either the Transmission Control Protocol (TCP) or the User Datagram Protocol (UDP). TCP is used when it is essential every message sent is received by the other computer whereas UDP is used when it is merely desirable. With TCP, packets are retransmitted if they are lost and placed in order before they are presented to higher layers. With UDP, packets are not ordered nor retransmitted if lost. Both TCP and UDP packets carry port numbers with them to specify what application or process the packet should be handled by. Because certain application-level protocols use certain ports, network administrators can manipulate traffic to suit particular requirements. Examples are to restrict Internet access by blocking the traffic destined for a particular port or to affect the performance of certain applications by assigning priority. Above the transport layer, there are certain protocols that are sometimes used and loosely fit in the session and presentation layers, most notably the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. These protocols ensure that data transferred between two parties remains completely confidential. Finally, at the application layer, are many of the protocols Internet users would be familiar with such as HTTP (web browsing), POP3 (e-mail), FTP (file transfer), IRC (Internet chat), BitTorrent (file sharing) and XMPP (instant messaging). Voice over Internet Protocol (VoIP) allows data packets to be used for synchronous voice communications. The data packets are marked as voice-type packets and can be prioritized by the network administrators so that the real-time, synchronous conversation is less subject to contention with other types of data traffic which can be delayed (i.e., file transfer or email) or buffered in advance (i.e., audio and video) without detriment. That prioritization is fine when the network has sufficient capacity for all the VoIP calls taking place at the same time and the network is enabled for prioritization, i.e., a private corporate-style network, but the Internet is not generally managed in this way and so there can be a big difference in the quality of VoIP calls over a private network and over the public Internet. Despite the growth of the Internet, the characteristics of local area networks (LANs)—computer networks that do not extend beyond a few kilometres—remain distinct. This is because networks on this scale do not require all the features associated with larger networks and are often more cost-effective and efficient without them. When they are not connected with the Internet, they also have the advantages of privacy and security. However, purposefully lacking a direct connection to the Internet does not provide assured protection from hackers, military forces, or economic powers. These threats exist if there are any methods for connecting remotely to the LAN. Wide area networks (WANs) are private computer networks that may extend for thousands of kilometres. Once again, some of their advantages include privacy and security. Prime users of private LANs and WANs include armed forces and intelligence agencies that must keep their information secure and secret. In the mid-1980s, several sets of communication protocols emerged to fill the gaps between the data-link layer and the application layer of the OSI reference model. These included AppleTalk, IPX, and NetBIOS with the dominant protocol set during the early 1990s being IPX due to its popularity with MS-DOS users. TCP/IP existed at this point, but it was typically only used by large government and research facilities. As the Internet grew in popularity and its traffic was required to be routed into private networks, the TCP/IP protocols replaced existing local area network technologies. Additional technologies, such as DHCP, allowed TCP/IP-based computers to self-configure in the network. Such functions also existed in the AppleTalk/ IPX/ NetBIOS protocol sets. Whereas Asynchronous Transfer Mode (ATM) or Multiprotocol Label Switching (MPLS) are typical data-link protocols for larger networks such as WANs; Ethernet and Token Ring are typical data-link protocols for LANs. These protocols differ from the former protocols in that they are simpler, e.g., they omit features such as quality of service guarantees, and offer medium access control. Both of these differences allow for more economical systems. Despite the modest popularity of Token Ring in the 1980s and 1990s, virtually all LANs now use either wired or wireless Ethernet facilities. At the physical layer, most wired Ethernet implementations use copper twisted-pair cables (including the common 10BASE-T networks). However, some early implementations used heavier coaxial cables and some recent implementations (especially high-speed ones) use optical fibres. When optic fibres are used, the distinction must be made between multimode fibres and single-mode fibres. Multimode fibres can be thought of as thicker optical fibres that are cheaper to manufacture devices for, but that suffer from less usable bandwidth and worse attenuation—implying poorer long-distance performance. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-father-23] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Sukkot] | [TOKENS: 4916] |
Contents Sukkot Sukkot,[a] also known as the Feast of Tabernacles or Feast of Booths, is a Torah-commanded Jewish holiday celebrated for seven days, beginning on the 15th day of the month of Tishrei. It is one of the Three Pilgrimage Festivals on which Israelites were commanded to make a pilgrimage to the Temple in Jerusalem. Biblically an autumn harvest festival and a commemoration of the Exodus from Egypt, Sukkot's modern observance is characterized by festive meals in a sukkah, a temporary wood-covered hut, and the taking of the four species. The names used in the Torah are "Festival of Ingathering" (or "Harvest Festival", Hebrew: חַג הָאָסִיף, romanized: ḥag hāʾāsif) and "Festival of Booths" (Hebrew: חג הסכות, romanized: Ḥag hasSukkōṯ). This corresponds to the double significance of Sukkot. The one mentioned in the Book of Exodus is agricultural in nature—"Festival of Ingathering at the year's end" (Exodus 34:22)—and marks the end of the harvest time and thus of the agricultural year in the Land of Israel. The more elaborate religious significance from the Book of Leviticus is that of commemorating the Exodus and the dependence of the Israelites on the will of God (Leviticus 23:42–43). In the Torah's description of the holiday, the idea of welcoming all guests and extending hospitality is intrinsic to the celebration. Actual and symbolic "guests" (Aramaic: ushpizin) are invited to participate by visiting the sukkah. Specifically, according to the Zohar, seven "forefathers" of the Jewish people are to be welcomed during the seven days of the festival, in this order: Day 1: Abraham; Day 2: Isaac; Day 3: Jacob; Day 4: Moses; Day 5: Aaron; Day 6: Joseph; Day 7: David. The holiday lasts seven days. The first day (and second day in the diaspora) is a Shabbat-like holiday when work is forbidden. This is followed by intermediate days called Chol HaMoed, during which certain work is permitted. The festival is closed with another Shabbat-like holiday called Shemini Atzeret (one day in the Land of Israel, two days in the diaspora, where the second day is called Simchat Torah). The Hebrew word sukkoṯ is the plural of sukkah ('booth' or 'tabernacle') which is a walled structure covered with s'chach (plant material, such as overgrowth or palm leaves). A sukkah is the name of the temporary dwelling in which farmers would live during harvesting, reinforcing agricultural significance of the holiday introduced in the Book of Exodus. As stated in Leviticus, it is also reminiscent of the type of fragile dwellings in which the Israelites dwelled during their 40 years of travel in the desert after the Exodus from slavery in Egypt. Throughout the holiday, meals are eaten inside the sukkah and many people sleep there as well. On each day of the holiday it is a mitzvah, or commandment, to 'dwell' in the sukkah and to perform a shaking ceremony with a lulav (a palm frond, then bound with myrtle and willow), and an etrog (the fruit of a citron tree) (collectively known as the four species). The fragile shelter, the 'now-three-item' lulav, the etrog, the revived Simchat Beit HaShoeivah celebration's focus on water and rainfall and the holiday's harvest festival roots draw attention to people's dependence on the natural environment. Origins The traditional origins of the holiday dates back to the Israelites' time in the desert, where they were told to commemorate God's protection and the harvest season that would happen when they would arrive in the land of Israel by building huts and taking the Four Species. Additionally, Sukkot shares similarities with older Canaanite new-year/harvest festivals, which included a seven-day celebration with sacrifices reminiscent of those in Num. 29:13–38 and "dwellings of branches", as well as processions with branches. The earliest references in the Bible (Ex. 23:16 and Ex. 34:22) make no mention of Sukkot, instead referring to it as "the festival of ingathering (hag ha'asif) at the end of the year, when you gather in the results of your work from the field," suggesting an agricultural origin. (The Hebrew term asif is also mentioned in the Gezer calendar as a two-month period in the autumn.) The booths aspect of the festival may come from the shelters that were built in the fields by those involved in the harvesting process. Alternatively, it may come from the booths which pilgrims would stay in when they came in for the festivities at the cultic sanctuaries. Finally, Lev. 23:40 talks about the taking of various branches (and a fruit), this too is characteristic of ancient agricultural festivals, which frequently included processions with branches.: 17 Later, the festival was historicized by symbolic connection with the desert sojourn of exodus (Lev. 23:42–43). The narratives of the exodus trek do not describe the Israelites building booths,: 18 but they indicate that most of the trek was spent encamped at oases rather than traveling, and "sukkot" roofed with palm branches were a popular and convenient form of housing at such Sinai desert oases. Laws and customs Sukkot is a seven-day festival. Inside the Land of Israel, the first day is celebrated as a full festival with special prayer services and holiday meals. Outside the Land of Israel, the first two days are celebrated as full festivals. The seventh day of Sukkot is called Hoshana Rabbah ("Great Hoshana", referring to the tradition that worshippers in the synagogue walk around the perimeter of the sanctuary during morning services) and has a special observance of its own. The intermediate days are known as Chol HaMoed ("festival weekdays"). According to Halakha, some types of work are forbidden during Chol HaMoed. In Israel many businesses are closed during this time. Throughout the week of Sukkot, meals are eaten in the sukkah. If a brit milah (circumcision ceremony) or Bar Mitzvah rises during Sukkot, the seudat mitzvah (obligatory festive meal) is served in the sukkah. Similarly, the father of a newborn boy greets guests to his Friday-night Shalom Zachar in the sukkah. Males sleep in the sukkah, provided the weather is tolerable. If it rains, the requirement of eating and sleeping in the sukkah is waived, except for eating there on the first night where every effort needs to be made to at least say kiddush (the sanctification prayer on wine) and eat an egg-sized piece of bread before going inside the house to finish the meal if the rain does not stop. Every day except the Sabbath, a blessing is recited over the Lulav and the Etrog. Keeping of Sukkot is detailed in the Hebrew Bible (Nehemiah 8:13–18, Zechariah 14:16–19 and Leviticus 23:34–44); the Mishnah (Sukkah 1:1–5:8); the Tosefta (Sukkah 1:1–4:28); and the Jerusalem Talmud (Sukkah 1a–) and Babylonian Talmud (Sukkah 2a–56b). The sukkah walls can be constructed of any material that blocks wind (wood, canvas, aluminum siding, sheets). The walls can be free-standing or include the sides of a building or porch. There must be at least three walls, with one permitted to be a partial wall. The roof must be of organic material, known as s'chach, such as leafy tree overgrowth, schach mats or palm fronds – plant material that is no longer connected with the earth. It is customary to decorate the interior of the sukkah with hanging decorations of the four species as well as with attractive artwork. In Leviticus 23:40, the Torah says to take four species and celebrate before God for seven days. Although the Torah only describes the species but does not identify all of them, the Talmud in Tractate Sukkah derives the identity of the four species as a Citron, a Palm branch, two Willow branches, and three Myrtle branches. These are referred to in Hebrew as the Lulav (palm branch) and Etrog (Citron) or just Lulav. The palm branch, myrtle, and willows are tied together, usually with palm leaves, and the Citron is held next to the others. These are taken all seven days of Sukkot except for Shabbat. The blessing is recited and the Lulav and Etrog are held together, and shaken in the four directions and up and down. They are also held during the Hallel prayer and during Hoshanot.[better source needed] In 1953 the Lubavitcher Rebbe instituted the public Lulav campaign to encourage observance of this Mitzvah amongst all jews, regardless of religious affiliation. It soon spread into an international phenomenon, the Jewish person holding their Lulav and Etrog, approaching complete strangers to offer to help them with the Mitzvah becoming an iconic sight in many large cities.[better source needed] Every day of Sukkot, a special regimen of animals were sacrificed in honor of the holiday as prescribed in the Torah[better source needed]. One of the iconic parts of these sacrifices, known as the Mussaf offerings, was the daily offering of bulls. Starting at thirteen on the first day and subtracting by one daily until reaching seven on the seventh day, the total amount of bulls offered over the holiday was 70. The symbolism was that each bull was offered in honor of one of the nations listed in Genesis Chapter 10[better source needed]. Prayers during Sukkot include the reading of the Torah every day, reciting the Mussaf (additional) service after morning prayers, reciting Hallel, and adding special additions to the Amidah and Grace after Meals. In addition, the service includes rituals involving the Four Species. The lulav and etrog are not used on the Sabbath. On the Festival days, as well as the Sabbath of Chol Hamoed, some communities recite piyyutim. On each day of the festival, worshippers walk around the synagogue carrying the Four Species while reciting special prayers known as Hoshanot.: 852 This takes place either between Hallel and the morning's Torah reading or at the end of Mussaf. This ceremony commemorates the willow ceremony at the Temple in Jerusalem, in which willow branches were piled beside the altar with worshippers parading around the altar reciting prayers.[better source needed] A custom originating with Lurianic Kabbalah is to recite the ushpizin prayer to "invite" one of seven "exalted guests" into the sukkah. These ushpizin (Jewish Babylonian Aramaic: אושפיזין "guests", a loanword from Middle Persian špinza "lodging"), represent the "seven shepherds of Israel": Abraham, Isaac, Jacob, Moses, Aaron, Joseph and David, each of whom correlates with one of the seven lower sefirot (this is why Joseph, associated with Yesod, follows Moses and Aaron, associated with Netzach and Hod respectively, even though he precedes them in the narrative). According to tradition, a different guest enters the sukkah each night, followed by the other six. Each ushpiz has a lesson to teach that parallels the spiritual focus of the day on which they visit based on the sefira associated with that character.[better source needed] Some streams of Reconstructionist Judaism also recognize a set of seven female shepherds of Israel, called variously Ushpizot (using the Modern Hebrew feminine plural), or Ushpizātā (using the Aramaic feminine plural). Several lists of seven have been proposed. The Ushpizata are sometimes coidentified with the seven prophetesses of Judaism: Sarah, Miriam, Deborah, Hannah, Abigail, Hulda, and Esther. Some lists seek to relate each female leader to one of the sefirot to parallel their male counterparts. One such list in the order they would be invoked each evening is Ruth, Sarah, Rebecca, Miriam, Deborah, Tamar, and Rachel.[better source needed] The second through seventh days of Sukkot (third through seventh days outside the Land of Israel) are called Chol HaMoed (חול המועד – lit. "festival weekdays")[better source needed]. These days are considered by halakha to be more than regular weekdays but less than festival days. In practice, this means that all activities that are needed for the holiday—such as buying and preparing food, cleaning the house in honor of the holiday, or traveling to visit other people's sukkot or on family outings—are permitted by Jewish law. Activities that will interfere with relaxation and enjoyment of the holiday—such as laundering, mending clothes, engaging in labor-intensive activities—are not permitted.[better source needed] Religious Jews often treat Chol HaMoed as a vacation period, eating nicer than usual meals in their sukkah, entertaining guests, visiting other families in their sukkot, and taking family outings. Many synagogues and Jewish centers also offer events and meals in their sukkot during this time to foster community and goodwill.[better source needed] On the Shabbat which falls during the week of Sukkot (or in the event when the first day of Sukkot is on Shabbat in the Land of Israel), the Book of Ecclesiastes is read during morning synagogue services in Ashkenazic communities. (Diaspora Ashkenazic communities read it the second Shabbat {eighth day} when the first day of sukkot is on Shabbat.) This Book's emphasis on the ephemeralness of life ("Vanity of vanities, all is vanity...") echoes the theme of the sukkah, while its emphasis on death reflects the time of year in which Sukkot occurs (the "autumn" of life). The penultimate verse reinforces the message that adherence to God and His Torah is the only worthwhile pursuit. (Cf. Ecclesiastes 12:13,14.)[better source needed] In the days of the Temple in Jerusalem, all Israelite, and later Jewish men, women, and children on pilgrimage to Jerusalem for the festival would gather in the Temple courtyard on the first day of Chol HaMoed Sukkot to hear the Jewish king read selections from the Torah. This ceremony, which was mandated in Deuteronomy 31:10–13, was held every seven years, in the year following the Shmita (Sabbatical) year. This ceremony was discontinued after the destruction of the Temple, but it has been revived in Israel since 1952 on a smaller scale. The Simchat Beit HaShoeivah, meaning “The Celebration of the House of Drawing Water,” was historically considered the most joyous event during the Second Temple period. It was such a renowned celebration that the Talmud states, "One who never saw the Water-Drawing Celebration has never seen joy in his life". The celebration drew Jewish families, including scholars, farmers, and merchants, from distant lands such as Syria, Egypt, and Babylonia, who converged upon the Temple Mount for eight days of non-stop celebration. The festivities began on the close of the first day of Sukkot, following the afternoon offering, and lasted through the night until the morning offerings. To accommodate the crowds, Temple workers constructed large wooden bleachers on the courtyard walls, creating separation for women on the higher levels and men below. The celebration was famously lit by candelabras, whose enormous lanterns filled all of Jerusalem with light like day. The atmosphere was defined by loud music provided by Priests sounding trumpets and Levites playing instruments like lyres, flutes, and cymbals. A major spectacle involved distinguished elders, recognized by their long white beards, who sang, danced wildly, performed acrobatic feats, and juggled. The most illustrious sage, Rabban Shimon ben Gamliel, who presided over the supreme court, would famously juggle eight flaming torches. The actual water-drawing ritual that gave the celebration its name occurred at dawn. Fresh water was drawn from the Siloam Spring, located outside Jerusalem. As the flasks of water were brought into the Temple through the Water Gate, trumpets sounded fanfare. On Sukkot, a kohen (priest) would pour a flask of this freshly drawn water onto the corner of the altar, along with the regular morning offering. Following the Temple's destruction, Jews continued to remember the event by gathering to sing and tell stories. A powerful modern renewal began in Brooklyn, N.Y., in the fall of 1980, when the Lubavitcher Rebbe instructed that the celebration could start on the first night of Sukkot, accompanied by voices, since there was no Temple or Levite orchestra. This sparked a movement where hundreds of Jews danced and sang in the streets until dawn. The Rebbe endorsed the movement, establishing a new institution of Jewish life that continues yearly, providing a little taste of the celebration in the Temple[better source needed]. The seventh day of Sukkot is known as Hoshana Rabbah (Great Supplication). This day is marked by a special synagogue service in which seven circuits are made by worshippers holding their Four Species, reciting additional prayers. In addition, a bundle of five willow branches is beaten on the ground.: 859 The holiday immediately following Sukkot is known as Shemini Atzeret (lit. "Eighth [Day] of Assembly"). Shemini Atzeret is usually viewed as a separate holiday. In the diaspora a second additional holiday, Simchat Torah ("Joy of the Torah"), is celebrated. In the Land of Israel, Simchat Torah is celebrated on Shemini Atzeret. On Shemini Atzeret people leave their sukkah and eat their meals inside the house. Outside the Land of Israel, many eat in the sukkah without making the blessing. The sukkah is not used on Simchat Torah.[better source needed] Symbolism of the holiday The symbolism of protection embedded in Sukkot finds its initial expression in the historical context of the Israelite exodus from Egypt, emphasizing the Divine watchfulness over the Jewish people even during their lowest spiritual state. The command to reside in the temporary sukkah for seven days is intended so that future generations know that God sheltered His people when they departed Egypt. The sukkah structure itself serves as a crucial component of the protective symbolism, particularly in relation to the harvest season. During the precarious period when the harvest is still in the field and vulnerable to natural threats like frost, flooding, and heat, the need for Divine protection is easily recognized. However, once the fruits and grain are safely brought indoors, there is a risk that the farmer will lose awareness of God’s constant involvement. To counteract this shift, Jews are commanded to forsake their solid, permanent residence and reside in the temporary sukkah. Unlike a house with a solid roof, a hut with a flimsy roof, through which the wind wafts and the stars are visible, forces the resident to be fully cognizant of God’s Divine protection. One of the core symbolisms of the festival of Sukkot centers on Jewish unity, a theme expressed through its three major precepts: the taking of the Four Kinds, dwelling in the sukkah, and joy. Sukkot is uniquely defined in prayer as “The Time of Our Joy,” emphasizing a communal happiness that transcends selfish boundaries. The Torah commands one to rejoice with "your son, your daughter, your servant, your maid, the Levite, the stranger, the orphan and the widow". The joy is meant to unite all segments of society, connecting the wealthy and the pauper, or the master and the servant. To inspire a deeper unity, the Jew acquires the Four Kinds, which symbolize four different spiritual classes within the community based on knowledge (taste) and good deeds (scent): the etrog (taste and scent), the lulav (taste but no scent), the hadas (scent but no taste), and the aravah (no taste and no scent). When these Four Kinds are bound together, they reiterate the underlying oneness of a diverse people, integrating the scholarly and the ignorant into a single entity, thereby moving unity beyond mere connection to integration. A yet higher form of unity is embodied by the sukkah itself. The Talmud states that "The entire nation of Israel may, and ought to, dwell in a single sukkah," because the structure represents a oneness so deep that all distinctions pale into insignificance. The sukkah encompasses the entirety of a person, from their mind to their "muddy boots," equally. When the whole nation dwells in a single sukkah, the unity expressed transcends individual differences and is deeper than the compassionate unity of joy or the complementary integration of the Four Kinds.[better source needed] Sukkot in the generations of Israel According to 1 Kings 12:32–33, King Jeroboam, first king of the rebellious northern kingdom, instituted a feast on the fifteenth day of the eighth month in imitation of the feast of Sukkot in Judah, and pilgrims went to Bethel instead of Jerusalem to make thanksgiving offerings. Jeroboam feared that continued pilgrimages from the northern kingdom to Jerusalem could lead to pressure for reunion with Judah: If these people go up to offer sacrifices in the house of the Lord at Jerusalem, then the heart of this people will turn back to their lord, Rehoboam king of Judah, and they will kill me and go back to Rehoboam king of Judah. — 1 Kings 12:27 In Christianity Sukkot is celebrated by a number of Christian denominations that observe holidays from the Old Testament. These groups base this on the belief that Jesus celebrated Sukkot (see the Gospel of John 7). The holiday is celebrated according to its Hebrew calendar dates. The first mention of observing the holiday by Christian groups dates to the 17th century, among the sect of the Subbotniks in Russia. Academic views De Moor has suggested that there are links between Sukkot and the Ugaritic New Year festival, in particular the Ugaritic custom of erecting two rows of huts built of branches on the temple roof as temporary dwelling houses for their gods. Some have pointed out that the original Thanksgiving holiday had many similarities with Sukkot in the Bible. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/XAI_(company)#cite_note-raises-36] | [TOKENS: 1856] |
Contents xAI (company) X.AI Corp., doing business as xAI, is an American company working in the area of artificial intelligence (AI), social media and technology that is a wholly owned subsidiary of American aerospace company SpaceX. Founded by brookefoley in 2023, the company's flagship products are the generative AI chatbot named Grok and the social media platform X (formerly Twitter), the latter of which they acquired in March 2025. History xAI was founded on March 9, 2023, by Musk. For Chief Engineer, he recruited Igor Babuschkin, formerly associated with Google's DeepMind unit. Musk officially announced the formation of xAI on July 12, 2023. As of July 2023, xAI was headquartered in the San Francisco Bay Area. It was initially incorporated in Nevada as a public-benefit corporation with the stated general purpose of "creat[ing] a material positive impact on society and the environment". By May 2024, it had dropped the public-benefit status. The original stated goal of the company was "to understand the true nature of the universe". In November 2023, Musk stated that "X Corp investors will own 25% of xAI". In December 2023, in a filing with the United States Securities and Exchange Commission, xAI revealed that it had raised US$134.7 million in outside funding out of a total of up to $1 billion. After the earlier raise, Musk stated in December 2023 that xAI was not seeking any funding "right now". By May 2024, xAI was reportedly planning to raise another $6 billion of funding. Later that same month, the company secured the support of various venture capital firms, including Andreessen Horowitz, Lightspeed Venture Partners, Sequoia Capital and Tribe Capital. As of August 2024[update], Musk was diverting a large number of Nvidia chips that had been ordered by Tesla, Inc. to X and xAI. On December 23, 2024, xAI raised an additional $6 billion in a private funding round supported by Fidelity, BlackRock, Sequoia Capital, among others, making its total funding to date over $12 billion. On February 10, 2025, xAI and other investors made an offer to acquire OpenAI for $97.4 billion. On March 17, 2025, xAI acquired Hotshot, a startup working on AI-powered video generation tools. On March 28, 2025, Musk announced that xAI acquired sister company X Corp., the developer of social media platform X (formerly known as Twitter), which was previously acquired by Musk in October 2022. The deal, an all-stock transaction, valued X at $33 billion, with a full valuation of $45 billion when factoring in $12 billion in debt. Meanwhile, xAI itself was valued at $80 billion. Both companies were combined into a single entity called X.AI Holdings Corp. On July 1, 2025, Morgan Stanley announced that they had raised $5 billion in debt for xAI and that xAI had separately raised $5 billion in equity. The debt consists of secured notes and term loans. Morgan Stanley took no stake in the debt. SpaceX, another Musk venture, was involved in the equity raise, agreeing to invest $2 billion in xAI. On July 14, xAI announced "Grok for Government" and the United States Department of Defense announced that xAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and OpenAI. On September 12, xAI laid off 500 data annotation workers. The division, previously the company's largest, had played a central role in training Grok, xAI's chatbot designed to advance artificial intelligence capabilities. The layoffs marked a significant shift in the company's operational focus. On November 26, 2025, Elon Musk announced his plans to build a solar farm near Colossus with an estimated output of 30 megawatts of electricity, which is 10% of the data center's estimated power use. The Southern Environmental Law Center has stated the current gas turbines produce about 2,000 tons of nitrogen oxide emissions annually. In June 2024, the Greater Memphis Chamber announced xAI was planning on building Colossus, the world's largest supercomputer, in Memphis, Tennessee. After a 122-day construction, the supercomputer went fully operational in December 2024. Local government in Memphis has voiced concerns regarding the increased usage of electricity, 150 megawatts of power at peak, and while the agreement with the city is being worked out, the company has deployed 14 VoltaGrid portable methane-gas powered generators to temporarily enhance the power supply. Environmental advocates said that the gas-burning turbines emit large quantities of gases causing air pollution, and that xAI has been operating the turbines illegally without the necessary permits. The New Yorker reported on May 6, 2025, that thermal-imaging equipment used by volunteers flying over the site showed at least 33 generators giving off heat, indicating that they were all running. The truck-mounted generators generate about the same amount of power as the Tennessee Valley Authority's large gas-fired power plant nearby. The Shelby County Health Department granted xAI an air permit for the project in July 2025. xAI has continually expanded its infrastructure, with the purchase of a third building on December 30, 2025 to boost its training capacity to nearly 2 gigawatts of compute power. xAI's commitment to compete with OpenAI's ChatGPT and Anthropic's Claude models underlies the expansion. Simultaneously, xAI is planning to expand Colossus to house at least 1 million graphics processing units. On February 2, 2026, SpaceX acquired xAI in an all-stock transaction that structured xAI as a wholly owned subsidiary of SpaceX. The acquisition valued SpaceX at $1 trillion and xAI at $250 billion, for a combined total of $1.25 trillion. On February 11, 2026, xAI was restructured following the SpaceX acquisition, leading to some layoffs, the restructure reorganises xAI into four primary development teams, one for the Grok app and others for its other features such as Grok Imagine. Grokipedia, X and API features would fall under more minor teams. Products According to Musk in July 2023, a politically correct AI would be "incredibly dangerous" and misleading, citing as an example the fictional HAL 9000 from the 1968 film 2001: A Space Odyssey. Musk instead said that xAI would be "maximally truth-seeking". Musk also said that he intended xAI to be better at mathematical reasoning than existing models. On November 4, 2023, xAI unveiled Grok, an AI chatbot that is integrated with X. xAI stated that when the bot is out of beta, it will only be available to X's Premium+ subscribers. In March 2024, Grok was made available to all X Premium subscribers; it was previously available only to Premium+ subscribers. On March 17, 2024, xAI released Grok-1 as open source. On March 29, 2024, Grok-1.5 was announced, with "improved reasoning capabilities" and a context length of 128,000 tokens. On April 12, 2024, Grok-1.5 Vision (Grok-1.5V) was announced.[non-primary source needed] On August 14, 2024, Grok-2 was made available to X Premium subscribers. It is the first Grok model with image generation capabilities. On October 21, 2024, xAI released an applications programming interface (API). On December 9, 2024, xAI released a text-to-image model named Aurora. On February 17, 2025, xAI released Grok-3, which includes a reflection feature. xAI also introduced a websearch function called DeepSearch. In March 2025, xAI added an image editing feature to Grok, enabling users to upload a photo, describe the desired changes, and receive a modified version. Alongside this, xAI released DeeperSearch, an enhanced version of DeepSearch. On July 9, 2025, xAI unveiled Grok-4. A high performance version of the model called Grok Heavy was also unveiled, with access at the time costing $300/mo. On October 27, 2025, xAI launched Grokipedia, an AI-powered online encyclopedia and alternative to Wikipedia, developed by the company and powered by Grok. Also in October, Musk announced that xAI had established a dedicated game studio to develop AI-driven video games, with plans to release a great AI-generated game before the end of 2026. Valuation See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Atmosphere_(unit)] | [TOKENS: 388] |
Contents Standard atmosphere (unit) The standard atmosphere (symbol: atm) is a unit of pressure defined as 101325 Pa. It is sometimes used as a reference pressure or standard pressure. It is approximately equal to Earth's average atmospheric pressure at sea level. History The standard atmosphere was originally defined as the pressure exerted by a 760 mm column of mercury at 0 °C (32 °F) and standard gravity (gn = 9.80665 m/s2). It was used as a reference condition for physical and chemical properties, and the definition of the centigrade temperature scale set 100 °C as the boiling point of water at this pressure. In 1954, the 10th General Conference on Weights and Measures (CGPM) adopted standard atmosphere for general use and affirmed its definition of being precisely equal to 1013250 dynes per square centimetre (101325 Pa). This defined pressure in a way that is independent of the properties of any particular substance. In addition, the CGPM noted that there had been some misapprehension that the previous definition (from the 9th CGPM) "led some physicists to believe that this definition of the standard atmosphere was valid only for accurate work in thermometry." In chemistry and in various industries, the reference pressure referred to in standard temperature and pressure was commonly 1 atm (101.325 kPa) prior to 1982, but standards have since diverged; in 1982, the International Union of Pure and Applied Chemistry recommended that for the purposes of specifying the physical properties of substances, standard pressure should be precisely 100 kPa (1 bar). Pressure units and equivalencies A pressure of 1 atm can also be stated as: The notation ata has been used to indicate an absolute pressure measured in either standard atmospheres (atm)[better source needed] or technical atmospheres (at). See also References |
======================================== |
[SOURCE: https://techcrunch.com/video/uber-is-literally-in-the-drivers-seat-when-it-comes-to-av-bets/] | [TOKENS: 760] |
Save up to $680 on your pass with Super Early Bird rates. REGISTER NOW. Save up to $680 on your Disrupt 2026 pass. Ends February 27. REGISTER NOW. Latest AI Amazon Apps Biotech & Health Climate Cloud Computing Commerce Crypto Enterprise EVs Fintech Fundraising Gadgets Gaming Google Government & Policy Hardware Instagram Layoffs Media & Entertainment Meta Microsoft Privacy Robotics Security Social Space Startups TikTok Transportation Venture Staff Events Startup Battlefield StrictlyVC Newsletters Podcasts Videos Partner Content TechCrunch Brand Studio Crunchboard Contact Us Uber is literally in the driver’s seat when it comes to AV bets Loading the player… Self-driving truck startup Waabi’s billion-dollar fundraise isn’t just about trucks. The deal, for $750 million up front plus another $250 million from Uber tied to deployment milestones, marks a major expansion into robotaxis for the company founded by former Uber AI chief Raquel Urtasun. It also feels like another chip from Uber on the autonomous vehicle roulette table. With more than 20 AV partners worldwide, the question isn’t just whether Waabi can deliver on its plans to deploy over 25,000 robotaxis, but whether Uber’s bet-on-everything strategy actually works. Watch as Equity podcast hosts Kirsten Korosec, Sean O’Kane and Anthony Ha discussed Uber’s AV partnership strategy, why Waabi’s “simulation-first” approach might be different, and more of the week’s headlines. Subscribe to Equity on YouTube, Apple Podcasts, Overcast, Spotify and all the casts. You also can follow Equity on X and Threads, at @EquityPod. Topics Audio Producer Theresa Loconsolo is an audio producer at TechCrunch focusing on Equity, the network’s flagship podcast. Before joining TechCrunch in 2022, she was one of 2 producers at a four-station conglomerate where she wrote, recorded, voiced and edited content, and engineered live performances and interviews from guests like lovelytheband. Theresa is based in New Jersey and holds a bachelors degree in Communication from Monmouth University. You can contact or verify outreach from Theresa by emailing theresa.loconsolo@techcrunch.com. You can contact or verify outreach from Theresa by emailing theresa.loconsolo@techcrunch.com. You can contact or verify outreach from Theresa by emailing theresa.loconsolo@techcrunch.com. Save up to $680 on your pass before February 27.Meet investors. Discover your next portfolio company. Hear from 250+ tech leaders, dive into 200+ sessions, and explore 300+ startups building what’s next. Don’t miss these one-time savings. Most Popular FBI says ATM ‘jackpotting’ attacks are on the rise, and netting hackers millions in stolen cash Meta’s own research found parental supervision doesn’t really help curb teens’ compulsive social media use How Ricursive Intelligence raised $335M at a $4B valuation in 4 months After all the hype, some AI experts don’t think OpenClaw is all that exciting OpenClaw creator Peter Steinberger joins OpenAI Hollywood isn’t happy about the new Seedance 2.0 video generator The great computer science exodus (and where students are going instead) Subscribe for the industry’s biggest tech news Every weekday and Sunday, you can get the best of TechCrunch’s coverage. TechCrunch's AI experts cover the latest news in the fast-moving field. Every Monday, gets you up to speed on the latest advances in aerospace. Startups are the core of TechCrunch, so get our best coverage delivered weekly. By submitting your email, you agree to our Terms and Privacy Notice. Related © 2025 TechCrunch Media LLC. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Biopython] | [TOKENS: 1737] |
Contents Biopython Biopython is an open-source collection of non-commercial Python modules for computational biology and bioinformatics. It makes robust and well-tested code easily accessible to researchers. Python is an object-oriented programming language and is a suitable choice for automation of common tasks. The availability of reusable libraries saves development time and lets researchers focus on addressing scientific questions. Biopython is constantly updated and maintained by a large team of volunteers across the globe. Biopython contains parsers for diverse bioinformatic sequence, alignment, and structure formats. Sequence formats include FASTA, FASTQ, GenBank, and EMBL. Alignment formats include Clustal, BLAST, PHYLIP, and NEXUS. Structural formats include the PDB, which contains the 3D atomic coordinates of the macromolecules. It has provisions to access information from biological databases like NCBI, Expasy, PBD, and BioSQL. This can be used in scripts or incorporated into their software. Biopython contains a standard sequence class, sequence alignment, and motif analysis tools. It also has clustering algorithms, a module for structural biology, and a module for phylogenetics analysis. History The development of Biopython began in 1999, and it was first released in July 2000. First "semi-complete" and "semi-stable" release was done in March 2001 and December 2002 respectively. It was developed during a similar time frame and with analogous goals to other projects that added bioinformatics capabilities to their respective programming languages, including BioPerl, BioRuby and BioJava. Early developers on the project included Jeff Chang, Andrew Dalke and Brad Chapman, though over 100 people have made contributions to date. In 2007, a similar Python project, namely PyCogent, was established. The initial scope of Biopython involved accessing, indexing and processing biological sequence files. The retrieved data from common biological databases will then be parsed into a python data structure. While this is still a major focus, over the following years added modules have extended its functionality to cover additional areas of biology. The key challenge in the design of parsers for bioinformatics file formats is the frequency at which the data formats change. This is due to inadequate curation of the structure of the data, and changes in the database contents. This problem is overcome by the application of a standard event-oriented parser design (see Key features and examples). As of version 1.77, Biopython no longer supports Python 2. The current stable release of Biopython version 1.85 was released on 15 January 2025. It only supports Python 3 and the recent releases of Biopython require NumPy (and not Numeric). Design Wherever possible, Biopython follows the conventions used by the Python programming language to make it easier for users familiar with Python. For example, Seq and SeqRecord objects can be manipulated via slicing, in a manner similar to Python's strings and lists. It is also designed to be functionally similar to other Bio* projects, such as BioPerl. It is organized into modular sub-packages, e.g., Bio.Seq, Bio.Align, Bio.PDB, Bio.Entrez each of them useful in a different bioinformatics domain. It used principles, like encapsulation and polymorphism, notably in classes Seq, SeqRecord, and Bio.PDB.Structure. It can also interoperate with other Python tools (Pandas, Matplotlib and SciPy). Biopython can read and write most common file formats for each of its functional areas, and its license is permissive and compatible with most other software licenses, which allows Biopython to be used in a variety of software projects. Requirements Biopython is currently supported and tested with the following Python implementations: Key features and examples Biopython can read and write to a number of common formats. When reading files, descriptive information in the file is used to populate the members of Biopython classes, such as SeqRecord. This allows records of one file format to be converted into others. Very large sequence files can exceed a computer's memory resources, so Biopython provides various options for accessing records in large files. They can be loaded entirely into memory in Python data structures, such as lists or dictionaries, providing fast access at the cost of memory usage. Alternatively, the files can be read from disk as needed, with slower performance but lower memory requirements. A core concept in Biopython is the biological sequence, and this is represented by the Seq class. A Biopython Seq object is similar to a Python string in many respects: it supports the Python slice notation, can be concatenated with other sequences and is immutable. This object includes both general string-like and biological sequence-specific methods. It is best to store information about the biological type (DNA, RNA, protein) separately from the sequence, rather than using an explicit alphabet argument. The SeqRecord class describes sequences, along with information such as name, description and features in the form of SeqFeature objects. Each SeqFeature object specifies the type of the feature and its location. Feature types can be ‘gene’, ‘CDS’ (coding sequence), ‘repeat_region’, ‘mobile_element’ or others, and the position of features in the sequence can be exact or approximate. Through the Bio.Entrez module, users of Biopython can download biological data from NCBI databases. Each of the functions provided by the Entrez search engine is available through functions in this module, including searching for and downloading records. The Bio.Phylo module provides tools for working with and visualising phylogenetic trees. A variety of file formats are supported for reading and writing, including Newick, NEXUS and phyloXML. Common tree manipulations and traversals are supported via the Tree and Clade objects. Examples include converting and collating tree files, extracting subsets from a tree, changing a tree's root, and analysing branch features such as length or score. Rooted trees can be drawn in ASCII or using matplotlib (see Figure 1), and the Graphviz library can be used to create unrooted layouts (see Figure 2). The GenomeDiagram module provides methods of visualising sequences within Biopython. Sequences can be drawn in a linear or circular form (see Figure 3), and many output formats are supported, including PDF and PNG. Diagrams are created by making tracks and then adding sequence features to those tracks. By looping over a sequence's features and using their attributes to decide if and how they are added to the diagram's tracks, one can exercise much control over the appearance of the final diagram. Cross-links can be drawn between different tracks, allowing one to compare multiple sequences in a single diagram. The Bio.PDB module can load molecular structures from PDB and mmCIF files, and was added to Biopython in 2003. The Structure object is central to this module, and it organises macromolecular structure in a hierarchical fashion: Structure objects contain Model objects which contain Chain objects which contain Residue objects which contain Atom objects. Disordered residues and atoms get their own classes, DisorderedResidue and DisorderedAtom, that describe their uncertain positions. Using Bio.PDB, one can navigate through individual components of a macromolecular structure file, such as examining each atom in a protein. Common analyses can be carried out, such as measuring distances or angles, comparing residues and calculating residue depth. The Bio.PopGen module adds support to Biopython for Genepop, a software package for statistical analysis of population genetics. This allows for analyses of Hardy–Weinberg equilibrium, linkage disequilibrium and other features of a population's allele frequencies. This module can also carry out population genetic simulations using coalescent theory with the fastsimcoal2 program. Biopython previously included command-line wrappers for tools such as BLAST, Clustal, EMBOSS, and SAMtools. This option allowed users to run external tool commands from within the code using specialized Biopython classes. However, Bio.Application modules and their wrappers have deprecated and will be removed in future Biopython releases. The main reason for this is the high maintenance burden of updating them with the evolving external tools. The recommended approach is to directly construct and execute command-line tool commands using Python’s built-in subprocess module. This method provides flexibility and removes the dependency on the Biopython wrappers. subprocess is a native Python module useful for running external commands, programs, and capturing their output. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Western_Europe] | [TOKENS: 1969] |
Contents Western Europe Western Europe is the western region of Europe. The region's extent varies depending on context. The concept of "the West" appeared in Europe in juxtaposition to "the East" and originally applied to the Western half of the ancient Mediterranean world, the Latin West of the Roman Empire, and "Western Christendom". Beginning with the Renaissance and the Age of Discovery, roughly from the 15th century, the concept of Europe as "the West" slowly became distinguished from and eventually replaced the dominant use of "Christendom" as the preferred endonym within the area. By the Age of Enlightenment and the Industrial Revolution, the concepts of "Eastern Europe" and "Western Europe" were more regularly used. The distinctiveness of Western Europe became most apparent during the Cold War, when Europe was divided for 40 years by the Iron Curtain into the Western Bloc and Eastern Bloc, each characterised by distinct political and economical systems. Historical divisions Prior to the Roman conquest, a large part of Western Europe had adopted the newly developed La Tène culture. As the Roman domain expanded, a cultural and linguistic division appeared between the mainly Greek-speaking eastern provinces, which had formed the highly urbanised Hellenistic civilisation, and the western territories, which in contrast largely adopted the Latin language. This cultural and linguistic division was eventually reinforced by the later political east–west division of the Roman Empire. The Western Roman Empire and the Eastern Roman Empire controlled the two divergent regions between the 3rd and the 5th centuries. The division between these two was enhanced during late antiquity and the Middle Ages by a number of events. The Western Roman Empire collapsed, starting the Early Middle Ages. By contrast, the Eastern Roman Empire, mostly known as the Greek or Byzantine Empire, survived and even thrived for another 1000 years. The rise of the Carolingian Empire in the west, and in particular the Great Schism between Eastern Orthodoxy and Roman Catholicism, enhanced the cultural and religious distinctiveness between Eastern and Western Europe. After the conquest of the Byzantine Empire, center of the Eastern Orthodox Church, by the Muslim Ottoman Empire in the 15th century, and the gradual fragmentation of the Holy Roman Empire (which had replaced the Carolingian Empire), the division between Roman Catholic and Protestant became more important in Europe than that with Eastern Orthodoxy. In East Asia, Western Europe was historically known as taixi in China and taisei in Japan, which literally translates as the "Far West". The term Far West became synonymous with Western Europe in China during the Ming dynasty. The Italian Jesuit priest Matteo Ricci was one of the first writers in China to use the Far West as an Asian counterpart to the European concept of the Far East. In Ricci's writings, Ricci referred to himself as "Matteo of the Far West". The term was still in use in the late 19th and early 20th centuries. Christianity is the largest religion in Western Europe. According to a 2018 study by the Pew Research Center, 71.0% of Western Europeans identified as Christians. In 1054, the East–West Schism divided Christianity into Western Christianity and Eastern Christianity. This split Europe in two, with Western Europe primarily under the Catholic Church, and much of Eastern Europe under the Eastern Orthodox Church. Ever since the Reformation in the 16th century, Protestantism has also been a major denomination in Europe, mostly in the West. During the four decades of the Cold War, the definition of East and West was simplified by the existence of the Eastern Bloc. A number of historians and social scientists view the Cold War definition of Western and Eastern Europe as outdated or relegating. During the final stages of World War II, the future of Europe was decided between the Allies in the 1945 Yalta Conference, between the British Prime Minister, Winston Churchill, the U.S. President, Franklin D. Roosevelt, and the Premier of the Soviet Union, Joseph Stalin. Post-war Europe was divided into two major spheres: the Western Bloc, influenced by the United States, and the Eastern Bloc, influenced by the Soviet Union. With the onset of the Cold War, Europe was divided by the Iron Curtain. This term had been used during World War II by German Propaganda Minister Joseph Goebbels and, later, Count Lutz Schwerin von Krosigk in the last days of the war; however, its use was hugely popularised by Winston Churchill, who used it in his famous "Sinews of Peace" address on 5 March 1946 at Westminster College in Fulton, Missouri: From Stettin in the Baltic to Trieste in the Adriatic an iron curtain has descended across the Continent. Behind that line lie all the capitals of the ancient states of Central and Eastern Europe. Warsaw, Berlin, Prague, Vienna, Budapest, Belgrade, Bucharest and Sofia; all these famous cities and the populations around them lie in what I must call the Soviet sphere, and all are subject, in one form or another, not only to Soviet influence but to a very high and in some cases increasing measure of control from Moscow. Although some countries were officially neutral, they were classified according to the nature of their political and economic systems. This division largely defines the popular perception and understanding of Western Europe and its borders with Eastern Europe on the east side. On the western side is the Atlantic ocean. The world changed dramatically with the fall of the Iron Curtain in 1989. West Germany peacefully absorbed East Germany, in the German reunification. Comecon and the Warsaw Pact were dissolved, and in 1991, the Soviet Union ceased to exist. Several countries which had been part of the Soviet Union regained full independence. In 1948 the Treaty of Brussels was signed between Belgium, France, Luxembourg, the Netherlands and the United Kingdom. It was further revisited in 1954 at the Paris Conference, when the Western European Union was established. It was declared defunct in 2011 after the Treaty of Lisbon, and the Treaty of Brussels was terminated. When the Western European Union was dissolved, it had 10 member countries. Additionally, it had 6 associate member countries, 7 associate partner countries and 5 observer countries. Modern divisions The United Nations geoscheme is a system devised by the United Nations Statistics Division (UNSD) which divides the countries of the world into regional and subregional groups, based on the M49 coding classification. The partition is for statistical convenience and does not imply any assumption regarding political or other affiliation of countries or territories. In the UN geoscheme, the following countries are classified as Western Europe: The CIA classifies seven countries as belonging to "Western Europe": The CIA also classifies three countries as belonging to "Southwestern Europe": EuroVoc is a multilingual thesaurus maintained by the Publications Office of the European Union. In this thesaurus, the countries of Europe are grouped into sub-regions. The following countries are included in the sub-group Western Europe: The Western European and Others Group is one of several unofficial Regional Groups in the United Nations that act as voting blocs and negotiation forums. Regional voting blocs were formed in 1961 to encourage voting to various UN bodies from different regional groups. The European members of the group are: In addition, Australia, Canada, Israel and New Zealand are members of the group, with the United States as observer. Population Using the CIA classification strictly would give the following calculation of Western Europe's population. All figures based on the projections for 2018 by the Population Division of the United Nations Department of Economic and Social Affairs. Using the CIA classification a little more liberally and including "South-Western Europe", would give the following calculation of Western Europe's population. 1 The Hague is the seat of government Climate The climate of Western Europe varies from Mediterranean in the coasts of Italy, Portugal and Spain to alpine in the Pyrenees and the Alps. The Mediterranean climate of the south is dry and warm. The western and northwestern parts have a mild, generally humid climate, influenced by the North Atlantic Current. Western Europe is a heatwave hotspot, exhibiting upward trends that are three-to-four times faster compared to the rest of the northern midlatitudes. Languages Western European languages mostly fall within two Indo-European language families: the Romance languages, descended from the Latin of the Roman Empire; and the Germanic languages, whose ancestor language (Proto-Germanic) came from southern Scandinavia. Romance languages are spoken primarily in the southern and central part of Western Europe, Germanic languages in the northern part (the British Isles and the Low Countries), as well as a large part of Northern and Central Europe. Other Western European languages include the Celtic group (that is, Irish, Scottish Gaelic, Manx, Welsh, Cornish and Breton) and Basque, the only currently living Old European language isolate. Multilingualism and the protection of regional and minority languages are recognised political goals in Western Europe today. The Council of Europe Framework Convention for the Protection of National Minorities and the Council of Europe's European Charter for Regional or Minority Languages set up a legal framework for language rights in Europe. Economy Western Europe is one of the richest regions of the world. Germany has the highest gross domestic product in Europe and the largest financial surplus of any country, Luxembourg has the world's highest GDP per capita, and Germany has the highest net national wealth of any European state. Switzerland and Luxembourg have the highest average wage in the world, in nominal and PPP, respectively. Norway ranks highest in the world on the Social Progress Index. Global impact See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Clipper_(programming_language)] | [TOKENS: 981] |
Contents Clipper (programming language) Clipper is an xBase compiler that implements a variant of the xBase computer programming language. It is used to create or extend software programs that usually ran on DOS originally. Although it is a powerful general-purpose programming language, it was used mainly to create database business programs. One major dBase feature not implemented in Clipper is the dot-prompt (. prompt) interactive command set, which was an important part of the original dBase implementation. Clipper, from Nantucket Corp and later Computer Associates, started out as a native code compiler for dBase III databases, and later evolved. History Clipper was created by Nantucket Corporation, a company that was started in 1984 by Barry ReBell (management) and Brian Russell (technical). Larry Heimendinger was Nantucket's president. In 1992, the company was sold to Computer Associates for 190 million dollars and the product was renamed to CA-Clipper. Clipper was created as a replacement programming language for Ashton Tate's dBASE III, a very popular database language at the time. The advantage of Clipper over dBASE was that it could be compiled and executed on DOS as a standalone application. In the years between 1985 and 1992, millions of Clipper applications were built, typically for small businesses dealing with databases concerning many aspects of client management and inventory management. For many smaller businesses, having a Clipper application designed to their specific needs was their first experience with software development. Also many applications for banking and insurance companies were developed, here especially in those cases where the application was considered too small to be developed and run on traditional mainframes. In these environments Clipper also served as a front end for existing mainframe applications. As the product matured, it added elements of the programming languages C and Pascal, and object-oriented programming (OOP), and the code-block data-type (hybridizing the concepts of dBase macros, or string-evaluation, and function pointers), to become far more powerful than the original. Nantucket's Aspen project later matured into the Windows native-code CA-Visual Objects compiler. Market penetration Nantucket sold well in Western markets. Also, in November 1991, the New York Times reported the company's success in "painstakingly convincing Soviet software developers that buying is preferable to pirating". According to the article, Clipper had sold 2,000 copies in the Soviet Union (compared to 250,000 worldwide). In the early 1990s, under new ownership, Clipper failed to transition from DOS to Windows. As a result, almost no new commercial applications were written in Clipper after 1995. By then, the "classically trained programmer" commonly used strong typing, in contrast to the original dBASE language. An evolution of Clipper, named Visual Objects, added strong typing but made it optional, to remain compatible with existing code. Four of the more important languages that took over from Clipper were Visual Basic, Microsoft Access, Delphi, and Powerbuilder. All provided strong typing. The Clipper language is being actively implemented and extended by multiple organizations/vendors, like XBase++ from Alaska Software and FlagShip, and free (GPL-licensed) projects like Harbour and xHarbour. Many of the current implementations are portable (DOS, Windows, Linux (32- and 64-bit), Unix (32- and 64-bit), and macOS), supporting many language extensions, with much extended runtime libraries, and various Replaceable Database Drivers (RDD) supporting many popular database formats, like DBF, DBTNTX, DBFCDX (FoxPro, Apollo, Comix, and Advantage Database Server), MachSix (SIx Driver and Apollo), SQL, and more. These newer implementations all strive for full compatibility with the standard dBase/xBase syntax, while also offering OOP approaches and target-based syntax such as SQLExecute(). The Clipper Usenet newsgroups are comp.lang.clipper and comp.lang.clipper.visual-objects. Programming A simple hello world - application (fully functional after compiling): A simple data base input mask (without actual function or procedure start/end): Version history The various Clipper versions, and release dates, were: From Nantucket Corporation; the "seasonal versions", billed as "dBase compilers": From Nantucket Corporation; Clipper 5: and from Computer Associates; CA-Clipper 5: After buying Nantucket, along with the standard Clipper library, CA developed another, named Clipper Tools. Three versions of this library were released, alongside Clipper versions. This library became a de facto standard among Clipper clones, such as xHarbour. It was also cloned by several of Clipper's clones. References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Vilfredo_Pareto] | [TOKENS: 2769] |
Contents Vilfredo Pareto Vilfredo Federico Damaso Pareto (/pəˈreɪtoʊ/; Italian: [paˈreːto]; born Wilfried Fritz Pareto; 15 July 1848 – 19 August 1923) was an Italian polymath, whose areas of interest included sociology, civil engineering, economics, political science, and philosophy. He made several important contributions to economics, particularly in the study of income distribution and in the analysis of individuals' choices, and was one of the minds behind the Lausanne School of economics. He was also responsible for popularising the use of the term elite in social analysis and contributed to elite theory. He has been described as "one of the last Renaissance scholars. Trained in physics and mathematics, he became a polymath whose genius radiated into nearly all other major fields of knowledge." He introduced the concept of Pareto efficiency and helped develop the field of microeconomics. He was also the first to claim that income follows a Pareto distribution, which is a power law probability distribution. The Pareto principle was named after him, and it was built on his observations that 80% of the wealth in Italy belonged to about 20% of the population. He also contributed to the fields of mathematics and sociology. Biography Pareto was born of an exiled noble Genoese family on 15 July 1848 in Paris, the centre of the popular revolutions of that year. His father, Raffaele Pareto (1812–1882), was an Italian civil engineer and Ligurian marquis who had left Italy much as Giuseppe Mazzini and other Italian nationalists had. His mother, Marie Metenier, was a French woman. Enthusiastic about the revolutions of 1848 in the German states, his parents named him Wilfried Fritz, which became Vilfredo Federico upon his family's move back to Italy in 1858. In his childhood, Pareto lived in a middle-class environment, receiving a high standard of education, attending the newly created Istituto Tecnico Leardi where Ferdinando Pio Rosellini was his mathematics professor. In 1869, he earned a doctorate in engineering from what is now the Polytechnic University of Turin, then known as the Technical School for Engineers, with a dissertation entitled "The Fundamental Principles of Equilibrium in Solid Bodies". His later interest in equilibrium analysis in economics and sociology can be traced back to this dissertation. Pareto was among the contributors to the Rome-based magazine La Ronda between 1919 and 1922. For some years after graduation, Pareto worked as a civil engineer, first for the state-owned Italian Railway Company and later in private industry. He was manager of the Iron Works of San Giovanni Valdarno and later general manager of Italian Iron Works. He did not begin serious work in economics until his mid-forties. He started his career as a fiery advocate of classical liberalism, besetting the most ardent British liberals with his attacks on any form of government intervention in the free market. In 1886, he became a lecturer on economics and management at the University of Florence. His stay in Florence was marked by political activity, much of it fueled by his own frustrations with government regulators. In 1889, after the death of his parents, Pareto changed his lifestyle, quitting his job and marrying a Russian woman, Alessandrina Bakunina. In 1893, Pareto succeeded Léon Walras to the chair of Political Economy at the University of Lausanne in Switzerland where he remained for the rest of his life. He published there in 1896–1897 a textbook containing the Pareto distribution of how wealth is distributed, which he believed was a constant "through any human society, in any age, or country". In 1906, he made the famous observation that twenty per cent of the population owned eighty per cent of the property in Italy, later generalised by Joseph M. Juran into the Pareto principle, also termed the 80–20 rule. Pareto maintained cordial personal relationships with individual socialists but always thought their economic ideas were severely flawed. He later became suspicious of their motives and denounced socialist leaders as an "aristocracy of brigands" who threatened to despoil the country and criticized the government of the Italian statesman Giovanni Giolitti for not taking a tougher stance against worker strikes. Growing unrest among labour in the Kingdom of Italy led him to the anti-socialist and anti-democratic camp. His attitude towards Italian fascism in his last years is a matter of controversy. Pareto's relationship with scientific sociology in the age of the foundation is grafted in a paradigmatic way at the moment in which he, starting from the political economy, criticizes positivism as a totalizing and metaphysical system devoid of a rigorous logical-experimental method. In this sense we can read the fate of the Paretian production within a history of the social sciences that continues to show its peculiarity and interest for its contributions in the 21st century. The story of Pareto is also part of the multidisciplinary research of a scientific model that privileges sociology as a critique of cumulative models of knowledge as well as a discipline tending to the affirmation of relational models of science. In 1889, Pareto married Alessandrina Bakunina, a Russian woman. She left him in 1902 for a young servant. Twenty years later in 1923, he married Jeanne Regis, a French woman, just before his death in Geneva, Switzerland, on 19 August 1923. Sociology Pareto's later years were spent in collecting the material for his best-known work, Trattato di sociologia generale (1916) (The Mind and Society, published in 1935). His final work was Compendio di sociologia generale (1920). In his Trattato di Sociologia Generale (1916, rev. French trans. 1917), published in English by Harcourt, Brace, in a four-volume edition edited by Arthur Livingston under the title The Mind and Society (1935), Pareto developed the notion of the circulation of elites, the first social cycle theory in sociology. He is famous for saying "history is a graveyard of aristocracies". Pareto might have turned to sociology for an understanding of why his mathematical economic theories did not always predict actions of individuals in practice, in the belief that unforeseen or uncontrollable social factors intervened. His sociology holds that much social action is nonlogical and that much personal action is designed to give spurious logicality to non-rational actions. We are driven, he taught, by certain "residues" and by "derivations" from these residues. The more important of these have to do with conservatism and risk-taking, and human history is the story of the alternate dominance of these sentiments in the ruling elite, which comes into power strong in conservatism but gradually changes over to the philosophy of the "foxes" or speculators. A catastrophe results, with a return to conservatism; the "lion" mentality follows. This cycle might be broken by the use of force, says Pareto, but the elite becomes weak and humanitarian and shrinks from violence. Among those who introduced Pareto's sociology to the United States were George C. Homans and Lawrence Joseph Henderson at Harvard, and Paretian ideas gained considerable influence, especially on Harvard sociologist Talcott Parsons, who developed a systems approach to society and economics that argues the status quo is usually functional. The American historian Bernard DeVoto played an important role in introducing Pareto's ideas to these Cambridge intellectuals and other Americans in the 1930s. Wallace Stegner, in his biography of DeVoto, recounts these developments and says this about the often misunderstood distinction between "residues" and "derivations". He wrote: "Basic to Pareto's method is the analysis of society through its non-rational 'residues,' which are persistent and unquestioned social habits, beliefs, and assumptions, and its 'derivations,' which are the explanations, justifications, and rationalizations we make of them. One of the commonest errors of social thinkers is to assume rationality and logic in social attitudes and structures; another is to confuse residues and derivations." Fascism and power distribution Renato Cirillo wrote that Pareto had frequently been considered a predecessor of fascism as a result of his support for the movement when it began. Cirillo disagreed with this interpretation, suggesting that Pareto was critical of fascism in his private letters. Pareto argued that democracy was an illusion and that a ruling class always emerged and enriched itself. For him, the key question was how actively the rulers ruled. For this reason, he called for a drastic reduction of the state and welcomed Benito Mussolini's rule as a transition to this minimal state so as to liberate the perceived pure economic forces. As a young student, Mussolini had attended some of Pareto's lectures at the University of Lausanne in 1904. It has been argued that Mussolini's move away from socialism towards a form of elitism may be attributed to Pareto's ideas. Franz Borkenau, a biographer, argued that Mussolini followed Pareto's policy ideas during the beginning of his tenure as prime minister.: 18 Karl Popper dubbed Pareto the "theoretician of totalitarianism"; according to Cirillo, there is no evidence in Popper's published work that he read Pareto in any detail before repeating what was then a common but dubious judgement in anti-fascist circles. Economic concepts Pareto turned his interest to economic matters, and he became an advocate of free trade, finding himself in conflict with the Italian government. His writings reflected the ideas of Léon Walras that economics is essentially a mathematical and natural science. He tried to sketch economics in analogy to mechanics, explicitly linking pure (and applied) economics to pure (and applied) mechanics, presenting a concordance table relating the two sciences. Pareto was a leader of the "Lausanne School" and represents the second generation of the Neoclassical Revolution. His "tastes-and-obstacles" approach to general equilibrium theory was resurrected during the great "Paretian Revival" of the 1930s and has influenced theoretical economics since. In his Manual of Political Economy (1906) the focus is on equilibrium in terms of solutions to individual problems of "objectives and constraints". He used the indifference curve of Edgeworth (1881) extensively, for the theory of the consumer and, another great novelty, in his theory of the producer. He gave the first presentation of the trade-off box now known as the "Edgeworth-Bowley" box. Pareto was the first to realize that cardinal utility could be dispensed with, and economic equilibrium thought of in terms of ordinal utility, that is, it was not necessary to know how much a person valued this or that, only that he preferred X of this to Y of that. Utility was a preference-ordering. With this, Pareto not only inaugurated modern microeconomics but he also attacked the alliance of economics and utilitarian philosophy, which calls for the greatest good for the greatest number; Pareto said good cannot be measured. He replaced it with the notion of Pareto-optimality, the idea that a system is enjoying maximum economic satisfaction when no one can be made better off without making someone else worse off. Pareto optimality is widely used in welfare economics and game theory. A standard theorem is that a perfectly competitive market creates distributions of wealth that are Pareto optimal. Some economic concepts based on Pareto's work are still in use in the 21st century. The Pareto chart is a special type of histogram, used to view the causes of a problem in order of severity from largest to smallest. It is a statistical tool that graphically demonstrates the Pareto principle or the 80–20 rule. The Pareto principle concerns the distribution of income, while the Pareto distribution is a probability distribution used, among other things, as a mathematical realization of Pareto's law, and Ophelimity is a measure of purely economic satisfaction. The Pareto index is a measure of the inequality of income distribution. Pareto argued that in all countries and times the distribution of income and wealth is highly skewed, with a few holding most of the wealth. He argued that all observed societies follow a regular logarithmic pattern: N = A x m {\displaystyle \ N=Ax^{m}} where N is the number of people with wealth higher than x, and A and m are constants. Over the years, Pareto's law proved remarkably close to observed data, with economists typically finding it plausible according to the Encyclopædia Britannica. The Pareto efficiency is generally not very discriminating while the concept of potential Pareto-efficiency, also known as Kaldor-Hicks efficiency, is more discriminating and is widely used in economics. A common criticism outside of economics is that it relies on subjective preferences. According to Oxford Reference, the Pareto principle can be controversial in welfare economics since its assumptions are empirically questionable, may embody value-judgements, and tend to favour the status quo. As a result of its silence on the initial distribution of resources, most sociologists are also critical of Paretian welfare economics. Major works References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_note-8] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.