text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/Post-industrial_economy] | [TOKENS: 178]
Contents Post-industrial economy A post-industrial economy is a period of growth within an industrialized economy or nation in which the relative importance of manufacturing reduces and that of services, information, and research grows. Such economies are often marked by a declining manufacturing sector, resulting in deindustrialization, and a large service sector as well as an increase in the amount of information technology, often leading to an "Information Age"; information, knowledge, and creativity are the new raw materials of such an economy. The industry aspect of a post-industrial economy is sent into less developed nations which manufacture what is needed at lower costs through outsourcing. This occurrence is typical of nations that industrialized in the past such as the United Kingdom (first industrialised nation), most of Western Europe and the United States. See also References This globalization-related article is a stub. You can help Wikipedia by adding missing information.
========================================
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-188] | [TOKENS: 10728]
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006โ€”over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protรฉgรฉ. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendoโ€“Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ยฅ39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to ยฃ700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even closeโ€”Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, โ€” with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) โ€” as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a ยฃ20 million marketing budget during the Christmas season compared to Sega's ยฃ4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999โ€“2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least ยฃ100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006โ€”over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256ร—224 to 640ร—480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consolesโ€”including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears outโ€”usually unevenlyโ€”due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41โ„2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsลซshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5โ€”for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everythingโ€”the whole PlayStation formatโ€”is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References
========================================
[SOURCE: https://en.wikipedia.org/wiki/BBC_News#cite_note-32] | [TOKENS: 8810]
Contents BBC News BBC News is an operational business division of the British Broadcasting Corporation (BBC) responsible for the gathering and broadcasting of news and current affairs in the UK and around the world. The department is the world's largest broadcast news organisation and generates about 120 hours of radio and television output each day, as well as online news coverage. The service has over 5,500 journalists working across its output including in 50 foreign news bureaus where more than 250 foreign correspondents are stationed. Deborah Turness has been the CEO of news and current affairs since September 2022. In 2019, it was reported in an Ofcom report that the BBC spent ยฃ136m on news during the period April 2018 to March 2019. BBC News' domestic, global and online news divisions are housed within the largest live newsroom in Europe, in Broadcasting House in central London. Parliamentary coverage is produced and broadcast from studios in London. Through BBC English Regions, the BBC also has regional centres across England and national news centres in Northern Ireland, Scotland and Wales. All nations and English regions produce their own local news programmes and other current affairs and sport programmes. The BBC is a quasi-autonomous corporation authorised by royal charter, making it operationally independent of the government. As of 2024, the BBC reaches an average of 450 million people per week, with the BBC World Service accounting for 320 million people. History This is London calling โ€“ 2LO calling. Here is the first general news bulletin, copyright by Reuters, Press Association, Exchange Telegraph and Central News. โ€” BBC news programme opening during the 1920s The British Broadcasting Company broadcast its first radio bulletin from radio station 2LO on 14 November 1922. Wishing to avoid competition, newspaper publishers persuaded the government to ban the BBC from broadcasting news before 7 pm, and to force it to use wire service copy instead of reporting on its own. The BBC gradually gained the right to edit the copy and, in 1934, created its own news operation. However, it could not broadcast news before 6 p.m. until World War II. In addition to news, Gaumont British and Movietone cinema newsreels had been broadcast on the TV service since 1936, with the BBC producing its own equivalent Television Newsreel programme from January 1948. A weekly Children's Newsreel was inaugurated on 23 April 1950, to around 350,000 receivers. The network began simulcasting its radio news on television in 1946, with a still picture of Big Ben. Televised bulletins began on 5 July 1954, broadcast from leased studios within Alexandra Palace in London. The public's interest in television and live events was stimulated by Elizabeth II's coronation in 1953. It is estimated that up to 27 million people viewed the programme in the UK, overtaking radio's audience of 12 million for the first time. Those live pictures were fed from 21 cameras in central London to Alexandra Palace for transmission, and then on to other UK transmitters opened in time for the event. That year, there were around two million TV Licences held in the UK, rising to over three million the following year, and four and a half million by 1955. Television news, although physically separate from its radio counterpart, was still firmly under radio news' control in the 1950s. Correspondents provided reports for both outlets, and the first televised bulletin, shown on 5 July 1954 on the then BBC television service and presented by Richard Baker, involved his providing narration off-screen while stills were shown. This was then followed by the customary Television Newsreel with a recorded commentary by John Snagge (and on other occasions by Andrew Timothy). On-screen newsreaders were introduced a year later in 1955 โ€“ Kenneth Kendall (the first to appear in vision), Robert Dougall, and Richard Bakerโ€”three weeks before ITN's launch on 21 September 1955. Mainstream television production had started to move out of Alexandra Palace in 1950 to larger premises โ€“ mainly at Lime Grove Studios in Shepherd's Bush, west London โ€“ taking Current Affairs (then known as Talks Department) with it. It was from here that the first Panorama, a new documentary programme, was transmitted on 11 November 1953, with Richard Dimbleby becoming anchor in 1955. In 1958, Hugh Carleton Greene became head of News and Current Affairs. On 1 January 1960, Greene became Director-General. Greene made changes that were aimed at making BBC reporting more similar to its competitor ITN, which had been highly rated by study groups held by Greene. A newsroom was created at Alexandra Palace, television reporters were recruited and given the opportunity to write and voice their own scripts, without having to cover stories for radio too. On 20 June 1960, Nan Winton, the first female BBC network newsreader, appeared in vision. 19 September 1960 saw the start of the radio news and current affairs programme The Ten O'clock News. BBC2 started transmission on 20 April 1964 and began broadcasting a new show, Newsroom. The World at One, a lunchtime news programme, began on 4 October 1965 on the then Home Service, and the year before News Review had started on television. News Review was a summary of the week's news, first broadcast on Sunday, 26 April 1964 on BBC 2 and harking back to the weekly Newsreel Review of the Week, produced from 1951, to open programming on Sunday eveningsโ€“the difference being that this incarnation had subtitles for the deaf and hard-of-hearing. As this was the decade before electronic caption generation, each superimposition ("super") had to be produced on paper or card, synchronised manually to studio and news footage, committed to tape during the afternoon, and broadcast early evening. Thus Sundays were no longer a quiet day for news at Alexandra Palace. The programme ran until the 1980s โ€“ by then using electronic captions, known as Anchor โ€“ to be superseded by Ceefax subtitling (a similar Teletext format), and the signing of such programmes as See Hear (from 1981). On Sunday 17 September 1967, The World This Weekend, a weekly news and current affairs programme, launched on what was then Home Service, but soon-to-be Radio 4. Preparations for colour began in the autumn of 1967 and on Thursday 7 March 1968 Newsroom on BBC2 moved to an early evening slot, becoming the first UK news programme to be transmitted in colour โ€“ from Studio A at Alexandra Palace. News Review and Westminster (the latter a weekly review of Parliamentary happenings) were "colourised" shortly after. However, much of the insert material was still in black and white, as initially only a part of the film coverage shot in and around London was on colour reversal film stock, and all regional and many international contributions were still in black and white. Colour facilities at Alexandra Palace were technically very limited for the next eighteen months, as it had only one RCA colour Quadruplex videotape machine and, eventually two Pye plumbicon colour telecinesโ€“although the news colour service started with just one. Black and white national bulletins on BBC 1 continued to originate from Studio B on weekdays, along with Town and Around, the London regional "opt out" programme broadcast throughout the 1960s (and the BBC's first regional news programme for the South East), until it started to be replaced by Nationwide on Tuesday to Thursday from Lime Grove Studios early in September 1969. Town and Around was never to make the move to Television Centre โ€“ instead it became London This Week which aired on Mondays and Fridays only, from the new TVC studios. The BBC moved production out of Alexandra Palace in 1969. BBC Television News resumed operations the next day with a lunchtime bulletin on BBC1 โ€“ in black and white โ€“ from Television Centre, where it remained until March 2013. This move to a smaller studio with better technical facilities allowed Newsroom and News Review to replace back projection with colour-separation overlay. During the 1960s, satellite communication had become possible; however, it was some years before digital line-store conversion was able to undertake the process seamlessly. On 14 September 1970, the first Nine O'Clock News was broadcast on television. Robert Dougall presented the first week from studio N1 โ€“ described by The Guardian as "a sort of polystyrene padded cell"โ€”the bulletin having been moved from the earlier time of 20.50 as a response to the ratings achieved by ITN's News at Ten, introduced three years earlier on the rival ITV. Richard Baker and Kenneth Kendall presented subsequent weeks, thus echoing those first television bulletins of the mid-1950s. Angela Rippon became the first female news presenter of the Nine O'Clock News in 1975. Her work outside the news was controversial at the time, appearing on The Morecambe and Wise Christmas Show in 1976 singing and dancing. The first edition of John Craven's Newsround, initially intended only as a short series and later renamed just Newsround, came from studio N3 on 4 April 1972. Afternoon television news bulletins during the mid to late 1970s were broadcast from the BBC newsroom itself, rather than one of the three news studios. The newsreader would present to camera while sitting on the edge of a desk; behind him staff would be seen working busily at their desks. This period corresponded with when the Nine O'Clock News got its next makeover, and would use a CSO background of the newsroom from that very same camera each weekday evening. Also in the mid-1970s, the late night news on BBC2 was briefly renamed Newsnight, but this was not to last, or be the same programme as we know today โ€“ that would be launched in 1980 โ€“ and it soon reverted to being just a news summary with the early evening BBC2 news expanded to become Newsday. News on radio was to change in the 1970s, and on Radio 4 in particular, brought about by the arrival of new editor Peter Woon from television news and the implementation of the Broadcasting in the Seventies report. These included the introduction of correspondents into news bulletins where previously only a newsreader would present, as well as the inclusion of content gathered in the preparation process. New programmes were also added to the daily schedule, PM and The World Tonight as part of the plan for the station to become a "wholly speech network". Newsbeat launched as the news service on Radio 1 on 10 September 1973. On 23 September 1974, a teletext system which was launched to bring news content on television screens using text only was launched. Engineers originally began developing such a system to bring news to deaf viewers, but the system was expanded. The Ceefax service became much more diverse before it ceased on 23 October 2012: it not only had subtitling for all channels, it also gave information such as weather, flight times and film reviews. By the end of the decade, the practice of shooting on film for inserts in news broadcasts was declining, with the introduction of ENG technology into the UK. The equipment would gradually become less cumbersome โ€“ the BBC's first attempts had been using a Philips colour camera with backpack base station and separate portable Sony U-matic recorder in the latter half of the decade. In 1980, the Iranian Embassy Siege had been shot electronically by the BBC Television News Outside broadcasting team, and the work of reporter Kate Adie, broadcasting live from Prince's Gate, was nominated for BAFTA actuality coverage, but this time beaten by ITN for the 1980 award. Newsnight, the news and current affairs programme, was due to go on air on 23 January 1980, although trade union disagreements meant that its launch from Lime Grove was postponed by a week. On 27 August 1981 Moira Stuart became the first African Caribbean female newsreader to appear on British television. By 1982, ENG technology had become sufficiently reliable for Bernard Hesketh to use an Ikegami camera to cover the Falklands War, coverage for which he won the "Royal Television Society Cameraman of the Year" award and a BAFTA nomination โ€“ the first time that BBC News had relied upon an electronic camera, rather than film, in a conflict zone. BBC News won the BAFTA for its actuality coverage, however the event has become remembered in television terms for Brian Hanrahan's reporting where he coined the phrase "I'm not allowed to say how many planes joined the raid, but I counted them all out and I counted them all back" to circumvent restrictions, and which has become cited as an example of good reporting under pressure. The first BBC breakfast television programme, Breakfast Time also launched during the 1980s, on 17 January 1983 from Lime Grove Studio E and two weeks before its ITV rival TV-am. Frank Bough, Selina Scott, and Nick Ross helped to wake viewers with a relaxed style of presenting. The Six O'Clock News first aired on 3 September 1984, eventually becoming the most watched news programme in the UK (however, since 2006 it has been overtaken by the BBC News at Ten). In October 1984, images of millions of people starving to death in the Ethiopian famine were shown in Michael Buerk's Six O'Clock News reports. The BBC News crew were the first to document the famine, with Buerk's report on 23 October describing it as "a biblical famine in the 20th century" and "the closest thing to hell on Earth". The BBC News report shocked Britain, motivating its citizens to inundate relief agencies, such as Save the Children, with donations, and to bring global attention to the crisis in Ethiopia. The news report was also watched by Bob Geldof, who would organise the charity single "Do They Know It's Christmas?" to raise money for famine relief followed by the Live Aid concert in July 1985. Starting in 1981, the BBC gave a common theme to its main news bulletins with new electronic titlesโ€“a set of computer-animated "stripes" forming a circle on a red background with a "BBC News" typescript appearing below the circle graphics, and a theme tune consisting of brass and keyboards. The Nine used a similar (striped) number 9. The red background was replaced by a blue from 1985 until 1987. By 1987, the BBC had decided to re-brand its bulletins and established individual styles again for each one with differing titles and music, the weekend and holiday bulletins branded in a similar style to the Nine, although the "stripes" introduction continued to be used until 1989 on occasions where a news bulletin was screened out of the running order of the schedule. In 1987, John Birt resurrected the practice of correspondents working for both TV and radio with the introduction of bi-media journalism. During the 1990s, a wider range of services began to be offered by BBC News, with the split of BBC World Service Television to become BBC World (news and current affairs), and BBC Prime (light entertainment). Content for a 24-hour news channel was thus required, followed in 1997 with the launch of domestic equivalent BBC News 24. Rather than set bulletins, ongoing reports and coverage was needed to keep both channels functioning and meant a greater emphasis in budgeting for both was necessary. In 1998, after 66 years at Broadcasting House, the BBC Radio News operation moved to BBC Television Centre. New technology, provided by Silicon Graphics, came into use in 1993 for a re-launch of the main BBC 1 bulletins, creating a virtual set which appeared to be much larger than it was physically. The relaunch also brought all bulletins into the same style of set with only small changes in colouring, titles, and music to differentiate each. A computer generated cut-glass sculpture of the BBC coat of arms was the centrepiece of the programme titles until the large scale corporate rebranding of news services in 1999. In November 1997, BBC News Online was launched, following individual webpages for major news events such as the 1996 Olympic Games, 1997 general election, and the death of Princess Diana. In 1999, the biggest relaunch occurred, with BBC One bulletins, BBC World, BBC News 24, and BBC News Online all adopting a common style. One of the most significant changes was the gradual adoption of the corporate image by the BBC regional news programmes, giving a common style across local, national and international BBC television news. This also included Newyddion, the main news programme of Welsh language channel S4C, produced by BBC News Wales. Following the relaunch of BBC News in 1999, regional headlines were included at the start of the BBC One news bulletins in 2000. The English regions did however lose five minutes at the end of their bulletins, due to a new headline round-up at 18:55. 2000 also saw the Nine O'Clock News moved to the later time of 22:00. This was in response to ITN who had just moved their popular News at Ten programme to 23:00. ITN briefly returned News at Ten but following poor ratings when head-to-head against the BBC's Ten O'Clock News, the ITN bulletin was moved to 22.30, where it remained until 14 January 2008. The retirement in 2009 of Peter Sissons and departure of Michael Buerk from the Ten O'Clock News led to changes in the BBC One bulletin presenting team on 20 January 2003. The Six O'Clock News became double headed with George Alagiah and Sophie Raworth after Huw Edwards and Fiona Bruce moved to present the Ten. A new set design featuring a projected fictional newsroom backdrop was introduced, followed on 16 February 2004 by new programme titles to match those of BBC News 24. BBC News 24 and BBC World introduced a new style of presentation in December 2003, that was slightly altered on 5 July 2004 to mark 50 years of BBC Television News. On 7 March 2005 director general Mark Thompson launched the "Creative Futures" project to restructure the organisation. The individual positions of editor of the One and Six O'Clock News were replaced by a new daytime position in November 2005. Kevin Bakhurst became the first Controller of BBC News 24, replacing the position of editor. Amanda Farnsworth became daytime editor while Craig Oliver was later named editor of the Ten O'Clock News. Bulletins received new titles and a new set design in May 2006, to allow for Breakfast to move into the main studio for the first time since 1997. The new set featured Barco videowall screens with a background of the London skyline used for main bulletins and originally an image of cirrus clouds against a blue sky for Breakfast. This was later replaced following viewer criticism. The studio bore similarities with the ITN-produced ITV News in 2004, though ITN uses a CSO Virtual studio rather than the actual screens at BBC News. BBC News became part of a new BBC Journalism group in November 2006 as part of a restructuring of the BBC. The then-Director of BBC News, Helen Boaden reported to the then-Deputy Director-General and head of the journalism group, Mark Byford until he was made redundant in 2010. On 18 October 2007, ED Mark Thompson announced a six-year plan, "Delivering Creative Futures" (based on his project begun in March 2005), merging the television current affairs department into a new "News Programmes" division. Thompson's announcement, in response to a ยฃ2 billion shortfall in funding, would, he said, deliver "a smaller but fitter BBC" in the digital age, by cutting its payroll and, in 2013, selling Television Centre. The various separate newsrooms for television, radio and online operations were merged into a single multimedia newsroom. Programme making within the newsrooms was brought together to form a multimedia programme making department. BBC World Service director Peter Horrocks said that the changes would achieve efficiency at a time of cost-cutting at the BBC. In his blog, he wrote that by using the same resources across the various broadcast media meant fewer stories could be covered, or by following more stories, there would be fewer ways to broadcast them. A new graphics and video playout system was introduced for production of television bulletins in January 2007. This coincided with a new structure to BBC World News bulletins, editors favouring a section devoted to analysing the news stories reported on. The first new BBC News bulletin since the Six O'Clock News was announced in July 2007 following a successful trial in the Midlands. The summary, lasting 90 seconds, has been broadcast at 20:00 on weekdays since December 2007 and bears similarities with 60 Seconds on BBC Three, but also includes headlines from the various BBC regions and a weather summary. As part of a long-term cost cutting programme, bulletins were renamed the BBC News at One, Six and Ten respectively in April 2008 while BBC News 24 was renamed BBC News and moved into the same studio as the bulletins at BBC Television Centre. BBC World was renamed BBC World News and regional news programmes were also updated with the new presentation style, designed by Lambie-Nairn. 2008 also saw tri-media introduced across TV, radio, and online. The studio moves also meant that Studio N9, previously used for BBC World, was closed, and operations moved to the previous studio of BBC News 24. Studio N9 was later refitted to match the new branding, and was used for the BBC's UK local elections and European elections coverage in early June 2009. A strategy review of the BBC in March 2010, confirmed that having "the best journalism in the world" would form one of five key editorial policies, as part of changes subject to public consultation and BBC Trust approval. After a period of suspension in late 2012, Helen Boaden ceased to be the Director of BBC News. On 16 April 2013, incoming BBC Director-General Tony Hall named James Harding, a former editor of The Times of London newspaper as Director of News and Current Affairs. From August 2012 to March 2013, all news operations moved from Television Centre to new facilities in the refurbished and extended Broadcasting House, in Portland Place. The move began in October 2012, and also included the BBC World Service, which moved from Bush House following the expiry of the BBC's lease. This new extension to the north and east, referred to as "New Broadcasting House", includes several new state-of-the-art radio and television studios centred around an 11-storey atrium. The move began with the domestic programme The Andrew Marr Show on 2 September 2012, and concluded with the move of the BBC News channel and domestic news bulletins on 18 March 2013. The newsroom houses all domestic bulletins and programmes on both television and radio, as well as the BBC World Service international radio networks and the BBC World News international television channel. BBC News and CBS News established an editorial and newsgathering partnership in 2017, replacing an earlier long-standing partnership between BBC News and ABC News. In an October 2018 Simmons Research survey of 38 news organisations, BBC News was ranked the fourth most trusted news organisation by Americans, behind CBS News, ABC News and The Wall Street Journal. In January 2020 the BBC announced a BBC News savings target of ยฃ80 million per year by 2022, involving about 450 staff reductions from the current 6,000. BBC director of news and current affairs Fran Unsworth said there would be further moves toward digital broadcasting, in part to attract back a youth audience, and more pooling of reporters to stop separate teams covering the same news. A further 70 staff reductions were announced in July 2020. BBC Three began airing the news programme The Catch Up in February 2022. It is presented by Levi Jouavel, Kirsty Grant, and Callum Tulley and aims to get the channel's target audience (16 to 34-year olds) to make sense of the world around them while also highlighting optimistic stories. Compared to its predecessor 60 Seconds, The Catch Up is three times longer, running for about three minutes and not airing during weekends. According to its annual report as of December 2021[update], India has the largest number of people using BBC services in the world. In May 2025, following the earthquake that hit Myanmar and Thailand, a television news bulletin (BBC News Myanmar) from the Burmese service using a vacated Voice of America satellite frequency began its broadcasts. Programming and reporting In November 2023, BBC News joined with the International Consortium of Investigative Journalists, Paper Trail Media [de] and 69 media partners including Distributed Denial of Secrets and the Organised Crime and Corruption Reporting Project (OCCRP) and more than 270 journalists in 55 countries and territories to produce the 'Cyprus Confidential' report on the financial network which supports the regime of Vladimir Putin, mostly with connections to Cyprus, and showed Cyprus to have strong links with high-up figures in the Kremlin, some of whom have been sanctioned. Government officials including Cyprus president Nikos Christodoulides and European lawmakers began responding to the investigation's findings in less than 24 hours, calling for reforms and launching probes. BBC News is responsible for the news programmes and documentary content on the BBC's general television channels, as well as the news coverage on the BBC News Channel in the UK, and 22 hours of programming for the corporation's international BBC World News channel. Coverage for BBC Parliament is carried out on behalf of the BBC at Millbank Studios, though BBC News provides editorial and journalistic content. BBC News content is also output onto the BBC's digital interactive television services under the BBC Red Button brand, and until 2012, on the Ceefax teletext system. The music on all BBC television news programmes was introduced in 1999 and composed by David Lowe. It was part of the re-branding which commenced in 1999 and features 'BBC Pips'. The general theme was used on bulletins on BBC One, News 24, BBC World and local news programmes in the BBC's Nations and Regions. Lowe was also responsible for the music on Radio One's Newsbeat. The theme has had several changes since 1999, the latest in March 2013. The BBC Arabic Television news channel launched on 11 March 2008, a Persian-language channel followed on 14 January 2009, broadcasting from the Peel wing of Broadcasting House; both include news, analysis, interviews, sports and highly cultural programmes and are run by the BBC World Service and funded from a grant-in-aid from the British Foreign Office (and not the television licence). The BBC Verify service was launched in 2023 to fact-check news stories, followed by BBC Verify Live in 2025. BBC Radio News produces bulletins for the BBC's national radio stations and provides content for local BBC radio stations via the General News Service (GNS), a BBC-internal news distribution service. BBC News does not produce the BBC's regional news bulletins, which are produced individually by the BBC nations and regions themselves. The BBC World Service broadcasts to some 150 million people in English as well as 27 languages across the globe. BBC Radio News is a patron of the Radio Academy. BBC News Online is the BBC's news website. Launched in November 1997, it is one of the most popular news websites, with 1.2 billion website visits in April 2021, as well as being used by 60% of the UK's internet users for news. The website contains international news coverage as well as entertainment, sport, science, and political news. Mobile apps for Android, iOS and Windows Phone systems have been provided since 2010. Many television and radio programmes are also available to view on the BBC iPlayer and BBC Sounds services. The BBC News channel is also available to view 24 hours a day, while video and radio clips are also available within online news articles. In October 2019, BBC News Online launched a mirror on the dark web anonymity network Tor in an effort to circumvent censorship. Criticism The BBC is required by its charter to be free from both political and commercial influence and answers only to its viewers and listeners. This political objectivity is sometimes questioned. For instance, The Daily Telegraph (3 August 2005) carried a letter from the KGB defector Oleg Gordievsky, referring to it as "The Red Service". Books have been written on the subject, including anti-BBC works like Truth Betrayed by W J West and The Truth Twisters by Richard Deacon. The BBC has been accused of bias by Conservative MPs. The BBC's Editorial Guidelines on Politics and Public Policy state that while "the voices and opinions of opposition parties must be routinely aired and challenged", "the government of the day will often be the primary source of news". The BBC is regularly accused by the government of the day of bias in favour of the opposition and, by the opposition, of bias in favour of the government. Similarly, during times of war, the BBC is often accused by the UK government, or by strong supporters of British military campaigns, of being overly sympathetic to the view of the enemy. An edition of Newsnight at the start of the Falklands War in 1982 was described as "almost treasonable" by John Page, MP, who objected to Peter Snow saying "if we believe the British". During the first Gulf War, critics of the BBC took to using the satirical name "Baghdad Broadcasting Corporation". During the Kosovo War, the BBC were labelled the "Belgrade Broadcasting Corporation" (suggesting favouritism towards the FR Yugoslavia government over ethnic Albanian rebels) by British ministers, although Slobodan Miloseviฤ‡ (then FRY president) claimed that the BBC's coverage had been biased against his nation. Conversely, some of those who style themselves anti-establishment in the United Kingdom or who oppose foreign wars have accused the BBC of pro-establishment bias or of refusing to give an outlet to "anti-war" voices. Following the 2003 invasion of Iraq, a study by the Cardiff University School of Journalism of the reporting of the war found that nine out of 10 references to weapons of mass destruction during the war assumed that Iraq possessed them, and only one in 10 questioned this assumption. It also found that, out of the main British broadcasters covering the war, the BBC was the most likely to use the British government and military as its source. It was also the least likely to use independent sources, like the Red Cross, who were more critical of the war. When it came to reporting Iraqi casualties, the study found fewer reports on the BBC than on the other three main channels. The report's author, Justin Lewis, wrote "Far from revealing an anti-war BBC, our findings tend to give credence to those who criticised the BBC for being too sympathetic to the government in its war coverage. Either way, it is clear that the accusation of BBC anti-war bias fails to stand up to any serious or sustained analysis." Prominent BBC appointments are constantly assessed by the British media and political establishment for signs of political bias. The appointment of Greg Dyke as Director-General was highlighted by press sources because Dyke was a Labour Party member and former activist, as well as a friend of Tony Blair. The BBC's former Political Editor, Nick Robinson, was some years ago a chairman of the Young Conservatives and did, as a result, attract informal criticism from the former Labour government, but his predecessor Andrew Marr faced similar claims from the right because he was editor of The Independent, a liberal-leaning newspaper, before his appointment in 2000. Mark Thompson, former Director-General of the BBC, admitted the organisation has been biased "towards the left" in the past. He said, "In the BBC I joined 30 years ago, there was, in much of current affairs, in terms of people's personal politics, which were quite vocal, a massive bias to the left". He then added, "The organization did struggle then with impartiality. Now it is a completely different generation. There is much less overt tribalism among the young journalists who work for the BBC." Following the EU referendum in 2016, some critics suggested that the BBC was biased in favour of leaving the EU. For instance, in 2018, the BBC received complaints from people who took issue that the BBC was not sufficiently covering anti-Brexit marches while giving smaller-scale events hosted by former UKIP leader Nigel Farage more airtime. On the other hand, a poll released by YouGov showed that 45% of people who voted to leave the EU thought that the BBC was 'actively anti-Brexit' compared to 13% of the same kinds of voters who think the BBC is pro-Brexit. In 2008, the BBC Hindi was criticised by some Indian outlets for referring to the terrorists who carried out the 2008 Mumbai attacks as "gunmen". The response to this added to prior criticism from some Indian commentators suggesting that the BBC may have an Indophobic bias. In March 2015, the BBC was criticised for a BBC Storyville documentary interviewing one of the rapists in India. In spite of a ban ordered by the Indian High court, the BBC still aired the documentary "India's Daughter" outside India. BBC News was at the centre of a political controversy following the 2003 invasion of Iraq. Three BBC News reports (Andrew Gilligan's on Today, Gavin Hewitt's on The Ten O'Clock News and another on Newsnight) quoted an anonymous source that stated the British government (particularly the Prime Minister's office) had embellished the September Dossier with misleading exaggerations of Iraq's weapons of mass destruction capabilities. The government denounced the reports and accused the corporation of poor journalism. In subsequent weeks the corporation stood by the report, saying that it had a reliable source. Following intense media speculation, David Kelly was named in the press as the source for Gilligan's story on 9 July 2003. Kelly was found dead, by suicide, in a field close to his home early on 18 July. An inquiry led by Lord Hutton was announced by the British government the following day to investigate the circumstances leading to Kelly's death, concluding that "Dr. Kelly took his own life." In his report on 28 January 2004, Lord Hutton concluded that Gilligan's original accusation was "unfounded" and the BBC's editorial and management processes were "defective". In particular, it specifically criticised the chain of management that caused the BBC to defend its story. The BBC Director of News, Richard Sambrook, the report said, had accepted Gilligan's word that his story was accurate in spite of his notes being incomplete. Davies had then told the BBC Board of Governors that he was happy with the story and told the Prime Minister that a satisfactory internal inquiry had taken place. The Board of Governors, under the chairman's, Gavyn Davies, guidance, accepted that further investigation of the Government's complaints were unnecessary. Because of the criticism in the Hutton report, Davies resigned on the day of publication. BBC News faced an important test, reporting on itself with the publication of the report, but by common consent (of the Board of Governors) managed this "independently, impartially and honestly". Davies' resignation was followed by the resignation of Director General, Greg Dyke, the following day, and the resignation of Gilligan on 30 January. While undoubtedly a traumatic experience for the corporation, an ICM poll in April 2003 indicated that it had sustained its position as the best and most trusted provider of news. The BBC has faced accusations of holding both anti-Israel and anti-Palestine bias. Douglas Davis, the London correspondent of The Jerusalem Post, has described the BBC's coverage of the Arabโ€“Israeli conflict as "a relentless, one-dimensional portrayal of Israel as a demonic, criminal state and Israelis as brutal oppressors [which] bears all the hallmarks of a concerted campaign of vilification that, wittingly or not, has the effect of delegitimising the Jewish state and pumping oxygen into a dark old European hatred that dared not speak its name for the past half-century.". However two large independent studies, one conducted by Loughborough University and the other by Glasgow University's Media Group concluded that Israeli perspectives are given greater coverage. Critics of the BBC argue that the Balen Report proves systematic bias against Israel in headline news programming. The Daily Mail and The Daily Telegraph criticised the BBC for spending hundreds of thousands of British tax payers' pounds from preventing the report being released to the public. Jeremy Bowen, the Middle East Editor for BBC world news, was singled out specifically for bias by the BBC Trust which concluded that he violated "BBC guidelines on accuracy and impartiality." An independent panel appointed by the BBC Trust was set up in 2006 to review the impartiality of the BBC's coverage of the Israeliโ€“Palestinian conflict. The panel's assessment was that "apart from individual lapses, there was little to suggest deliberate or systematic bias." While noting a "commitment to be fair accurate and impartial" and praising much of the BBC's coverage the independent panel concluded "that BBC output does not consistently give a full and fair account of the conflict. In some ways the picture is incomplete and, in that sense, misleading." It notes that, "the failure to convey adequately the disparity in the Israeli and Palestinian experience, [reflects] the fact that one side is in control and the other lives under occupation". Writing in the Financial Times, Philip Stephens, one of the panellists, later accused the BBC's director-general, Mark Thompson, of misrepresenting the panel's conclusions. He further opined "My sense is that BBC news reporting has also lost a once iron-clad commitment to objectivity and a necessary respect for the democratic process. If I am right, the BBC, too, is lost". Mark Thompson published a rebuttal in the FT the next day. The description by one BBC correspondent reporting on the funeral of Yassir Arafat that she had been left with tears in her eyes led to other questions of impartiality, particularly from Martin Walker in a guest opinion piece in The Times, who picked out the apparent case of Fayad Abu Shamala, the BBC Arabic Service correspondent, who told a Hamas rally on 6 May 2001, that journalists in Gaza were "waging the campaign shoulder to shoulder together with the Palestinian people". Walker argues that the independent inquiry was flawed for two reasons. Firstly, because the time period over which it was conducted (August 2005 to January 2006) surrounded the Israeli withdrawal from Gaza and Ariel Sharon's stroke, which produced more positive coverage than usual. Furthermore, he wrote, the inquiry only looked at the BBC's domestic coverage, and excluded output on the BBC World Service and BBC World. Tom Gross accused the BBC of glorifying Hamas suicide bombers, and condemned its policy of inviting guests such as Jenny Tonge and Tom Paulin who have compared Israeli soldiers to Nazis. Writing for the BBC, Paulin said Israeli soldiers should be "shot dead" like Hitler's S.S, and said he could "understand how suicide bombers feel". The BBC also faced criticism for not airing a Disasters Emergency Committee aid appeal for Palestinians who suffered in Gaza during 22-day war there between late 2008 and early 2009. Most other major UK broadcasters did air this appeal, but rival Sky News did not. British journalist Julie Burchill has accused BBC of creating a "climate of fear" for British Jews over its "excessive coverage" of Israel compared to other nations. In light of the Gaza war, the BBC suspended seven Arab journalists over allegations of expressing support for Hamas via social media. BBC and ABC share video segments and reporters as needed in producing their newscasts. with the BBC showing ABC World News Tonight with David Muir in the UK. However, in July 2017, the BBC announced a new partnership with CBS News allows both organisations to share video, editorial content, and additional newsgathering resources in New York, London, Washington and around the world. BBC News subscribes to wire services from leading international agencies including PA Media (formerly Press Association), Reuters, and Agence France-Presse. In April 2017, the BBC dropped Associated Press in favour of an enhanced service from AFP. BBC News reporters and broadcasts are now and have in the past been banned in several countries primarily for reporting which has been unfavourable to the ruling government. For example, correspondents were banned by the former apartheid regime of South Africa. The BBC was banned in Zimbabwe under Mugabe for eight years as a terrorist organisation until being allowed to operate again over a year after the 2008 elections. The BBC was banned in Burma (officially Myanmar) after their coverage and commentary on anti-government protests there in September 2007. The ban was lifted four years later in September 2011. Other cases have included Uzbekistan, China, and Pakistan. BBC Persian, the BBC's Persian language news site, was blocked from the Iranian internet in 2006. The BBC News website was made available in China again in March 2008, but as of October 2014[update], was blocked again. In June 2015, the Rwandan government placed an indefinite ban on BBC broadcasts following the airing of a controversial documentary regarding the 1994 Rwandan genocide, Rwanda's Untold Story, broadcast on BBC2 on 1 October 2014. The UK's Foreign Office recognised "the hurt caused in Rwanda by some parts of the documentary". In February 2017, reporters from the BBC (as well as the Daily Mail, The New York Times, Politico, CNN, and others) were denied access to a United States White House briefing. In 2017, BBC India was banned for a period of five years from covering all national parks and sanctuaries in India. Following the withdrawal of CGTN's UK broadcaster licence on 4 February 2021 by Ofcom, China banned BBC News from airing in China. See also References External links |below = Category }}
========================================
[SOURCE: https://en.wikipedia.org/wiki/HTC_First] | [TOKENS: 1512]
Contents HTC First GPS GLONASS Micro USB 2.0 Bluetooth 4.0 with A2DP The HTC First is an Android smartphone released by HTC on April 12, 2013. It was unveiled on April 4, 2013, as part of a press event held by Facebook. Serving as a successor to a pair of Facebook-oriented devices HTC released in 2011, it was the first and only Android device to be pre-loaded with Facebook's own user interface layer, Facebook Home, in lieu of HTC's own Sense. While considered compelling by critics for a mid-range phone due to its display quality and its optional use of stock Android beneath the default Facebook Home overlay, the HTC First was panned by critics for its poor camera and lack of removable storage, and was also affected by the similarly underwhelming reception faced by the Facebook Home software. AT&T, the exclusive U.S. carrier of the First, only reportedly sold over 15,000 units of the device, while both ReadWrite and Time named it among the biggest failures in the technology industry for 2013. Development In 2011, HTC released two low-end smartphones that provided integration with the social networking service Facebook, the keyboard-equipped HTC Status, and the larger slate HTC Salsa. The two phones featured Facebook's apps pre-loaded, along with Facebook integration within the HTC Sense interface and a dedicated Facebook key that could be used to provide quick access to sharing functions. Facebook founder Mark Zuckerberg endorsed the two devices in a pre-taped statement during their unveiling, and promised the possibility of more "Facebook phones" in the near future. Later that year, details began surfacing about a collaboration between Facebook and HTC known as "Buffy" (after the television series Buffy the Vampire Slayer), a fork of Android that would be "deeply social". The specifications of the phone was first leaked out by an HTC insider towards the end of 2012 that had claimed the device was then called "Opera UL" with a 1280x720 screen, and an Adreno 305 graphics processor with the Snapdragon 1.4 GHz CPU system on chip, which was later known to match the actual specifications of the final device. In early 2013, reports indicated that HTC was preparing to unveil a Facebook-oriented smartphone with the revised codename "Myst"; this mid-range device was reportedly pre-loaded with a new Facebook-developed user interface layer known as "Facebook Home". Renderings of the new device leaked on April 3, 2013, revealing the design of the phone and its official name as the HTC First. Both Facebook Home and the HTC First were unveiled at a Facebook news conference held the following day on April 4, 2013. AT&T exclusively released the device in the United States on April 12, 2013. Specifications The HTC First is a mid-range smartphone which uses a dual-core, 1.4 GHz Qualcomm Snapdragon 400 processor with support for LTE, a 2000 mAh battery, 1 GB of RAM and 16 GB of non-expandable storage. The First uses a 4.3-inch 720p Super LCD display, and includes a 5-megapixel rear-facing camera, and a 1.6-megapixel front-facing camera. The HTC First's exterior uses a rounded, minimal design with three capacitive buttons below its screen, and was available in black, light blue, red, and white color finishes. The HTC First runs Android 4.1.2 "Jelly Bean" and Facebook Home, a new interface layer developed by Facebook that heavily integrates with the service. Facebook Home consists of a home screen and lock screen replacement known as Cover Feed (which aggregates content posted by friends on Facebook along with notifications from other apps), the ability to message users (via either Facebook or SMS) from any app using the "Chat Heads" overlay, and an overall experience that is oriented towards social interaction. The HTC First was also the first smartphone to include the recently acquired Instagram as a pre-loaded app. Although support for Facebook Home is not limited to the HTC First (it was also released for select HTC and Samsung models as well, and the Chat Heads feature was added to the standalone Facebook Messenger app), integration with certain system functions (such as the ability to display non-Facebook notifications on the lock screen) is exclusive to the First due to technical limitations. If Facebook Home is disabled, the device reverts to a stock Android 4.1 experience; the First was the first HTC device since the T-Mobile G2 to offer a stock Android interface and not ship with the company's HTC Sense software. Reception The HTC First was released to mixed reviews. Dieter Bohn of The Verge gave the HTC First a 7.9 out of 10, receiving high marks in most categories except for its camera. Its design was considered to have a comfortable size and shape by contrast to larger flagship Android phones, and a display that was noted for its high resolution, good color reproduction and "ridiculous" viewing angles (but still being hard to use in direct sunlight). The First's camera was considered to be better than expected for a low-end phone, but produced "muddy" photos and "[felt] like a throwback to an earlier age when smartphones were nigh-useless in the dark." Facebook Home's functionality was considered to be good for casual users, but the ability to switch back to a stock Android 4.1 interface was considered "stunning" and a good compromise for the lack of Nexus devices with LTE support at the time. Alex Roth of TechRadar gave the HTC First a 3.5 out of 5, praising its build quality and operating system (feeling that the First could "develop a sort of second life as the mid-range of choice among Google geeks" due to its use of stock Android and LTE support) and considering it to possibly be "the last decent dual-core handset ever made", but considered Facebook Home to be a "glorified screensaver", and also criticized the camera's low quality and the lack of a dedicated shutter button (which Roth believed would have made sense on a Facebook-oriented phone). On May 13, 2013, reports surfaced that AT&T had only sold 15,000 units of the First since its launch, and was planning to discontinue the device in response to the poor reception of Facebook Home from both users and AT&T's sales representatives. The reports came shortly after AT&T had lowered the First's price from $99.99 to $0.99 on a two-year contract as a promotion; however, AT&T denied any possible connection to the device being possibly discontinued. In response to the issues, the release of the First on the British carriers EE and Orange was indefinitely delayed so Facebook could focus on making improvements to the Home software. In December 2013, Time named the HTC First as one of the 47 "lamest moments in tech" for 2013, and ReadWrite similarly named it one of the "Top 10 Tech Failures" of 2013, stating that "like Carrie Underwood in the remade Sound of Music Live!, the HTC First smartphone started out as an intriguing concept that attempted to shoehorn something very popular (Facebook) into a familiar vehicle (a smartphone). And like that live television event, it wound up being an undeniable disaster." References External links
========================================
[SOURCE: https://www.mako.co.il/hix-special/Article-b4a84e783867c91026.htm] | [TOKENS: 8689]
ื”ืืจื•ืก ื”ืชืงืฉืจ ืœืžื•ืงื“ ื”ื—ื™ืจื•ื โ€“ ืื‘ืœ ื”ื—ืฉื“ ื”ื•ืคื ื” ื›ืœืคื™ื•: ืžื” ืขืœื” ื‘ื’ื•ืจืœื” ืฉืœ ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ืื”ื™ื ื”ื™ื™ืชื” ื‘ืช 21, ืžืื•ืจืกืช ื•ื‘ืžืจื—ืง ืฉื‘ื•ืขื•ืช ืžืกื™ื•ื ื”ืœื™ืžื•ื“ื™ื ื›ืฉื ืขืœืžื” ืžื‘ื™ืชื” ื‘ืื•ื”ื™ื•. ื‘ืžืฉืš ืฉื ื™ื ืœื ื™ื“ืขื• ืื ื‘ืจื—ื”, ื ื—ื˜ืคื” ืื• ื ืจืฆื—ื”. ืœื‘ืกื•ืฃ, ื›ืฉืฉืจื™ื“ื™ื” ื ืžืฆืื• ื‘ื ื—ืœ ืžืจื•ื—ืง, ื”ื—ื•ืงืจื™ื ื”ื—ืœื• ืœื”ืคื ื•ืช ืืช ื”ื–ืจืงื•ืจ ืืœ ื”ืื™ืฉ ืฉืืžื•ืจ ื”ื™ื” ืœื”ื™ื•ืช ื”ืงืจื•ื‘ ืืœื™ื” ื‘ื™ื•ืชืจื’ื™ื ืจื™ื™ื˜ื”ื™ืงืกืคื•ืจืกื: 20.02.26, 06:00 | ืขื•ื“ื›ืŸ: 20.02.26, 18:02ืฆื™ืœื•ื: ืžืชื•ืš ื”ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช ื‘ื”ืชืื ืœืกืขื™ืฃ 27ื' ื‘ื—ื•ืงื”ืงื™ืฉื•ืจ ื”ื•ืขืชืง"ืืจื•ืกืชื™ ื ืขืœืžื”. ืื ื™ ืœื ืžื•ืฆื ืื•ืชื” ื‘ืฉื•ื ืžืงื•ื". ื›ืš ื ืฉืžืข ื’'ื•ืŸ ืงืจื˜ืจ ื‘ืฉื™ื—ืช ื”ื—ื™ืจื•ื ื‘-14 ื‘ืื•ื’ื•ืกื˜ 2011, ื›ืฉื”ื•ื ื“ื™ื•ื•ื— ืขืœ ื”ื™ืขืœืžื•ืชื” ืฉืœ ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื. ื”ื•ื ื ืฉืžืข ืœื—ื•ืฅ, ื›ืžืขื˜ ืžื‘ื•ืœื‘ืœ, ืื‘ืœ ืžืื•ืชื• ืจื’ืข ื”ื—ืงื™ืจื” ื”ืชื”ืคื›ื” ื ื’ื“ื•.ื›ืฉื”ืฉื•ื˜ืจื™ื ื”ื’ื™ืขื• ืœื“ื™ืจืชื” ืฉืœ ืžืจืงื”ื ื‘ืคื™ื™ืจืคื™ืœื“, ืื•ื”ื™ื•, ื”ื ื ืชืงืœื• ื‘ื–ื™ืจื” ื—ืฉื•ื“ื” ืžืื•ื“. ืžื›ื•ื ื™ืชื” ื—ื ืชื” ื‘ื—ื•ืฅ, ื•ื‘ื—ื“ืจ ื”ืฉื™ื ื” ื ืžืฆื ื”ืืจื ืง ืฉืœื” ืขื ืจื™ืฉื™ื•ืŸ ื”ื ื”ื™ื’ื” ื•-300 ื“ื•ืœืจ ื‘ืžื–ื•ืžืŸ. ืœื ื”ื™ื• ืกื™ืžื ื™ ืคืจื™ืฆื” ืื• ืžืื‘ืง ื‘ื•ืœื˜ื™ื. ืืคื™ืœื• ื”ื›ืœื‘ ืฉืœื”, ืžืจืคื™, ื ืžืฆื ื‘ืžืงื•ื. ื–ื” ื”ื™ื” ื›ืื™ืœื• ื”ื™ื ื™ืฆืื” ืœืจื’ืข ื•ื”ืชื›ื•ื•ื ื” ืœื—ื–ื•ืจ. ื›ืœ ื”ืกื™ืžื ื™ื ื”ืจืื• ืฉื”ื™ื ืœื ื ืขืœืžื” ืžืจืฆื•ืŸ. ืœืžืขืฉื”, ื”ืคืจื™ื˜ ื”ื™ื—ื™ื“ ืฉื”ื™ื” ื—ืกืจ ื”ื™ื” ื”ื˜ืœืคื•ืŸ ื”ื ื™ื™ื“ ืฉืœื”. ืžืฉื”ื• ื‘ืชื—ื•ืฉื” ื”ื›ืœืœื™ืช ืœื ื”ืกืชื“ืจ.ืงื™ื™ื˜ืœื™ืŸ, ืกื˜ื•ื“ื ื˜ื™ืช ื‘ืžื›ื•ืŸ ืœืื•ืžื ื•ืช ืฉืœ ืกื™ื ืกื™ื ื˜ื™, ื”ื™ื™ืชื” ืžื•ื›ืจืช ื›ืฆืขื™ืจื” ื™ืฆื™ืจืชื™ืช ื•ืื•ืคื˜ื™ืžื™ืช. ื”ื™ื ืื•ืžืฆื” ื›ืชื™ื ื•ืงืช ืขืœ ื™ื“ื™ ื“ื™ื™ื‘ ื•ืฉืจื™ ืžืจืงื”ื, ืฉืชื™ืืจื• ืื•ืชื” ื›ื™ืœื“ื” ืžืœืืช ืื•ืจ. ื‘ืชื™ื›ื•ืŸ ื”ื›ื™ืจื” ืืช ื’'ื•ืŸ, ื•ื”ืงืฉืจ ื‘ื™ื ื™ื”ื ื”ืชืคืชื— ื‘ืžื”ื™ืจื•ืช. "ื”ื ื”ืชืื”ื‘ื• ื›ืœ ื›ืš ืžื”ืจ", ืกื™ืคืจื” ืื—ื•ืชื• ื”ื—ื•ืจื’ืช ืฉืœ ื’'ื•ืŸ, ืžื™ื™ื’ืŸ ืงื•ื ืฅ. ืื—ืจื™ ื—ืžืฉ ืฉื ื•ืช ื–ื•ื’ื™ื•ืช, ื‘ืื•ื’ื•ืกื˜ 2010, ื”ื•ื ื›ืจืข ื‘ืจืš ื•ื”ืฆื™ืข ื ื™ืฉื•ืื™ื. "ื”ื•ื ื ืชืŸ ืœื” ื˜ื‘ืขืช, ื•ื”ื™ื ื—ืฉื‘ื” ืฉื”ื™ื ื”ื“ื‘ืจ ื”ื›ื™ ื™ืคื” ื‘ืขื•ืœื", ื ื–ื›ืจื” ืžื™ื™ื’ืŸ.ื‘ืื•ื’ื•ืกื˜ 2011, ืฉื‘ื•ืขื•ืช ืกืคื•ืจื™ื ืœืคื ื™ ืกื™ื•ื ืœื™ืžื•ื“ื™ื” ื•ืœืคื ื™ ืฉืชื›ื ื ื• ืœืขื‘ื•ืจ ื™ื—ื“ ืœืงื•ืœื•ืจื“ื•, ื”ื›ืœ ื ืขืฆืจ. ื‘ืœื™ืœื” ืฉืœืคื ื™ ื”ื”ื™ืขืœืžื•ืช ื‘ื™ืœื• ื”ืฉื ื™ื™ื ืขื ื—ื‘ืจ ืžืฉื•ืชืฃ ื‘ืฉื ื‘ืจืื“ ืคื•ืŸ ื‘ืืจื’ืŸ. ื‘ื”ืžืฉืš, ื™ืฆื ื’'ื•ืŸ ืœืคื’ื•ืฉ ื—ื‘ืจื™ื ื ื•ืกืคื™ื ื‘ืกื‘ื™ื‘ื•ืช 23:30, ื‘ื–ืžืŸ ืฉืงื™ื™ื˜ืœื™ืŸ ื ืฉืืจื” ื‘ื‘ื™ืช.ื”ื”ื•ื“ืขื” ื”ืื—ืจื•ื ื” ืฉืงื™ื™ื˜ืœื™ืŸ ืฉืœื—ื” ื”ื™ื™ืชื” ืœื’'ื•ืŸ ื‘-13 ื‘ืื•ื’ื•ืกื˜ ื‘ืฉืขื” 23:36. ืžืขื˜ ืื—ืจื™ ื—ืฆื•ืช, ื”ื˜ืœืคื•ืŸ ืฉืœื” ื›ื‘ื”. ืœืคื™ ื’ืจืกืชื• ืฉืœ ื’'ื•ืŸ, ื”ื•ื ื—ื–ืจ ืœื‘ื™ืช ืืžื• ื‘ืกื‘ื™ื‘ื•ืช ื”ืฉืขื” 1:30 ื‘ืœื™ืœื”, ืฆืคื” ื‘ื˜ืœื•ื•ื™ื–ื™ื” ื•ืฉืœื— ืœื” ื”ื•ื“ืขืช "ื‘ื•ืงืจ ื˜ื•ื‘" ื‘ืืจื‘ืข ืœืคื ื•ืช ื‘ื•ืงืจ. ื›ืฉื”ืชืขื•ืจืจ ื‘ืฆื”ืจื™ื™ื ื•ืœื ืงื™ื‘ืœ ืชืฉื•ื‘ื”, ื”ื•ื ื ืœื—ืฅ.ืคืจืกื•ืžืชื’'ื•ืŸ ื ืกืข ืœื‘ื™ืชื”, ืจืื” ืืช ื”ืจื›ื‘ ื‘ื—ื•ืฅ ื•ื”ื‘ื™ืŸ ืฉืžืฉื”ื• ืื™ื ื• ื›ืฉื•ืจื”. ื–ืžืŸ ืงืฆืจ ืœืื—ืจ ืžื›ืŸ ื›ื‘ืจ ื”ืชืงืฉืจ ืœืžื•ืงื“ ื”ื—ื™ืจื•ื. ื‘ื—ืงื™ืจืชื• ื”ื‘ื—ื™ื ื• ื”ืฉื•ื˜ืจื™ื ื‘ืฉืจื™ื˜ื•ืช ื‘ืฆื“ ื”ืฉืžืืœื™ ืฉืœ ืฆื•ื•ืืจื•. ื’'ื•ืŸ ื˜ืขืŸ ืฉืžื“ื•ื‘ืจ ื‘ืคืฆื™ืขื•ืช ื’ื™ืœื•ื—. ืคื•ืœ ื ื™ื•ื˜ื•ืŸ, ื—ื•ืงืจ ืžื—ื•ื– ื‘ืื˜ืœืจ, ืœื ื”ืฉืชื›ื ืข. "ื–ื” ืœื ืžืงื•ื ืจื’ื™ืœ ืฉื’ื‘ืจ ืžืชื’ืœื— ื‘ื•", ืืžืจ. "ื–ื” ื ืจืื” ื›ืื™ืœื• ื”ืกื™ืžื ื™ื ื ื’ืจืžื• ืžื™ื“ื™ื™ื ืฉืœ ืžื™ืฉื”ื• ืื—ืจ". ืœืžืจื•ืช ื”ื—ืฉื“, ืœื ื”ื™ื™ืชื” ืจืื™ื” ืฉืงืฉืจื” ืื•ืชื• ื™ืฉื™ืจื•ืช ืœื”ื™ืขืœืžื•ืช.ื”ืืคืฉืจื•ืช ืฉืงื™ื™ื˜ืœื™ืŸ ืขื–ื‘ื” ืžืจืฆื•ื ื” ืขื“ื™ื™ืŸ ืจื™ื—ืคื” ื‘ืื•ื•ื™ืจ. ืื‘ืœ ื”ื–ืžืŸ ื—ืœืฃ, ื•ื”ืฆืขื™ืจื” ืœื ื ืžืฆืื”. ื‘-7 ื‘ืืคืจื™ืœ 2013, ื›ืžืขื˜ ืฉื ืชื™ื™ื ืื—ืจื™ ื”ื™ืขืœืžื•ืชื”, ื ืžืฆืื• ืฉืจื™ื“ื™ ื’ื•ืคื” ืœื™ื“ ื ื—ืœ ื‘ืžื—ื•ื– ืคืจื ืงืœื™ืŸ, ืื™ื ื“ื™ืื ื”. "ื”ื™ื™ืชื” ืฉืงื™ืช ืคืœืกื˜ื™ืง ืžืขืœ ื”ื’ื•ืœื’ื•ืœืช", ืกื™ืคืจ ื•ืื ืก ืคืื˜ื•ืŸ, ื‘ืœืฉ ื‘ืžืฉื˜ืจืช ืื™ื ื“ื™ืื ื”. ื”ืฉืจื™ื“ื™ื ื”ื™ื• ืฉืœ ืื“ื ืฆืขื™ืจ, ืืš ืœื ื ื™ืชืŸ ื”ื™ื” ืœื–ื”ื•ืช ืžื™ื“ ืืช ื–ื”ื•ืชื•. ื–ืžืŸ ืงืฆืจ ืœืื—ืจ ืžื›ืŸ, ืจื™ืฉื•ืžื™ื ื“ื ื˜ืœื™ื™ื ืกื™ืคืงื• ืืช ื”ืชืฉื•ื‘ื”: ื”ืฉืจื™ื“ื™ื ื”ื™ื• ืฉืœ ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื."ืฉืžืขืชื™ ื“ืคื™ืงื” ื‘ื“ืœืช", ืกื™ืคืจ ืื‘ื™ื”, ื“ื™ื™ื‘. "ืคืชื—ืชื™ ื•ืขืžื“ื• ืฉื ืฉื ื™ ืฉื•ื˜ืจื™ื ื•ื›ื•ืžืจ. ื”ื ืœื ื”ื™ื• ืฆืจื™ื›ื™ื ืœื•ืžืจ ืžื™ืœื”. ื™ื“ืขืชื™. ืœื‘ื™ ืฆื ื— ื•ื”ืชื—ืœืชื™ ืœืžืจืจ ื‘ื‘ื›ื™". ืžื•ืชื” ืฉืœ ื”ืกื˜ื•ื“ื ื˜ื™ืช ื”ื•ื’ื“ืจ ื›ืจืฆื—, ืืš ืกื™ื‘ืช ื”ืžื•ื•ืช ื”ืžื“ื•ื™ืงืช ืœื ื ืงื‘ืขื”. ื”ื’ื™ืœื•ื™ ื”ืขื‘ื™ืจ ืืช ื”ื—ืงื™ืจื” ืœื”ื™ืœื•ืš ืื—ืจ. ืขื“ ืžื”ืจื” ื”ืชื‘ืจืจ ืฉืœืžืฉืคื—ืชื• ืฉืœ ื’'ื•ืŸ ื™ืฉ ื—ื•ื•ื” ื‘ืื•ืชื• ืžื—ื•ื– ื‘ืื™ื ื“ื™ืื ื” ืฉื‘ื• ื ืžืฆืื• ื”ืฉืจื™ื“ื™ื. ื”ืžืจื—ืง ื›ื‘ืจ ืœื ื ืจืื” ืžืงืจื™ ื›ืœ ื›ืš.ืคืจืกื•ืžืชื’'ื•ืŸ ื”ืžืฉื™ืš ืœื”ื›ื—ื™ืฉ ื›ืœ ืžืขื•ืจื‘ื•ืช ื‘ืคืจืฉื”. "ืื ื™ ืœื ื™ื•ื“ืข ืื™ืš ื”ื™ื ื”ื’ื™ืขื” ืœืฉื. ืื ื™ ืžื ืกื” ืœื ืœื—ืฉื•ื‘ ืขืœ ื–ื”", ืืžืจ ืœื‘ืœืฉ. ื”ืชื™ืง, ื‘ื”ื™ืขื“ืจ ืจืื™ื•ืช ื—ื•ืชื›ื•ืช, ื ื•ืชืจ ืคืชื•ื— ื‘ืžืฉืš ืฉื ื™ื. ืคืจืงืœื™ื˜ื•ืช ืžื—ื•ื– ื‘ืื˜ืœืจ ืคืชื—ื” ืืช ื”ืชื™ืง ืžื—ื“ืฉ ื‘ืฉื ืช 2020.ื”ื—ื•ืงืจ ื ื™ื•ื˜ื•ืŸ ืื™ืชืจ ืขื“ื” ืฉื˜ืขื ื” ื›ื™ ื™ืžืžื” ืœืคื ื™ ื”ื”ื™ืขืœืžื•ืช ืจืืชื” ืืช ืงื™ื™ื˜ืœื™ืŸ ื•ื’'ื•ืŸ ืจื‘ื™ื ื‘ืคืกื˜ื™ื‘ืœ, ื•ื”ืชืจืฉืžื” ืฉื”ื ืขื•ืžื“ื™ื ืœื”ื™ืคืจื“. ื‘ืžืงื‘ื™ืœ ื ืžืฆืื• ืฉื ื™ ื ืขืจื™ื ืฉืกื™ืคืจื• ื›ื™ ืจืื• ืืช ืจื›ื‘ื• ืฉืœ ื’'ื•ืŸ ื•ืจื›ื‘ ื ื•ืกืฃ ืขื•ืฆืจื™ื ืœื™ื“ ื‘ื™ืชื” ืฉืœ ืžืจืงื”ื ื‘ืกื‘ื™ื‘ื•ืช 01:30 ื‘ืœื™ืœื”. ืœืคื™ ื”ืขื“ื•ื™ื•ืช, ื”ืžื›ื•ื ื™ื•ืช ืขื–ื‘ื• ื›ืขื‘ื•ืจ ื“ืงื•ืช ืกืคื•ืจื•ืช. ื’'ื•ืŸ ื˜ืขืŸ ืฉื”ื ืขืจื™ื ืžืฉืงืจื™ื ื•ื—ื–ืจ ืขืœ ื’ืจืกืชื• ื›ื™ ื”ื™ื” ื‘ื‘ื™ืช ืืžื• ื•ืฆืคื” ื‘ืกื“ืจื” ื‘ืžื—ืฉื‘. ืจื™ืฉื•ืžื™ ื”ืžื—ืฉื‘ ืื›ืŸ ื”ืจืื• ืฉื”ืคืจืงื™ื ื”ื•ืคืขืœื•. ืืœื ืฉื”ื—ื•ืงืจื™ื ื’ื™ืœื• ืคืจื˜ ื—ืจื™ื’.ื‘ื‘ื•ืงืจ ืฉืœืžื—ืจืช, ื›ืš ืขืœื” ืžื‘ื“ื™ืงืช ื”ื“ืคื“ืคืŸ, ื’'ื•ืŸ ื—ื™ืคืฉ ื‘ื’ื•ื’ืœ ืชืงืฆื™ืจื™ื ืฉืœ ืื•ืชื ืคืจืงื™ื ืฉื‘ื”ื ืฆืคื” ืœื›ืื•ืจื”. ืขื‘ื•ืจ ื”ื—ื•ืงืจ ื ื™ื•ื˜ื•ืŸ, ื–ื” ื ืจืื” ื›ืžื• ื ื™ืกื™ื•ืŸ ืœื”ืฉืœื™ื ืคืจื˜ื™ื ืฉืœื ื‘ืืžืช ืจืื”. ื”ืชื™ืื•ืจื™ื” ืฉื ื‘ื ืชื” ื”ื™ื™ืชื” ืžืฆืžืจืจืช. ืœืคื™ ื”ื—ืฉื“, ื’'ื•ืŸ ื”ืจื’ ืืช ืงื™ื™ื˜ืœื™ืŸ ืžื•ืงื“ื ื™ื•ืชืจ ื‘ืื•ืชื• ืœื™ืœื”, ื™ืฆื ืœื‘ืœื•ืช ื›ื“ื™ ืœื‘ืกืก ืืœื™ื‘ื™, ื”ืคืขื™ืœ ืืช ื”ืžื—ืฉื‘ ื‘ื‘ื™ืช ืืžื• ื•ืื– ื—ื–ืจ ื›ื“ื™ ืœื”ื™ืคื˜ืจ ืžื”ื’ื•ืคื” ื‘ืขื–ืจืช ืฉื•ืชืฃ ืœื ื™ื“ื•ืข. ื”ื›ืœ ื”ื™ื” ื ืกื™ื‘ืชื™, ืืš ื”ืชืžื•ื ื” ื”ื—ืœื” ืœื”ืชื’ื‘ืฉ.ื‘ืžืจืฅ 2023, ื›ืžืขื˜ ืขืฉื•ืจ ืœืื—ืจ ื”ืจืฆื—, ื’'ื•ืŸ ื ืขืฆืจ. ื”ืชื™ืง ื ื•ืชืจ ื‘ืจื•ื‘ื• ื ืกื™ื‘ืชื™, ื•ื”ื—ืฉืฉ ืžืžืฉืคื˜ ืžืžื•ืฉืš ืœืœื ื”ื›ืจืขื” ื‘ืจื•ืจื” ืจื™ื—ืฃ ืžืขืœ ื”ืžืฉืคื—ื”. ืœื‘ืกื•ืฃ ื”ืกื›ื™ืžื• ื”ืชื•ื‘ืขื™ื ืœืขืกืงืช ื˜ื™ืขื•ืŸ. ื’'ื•ืŸ ื”ื•ื“ื” ื‘ื”ืจื™ื’ื” ื‘ืœื‘ื“ ื•ื ื™ื“ื•ืŸ ืœืฉืœื•ืฉ ืฉื ื•ืช ืžืืกืจ. ื‘ืขื™ื ื™ ืจื‘ื™ื ื–ื” ื”ื™ื” ืขื•ื ืฉ ืงืœ ืžื“ื™. ืขื‘ื•ืจ ื“ื™ื™ื‘ ืžืจืงื”ื, ื–ืืช ื”ื™ื™ืชื” ืคืฉืจื” ื›ื•ืื‘ืช. "ืฉืœื•ืฉ ืฉื ื™ื ืœื ื™ืกืคื™ืงื• ืœืžื” ืฉื”ื•ื ืขืฉื” ื•ืœืงื— ืžืžื ื™", ืืžืจ ื”ืื‘. "ื™ื›ื•ืœืชื™ ืœืœื›ืช ืœืžืฉืคื˜ ื•ืื ื™ ืžืืžื™ืŸ ืฉื”ื•ื ื”ื™ื” ื ืžืฆื ืืฉื. ืื‘ืœ ื‘ืฉืœื‘ ื”ื–ื” ืจืง ืจืฆื™ืชื™ ืฉื”ื•ื ื™ื•ื“ื”, ืฉื”ื•ื ื™ื™ืฉื‘ ื‘ื›ืœื, ืฉื”ื•ื ื™ืกื‘ื•ืœ ืงืฆืช".ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื ื”ื™ื™ืชื” ืืžื•ืจื” ืœืกื™ื™ื ืœื™ืžื•ื“ื™ื, ืœืขื‘ื•ืจ ืœืžื“ื™ื ื” ื—ื“ืฉื” ื•ืœื”ืชื—ื™ืœ ื—ื™ื™ื ืื—ืจื™ื. ื‘ืžืงื•ื ื–ื”, ื—ื™ื™ื” ื ื’ื“ืขื• ื‘ืœื™ืœื” ืื—ื“ ืฉืœ ืื•ื’ื•ืกื˜. ื”ืื™ืฉ ืฉื”ืชืงืฉืจ ืœื“ื•ื•ื— ืขืœ ื”ื™ืขืœืžื•ืชื” ื”ืคืš, ืฉื ื™ื ืื—ืจ ื›ืš, ืœืื™ืฉ ืฉื”ื•ื“ื” ื‘ืื—ืจื™ื•ืช ืœืžื•ืชื”.ื‘ื™ื–ืืจื—ื“ืฉื•ืช ื‘ืขื•ืœืืžืฆืืชื ื˜ืขื•ืช ืœืฉื•ืŸ? ื”ืืจื•ืก ื”ืชืงืฉืจ ืœืžื•ืงื“ ื”ื—ื™ืจื•ื โ€“ ืื‘ืœ ื”ื—ืฉื“ ื”ื•ืคื ื” ื›ืœืคื™ื•: ืžื” ืขืœื” ื‘ื’ื•ืจืœื” ืฉืœ ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื ื”ื™ื ื”ื™ื™ืชื” ื‘ืช 21, ืžืื•ืจืกืช ื•ื‘ืžืจื—ืง ืฉื‘ื•ืขื•ืช ืžืกื™ื•ื ื”ืœื™ืžื•ื“ื™ื ื›ืฉื ืขืœืžื” ืžื‘ื™ืชื” ื‘ืื•ื”ื™ื•. ื‘ืžืฉืš ืฉื ื™ื ืœื ื™ื“ืขื• ืื ื‘ืจื—ื”, ื ื—ื˜ืคื” ืื• ื ืจืฆื—ื”. ืœื‘ืกื•ืฃ, ื›ืฉืฉืจื™ื“ื™ื” ื ืžืฆืื• ื‘ื ื—ืœ ืžืจื•ื—ืง, ื”ื—ื•ืงืจื™ื ื”ื—ืœื• ืœื”ืคื ื•ืช ืืช ื”ื–ืจืงื•ืจ ืืœ ื”ืื™ืฉ ืฉืืžื•ืจ ื”ื™ื” ืœื”ื™ื•ืช ื”ืงืจื•ื‘ ืืœื™ื” ื‘ื™ื•ืชืจ "ืืจื•ืกืชื™ ื ืขืœืžื”. ืื ื™ ืœื ืžื•ืฆื ืื•ืชื” ื‘ืฉื•ื ืžืงื•ื". ื›ืš ื ืฉืžืข ื’'ื•ืŸ ืงืจื˜ืจ ื‘ืฉื™ื—ืช ื”ื—ื™ืจื•ื ื‘-14 ื‘ืื•ื’ื•ืกื˜ 2011, ื›ืฉื”ื•ื ื“ื™ื•ื•ื— ืขืœ ื”ื™ืขืœืžื•ืชื” ืฉืœ ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื. ื”ื•ื ื ืฉืžืข ืœื—ื•ืฅ, ื›ืžืขื˜ ืžื‘ื•ืœื‘ืœ, ืื‘ืœ ืžืื•ืชื• ืจื’ืข ื”ื—ืงื™ืจื” ื”ืชื”ืคื›ื” ื ื’ื“ื•. ื›ืฉื”ืฉื•ื˜ืจื™ื ื”ื’ื™ืขื• ืœื“ื™ืจืชื” ืฉืœ ืžืจืงื”ื ื‘ืคื™ื™ืจืคื™ืœื“, ืื•ื”ื™ื•, ื”ื ื ืชืงืœื• ื‘ื–ื™ืจื” ื—ืฉื•ื“ื” ืžืื•ื“. ืžื›ื•ื ื™ืชื” ื—ื ืชื” ื‘ื—ื•ืฅ, ื•ื‘ื—ื“ืจ ื”ืฉื™ื ื” ื ืžืฆื ื”ืืจื ืง ืฉืœื” ืขื ืจื™ืฉื™ื•ืŸ ื”ื ื”ื™ื’ื” ื•-300 ื“ื•ืœืจ ื‘ืžื–ื•ืžืŸ. ืœื ื”ื™ื• ืกื™ืžื ื™ ืคืจื™ืฆื” ืื• ืžืื‘ืง ื‘ื•ืœื˜ื™ื. ืืคื™ืœื• ื”ื›ืœื‘ ืฉืœื”, ืžืจืคื™, ื ืžืฆื ื‘ืžืงื•ื. ื–ื” ื”ื™ื” ื›ืื™ืœื• ื”ื™ื ื™ืฆืื” ืœืจื’ืข ื•ื”ืชื›ื•ื•ื ื” ืœื—ื–ื•ืจ. ื›ืœ ื”ืกื™ืžื ื™ื ื”ืจืื• ืฉื”ื™ื ืœื ื ืขืœืžื” ืžืจืฆื•ืŸ. ืœืžืขืฉื”, ื”ืคืจื™ื˜ ื”ื™ื—ื™ื“ ืฉื”ื™ื” ื—ืกืจ ื”ื™ื” ื”ื˜ืœืคื•ืŸ ื”ื ื™ื™ื“ ืฉืœื”. ืžืฉื”ื• ื‘ืชื—ื•ืฉื” ื”ื›ืœืœื™ืช ืœื ื”ืกืชื“ืจ. ืงื™ื™ื˜ืœื™ืŸ, ืกื˜ื•ื“ื ื˜ื™ืช ื‘ืžื›ื•ืŸ ืœืื•ืžื ื•ืช ืฉืœ ืกื™ื ืกื™ื ื˜ื™, ื”ื™ื™ืชื” ืžื•ื›ืจืช ื›ืฆืขื™ืจื” ื™ืฆื™ืจืชื™ืช ื•ืื•ืคื˜ื™ืžื™ืช. ื”ื™ื ืื•ืžืฆื” ื›ืชื™ื ื•ืงืช ืขืœ ื™ื“ื™ ื“ื™ื™ื‘ ื•ืฉืจื™ ืžืจืงื”ื, ืฉืชื™ืืจื• ืื•ืชื” ื›ื™ืœื“ื” ืžืœืืช ืื•ืจ. ื‘ืชื™ื›ื•ืŸ ื”ื›ื™ืจื” ืืช ื’'ื•ืŸ, ื•ื”ืงืฉืจ ื‘ื™ื ื™ื”ื ื”ืชืคืชื— ื‘ืžื”ื™ืจื•ืช. "ื”ื ื”ืชืื”ื‘ื• ื›ืœ ื›ืš ืžื”ืจ", ืกื™ืคืจื” ืื—ื•ืชื• ื”ื—ื•ืจื’ืช ืฉืœ ื’'ื•ืŸ, ืžื™ื™ื’ืŸ ืงื•ื ืฅ. ืื—ืจื™ ื—ืžืฉ ืฉื ื•ืช ื–ื•ื’ื™ื•ืช, ื‘ืื•ื’ื•ืกื˜ 2010, ื”ื•ื ื›ืจืข ื‘ืจืš ื•ื”ืฆื™ืข ื ื™ืฉื•ืื™ื. "ื”ื•ื ื ืชืŸ ืœื” ื˜ื‘ืขืช, ื•ื”ื™ื ื—ืฉื‘ื” ืฉื”ื™ื ื”ื“ื‘ืจ ื”ื›ื™ ื™ืคื” ื‘ืขื•ืœื", ื ื–ื›ืจื” ืžื™ื™ื’ืŸ. ื‘ืื•ื’ื•ืกื˜ 2011, ืฉื‘ื•ืขื•ืช ืกืคื•ืจื™ื ืœืคื ื™ ืกื™ื•ื ืœื™ืžื•ื“ื™ื” ื•ืœืคื ื™ ืฉืชื›ื ื ื• ืœืขื‘ื•ืจ ื™ื—ื“ ืœืงื•ืœื•ืจื“ื•, ื”ื›ืœ ื ืขืฆืจ. ื‘ืœื™ืœื” ืฉืœืคื ื™ ื”ื”ื™ืขืœืžื•ืช ื‘ื™ืœื• ื”ืฉื ื™ื™ื ืขื ื—ื‘ืจ ืžืฉื•ืชืฃ ื‘ืฉื ื‘ืจืื“ ืคื•ืŸ ื‘ืืจื’ืŸ. ื‘ื”ืžืฉืš, ื™ืฆื ื’'ื•ืŸ ืœืคื’ื•ืฉ ื—ื‘ืจื™ื ื ื•ืกืคื™ื ื‘ืกื‘ื™ื‘ื•ืช 23:30, ื‘ื–ืžืŸ ืฉืงื™ื™ื˜ืœื™ืŸ ื ืฉืืจื” ื‘ื‘ื™ืช. ื”ื”ื•ื“ืขื” ื”ืื—ืจื•ื ื” ืฉืงื™ื™ื˜ืœื™ืŸ ืฉืœื—ื” ื”ื™ื™ืชื” ืœื’'ื•ืŸ ื‘-13 ื‘ืื•ื’ื•ืกื˜ ื‘ืฉืขื” 23:36. ืžืขื˜ ืื—ืจื™ ื—ืฆื•ืช, ื”ื˜ืœืคื•ืŸ ืฉืœื” ื›ื‘ื”. ืœืคื™ ื’ืจืกืชื• ืฉืœ ื’'ื•ืŸ, ื”ื•ื ื—ื–ืจ ืœื‘ื™ืช ืืžื• ื‘ืกื‘ื™ื‘ื•ืช ื”ืฉืขื” 1:30 ื‘ืœื™ืœื”, ืฆืคื” ื‘ื˜ืœื•ื•ื™ื–ื™ื” ื•ืฉืœื— ืœื” ื”ื•ื“ืขืช "ื‘ื•ืงืจ ื˜ื•ื‘" ื‘ืืจื‘ืข ืœืคื ื•ืช ื‘ื•ืงืจ. ื›ืฉื”ืชืขื•ืจืจ ื‘ืฆื”ืจื™ื™ื ื•ืœื ืงื™ื‘ืœ ืชืฉื•ื‘ื”, ื”ื•ื ื ืœื—ืฅ. ื’'ื•ืŸ ื ืกืข ืœื‘ื™ืชื”, ืจืื” ืืช ื”ืจื›ื‘ ื‘ื—ื•ืฅ ื•ื”ื‘ื™ืŸ ืฉืžืฉื”ื• ืื™ื ื• ื›ืฉื•ืจื”. ื–ืžืŸ ืงืฆืจ ืœืื—ืจ ืžื›ืŸ ื›ื‘ืจ ื”ืชืงืฉืจ ืœืžื•ืงื“ ื”ื—ื™ืจื•ื. ื‘ื—ืงื™ืจืชื• ื”ื‘ื—ื™ื ื• ื”ืฉื•ื˜ืจื™ื ื‘ืฉืจื™ื˜ื•ืช ื‘ืฆื“ ื”ืฉืžืืœื™ ืฉืœ ืฆื•ื•ืืจื•. ื’'ื•ืŸ ื˜ืขืŸ ืฉืžื“ื•ื‘ืจ ื‘ืคืฆื™ืขื•ืช ื’ื™ืœื•ื—. ืคื•ืœ ื ื™ื•ื˜ื•ืŸ, ื—ื•ืงืจ ืžื—ื•ื– ื‘ืื˜ืœืจ, ืœื ื”ืฉืชื›ื ืข. "ื–ื” ืœื ืžืงื•ื ืจื’ื™ืœ ืฉื’ื‘ืจ ืžืชื’ืœื— ื‘ื•", ืืžืจ. "ื–ื” ื ืจืื” ื›ืื™ืœื• ื”ืกื™ืžื ื™ื ื ื’ืจืžื• ืžื™ื“ื™ื™ื ืฉืœ ืžื™ืฉื”ื• ืื—ืจ". ืœืžืจื•ืช ื”ื—ืฉื“, ืœื ื”ื™ื™ืชื” ืจืื™ื” ืฉืงืฉืจื” ืื•ืชื• ื™ืฉื™ืจื•ืช ืœื”ื™ืขืœืžื•ืช. ื”ืืคืฉืจื•ืช ืฉืงื™ื™ื˜ืœื™ืŸ ืขื–ื‘ื” ืžืจืฆื•ื ื” ืขื“ื™ื™ืŸ ืจื™ื—ืคื” ื‘ืื•ื•ื™ืจ. ืื‘ืœ ื”ื–ืžืŸ ื—ืœืฃ, ื•ื”ืฆืขื™ืจื” ืœื ื ืžืฆืื”. ื‘-7 ื‘ืืคืจื™ืœ 2013, ื›ืžืขื˜ ืฉื ืชื™ื™ื ืื—ืจื™ ื”ื™ืขืœืžื•ืชื”, ื ืžืฆืื• ืฉืจื™ื“ื™ ื’ื•ืคื” ืœื™ื“ ื ื—ืœ ื‘ืžื—ื•ื– ืคืจื ืงืœื™ืŸ, ืื™ื ื“ื™ืื ื”. "ื”ื™ื™ืชื” ืฉืงื™ืช ืคืœืกื˜ื™ืง ืžืขืœ ื”ื’ื•ืœื’ื•ืœืช", ืกื™ืคืจ ื•ืื ืก ืคืื˜ื•ืŸ, ื‘ืœืฉ ื‘ืžืฉื˜ืจืช ืื™ื ื“ื™ืื ื”. ื”ืฉืจื™ื“ื™ื ื”ื™ื• ืฉืœ ืื“ื ืฆืขื™ืจ, ืืš ืœื ื ื™ืชืŸ ื”ื™ื” ืœื–ื”ื•ืช ืžื™ื“ ืืช ื–ื”ื•ืชื•. ื–ืžืŸ ืงืฆืจ ืœืื—ืจ ืžื›ืŸ, ืจื™ืฉื•ืžื™ื ื“ื ื˜ืœื™ื™ื ืกื™ืคืงื• ืืช ื”ืชืฉื•ื‘ื”: ื”ืฉืจื™ื“ื™ื ื”ื™ื• ืฉืœ ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื. "ืฉืžืขืชื™ ื“ืคื™ืงื” ื‘ื“ืœืช", ืกื™ืคืจ ืื‘ื™ื”, ื“ื™ื™ื‘. "ืคืชื—ืชื™ ื•ืขืžื“ื• ืฉื ืฉื ื™ ืฉื•ื˜ืจื™ื ื•ื›ื•ืžืจ. ื”ื ืœื ื”ื™ื• ืฆืจื™ื›ื™ื ืœื•ืžืจ ืžื™ืœื”. ื™ื“ืขืชื™. ืœื‘ื™ ืฆื ื— ื•ื”ืชื—ืœืชื™ ืœืžืจืจ ื‘ื‘ื›ื™". ืžื•ืชื” ืฉืœ ื”ืกื˜ื•ื“ื ื˜ื™ืช ื”ื•ื’ื“ืจ ื›ืจืฆื—, ืืš ืกื™ื‘ืช ื”ืžื•ื•ืช ื”ืžื“ื•ื™ืงืช ืœื ื ืงื‘ืขื”. ื”ื’ื™ืœื•ื™ ื”ืขื‘ื™ืจ ืืช ื”ื—ืงื™ืจื” ืœื”ื™ืœื•ืš ืื—ืจ. ืขื“ ืžื”ืจื” ื”ืชื‘ืจืจ ืฉืœืžืฉืคื—ืชื• ืฉืœ ื’'ื•ืŸ ื™ืฉ ื—ื•ื•ื” ื‘ืื•ืชื• ืžื—ื•ื– ื‘ืื™ื ื“ื™ืื ื” ืฉื‘ื• ื ืžืฆืื• ื”ืฉืจื™ื“ื™ื. ื”ืžืจื—ืง ื›ื‘ืจ ืœื ื ืจืื” ืžืงืจื™ ื›ืœ ื›ืš. ื’'ื•ืŸ ื”ืžืฉื™ืš ืœื”ื›ื—ื™ืฉ ื›ืœ ืžืขื•ืจื‘ื•ืช ื‘ืคืจืฉื”. "ืื ื™ ืœื ื™ื•ื“ืข ืื™ืš ื”ื™ื ื”ื’ื™ืขื” ืœืฉื. ืื ื™ ืžื ืกื” ืœื ืœื—ืฉื•ื‘ ืขืœ ื–ื”", ืืžืจ ืœื‘ืœืฉ. ื”ืชื™ืง, ื‘ื”ื™ืขื“ืจ ืจืื™ื•ืช ื—ื•ืชื›ื•ืช, ื ื•ืชืจ ืคืชื•ื— ื‘ืžืฉืš ืฉื ื™ื. ืคืจืงืœื™ื˜ื•ืช ืžื—ื•ื– ื‘ืื˜ืœืจ ืคืชื—ื” ืืช ื”ืชื™ืง ืžื—ื“ืฉ ื‘ืฉื ืช 2020. ื”ื—ื•ืงืจ ื ื™ื•ื˜ื•ืŸ ืื™ืชืจ ืขื“ื” ืฉื˜ืขื ื” ื›ื™ ื™ืžืžื” ืœืคื ื™ ื”ื”ื™ืขืœืžื•ืช ืจืืชื” ืืช ืงื™ื™ื˜ืœื™ืŸ ื•ื’'ื•ืŸ ืจื‘ื™ื ื‘ืคืกื˜ื™ื‘ืœ, ื•ื”ืชืจืฉืžื” ืฉื”ื ืขื•ืžื“ื™ื ืœื”ื™ืคืจื“. ื‘ืžืงื‘ื™ืœ ื ืžืฆืื• ืฉื ื™ ื ืขืจื™ื ืฉืกื™ืคืจื• ื›ื™ ืจืื• ืืช ืจื›ื‘ื• ืฉืœ ื’'ื•ืŸ ื•ืจื›ื‘ ื ื•ืกืฃ ืขื•ืฆืจื™ื ืœื™ื“ ื‘ื™ืชื” ืฉืœ ืžืจืงื”ื ื‘ืกื‘ื™ื‘ื•ืช 01:30 ื‘ืœื™ืœื”. ืœืคื™ ื”ืขื“ื•ื™ื•ืช, ื”ืžื›ื•ื ื™ื•ืช ืขื–ื‘ื• ื›ืขื‘ื•ืจ ื“ืงื•ืช ืกืคื•ืจื•ืช. ื’'ื•ืŸ ื˜ืขืŸ ืฉื”ื ืขืจื™ื ืžืฉืงืจื™ื ื•ื—ื–ืจ ืขืœ ื’ืจืกืชื• ื›ื™ ื”ื™ื” ื‘ื‘ื™ืช ืืžื• ื•ืฆืคื” ื‘ืกื“ืจื” ื‘ืžื—ืฉื‘. ืจื™ืฉื•ืžื™ ื”ืžื—ืฉื‘ ืื›ืŸ ื”ืจืื• ืฉื”ืคืจืงื™ื ื”ื•ืคืขืœื•. ืืœื ืฉื”ื—ื•ืงืจื™ื ื’ื™ืœื• ืคืจื˜ ื—ืจื™ื’. ื‘ื‘ื•ืงืจ ืฉืœืžื—ืจืช, ื›ืš ืขืœื” ืžื‘ื“ื™ืงืช ื”ื“ืคื“ืคืŸ, ื’'ื•ืŸ ื—ื™ืคืฉ ื‘ื’ื•ื’ืœ ืชืงืฆื™ืจื™ื ืฉืœ ืื•ืชื ืคืจืงื™ื ืฉื‘ื”ื ืฆืคื” ืœื›ืื•ืจื”. ืขื‘ื•ืจ ื”ื—ื•ืงืจ ื ื™ื•ื˜ื•ืŸ, ื–ื” ื ืจืื” ื›ืžื• ื ื™ืกื™ื•ืŸ ืœื”ืฉืœื™ื ืคืจื˜ื™ื ืฉืœื ื‘ืืžืช ืจืื”. ื”ืชื™ืื•ืจื™ื” ืฉื ื‘ื ืชื” ื”ื™ื™ืชื” ืžืฆืžืจืจืช. ืœืคื™ ื”ื—ืฉื“, ื’'ื•ืŸ ื”ืจื’ ืืช ืงื™ื™ื˜ืœื™ืŸ ืžื•ืงื“ื ื™ื•ืชืจ ื‘ืื•ืชื• ืœื™ืœื”, ื™ืฆื ืœื‘ืœื•ืช ื›ื“ื™ ืœื‘ืกืก ืืœื™ื‘ื™, ื”ืคืขื™ืœ ืืช ื”ืžื—ืฉื‘ ื‘ื‘ื™ืช ืืžื• ื•ืื– ื—ื–ืจ ื›ื“ื™ ืœื”ื™ืคื˜ืจ ืžื”ื’ื•ืคื” ื‘ืขื–ืจืช ืฉื•ืชืฃ ืœื ื™ื“ื•ืข. ื”ื›ืœ ื”ื™ื” ื ืกื™ื‘ืชื™, ืืš ื”ืชืžื•ื ื” ื”ื—ืœื” ืœื”ืชื’ื‘ืฉ. ื‘ืžืจืฅ 2023, ื›ืžืขื˜ ืขืฉื•ืจ ืœืื—ืจ ื”ืจืฆื—, ื’'ื•ืŸ ื ืขืฆืจ. ื”ืชื™ืง ื ื•ืชืจ ื‘ืจื•ื‘ื• ื ืกื™ื‘ืชื™, ื•ื”ื—ืฉืฉ ืžืžืฉืคื˜ ืžืžื•ืฉืš ืœืœื ื”ื›ืจืขื” ื‘ืจื•ืจื” ืจื™ื—ืฃ ืžืขืœ ื”ืžืฉืคื—ื”. ืœื‘ืกื•ืฃ ื”ืกื›ื™ืžื• ื”ืชื•ื‘ืขื™ื ืœืขืกืงืช ื˜ื™ืขื•ืŸ. ื’'ื•ืŸ ื”ื•ื“ื” ื‘ื”ืจื™ื’ื” ื‘ืœื‘ื“ ื•ื ื™ื“ื•ืŸ ืœืฉืœื•ืฉ ืฉื ื•ืช ืžืืกืจ. ื‘ืขื™ื ื™ ืจื‘ื™ื ื–ื” ื”ื™ื” ืขื•ื ืฉ ืงืœ ืžื“ื™. ืขื‘ื•ืจ ื“ื™ื™ื‘ ืžืจืงื”ื, ื–ืืช ื”ื™ื™ืชื” ืคืฉืจื” ื›ื•ืื‘ืช. "ืฉืœื•ืฉ ืฉื ื™ื ืœื ื™ืกืคื™ืงื• ืœืžื” ืฉื”ื•ื ืขืฉื” ื•ืœืงื— ืžืžื ื™", ืืžืจ ื”ืื‘. "ื™ื›ื•ืœืชื™ ืœืœื›ืช ืœืžืฉืคื˜ ื•ืื ื™ ืžืืžื™ืŸ ืฉื”ื•ื ื”ื™ื” ื ืžืฆื ืืฉื. ืื‘ืœ ื‘ืฉืœื‘ ื”ื–ื” ืจืง ืจืฆื™ืชื™ ืฉื”ื•ื ื™ื•ื“ื”, ืฉื”ื•ื ื™ื™ืฉื‘ ื‘ื›ืœื, ืฉื”ื•ื ื™ืกื‘ื•ืœ ืงืฆืช". ืงื™ื™ื˜ืœื™ืŸ ืžืจืงื”ื ื”ื™ื™ืชื” ืืžื•ืจื” ืœืกื™ื™ื ืœื™ืžื•ื“ื™ื, ืœืขื‘ื•ืจ ืœืžื“ื™ื ื” ื—ื“ืฉื” ื•ืœื”ืชื—ื™ืœ ื—ื™ื™ื ืื—ืจื™ื. ื‘ืžืงื•ื ื–ื”, ื—ื™ื™ื” ื ื’ื“ืขื• ื‘ืœื™ืœื” ืื—ื“ ืฉืœ ืื•ื’ื•ืกื˜. ื”ืื™ืฉ ืฉื”ืชืงืฉืจ ืœื“ื•ื•ื— ืขืœ ื”ื™ืขืœืžื•ืชื” ื”ืคืš, ืฉื ื™ื ืื—ืจ ื›ืš, ืœืื™ืฉ ืฉื”ื•ื“ื” ื‘ืื—ืจื™ื•ืช ืœืžื•ืชื”.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_ref-150] | [TOKENS: 9291]
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988โ€“89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passedโ€”usually fully encryptedโ€”across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to โ‰ˆ4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Footage] | [TOKENS: 670]
Contents Footage In filmmaking and video production, footage is raw, unedited material as originally filmed by a movie camera or recorded by a digital video camera, which typically must be edited to create a motion picture, video clip, television show, or similar completed work.[not in body] Footage can also refer to sequences used in film and video editing, such as special effects and archive material (for special cases of this, see stock footage and B roll).[not in body] Since the term originates in film, footage is only used for recorded images, such as film stock, videotapes, or digitized clips. For live television feeds, the signals from video cameras are instead called sources.[not in body] History The origin of the term "footage" comes from early 35 mm silent film, which is traditionally measured in feet and frames. The fact that film was measured by length in cutting rooms, and that there are 16 frames (4-perf film format) in a foot of 35 mm film (52.5 frames/meter), which roughly represented 1 second of screen time (frame rate) in some early silent films, made footage a natural unit of measure for film. The term then became used figuratively to describe moving image material of any kind.[citation needed] In recent years, neutral terms such as "recorded material" are becoming more popular, especially in English-speaking countries other than the United States, although footage is still widely used.[citation needed] Types of footage Sometimes film projects will also sell or trade footage, usually second unit material not used in the final cut. For example, the end of the non-director's cut version of Blade Runner used landscape views that were originally shot for The Shining before the script was modified after shooting had finished. Television footage, especially news footage, is often traded between television networks, but good footage usually commands a high price. The actual sum depends on duration, age, size of intended audience, duration of licensing, and other factors. Amateur footage is the low-budget hobbyist art of film practised for passion and enjoyment and not for business purposes. Amateur video footage of current events, for instance from camcorders, smart phones or closed-circuit television, can also often fetch a high price on the market โ€“ scenes shot inside the World Trade Center during the September 11, 2001 attacks were reportedly sold in 2001 for US$45,000 (equivalent to $82,000 in 2025). Stock footage is film or video footage that can be used again in other films. Stock footage is beneficial to filmmakers as it saves shooting new material. A single piece of stock footage is called a "stock shot" or a "library shot". Stock footage may have appeared in previous productions but may also be outtakes or footage shot for previous productions and not used. Examples of stock footage that might be utilized are moving images of cities and landmarks, wildlife in their natural environments, and historical footage. Suppliers of stock footage may be either rights managed or royalty-free. Many websites offer direct downloads of clips in various formats.[citation needed] Footage broker A footage broker is an agent who deals in footage by promoting it to footage purchasers or producers, while taking a profit in the sales transaction.[citation needed] See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_ref-71] | [TOKENS: 10515]
Contents Elon Musk Elon Reeve Musk (/หˆiหlษ’n/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025โ€“26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (nรฉe Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopรฆdia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 โ€“ a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Altoโ€“based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Muskโ€”the largest shareholder with 11.72% of sharesโ€”received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trialsโ€”which have caused the deaths of some monkeysโ€”have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sรกnchez as a "tyrant" following Sรกnchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong โ€” [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X ร† A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "ร† A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X ร† A-Xii. Elon Musk has taken X ร† A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? Iโ€™ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what Iโ€™m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I donโ€™t recall introducing Epstein to anyone, as I donโ€™t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fรฉdรฉration Aรฉronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links
========================================
[SOURCE: https://www.mako.co.il/hix-special/Article-6dc9c2e48467c91027.htm] | [TOKENS: 4179]
"ื”ื™ื ืืžืจื” ืœื™ ืœืœื›ืช": ืžื˜ืคืก ื”ื”ืจื™ื ืฉืžื•ืืฉื ื‘ืžื•ืชื” ืฉืœ ืงืจืกื˜ื™ืŸ ื’ื•ืจื˜ื ืจ ืžื’ื™ื‘ืชื•ืžืืก ืคืœืžื‘ืจื’ืจ, ืžื˜ืคืก ื”ืจื™ื ื‘ืŸ ื”-39, ืขื•ืžื“ ืœื“ื™ืŸ ื‘ืื•ืกื˜ืจื™ื” ืœืื—ืจ ืฉื‘ืช ื–ื•ื’ื• ืžืชื” ืžื”ื™ืคื•ืชืจืžื™ื” ืกืžื•ืš ืœืคืกื’ืช ื”ื”ืจ ื”ื’ื‘ื•ื”ื” ื‘ืžื“ื™ื ื”. ืœืคื™ ื›ืชื‘ ื”ืื™ืฉื•ื, ื”ื•ื ื”ื•ืชื™ืจ ืื•ืชื” ืœื‘ื“ื” ื‘ืงื•ืจ ืงื™ืฆื•ื ื™ ื•ื‘ื—ืฉื›ื”ื’ื™ื ืจื™ื™ื˜ื”ื™ืงืกืคื•ืจืกื: 20.02.26, 06:00 | ืขื•ื“ื›ืŸ: 21.02.26, 10:43ืฆื™ืœื•ื: ืžืชื•ืš ื”ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช ื‘ื”ืชืื ืœืกืขื™ืฃ 27ื' ื‘ื—ื•ืงื”ืงื™ืฉื•ืจ ื”ื•ืขืชืงืงืจืกื˜ื™ืŸ ื’ื•ืจื˜ื ืจ (33) ื ืžืฆืื” ืžืชื” ื‘ื™ื ื•ืืจ 2024, ื›ืฉื”ื™ื ืฉืจื•ืขื” ื›-150 ืžื˜ืจื™ื ื‘ืœื‘ื“ ืžืชื—ืช ืœืคืกื’ืช ื”ืจ ื’ืจื•ืกื’ืœื•ืงื ืจ โ€“ ื”ืžืชื ืฉื ืœื’ื•ื‘ื” ืฉืœ 3,798 ืžื˜ืจื™ื. ืœืคื™ ื”ืื™ืฉื•ื, ืคืœืžื‘ืจื’ืจ ืขื–ื‘ ืื•ืชื” ื›ืฉื”ื™ื™ืชื” ืžื•ืชืฉืช, ืžื‘ื•ืœื‘ืœืช ื•ืžืชืžื•ื“ื“ืช ืขื ื”ื™ืคื•ืชืจืžื™ื”. ืขื•ื“ ื ื˜ืขืŸ ื›ื™ ืœื ื”ื–ืขื™ืง ืขื–ืจื” ื‘ื–ืžืŸ.ืคืœืžื‘ืจื’ืจ ื›ืคืจ ื‘ืืฉืžืช ื’ืจื™ืžืช ืžื•ื•ืช ื‘ืจืฉืœื ื•ืช ื‘ื“ื™ื•ืŸ ืฉื ืคืชื— ื‘ืื™ื ืกื‘ืจื•ืง. ื”ืขื‘ื™ืจื” ื ื•ืฉืืช ืขื•ื ืฉ ืžืจื‘ื™ ืฉืœ ืฉืœื•ืฉ ืฉื ื•ืช ืžืืกืจ. ืขื•ืจืš ื“ื™ื ื• ื˜ืขืŸ ื›ื™ ืฉื”ื” ืœืฆื“ื” ืฉืœ ื’ื•ืจื˜ื ืจ ื‘ืžืฉืš ื›ืฉืขื” ื•ื—ืฆื™, ื‘ืชื ืื™ื ืงื™ืฆื•ื ื™ื™ื, ื•ื›ื™ ื‘ืฉืœื‘ ืžืกื•ื™ื ื”ื™ื ืฆืขืงื” ืœืขื‘ืจื• "ืœืš!".ื—ืงื™ืจืช ื”ืžืงืจื” ื ืžืฉื›ื” 11 ื—ื•ื“ืฉื™ื ื•ื›ืœืœื” ื‘ื“ื™ืงืช ื˜ืœืคื•ื ื™ื ื ื™ื™ื“ื™ื, ืฉืขื•ื ื™ ืกืคื•ืจื˜ ื•ืชื™ืขื•ื“ ืžื”ื˜ื™ืคื•ืก ืขืฆืžื•. ื›ืžื• ื›ืŸ, ืžื•ืžื—ื” ืืœืคื™ื ื™ ื‘ืœืชื™ ืชืœื•ื™ ื”ื’ื™ืฉ ื—ื•ื•ืช ื“ืขืช ืœื’ื‘ื™ ื”ืฉืชืœืฉืœื•ืช ื”ืื™ืจื•ืขื™ื. ืœืคื™ ื”ืชื‘ื™ืขื”, ื”ืฉื ื™ื™ื ื™ืฆืื• ืœืžืกืœื•ืœ ื”ื˜ื™ืคื•ืก ื‘ืื™ื—ื•ืจ ืฉืœ ืฉืขืชื™ื™ื, ื‘ืชื ืื™ื ื—ื•ืจืคื™ื™ื ืงืฉื™ื ื•ืœืœื ืฆื™ื•ื“ ื—ื™ืจื•ื ืžืกืคืง. ืขื•ื“ ื ื˜ืขืŸ ื›ื™ ืคืœืžื‘ืจื’ืจ ื™ื“ืข ืฉื’ื•ืจื˜ื ืจ โ€“ ื‘ืช ื–ื•ื’ื• ื“ืื– โ€“ ืžืขื•ืœื ืœื ื‘ื™ืฆืขื” ืžืกืœื•ืœ ื›ื” ืืจื•ืš ื•ืžืืชื’ืจ, ื•ื‘ื›ืœ ื–ืืช ื”ืžืฉื™ืš.ืžืžืฆืื™ ื”ื—ืงื™ืจื” ื—ืฉืคื• ื›ื™ ื”ืฉื ื™ื™ื ื ืื‘ืงื• ื‘ืจื•ื—ื•ืช ืขื–ื•ืช ื•ื‘ืงื•ืจ ืงื™ืฆื•ื ื™ ื‘ืžื”ืœืš ื”ื˜ื™ืคื•ืก ืขืœ ื”ื”ืจ. ื‘ืกืžื•ืš ืœื—ืฆื•ืช ื”ืชืงืฉืจ ืคืœืžื‘ืจื’ืจ ืœืžืฉื˜ืจืช ื”ื”ืจื™ื ื”ืื•ืกื˜ืจื™ืช (Alpinpolizei). ืœืคื™ ื”ื—ื•ืงืจื™ื, ื”ื™ื” ืขืœื™ื• ืœืกื’ืช ืžื•ืงื“ื ื™ื•ืชืจ ื›ืืฉืจ ืขื“ื™ื™ืŸ ื ื™ืชืŸ ื”ื™ื” ืœื—ื–ื•ืจ ื‘ื‘ื˜ื—ื”. ื”ื”ื’ื ื” ื˜ื•ืขื ืช ื›ื™ ืคืœืžื‘ืจื’ืจ ื‘ื™ืงืฉ ืกื™ื•ืข ืžื”ืจืฉื•ื™ื•ืช ื•ืžื›ื—ื™ืฉื” ืฉืืžืจ ืฉื”ื›ื•ืœ ื ืžืฆื ื‘ืฉืœื™ื˜ื”. ื‘ืžืฉื˜ืจื” ื˜ื•ืขื ื™ื ื›ื™ ื‘ืขืงื‘ื•ืช ื”ืฉื™ื—ื” ืขื ื”ื ืืฉื, ื”ื•ื ื”ืขื‘ื™ืจ ืืช ื”ื˜ืœืคื•ืŸ ืฉืœื• ืœืžืฆื‘ ืฉืงื˜ ื•ืœื ืขื ื” ืœืฉื™ื—ื•ืช ื—ื•ื–ืจื•ืช.ืคืจืกื•ืžืชืœืคื™ ื›ืชื‘ ื”ืื™ืฉื•ื, ืคืœืžื‘ืจื’ืจ ืขื–ื‘ ืืช ื’ื•ืจื˜ื ืจ ืœื‘ื“ื” ืœืœื ืฉืžื™ื›ื•ืช ื”ืฆืœื” ืื• ืฆื™ื•ื“ ืžื’ืŸ ื‘ืกื‘ื™ื‘ื•ืช 2:00 ื‘ืœื™ืœื”. ืžืฆืœืžืช ืจืฉืช ืชื™ืขื“ื” ืœื›ืื•ืจื” ืืช ื”ืžืฉืš ื™ืจื™ื“ืชื• ืœื‘ื“ื• ื›ื—ืฆื™ ืฉืขื” ืœืื—ืจ ืžื›ืŸ. ื”ืชืจืขื” ืœื—ื™ืœื•ืฅ ื™ืฆืื” ืจืง ื‘-3:30 ืœืคื ื•ืช ื‘ื•ืงืจ, ืืš ื”ืจื•ื—ื•ืช ืžื ืขื• ืžืžืกื•ืงื™ ื”ื—ื™ืœื•ืฅ ืœื”ืžืจื™ื. ื’ื•ืคืชื” ืฉืœ ืงืจืกื˜ื™ืŸ ืื•ืชืจื” ืจืง ื‘-10:00 ื‘ื‘ื•ืงืจ ืขืœ ื”ืžื“ืจื•ืŸ ื”ืงืคื•ื.15 ืขื“ื™ื, ื‘ื”ื ื‘ื ื™ ืžืฉืคื—ื”, ืฆื•ื•ืชื™ ื—ื™ืœื•ืฅ, ื˜ื™ื™ืก ืžืกื•ืง ื”ื—ื™ืœื•ืฅ ื•ืคืชื•ืœื•ื’, ืฆืคื•ื™ื™ื ืœื”ืขื™ื“ ื‘ืžืฉืคื˜ ื ื’ื“ ืคืœืžื‘ืจื’ืจ. ืžื˜ืคืก ื”ื”ืจื™ื ืžืžืฉื™ืš ืœื˜ืขื•ืŸ ื›ื™ ื™ืจื“ ื›ื“ื™ ืœื”ื–ืขื™ืง ืขื–ืจื” ื•ื›ื™ ืžื“ื•ื‘ืจ ื‘"ืชืื•ื ื” ื˜ืจื’ื™ืช". ื‘ืื•ืคืŸ ืžืคืชื™ืข, ืขืจื‘ ืคืชื™ื—ืช ื”ืžืฉืคื˜ ื™ืฆืื” ืืžื” ืฉืœ ื’ื•ืจื˜ื ืจ ืœื”ื’ื ืชื• ืฉืœ ืคืœืžื‘ืจื’ืจ. "ืžื›ืขื™ืก ืื•ืชื™ ืฉืžืฆื™ื’ื™ื ืืช ืงืจืกื˜ื™ืŸ ื›ื‘ื—ื•ืจื” ืชืžื™ืžื” ืฉื ื’ืจืจื” ืœื”ืจ", ืืžืจื”. "ืžืชื ื”ืœ ื ื’ื“ื• ืฆื™ื“ ืžื›ืฉืคื•ืช ื‘ืชืงืฉื•ืจืช ื•ื‘ืจืฉืช". ืคืกืง ื”ื“ื™ืŸ ืฆืคื•ื™ ืœื”ื™ื ืชืŸ ื‘ื”ืžืฉืš ื”ืฉื‘ื•ืข.ื‘ื™ื–ืืจื—ื“ืฉื•ืช ื‘ืขื•ืœืืžืฆืืชื ื˜ืขื•ืช ืœืฉื•ืŸ? "ื”ื™ื ืืžืจื” ืœื™ ืœืœื›ืช": ืžื˜ืคืก ื”ื”ืจื™ื ืฉืžื•ืืฉื ื‘ืžื•ืชื” ืฉืœ ืงืจืกื˜ื™ืŸ ื’ื•ืจื˜ื ืจ ืžื’ื™ื‘ ืชื•ืžืืก ืคืœืžื‘ืจื’ืจ, ืžื˜ืคืก ื”ืจื™ื ื‘ืŸ ื”-39, ืขื•ืžื“ ืœื“ื™ืŸ ื‘ืื•ืกื˜ืจื™ื” ืœืื—ืจ ืฉื‘ืช ื–ื•ื’ื• ืžืชื” ืžื”ื™ืคื•ืชืจืžื™ื” ืกืžื•ืš ืœืคืกื’ืช ื”ื”ืจ ื”ื’ื‘ื•ื”ื” ื‘ืžื“ื™ื ื”. ืœืคื™ ื›ืชื‘ ื”ืื™ืฉื•ื, ื”ื•ื ื”ื•ืชื™ืจ ืื•ืชื” ืœื‘ื“ื” ื‘ืงื•ืจ ืงื™ืฆื•ื ื™ ื•ื‘ื—ืฉื›ื” ืงืจืกื˜ื™ืŸ ื’ื•ืจื˜ื ืจ (33) ื ืžืฆืื” ืžืชื” ื‘ื™ื ื•ืืจ 2024, ื›ืฉื”ื™ื ืฉืจื•ืขื” ื›-150 ืžื˜ืจื™ื ื‘ืœื‘ื“ ืžืชื—ืช ืœืคืกื’ืช ื”ืจ ื’ืจื•ืกื’ืœื•ืงื ืจ โ€“ ื”ืžืชื ืฉื ืœื’ื•ื‘ื” ืฉืœ 3,798 ืžื˜ืจื™ื. ืœืคื™ ื”ืื™ืฉื•ื, ืคืœืžื‘ืจื’ืจ ืขื–ื‘ ืื•ืชื” ื›ืฉื”ื™ื™ืชื” ืžื•ืชืฉืช, ืžื‘ื•ืœื‘ืœืช ื•ืžืชืžื•ื“ื“ืช ืขื ื”ื™ืคื•ืชืจืžื™ื”. ืขื•ื“ ื ื˜ืขืŸ ื›ื™ ืœื ื”ื–ืขื™ืง ืขื–ืจื” ื‘ื–ืžืŸ. ืคืœืžื‘ืจื’ืจ ื›ืคืจ ื‘ืืฉืžืช ื’ืจื™ืžืช ืžื•ื•ืช ื‘ืจืฉืœื ื•ืช ื‘ื“ื™ื•ืŸ ืฉื ืคืชื— ื‘ืื™ื ืกื‘ืจื•ืง. ื”ืขื‘ื™ืจื” ื ื•ืฉืืช ืขื•ื ืฉ ืžืจื‘ื™ ืฉืœ ืฉืœื•ืฉ ืฉื ื•ืช ืžืืกืจ. ืขื•ืจืš ื“ื™ื ื• ื˜ืขืŸ ื›ื™ ืฉื”ื” ืœืฆื“ื” ืฉืœ ื’ื•ืจื˜ื ืจ ื‘ืžืฉืš ื›ืฉืขื” ื•ื—ืฆื™, ื‘ืชื ืื™ื ืงื™ืฆื•ื ื™ื™ื, ื•ื›ื™ ื‘ืฉืœื‘ ืžืกื•ื™ื ื”ื™ื ืฆืขืงื” ืœืขื‘ืจื• "ืœืš!". ื—ืงื™ืจืช ื”ืžืงืจื” ื ืžืฉื›ื” 11 ื—ื•ื“ืฉื™ื ื•ื›ืœืœื” ื‘ื“ื™ืงืช ื˜ืœืคื•ื ื™ื ื ื™ื™ื“ื™ื, ืฉืขื•ื ื™ ืกืคื•ืจื˜ ื•ืชื™ืขื•ื“ ืžื”ื˜ื™ืคื•ืก ืขืฆืžื•. ื›ืžื• ื›ืŸ, ืžื•ืžื—ื” ืืœืคื™ื ื™ ื‘ืœืชื™ ืชืœื•ื™ ื”ื’ื™ืฉ ื—ื•ื•ืช ื“ืขืช ืœื’ื‘ื™ ื”ืฉืชืœืฉืœื•ืช ื”ืื™ืจื•ืขื™ื. ืœืคื™ ื”ืชื‘ื™ืขื”, ื”ืฉื ื™ื™ื ื™ืฆืื• ืœืžืกืœื•ืœ ื”ื˜ื™ืคื•ืก ื‘ืื™ื—ื•ืจ ืฉืœ ืฉืขืชื™ื™ื, ื‘ืชื ืื™ื ื—ื•ืจืคื™ื™ื ืงืฉื™ื ื•ืœืœื ืฆื™ื•ื“ ื—ื™ืจื•ื ืžืกืคืง. ืขื•ื“ ื ื˜ืขืŸ ื›ื™ ืคืœืžื‘ืจื’ืจ ื™ื“ืข ืฉื’ื•ืจื˜ื ืจ โ€“ ื‘ืช ื–ื•ื’ื• ื“ืื– โ€“ ืžืขื•ืœื ืœื ื‘ื™ืฆืขื” ืžืกืœื•ืœ ื›ื” ืืจื•ืš ื•ืžืืชื’ืจ, ื•ื‘ื›ืœ ื–ืืช ื”ืžืฉื™ืš. ืžืžืฆืื™ ื”ื—ืงื™ืจื” ื—ืฉืคื• ื›ื™ ื”ืฉื ื™ื™ื ื ืื‘ืงื• ื‘ืจื•ื—ื•ืช ืขื–ื•ืช ื•ื‘ืงื•ืจ ืงื™ืฆื•ื ื™ ื‘ืžื”ืœืš ื”ื˜ื™ืคื•ืก ืขืœ ื”ื”ืจ. ื‘ืกืžื•ืš ืœื—ืฆื•ืช ื”ืชืงืฉืจ ืคืœืžื‘ืจื’ืจ ืœืžืฉื˜ืจืช ื”ื”ืจื™ื ื”ืื•ืกื˜ืจื™ืช (Alpinpolizei). ืœืคื™ ื”ื—ื•ืงืจื™ื, ื”ื™ื” ืขืœื™ื• ืœืกื’ืช ืžื•ืงื“ื ื™ื•ืชืจ ื›ืืฉืจ ืขื“ื™ื™ืŸ ื ื™ืชืŸ ื”ื™ื” ืœื—ื–ื•ืจ ื‘ื‘ื˜ื—ื”. ื”ื”ื’ื ื” ื˜ื•ืขื ืช ื›ื™ ืคืœืžื‘ืจื’ืจ ื‘ื™ืงืฉ ืกื™ื•ืข ืžื”ืจืฉื•ื™ื•ืช ื•ืžื›ื—ื™ืฉื” ืฉืืžืจ ืฉื”ื›ื•ืœ ื ืžืฆื ื‘ืฉืœื™ื˜ื”. ื‘ืžืฉื˜ืจื” ื˜ื•ืขื ื™ื ื›ื™ ื‘ืขืงื‘ื•ืช ื”ืฉื™ื—ื” ืขื ื”ื ืืฉื, ื”ื•ื ื”ืขื‘ื™ืจ ืืช ื”ื˜ืœืคื•ืŸ ืฉืœื• ืœืžืฆื‘ ืฉืงื˜ ื•ืœื ืขื ื” ืœืฉื™ื—ื•ืช ื—ื•ื–ืจื•ืช. ืœืคื™ ื›ืชื‘ ื”ืื™ืฉื•ื, ืคืœืžื‘ืจื’ืจ ืขื–ื‘ ืืช ื’ื•ืจื˜ื ืจ ืœื‘ื“ื” ืœืœื ืฉืžื™ื›ื•ืช ื”ืฆืœื” ืื• ืฆื™ื•ื“ ืžื’ืŸ ื‘ืกื‘ื™ื‘ื•ืช 2:00 ื‘ืœื™ืœื”. ืžืฆืœืžืช ืจืฉืช ืชื™ืขื“ื” ืœื›ืื•ืจื” ืืช ื”ืžืฉืš ื™ืจื™ื“ืชื• ืœื‘ื“ื• ื›ื—ืฆื™ ืฉืขื” ืœืื—ืจ ืžื›ืŸ. ื”ืชืจืขื” ืœื—ื™ืœื•ืฅ ื™ืฆืื” ืจืง ื‘-3:30 ืœืคื ื•ืช ื‘ื•ืงืจ, ืืš ื”ืจื•ื—ื•ืช ืžื ืขื• ืžืžืกื•ืงื™ ื”ื—ื™ืœื•ืฅ ืœื”ืžืจื™ื. ื’ื•ืคืชื” ืฉืœ ืงืจืกื˜ื™ืŸ ืื•ืชืจื” ืจืง ื‘-10:00 ื‘ื‘ื•ืงืจ ืขืœ ื”ืžื“ืจื•ืŸ ื”ืงืคื•ื. 15 ืขื“ื™ื, ื‘ื”ื ื‘ื ื™ ืžืฉืคื—ื”, ืฆื•ื•ืชื™ ื—ื™ืœื•ืฅ, ื˜ื™ื™ืก ืžืกื•ืง ื”ื—ื™ืœื•ืฅ ื•ืคืชื•ืœื•ื’, ืฆืคื•ื™ื™ื ืœื”ืขื™ื“ ื‘ืžืฉืคื˜ ื ื’ื“ ืคืœืžื‘ืจื’ืจ. ืžื˜ืคืก ื”ื”ืจื™ื ืžืžืฉื™ืš ืœื˜ืขื•ืŸ ื›ื™ ื™ืจื“ ื›ื“ื™ ืœื”ื–ืขื™ืง ืขื–ืจื” ื•ื›ื™ ืžื“ื•ื‘ืจ ื‘"ืชืื•ื ื” ื˜ืจื’ื™ืช". ื‘ืื•ืคืŸ ืžืคืชื™ืข, ืขืจื‘ ืคืชื™ื—ืช ื”ืžืฉืคื˜ ื™ืฆืื” ืืžื” ืฉืœ ื’ื•ืจื˜ื ืจ ืœื”ื’ื ืชื• ืฉืœ ืคืœืžื‘ืจื’ืจ. "ืžื›ืขื™ืก ืื•ืชื™ ืฉืžืฆื™ื’ื™ื ืืช ืงืจืกื˜ื™ืŸ ื›ื‘ื—ื•ืจื” ืชืžื™ืžื” ืฉื ื’ืจืจื” ืœื”ืจ", ืืžืจื”. "ืžืชื ื”ืœ ื ื’ื“ื• ืฆื™ื“ ืžื›ืฉืคื•ืช ื‘ืชืงืฉื•ืจืช ื•ื‘ืจืฉืช". ืคืกืง ื”ื“ื™ืŸ ืฆืคื•ื™ ืœื”ื™ื ืชืŸ ื‘ื”ืžืฉืš ื”ืฉื‘ื•ืข.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Portal:Israel] | [TOKENS: 878]
Portal:Israel Portal topics - (Random portal) Welcome to the Israel Portalืžึฐื“ึดื™ื ึทืช ื™ึดืฉึฐื‚ืจึธืึตืœ Israel, officially the State of Israel, is a country in the Southern Levant region of West Asia. It is bordered by Lebanon to the north, Syria to the northeast, Jordan to the east, and Egypt to the southwest. Israel occupies the West Bank and the Gaza Strip of the Palestinian territories, as well as the Syrian Golan Heights. Israel's western coast lies on the Mediterranean Sea, its southern tip reaching the Red Sea, and the east includes the Earth's lowest point near the Dead Sea. Jerusalem is the government seat and proclaimed capital, while Tel Aviv is Israel's largest urban area and economic centre. The Land of Israel, also called Palestine or the Holy Land, was home to the ancient Canaanites and later the kingdoms of Israel and Judah. Located near continental crossroads, its demographics shifted under various empires. 19th-century European antisemitism fuelled the Zionist movement for a Jewish homeland, which gained British support with the 1917 Balfour Declaration. After World War I, Britain occupied the region and established Mandatory Palestine. British rule and Jewish immigration intensified Arab-Jewish tensions, and the 1947 United Nations (UN) Partition Plan led to a civil war. (Full article...) Selected article - show another Tel Aviv, officially Tel Aviv-Yafo, and also known as Tel Aviv-Jaffa, is the most populous city in the Gush Dan metropolitan area of Israel. Located on the Israeli Mediterranean coastline and with a population of 495,230, it is the economic and technological center of the country and a global high-tech hub. If East Jerusalem is considered part of Israel, Tel Aviv is the country's second-most-populous city, after Jerusalem; if not, Tel Aviv is the most populous city, ahead of West Jerusalem. Tel Aviv is governed by the Tel Aviv-Yafo Municipality, headed by Mayor Ron Huldai, and is home to most of Israel's foreign embassies. It is a beta+ world city and is ranked 53rd in the 2022 Global Financial Centres Index. Tel Aviv has the third- or fourth-largest economy and the largest economy per capita in the Middle East. Tel Aviv is ranked the 4th top global startup ecosystem hub. The city currently has the highest cost of living in the world. Tel Aviv receives over 2.5 million international visitors annually. Tel Aviv is home to Tel Aviv University, the largest university in the country with more than 30,000 students. (Full article...) Featured picture - show another Did you know - show different entries Related portals WikiProjects Good article - show another Ir Ovot (Hebrew: ืขื™ืจ ืื•ื‘ื•ืช, lit. 'City of Oboth') is a small community settlement in southern Israel. Located in the northeastern Arava valley, it falls under the jurisdiction of Central Arava Regional Council. It operated as a kibbutz from 1967 until the 1980s. In 2019 it had a population of 54. It is the site of an extensive archaeological complex known as Tamar Fortress or Hatzevah Fortress (Hebrew: ืžืฆื•ื“ืช ื—ืฆื‘ื”) which dates to the 10th century BCE (United Monarchy/First Temple period). (Full article...) Selected fare or cuisine - show another Kugel (Yiddish: ืงื•ื’ืœ kugl, pronounced [หˆkสŠษกlฬฉ] or [หˆkษชษกlฬฉ]) is a baked casserole, most commonly made from egg noodles (lokshen) or shredded potato. It is a traditional Ashkenazi Jewish dish, often served on Sabbath and Jewish holidays. American Jews also serve it for Thanksgiving dinner. In Hungary it is known as "vargabรฉles" and served as a sweet dish. (Full article...) General images - show another Categories Topics News Featured content Things you can do Associated Wikimedia The following Wikimedia Foundation sister projects provide more on this subject: External media Mass media in Israel (in English): Sources
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_ref-151] | [TOKENS: 9291]
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988โ€“89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passedโ€”usually fully encryptedโ€”across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to โ‰ˆ4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-Collins-2023-357] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775โ€“1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a Northโ€“South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861โ€“1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454โ€“1512);[m] it was first used as a place name by the German cartographers Martin Waldseemรผller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 โ€“ c. 13,500 BCE (c. 18,500 โ€“ c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36ยฐ30โ€ฒ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830โ€“1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexicanโ€“American War (1846โ€“1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848โ€“1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansasโ€“Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861โ€“1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanishโ€“American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'รฉtat. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly northโ€“south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United Statesโ€“Mexicoโ€“Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weaponsโ€”the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the worldโ€”531 people per 100,000 inhabitantsโ€”and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961โ€“1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981โ€“2011), the Voyager program (1972โ€“present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfieldโ€“Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four citiesโ€”New York City, Los Angeles, Chicago, and Houstonโ€”had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as Kโ€“12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020โ€“2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charityโ€”the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765โ€“1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Nรฉgritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPTโ€”all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts didโ€”and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncรฉ. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture โ€“ Statistical Yearbook 2023โ€‹, FAO, FAO. External links 40ยฐN 100ยฐW๏ปฟ / ๏ปฟ40ยฐN 100ยฐW๏ปฟ / 40; -100๏ปฟ (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Template_talk:History_of_Israel] | [TOKENS: 331]
Contents Template talk:History of Israel Seriously? This section was originally located at User talk:Triggerhippie4 I'd spent sometimes to read the history of Israel to get the ideas of what had happened in the area of modern Israel. Then what was left was to create the history template. But now I looked at your edits, I must say you have COMPLETELY destroyed all my efforts. To me, you had COMPLETELY ruined the template. My original is perfect both in date and line and format. But your version looks somewhat like a mess. Your version also makes it as if it has a lot of gap in history. I have an intention to create the template focusing on historical governments/periods, NOT some kinds like conflicts, UN Plan or Declaration of Independence. Just tell me, what is it wrong with my original contents and formats? เผ† (talk) 02:57, 17 December 2012 (UTC)[reply] I made edits, because I think that template should link to existing articles, not subsections. It already has link to History of the Jews in the Land of Israel. No need for repeating links to it's subsections. --Triggerhippie4 (talk) 13:37, 24 December 2012 (UTC)[reply] I've started rearranging this. there are centain themes that need to be covered: Jewish presence post the destruciton history of the bible histoory of judaism as a religion in israel arab history and the cursades trying to get it into shape. I haven't yet finished but I'm at work and have no time.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Huwara_rampage] | [TOKENS: 2505]
Contents Huwara shooting and rampage On 26 February 2023, hundreds of Israeli settlers went on a violent late-night rampage in Huwara and other Palestinian villages in the Israeli-occupied West Bank, leaving one civilian dead and 100 other Palestinians injured, four critically, and the town ablaze. It was the worst attack stemming from Israeli settler violence in the northern West Bank in decades. The rampage followed the deadly attack in which two Israelis were murdered the same day by an unidentified attacker in the area. Israeli soldiers were in the area while the rampage by the settlers unfolded and did not intervene. The rampage was called a pogrom by an Israeli commander in charge of the area. The same day, Israeli and Palestinian officials issued a joint declaration in Aqaba, Jordan to counter the recent round of Israeliโ€“Palestinian violence. In the rampage's aftermath, Israeli Finance Minister Bezalel Smotrich, a settler leader in charge of the administration of the West Bank, called for Huwara to be "wiped out" by the Israeli army. Condemnations from the United States, European Union, and Arab countries led to Smotrich retracting his comments and claiming they were said in the heat of the moment. Shooting On 26 February 2023, Hamas operative Abdel Fattah Hussein Kharousha shot and killed two Israeli settlers in their car in Huwara, a town south of Nablus in the Israeli-occupied West Bank. Following the shooting, Israeli settlers carried out revenge attacks on Palestinians, which have killed at least one Palestinian and injured around 100 others. The Israel Defense Forces killed Kharousha on 7 March during an operation in Jenin. On 26 February 2023, four days after the incursion in Nablus, an unidentified attacker shot and killed two Israelis in a car near the Einbus intersection along Highway 60 in Huwara, south of Nablus. The attacker shot the two Israelis with an M16 rifle while they were driving and then fled the scene on foot. During a 3 April raid in Nablus, the IDF arrested Izz al-Din Touqan and Nidal Tabanja, who it accused of assisting the Huwara attacker. The IDF killed two Palestinian militants, Mohammad Nasser Al-Saeed (also rendered Mohammad Saeed Nasser) and Mohammad Abu Baker Al-Junaidi, during the operation after militants opened fire on the IDF. Palestinian militant group The Lions' Den identified Al-Saeed as one of its fighters. The attack came juxtaposed against a joint declaration earlier that day in Aqaba, Jordan, by Israeli and Palestinian officials expressing a desire to work towards calming the latest round of violence. Some 700,000 Israeli settlers now live in the West Bank and East Jerusalem, in settlements viewed as illegal by the international community. Hardline settlers in the West Bank more generally frequently commit violence against Palestinians and vandalize Palestinian land and property, but rarely on the scale of the rampage in Huwara. No group initially claimed responsibility for the shooting. On March 7, an Israeli raid in Jenin killed the attacker, 49-year-old Abdel Fattah Hussein Kharousha, 3 militants of the Al-Aqsa Martyrs Brigades, and two other Palestinian militants. Kharousha's three sons were also arrested. After his death, Hamas claimed responsibility for the attack and identified Kharousha, a fighter in the Qassam Brigades, its military wing, as the perpetrator. Large crowds accompanied Kharousha's coffin from the Rafidia Surgical Hospital in Nablus to his burial in the Askar Camp, east of Nablus. Kharousha was born in 1974. After joining the Qassam Brigades, he was arrested multiple times by Israel, and was released at the end of 2022 after serving 40 months in prison. Later on the same day, groups of Israeli settlers rioted in the region, carrying out revenge attacks. One Palestinian man was fatally shot in the abdomen in neighboring Za'tara. An analysis by journalists for +972 Magazine of 14 videos of the assault conducted by 40โ€“50 settlers, who had returned to Za'tara after being repulsed the first time, concluded that the simultaneous attack on Za'atara in which Sameh Aqtesh was shot dead was conducted under Israeli army escort. In Huwara itself, 98 Palestinians were injured as settlers torched Palestinian homes. The two Israelis killed in the shooting were brothers named Hillel Menachem Yaniv and Yagel Yaโ€™acov Yaniv. They were from the Har Brakha settlement, and were described as yeshiva students. One of the brothers had just completed his service in the Israeli Navy. Following a lawsuit by the Yaniv family, Israel seized $5.2 million in funds for the Palestinian Authority (PA) under the 2024 Compensation for Terror Victims Law. Under the law, heirs of victims are entitled to compensation of $2.6 million per victim. The lawsuit targeted the PA's payments of stipends to militants and their families. Rampage The rampage in Huwara followed shortly after the deadly shooting by a Palestinian gunman of two Israeli settlers from Har Brakha, an Israeli settlement near Huwara, earlier the same day.Settler violence had generally been steadily on the rise in the West Bank in recent months, with Huwara previously having been subjected to an October blockade imposed by settlers and backed by Israeli soldiers. In the first two months of 2023, intercommunal violence led to the killings of 62 Palestinians and 14 Israelis. The year of 2022 was the deadliest for Palestinians living in the West Bank and East Jerusalem since 2004, according to Bโ€™Tselem, with nearly 150 Palestinians and 30 Israelis killed. On the night of 26 February 2023, hundreds of Israeli settlers attacked Huwara and three nearby villages, torching hundreds of Palestinian homes (some with people in them), businesses, a school, and numerous vehicles. One Palestinian man was shot dead. Although the violence had been anticipated, and the Israeli military had cordoned off the area, the soldiers remained on the sidelines during the siege and did not intervene. Israel's West Bank army commander defended the inaction, saying that they had not anticipated the ferocity of the attack and were not prepared to deal with it. He described the event as "a pogrom done by outlaws"โ€”a term commonly applied to eastern European mob attacks upon Jews during the 1800s and early 1900s. The Palestinian health ministry said a 37-year-old man was shot to death by the settlers, while the Palestinian Red Crescent medical service said another two people were wounded by gunfire, a third was stabbed and a fourth beaten with an iron bar. Ninety-five others were said to be suffering from teargas inhalation. Alongside the physical violence against local residents, the settlers set fire to approximately 30 homes and cars, according to one source. Other sources say 200 buildings were set ablaze in four Palestinian villages. Social media showed large blazes burning across the town and the violence reportedly lasted throughout the night and continued on into Monday morning. Aftermath The day after the rampage, the rampagers were still in control of the area and groups of masked Jews checked vehicles looking for Palestinians. Israeli soldiers nearby did not intervene. "Reports of a large Israeli army presence in the town existed only on paper". Settler attacks on Palestinian communities continued through 28 February. As of 28 February, of only eight suspects detained in connection with the rampage, all had been released, three to house arrest. On 1 March, eight further suspects believed to have taken part in the assault were arrested, one of whom was released the same day. The IDF commander responsible for the area, Major-General Yehuda Fox, described the event as a "pogrom carried out by outlaws." On 2 March, the Israeli Defense Ministry placed two of the individuals, one a minor, in administrative detention after a Jerusalem court ordered the police to release the remaining seven detained suspects. In June 2023, a joint FakeReporter and CNN investigation of the rampage revealed that the IDF took little action during the rampage. Reactions Israeli Prime Minister Benjamin Netanyahu appealed for calm on Sunday evening as footage of the violence emerged and spoke out against vigilantism. Israeli President Isaac Herzog condemned the rampage, stating, "This is not our way. It is criminal violence against innocents." Israeli Finance Minister Bezalel Smotrich, a settler leader now largely in charge of the administration of the West Bank, who had earlier called for "striking the cities of terror and its instigators without mercy, with tanks and helicopters", appealed to the settlers not to take the law into their own hands, and to let the army and government do their jobs. However, Smotrich later called on Twitter for Huwara to be "wiped out" by the Israeli government. Smotrich's remark drew international condemnation, including from the U.N. Secretary General, Jordan, the United Arab Emirates, and others. The United States State Department demanded Netanyahu reject the remark. Smotrich's remark came just ahead of a major fundraising event for Israel in the U.S., where Smotrich was set to appear. The White House indicated U.S. officials would not meet with him, and various Jewish rights organizations called for the State Department to deny him entry. Other members of Israel's ruling coalition offered other sentiments. Zvika Fogel, of the ultra-nationalist Otzma Yehudit, said he saw the violence "in a very good light" in response to a question on Army Radio in which the interviewer referred to the rampage as a 'pogrom'. After the IDF killed Kharousha and others on 7 March, Palestinian factions called for an escalation of "armed resistance" and revenge. On 1 March, Israeli Attorney General Gali Baharav-Miara opened an investigation into lawmaker Zvika Fogel of the far-right Otzma Yehudit party and a member of the Israeli government coalition on "suspicion of incitement to terrorism"; Fogel had publicly supported the rampage. 22 Israeli legal experts wrote to the attorney general to investigate pro-settler government MKs, including far-right minister Bezalel Smotrich, for "inducing war crimes" by their public support for the riots. Going further, Smotrich, when asked why he liked a tweet by Samaria Regional Council deputy mayor Davidi Ben Zion calling "to wipe out the village of Huwara today", said "Because I think the village of Huwara needs to be wiped out. I think the State of Israel should do it." Palestinian President Mahmoud Abbas criticized events as "the terrorist acts carried out by settlers under the protection of the occupation forces tonight," and blamed the Israeli government. One Palestinian witness said the event was "one of the most shocking attacks on Palestinian civilians from Israeli settlers in recent years", referred to it as a "pogrom", and called for the international community "to stop supporting Israel until it ends its military occupation." By the evening of 26 February, Israeli and Palestinian officials released a joint statement emphasizing "the importance of de-escalation on the ground and preventing further violence." Israel pledged not to approve new housing units in the West Bank for four months. The two parties agreed to examine the renewal of security cooperation and to establish a joint committee to explore economic measures Israel could take for the Palestinians. The European Union noted its alarm at the violence called on authorities on both sides "stop this endless cycle of violence." The UK ambassador to Israel called on Israel to address the settler violence and bring those responsible to justice. US State Department spokesman Ned Price said violence underscored "the imperative to immediately de-escalate tensions in words and deeds." CNN published an investigation on June 15 finding that Israeli forces both failed to stop the riots nor protect residents against settler violence. A soldier told the network that soldiers and border police did nothing, "We just let them continue to advance". See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Category:Existential_risk_from_artificial_intelligence] | [TOKENS: 43]
Category:Existential risk from artificial intelligence Pages in category "Existential risk from artificial intelligence" The following 58 pages are in this category, out of 58 total. This list may not reflect recent changes.
========================================
[SOURCE: https://www.mako.co.il/hix-nature/Article-d8eaad46a167c91027.htm] | [TOKENS: 92]
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/hix-nature/Article-d8eaad46a167c91027.htm from 79.181.162.231 on February 21 2026, 10:58:14 UTC
========================================
[SOURCE: https://news.ycombinator.com/item?id=47045194] | [TOKENS: 595]
Thereโ€™s surprising stuff on every page of this site. On the Contact Us page:> If youโ€™re looking for the debt relief company [โ€ฆ]The Featured Forth Applications page answered a lot of my questions.1: https://www.forth.com/resources/forth-apps/ > If youโ€™re looking for the debt relief company [โ€ฆ]The Featured Forth Applications page answered a lot of my questions.1: https://www.forth.com/resources/forth-apps/ The Featured Forth Applications page answered a lot of my questions.1: https://www.forth.com/resources/forth-apps/ 1: https://www.forth.com/resources/forth-apps/ reply The cost is stupidly high, though. Look at the source code of .The only good page to take from OOP book is the automatic and implicit pseudo-variable "self" or "this", that can reduce stack juggling significantly. I've implemented that in my (yet to be published) dialect and it works like a charm. In my experience, you can have that for cheap and anything more is not worth it from the point of view of a byte-counting Forth programmer. https://vfxforth.com/flag/swoop/index.html The only good page to take from OOP book is the automatic and implicit pseudo-variable "self" or "this", that can reduce stack juggling significantly. I've implemented that in my (yet to be published) dialect and it works like a charm. In my experience, you can have that for cheap and anything more is not worth it from the point of view of a byte-counting Forth programmer. https://vfxforth.com/flag/swoop/index.html https://vfxforth.com/flag/swoop/index.html reply reply reply reply This one for example looks like well rounded and user friendly option.Would anyone care to comment about this? Would anyone care to comment about this? reply Swift Forth is literally a professional Forth and is well regarded. The other often recommend Forth is the FOSS GForth. They are good for starting because they are popular and standard, so you'll find help easily.Other "smaller" Forth are often non-standard dialects and are more-or-less mature experiments. Other "smaller" Forth are often non-standard dialects and are more-or-less mature experiments. reply reply if you are working with specific hardware (e.g. microcontrollers) it depends on which forth dialects are available but for the raspberry pico and pico 2 I recently found zeptoforth or you know you can always bootstrap your own :) https://gforth.org https://github.com/tabemann/zeptoforth or you know you can always bootstrap your own :) https://gforth.org https://github.com/tabemann/zeptoforth https://gforth.org https://github.com/tabemann/zeptoforth reply reply
========================================
[SOURCE: https://en.wikipedia.org/wiki/Meerkat_(app)] | [TOKENS: 359]
Contents Meerkat (app) Meerkat was a mobile app that allowed users to broadcast live video streaming through their mobile devices. Upon registration, Meerkat users had the choice to connect their Facebook and Twitter accounts, facilitating direct streaming to their followers immediately upon going live. The app was available for both iOS and Android. The app was launched in February 2015 and gained rapid popularity following its debut on the Product Hunt. It also saw widespread use during the South by Southwest Interactive Festival in March 2015. On October 4, 2016, Meerkat was shut down and subsequently replaced by Houseparty. Development Meerkat was developed by Life On Air, Inc., a team led by founder and CEO Ben Rubin. The back-end that powered the app was developed over the course of two years for a previous video product of theirs. The company raised $12 million in venture capital funding from Greylock Partners in March 2015. On October 4, 2016, The app was withdrawn from the App Store and Play Store, and all associated services stopped. The Meerkat website now redirects to the creator's new app, Houseparty. According to the creators, the new app, which "has been in development for 10 months, and moves away from public broadcasts in favour of private chats." Twitter reaction In March 2015, weeks after the release of Meerkat, Twitter cut off Meerkat's access to its social graph, then announced the acquisition of the competing app Periscope. Twitter publicly launched Periscope on March 26, 2015. Apart from providing the similar functionality of live-streaming to users' Twitter followers, Periscope also gives users an option to let anyone play the stream back. References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Japanese_Alps] | [TOKENS: 1790]
Contents Japanese Alps The Japanese Alps (Japanese: ๆ—ฅๆœฌใ‚ขใƒซใƒ—ใ‚น, Hepburn: Nihon Arupusu) is a series of mountain ranges in Japan which bisect the main island of Honshu. The peaks that tower over central Honshu have long been the object of veneration and pilgrimage. These mountains had long been exploited by local people for raw materials, including timber, fuel, fertilizer, fodder, meat, minerals, and medicines. Many visitors have come to the mountains for pilgrimage, especially to the Buddhist temples located within them and the sacred peak of Mount Tate. The name was coined by English archaeologist William Gowland, and later popularized by Reverend Walter Weston (1861โ€“1940), an English missionary for whom a memorial plaque is located at Kamikลchi, a tourist destination known for its alpine climate. When Gowland coined the phrase, he was only referring to the Hida Mountains, but it now also applies to the Kiso Mountains and Akaishi Mountains. History The Japanese Alps has a long history before William Gowland established this name. The Japanese Alps have been used as a place of ascetic practice for Buddhists monks and Shugenja since ancient times. From the 1600s to the 1800s, samurai officers of the Kaga domain travelled deep into the Hida Mountains with local hunters and farmers as guides to preserve the timber of the mountains and continued to create maps recording ridges, valleys and vegetation. This survey is called Okuyama-mawari (ๅฅฅๅฑฑๅปปใ‚Š). Even now, it is very difficult to cross the steep Hida mountains, one of the world's heaviest snowfall areas, in winter. Therefore, it is considered a historical event in Japan that in the winter of 1584, daimyล Sassa Narimasa's forces crossed over the mountain range over Zara Pass and Harinoki Pass. This event is called "Sarasara-goe" (ใ•ใ‚‰ใ•ใ‚‰่ถŠใˆ) derived from Sassa and Zara Pass. However, these Hida Mountains surveys did not seem to have been inherited by modern Japanese mountaineers who trekked through the mountains as a sport. As Kojima Usui later recalled, โ€œin those days,... no one knew even the names of the mountains, much less their locations or elevations. To go mountaineering was literally to strike out into the unknown country.โ€ The first modern geological survey sheets were issued in 1890. The report mentioned major peaks, but the topography was mostly guesswork. From 1891, foreign travelers were able to find useful information in Basil Hall Chamberlain and W.B. Mason's Handbook for Travellers in Japan. However, for decades, the Japanese were climbing these mountains without a comparable guidebook. Japanese people did physical exploration over a decade in the 1890s. They divided the mountains into (north, central, and south) depending on how they were conventionally grouped. William Gowland, an English geologist, first thought of this swath of terrain as forming a single coherent landscape, comparable to the European Alps. Gowland's view was further developed by another Englishman and Christian missionary, Walter Weston, who was able โ€œto canonize Gowland's geographical conception, deploying it as a de facto proper nounโ€. Gowland explored several parts of the ranges in the 1860s, being the first documented foreigner to climb two peaks in the Alps, Mount Yari and Mount Norikura. Gowland was an archaeologist, and he explored these ranges for archaeological reasons. While Gowland was the first foreigner to explore the ranges, Reverend Walter Weston, a Christian missionary, was the first foreigner to document his experiences. About twenty years after Gowland's explorations, Weston explored the ranges himself with Gowland's notes on his explorations. Weston was led up many mountains by Kamijล kamonji, a mountain guide living in Kamikลchi. Weston explored the same ranges that Gowland previously traversed, and ascended the Mount Shirouma, Mount Jลnen, Mount Kasa, Mount Hotaka, and other minor mountains. Weston first documented the two main mountain systems distinguishable by geological structure. The first of these he called the "China system" due to its connection with southeast China from just south of the Japanese archipelago. The second was called the "Karafuto system", due to the fact that it enters Japan from Karafuto to the north and runs southwest. These two were considered to be the first western explorers of the range, and as a result Weston, with the help of Gowland, popularized and documented different parts of the ranges in an incredibly in-depth manner for others to expand on. In 1907, Yoshitaro Shibasaki and others succeeded in climbing Mount Tsurugi, which is said to be the last unexplored peak in Japan and the most difficult to climb. On this occasion, they found the ornaments of a metal shugenja cane and a sword on the top of the mountain. A scientific investigation later confirmed that the ornaments of the cane and the sword were from the late Nara period to the early Heian period. It turned out that Mount Tsurugi had already been climbed by shugenja more than 1,000 years ago. From the 1960s to the 1970s, the transportation infrastructure of the Japanese Alps was improved, and access to some popular mountain areas became dramatically easier, increasing not only climbers but also tourists. The Komagatake Ropeway opened in 1967, the Shinhotaka Ropeway opened in 1970, and the Tateyama Kurobe Alpine Route fully opened in 1971. Ranges Today, the Japanese Alps encompass the Hida Mountains (้ฃ›้จจๅฑฑ่„ˆ), the Kiso Mountains (ๆœจๆ›ฝๅฑฑ่„ˆ) and the Akaishi Mountains (่ตค็Ÿณๅฑฑ่„ˆ). These towering ranges include several peaks exceeding 3,000 m (9,843 ft) in height, the tallest after Mount Fuji. The highest are Mount Hotaka at 3,190 m (10,466 ft) in north area and Mount Kita at 3,193 m (10,476 ft) in south area. Since Mount Ontake is far from the Hida Mountains, it is generally not included in the Hida Mountains, but it is often mentioned together with the Japanese Alps in mountain guidebooks. Mount Ontake is well known as an active volcano, having erupted most recently in 2014. The Northern Alps, also known as the Hida Mountains, stretch through Nagano, Toyama and Gifu prefectures. A small portion of the mountains also reach into Niigata Prefecture. It includes the mountains Mount Norikura, Mount Yake, Mount Kasa, Mount Hotaka, Mount Yari, Mount Jลnen, Mount Washiba, Mount Suisho, Mount Yakushi, Mount Kurobegorล, Mount Tate, Mount Tsurugi, Kashima Yarigatake (้นฟๅณถๆงใƒถๅฒณ), Goryลซ dake (ไบ”็ซœๅฒณ), Mount Shirouma, etc. The Central Alps, also known as the Kiso Mountains, are located in the Nagano prefecture. It includes the mountains Mount Ena, Anpaiji mountain (ๅฎ‰ๅนณ่ทฏๅฑฑ), Mount Kusumoyama (่ถŠ็™พๅฑฑ), Mount Minamikoma, Mount Utsugi, Mount Hลken, Mount Kisokoma, Kyogatake (็ตŒใƒถๅฒณ), etc. The Southern Alps, also known as the Akaishi Mountains, span Nagano, Yamanashi, and Shizuoka prefectures. It includes the mountains Mount Tekari, Mount Hijiri, Mount Akaishi, Mount Arakawa, Mount Shiomi, Mount Nลtori, Mount Aino, Mount Kita, Mount Hลล, Mount Kaikoma, Mount Senjล, Mount Nokogiri (Akaishi), etc. Glaciers Geographers previously believed that no active glaciers existed in Japan. The Japanese Society of Snow and Ice found this to be false in May 2012. By studying surface flow velocity and snow patches in Mount Tsurugi, they found that certain perennial snow patches have large masses of ice, upwards of 30 meters in thickness. This causes these snow patches to be classified as active glaciers, and as of 2019 there are seven active glaciers in the Japanese Alps, and all of Japan. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-189] | [TOKENS: 10728]
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006โ€”over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protรฉgรฉ. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendoโ€“Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ยฅ39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to ยฃ700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even closeโ€”Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, โ€” with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) โ€” as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a ยฃ20 million marketing budget during the Christmas season compared to Sega's ยฃ4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999โ€“2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least ยฃ100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006โ€”over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256ร—224 to 640ร—480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consolesโ€”including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears outโ€”usually unevenlyโ€”due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41โ„2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsลซshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5โ€”for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everythingโ€”the whole PlayStation formatโ€”is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References
========================================
[SOURCE: https://en.wikipedia.org/wiki/BBC_News#cite_note-33] | [TOKENS: 8810]
Contents BBC News BBC News is an operational business division of the British Broadcasting Corporation (BBC) responsible for the gathering and broadcasting of news and current affairs in the UK and around the world. The department is the world's largest broadcast news organisation and generates about 120 hours of radio and television output each day, as well as online news coverage. The service has over 5,500 journalists working across its output including in 50 foreign news bureaus where more than 250 foreign correspondents are stationed. Deborah Turness has been the CEO of news and current affairs since September 2022. In 2019, it was reported in an Ofcom report that the BBC spent ยฃ136m on news during the period April 2018 to March 2019. BBC News' domestic, global and online news divisions are housed within the largest live newsroom in Europe, in Broadcasting House in central London. Parliamentary coverage is produced and broadcast from studios in London. Through BBC English Regions, the BBC also has regional centres across England and national news centres in Northern Ireland, Scotland and Wales. All nations and English regions produce their own local news programmes and other current affairs and sport programmes. The BBC is a quasi-autonomous corporation authorised by royal charter, making it operationally independent of the government. As of 2024, the BBC reaches an average of 450 million people per week, with the BBC World Service accounting for 320 million people. History This is London calling โ€“ 2LO calling. Here is the first general news bulletin, copyright by Reuters, Press Association, Exchange Telegraph and Central News. โ€” BBC news programme opening during the 1920s The British Broadcasting Company broadcast its first radio bulletin from radio station 2LO on 14 November 1922. Wishing to avoid competition, newspaper publishers persuaded the government to ban the BBC from broadcasting news before 7 pm, and to force it to use wire service copy instead of reporting on its own. The BBC gradually gained the right to edit the copy and, in 1934, created its own news operation. However, it could not broadcast news before 6 p.m. until World War II. In addition to news, Gaumont British and Movietone cinema newsreels had been broadcast on the TV service since 1936, with the BBC producing its own equivalent Television Newsreel programme from January 1948. A weekly Children's Newsreel was inaugurated on 23 April 1950, to around 350,000 receivers. The network began simulcasting its radio news on television in 1946, with a still picture of Big Ben. Televised bulletins began on 5 July 1954, broadcast from leased studios within Alexandra Palace in London. The public's interest in television and live events was stimulated by Elizabeth II's coronation in 1953. It is estimated that up to 27 million people viewed the programme in the UK, overtaking radio's audience of 12 million for the first time. Those live pictures were fed from 21 cameras in central London to Alexandra Palace for transmission, and then on to other UK transmitters opened in time for the event. That year, there were around two million TV Licences held in the UK, rising to over three million the following year, and four and a half million by 1955. Television news, although physically separate from its radio counterpart, was still firmly under radio news' control in the 1950s. Correspondents provided reports for both outlets, and the first televised bulletin, shown on 5 July 1954 on the then BBC television service and presented by Richard Baker, involved his providing narration off-screen while stills were shown. This was then followed by the customary Television Newsreel with a recorded commentary by John Snagge (and on other occasions by Andrew Timothy). On-screen newsreaders were introduced a year later in 1955 โ€“ Kenneth Kendall (the first to appear in vision), Robert Dougall, and Richard Bakerโ€”three weeks before ITN's launch on 21 September 1955. Mainstream television production had started to move out of Alexandra Palace in 1950 to larger premises โ€“ mainly at Lime Grove Studios in Shepherd's Bush, west London โ€“ taking Current Affairs (then known as Talks Department) with it. It was from here that the first Panorama, a new documentary programme, was transmitted on 11 November 1953, with Richard Dimbleby becoming anchor in 1955. In 1958, Hugh Carleton Greene became head of News and Current Affairs. On 1 January 1960, Greene became Director-General. Greene made changes that were aimed at making BBC reporting more similar to its competitor ITN, which had been highly rated by study groups held by Greene. A newsroom was created at Alexandra Palace, television reporters were recruited and given the opportunity to write and voice their own scripts, without having to cover stories for radio too. On 20 June 1960, Nan Winton, the first female BBC network newsreader, appeared in vision. 19 September 1960 saw the start of the radio news and current affairs programme The Ten O'clock News. BBC2 started transmission on 20 April 1964 and began broadcasting a new show, Newsroom. The World at One, a lunchtime news programme, began on 4 October 1965 on the then Home Service, and the year before News Review had started on television. News Review was a summary of the week's news, first broadcast on Sunday, 26 April 1964 on BBC 2 and harking back to the weekly Newsreel Review of the Week, produced from 1951, to open programming on Sunday eveningsโ€“the difference being that this incarnation had subtitles for the deaf and hard-of-hearing. As this was the decade before electronic caption generation, each superimposition ("super") had to be produced on paper or card, synchronised manually to studio and news footage, committed to tape during the afternoon, and broadcast early evening. Thus Sundays were no longer a quiet day for news at Alexandra Palace. The programme ran until the 1980s โ€“ by then using electronic captions, known as Anchor โ€“ to be superseded by Ceefax subtitling (a similar Teletext format), and the signing of such programmes as See Hear (from 1981). On Sunday 17 September 1967, The World This Weekend, a weekly news and current affairs programme, launched on what was then Home Service, but soon-to-be Radio 4. Preparations for colour began in the autumn of 1967 and on Thursday 7 March 1968 Newsroom on BBC2 moved to an early evening slot, becoming the first UK news programme to be transmitted in colour โ€“ from Studio A at Alexandra Palace. News Review and Westminster (the latter a weekly review of Parliamentary happenings) were "colourised" shortly after. However, much of the insert material was still in black and white, as initially only a part of the film coverage shot in and around London was on colour reversal film stock, and all regional and many international contributions were still in black and white. Colour facilities at Alexandra Palace were technically very limited for the next eighteen months, as it had only one RCA colour Quadruplex videotape machine and, eventually two Pye plumbicon colour telecinesโ€“although the news colour service started with just one. Black and white national bulletins on BBC 1 continued to originate from Studio B on weekdays, along with Town and Around, the London regional "opt out" programme broadcast throughout the 1960s (and the BBC's first regional news programme for the South East), until it started to be replaced by Nationwide on Tuesday to Thursday from Lime Grove Studios early in September 1969. Town and Around was never to make the move to Television Centre โ€“ instead it became London This Week which aired on Mondays and Fridays only, from the new TVC studios. The BBC moved production out of Alexandra Palace in 1969. BBC Television News resumed operations the next day with a lunchtime bulletin on BBC1 โ€“ in black and white โ€“ from Television Centre, where it remained until March 2013. This move to a smaller studio with better technical facilities allowed Newsroom and News Review to replace back projection with colour-separation overlay. During the 1960s, satellite communication had become possible; however, it was some years before digital line-store conversion was able to undertake the process seamlessly. On 14 September 1970, the first Nine O'Clock News was broadcast on television. Robert Dougall presented the first week from studio N1 โ€“ described by The Guardian as "a sort of polystyrene padded cell"โ€”the bulletin having been moved from the earlier time of 20.50 as a response to the ratings achieved by ITN's News at Ten, introduced three years earlier on the rival ITV. Richard Baker and Kenneth Kendall presented subsequent weeks, thus echoing those first television bulletins of the mid-1950s. Angela Rippon became the first female news presenter of the Nine O'Clock News in 1975. Her work outside the news was controversial at the time, appearing on The Morecambe and Wise Christmas Show in 1976 singing and dancing. The first edition of John Craven's Newsround, initially intended only as a short series and later renamed just Newsround, came from studio N3 on 4 April 1972. Afternoon television news bulletins during the mid to late 1970s were broadcast from the BBC newsroom itself, rather than one of the three news studios. The newsreader would present to camera while sitting on the edge of a desk; behind him staff would be seen working busily at their desks. This period corresponded with when the Nine O'Clock News got its next makeover, and would use a CSO background of the newsroom from that very same camera each weekday evening. Also in the mid-1970s, the late night news on BBC2 was briefly renamed Newsnight, but this was not to last, or be the same programme as we know today โ€“ that would be launched in 1980 โ€“ and it soon reverted to being just a news summary with the early evening BBC2 news expanded to become Newsday. News on radio was to change in the 1970s, and on Radio 4 in particular, brought about by the arrival of new editor Peter Woon from television news and the implementation of the Broadcasting in the Seventies report. These included the introduction of correspondents into news bulletins where previously only a newsreader would present, as well as the inclusion of content gathered in the preparation process. New programmes were also added to the daily schedule, PM and The World Tonight as part of the plan for the station to become a "wholly speech network". Newsbeat launched as the news service on Radio 1 on 10 September 1973. On 23 September 1974, a teletext system which was launched to bring news content on television screens using text only was launched. Engineers originally began developing such a system to bring news to deaf viewers, but the system was expanded. The Ceefax service became much more diverse before it ceased on 23 October 2012: it not only had subtitling for all channels, it also gave information such as weather, flight times and film reviews. By the end of the decade, the practice of shooting on film for inserts in news broadcasts was declining, with the introduction of ENG technology into the UK. The equipment would gradually become less cumbersome โ€“ the BBC's first attempts had been using a Philips colour camera with backpack base station and separate portable Sony U-matic recorder in the latter half of the decade. In 1980, the Iranian Embassy Siege had been shot electronically by the BBC Television News Outside broadcasting team, and the work of reporter Kate Adie, broadcasting live from Prince's Gate, was nominated for BAFTA actuality coverage, but this time beaten by ITN for the 1980 award. Newsnight, the news and current affairs programme, was due to go on air on 23 January 1980, although trade union disagreements meant that its launch from Lime Grove was postponed by a week. On 27 August 1981 Moira Stuart became the first African Caribbean female newsreader to appear on British television. By 1982, ENG technology had become sufficiently reliable for Bernard Hesketh to use an Ikegami camera to cover the Falklands War, coverage for which he won the "Royal Television Society Cameraman of the Year" award and a BAFTA nomination โ€“ the first time that BBC News had relied upon an electronic camera, rather than film, in a conflict zone. BBC News won the BAFTA for its actuality coverage, however the event has become remembered in television terms for Brian Hanrahan's reporting where he coined the phrase "I'm not allowed to say how many planes joined the raid, but I counted them all out and I counted them all back" to circumvent restrictions, and which has become cited as an example of good reporting under pressure. The first BBC breakfast television programme, Breakfast Time also launched during the 1980s, on 17 January 1983 from Lime Grove Studio E and two weeks before its ITV rival TV-am. Frank Bough, Selina Scott, and Nick Ross helped to wake viewers with a relaxed style of presenting. The Six O'Clock News first aired on 3 September 1984, eventually becoming the most watched news programme in the UK (however, since 2006 it has been overtaken by the BBC News at Ten). In October 1984, images of millions of people starving to death in the Ethiopian famine were shown in Michael Buerk's Six O'Clock News reports. The BBC News crew were the first to document the famine, with Buerk's report on 23 October describing it as "a biblical famine in the 20th century" and "the closest thing to hell on Earth". The BBC News report shocked Britain, motivating its citizens to inundate relief agencies, such as Save the Children, with donations, and to bring global attention to the crisis in Ethiopia. The news report was also watched by Bob Geldof, who would organise the charity single "Do They Know It's Christmas?" to raise money for famine relief followed by the Live Aid concert in July 1985. Starting in 1981, the BBC gave a common theme to its main news bulletins with new electronic titlesโ€“a set of computer-animated "stripes" forming a circle on a red background with a "BBC News" typescript appearing below the circle graphics, and a theme tune consisting of brass and keyboards. The Nine used a similar (striped) number 9. The red background was replaced by a blue from 1985 until 1987. By 1987, the BBC had decided to re-brand its bulletins and established individual styles again for each one with differing titles and music, the weekend and holiday bulletins branded in a similar style to the Nine, although the "stripes" introduction continued to be used until 1989 on occasions where a news bulletin was screened out of the running order of the schedule. In 1987, John Birt resurrected the practice of correspondents working for both TV and radio with the introduction of bi-media journalism. During the 1990s, a wider range of services began to be offered by BBC News, with the split of BBC World Service Television to become BBC World (news and current affairs), and BBC Prime (light entertainment). Content for a 24-hour news channel was thus required, followed in 1997 with the launch of domestic equivalent BBC News 24. Rather than set bulletins, ongoing reports and coverage was needed to keep both channels functioning and meant a greater emphasis in budgeting for both was necessary. In 1998, after 66 years at Broadcasting House, the BBC Radio News operation moved to BBC Television Centre. New technology, provided by Silicon Graphics, came into use in 1993 for a re-launch of the main BBC 1 bulletins, creating a virtual set which appeared to be much larger than it was physically. The relaunch also brought all bulletins into the same style of set with only small changes in colouring, titles, and music to differentiate each. A computer generated cut-glass sculpture of the BBC coat of arms was the centrepiece of the programme titles until the large scale corporate rebranding of news services in 1999. In November 1997, BBC News Online was launched, following individual webpages for major news events such as the 1996 Olympic Games, 1997 general election, and the death of Princess Diana. In 1999, the biggest relaunch occurred, with BBC One bulletins, BBC World, BBC News 24, and BBC News Online all adopting a common style. One of the most significant changes was the gradual adoption of the corporate image by the BBC regional news programmes, giving a common style across local, national and international BBC television news. This also included Newyddion, the main news programme of Welsh language channel S4C, produced by BBC News Wales. Following the relaunch of BBC News in 1999, regional headlines were included at the start of the BBC One news bulletins in 2000. The English regions did however lose five minutes at the end of their bulletins, due to a new headline round-up at 18:55. 2000 also saw the Nine O'Clock News moved to the later time of 22:00. This was in response to ITN who had just moved their popular News at Ten programme to 23:00. ITN briefly returned News at Ten but following poor ratings when head-to-head against the BBC's Ten O'Clock News, the ITN bulletin was moved to 22.30, where it remained until 14 January 2008. The retirement in 2009 of Peter Sissons and departure of Michael Buerk from the Ten O'Clock News led to changes in the BBC One bulletin presenting team on 20 January 2003. The Six O'Clock News became double headed with George Alagiah and Sophie Raworth after Huw Edwards and Fiona Bruce moved to present the Ten. A new set design featuring a projected fictional newsroom backdrop was introduced, followed on 16 February 2004 by new programme titles to match those of BBC News 24. BBC News 24 and BBC World introduced a new style of presentation in December 2003, that was slightly altered on 5 July 2004 to mark 50 years of BBC Television News. On 7 March 2005 director general Mark Thompson launched the "Creative Futures" project to restructure the organisation. The individual positions of editor of the One and Six O'Clock News were replaced by a new daytime position in November 2005. Kevin Bakhurst became the first Controller of BBC News 24, replacing the position of editor. Amanda Farnsworth became daytime editor while Craig Oliver was later named editor of the Ten O'Clock News. Bulletins received new titles and a new set design in May 2006, to allow for Breakfast to move into the main studio for the first time since 1997. The new set featured Barco videowall screens with a background of the London skyline used for main bulletins and originally an image of cirrus clouds against a blue sky for Breakfast. This was later replaced following viewer criticism. The studio bore similarities with the ITN-produced ITV News in 2004, though ITN uses a CSO Virtual studio rather than the actual screens at BBC News. BBC News became part of a new BBC Journalism group in November 2006 as part of a restructuring of the BBC. The then-Director of BBC News, Helen Boaden reported to the then-Deputy Director-General and head of the journalism group, Mark Byford until he was made redundant in 2010. On 18 October 2007, ED Mark Thompson announced a six-year plan, "Delivering Creative Futures" (based on his project begun in March 2005), merging the television current affairs department into a new "News Programmes" division. Thompson's announcement, in response to a ยฃ2 billion shortfall in funding, would, he said, deliver "a smaller but fitter BBC" in the digital age, by cutting its payroll and, in 2013, selling Television Centre. The various separate newsrooms for television, radio and online operations were merged into a single multimedia newsroom. Programme making within the newsrooms was brought together to form a multimedia programme making department. BBC World Service director Peter Horrocks said that the changes would achieve efficiency at a time of cost-cutting at the BBC. In his blog, he wrote that by using the same resources across the various broadcast media meant fewer stories could be covered, or by following more stories, there would be fewer ways to broadcast them. A new graphics and video playout system was introduced for production of television bulletins in January 2007. This coincided with a new structure to BBC World News bulletins, editors favouring a section devoted to analysing the news stories reported on. The first new BBC News bulletin since the Six O'Clock News was announced in July 2007 following a successful trial in the Midlands. The summary, lasting 90 seconds, has been broadcast at 20:00 on weekdays since December 2007 and bears similarities with 60 Seconds on BBC Three, but also includes headlines from the various BBC regions and a weather summary. As part of a long-term cost cutting programme, bulletins were renamed the BBC News at One, Six and Ten respectively in April 2008 while BBC News 24 was renamed BBC News and moved into the same studio as the bulletins at BBC Television Centre. BBC World was renamed BBC World News and regional news programmes were also updated with the new presentation style, designed by Lambie-Nairn. 2008 also saw tri-media introduced across TV, radio, and online. The studio moves also meant that Studio N9, previously used for BBC World, was closed, and operations moved to the previous studio of BBC News 24. Studio N9 was later refitted to match the new branding, and was used for the BBC's UK local elections and European elections coverage in early June 2009. A strategy review of the BBC in March 2010, confirmed that having "the best journalism in the world" would form one of five key editorial policies, as part of changes subject to public consultation and BBC Trust approval. After a period of suspension in late 2012, Helen Boaden ceased to be the Director of BBC News. On 16 April 2013, incoming BBC Director-General Tony Hall named James Harding, a former editor of The Times of London newspaper as Director of News and Current Affairs. From August 2012 to March 2013, all news operations moved from Television Centre to new facilities in the refurbished and extended Broadcasting House, in Portland Place. The move began in October 2012, and also included the BBC World Service, which moved from Bush House following the expiry of the BBC's lease. This new extension to the north and east, referred to as "New Broadcasting House", includes several new state-of-the-art radio and television studios centred around an 11-storey atrium. The move began with the domestic programme The Andrew Marr Show on 2 September 2012, and concluded with the move of the BBC News channel and domestic news bulletins on 18 March 2013. The newsroom houses all domestic bulletins and programmes on both television and radio, as well as the BBC World Service international radio networks and the BBC World News international television channel. BBC News and CBS News established an editorial and newsgathering partnership in 2017, replacing an earlier long-standing partnership between BBC News and ABC News. In an October 2018 Simmons Research survey of 38 news organisations, BBC News was ranked the fourth most trusted news organisation by Americans, behind CBS News, ABC News and The Wall Street Journal. In January 2020 the BBC announced a BBC News savings target of ยฃ80 million per year by 2022, involving about 450 staff reductions from the current 6,000. BBC director of news and current affairs Fran Unsworth said there would be further moves toward digital broadcasting, in part to attract back a youth audience, and more pooling of reporters to stop separate teams covering the same news. A further 70 staff reductions were announced in July 2020. BBC Three began airing the news programme The Catch Up in February 2022. It is presented by Levi Jouavel, Kirsty Grant, and Callum Tulley and aims to get the channel's target audience (16 to 34-year olds) to make sense of the world around them while also highlighting optimistic stories. Compared to its predecessor 60 Seconds, The Catch Up is three times longer, running for about three minutes and not airing during weekends. According to its annual report as of December 2021[update], India has the largest number of people using BBC services in the world. In May 2025, following the earthquake that hit Myanmar and Thailand, a television news bulletin (BBC News Myanmar) from the Burmese service using a vacated Voice of America satellite frequency began its broadcasts. Programming and reporting In November 2023, BBC News joined with the International Consortium of Investigative Journalists, Paper Trail Media [de] and 69 media partners including Distributed Denial of Secrets and the Organised Crime and Corruption Reporting Project (OCCRP) and more than 270 journalists in 55 countries and territories to produce the 'Cyprus Confidential' report on the financial network which supports the regime of Vladimir Putin, mostly with connections to Cyprus, and showed Cyprus to have strong links with high-up figures in the Kremlin, some of whom have been sanctioned. Government officials including Cyprus president Nikos Christodoulides and European lawmakers began responding to the investigation's findings in less than 24 hours, calling for reforms and launching probes. BBC News is responsible for the news programmes and documentary content on the BBC's general television channels, as well as the news coverage on the BBC News Channel in the UK, and 22 hours of programming for the corporation's international BBC World News channel. Coverage for BBC Parliament is carried out on behalf of the BBC at Millbank Studios, though BBC News provides editorial and journalistic content. BBC News content is also output onto the BBC's digital interactive television services under the BBC Red Button brand, and until 2012, on the Ceefax teletext system. The music on all BBC television news programmes was introduced in 1999 and composed by David Lowe. It was part of the re-branding which commenced in 1999 and features 'BBC Pips'. The general theme was used on bulletins on BBC One, News 24, BBC World and local news programmes in the BBC's Nations and Regions. Lowe was also responsible for the music on Radio One's Newsbeat. The theme has had several changes since 1999, the latest in March 2013. The BBC Arabic Television news channel launched on 11 March 2008, a Persian-language channel followed on 14 January 2009, broadcasting from the Peel wing of Broadcasting House; both include news, analysis, interviews, sports and highly cultural programmes and are run by the BBC World Service and funded from a grant-in-aid from the British Foreign Office (and not the television licence). The BBC Verify service was launched in 2023 to fact-check news stories, followed by BBC Verify Live in 2025. BBC Radio News produces bulletins for the BBC's national radio stations and provides content for local BBC radio stations via the General News Service (GNS), a BBC-internal news distribution service. BBC News does not produce the BBC's regional news bulletins, which are produced individually by the BBC nations and regions themselves. The BBC World Service broadcasts to some 150 million people in English as well as 27 languages across the globe. BBC Radio News is a patron of the Radio Academy. BBC News Online is the BBC's news website. Launched in November 1997, it is one of the most popular news websites, with 1.2 billion website visits in April 2021, as well as being used by 60% of the UK's internet users for news. The website contains international news coverage as well as entertainment, sport, science, and political news. Mobile apps for Android, iOS and Windows Phone systems have been provided since 2010. Many television and radio programmes are also available to view on the BBC iPlayer and BBC Sounds services. The BBC News channel is also available to view 24 hours a day, while video and radio clips are also available within online news articles. In October 2019, BBC News Online launched a mirror on the dark web anonymity network Tor in an effort to circumvent censorship. Criticism The BBC is required by its charter to be free from both political and commercial influence and answers only to its viewers and listeners. This political objectivity is sometimes questioned. For instance, The Daily Telegraph (3 August 2005) carried a letter from the KGB defector Oleg Gordievsky, referring to it as "The Red Service". Books have been written on the subject, including anti-BBC works like Truth Betrayed by W J West and The Truth Twisters by Richard Deacon. The BBC has been accused of bias by Conservative MPs. The BBC's Editorial Guidelines on Politics and Public Policy state that while "the voices and opinions of opposition parties must be routinely aired and challenged", "the government of the day will often be the primary source of news". The BBC is regularly accused by the government of the day of bias in favour of the opposition and, by the opposition, of bias in favour of the government. Similarly, during times of war, the BBC is often accused by the UK government, or by strong supporters of British military campaigns, of being overly sympathetic to the view of the enemy. An edition of Newsnight at the start of the Falklands War in 1982 was described as "almost treasonable" by John Page, MP, who objected to Peter Snow saying "if we believe the British". During the first Gulf War, critics of the BBC took to using the satirical name "Baghdad Broadcasting Corporation". During the Kosovo War, the BBC were labelled the "Belgrade Broadcasting Corporation" (suggesting favouritism towards the FR Yugoslavia government over ethnic Albanian rebels) by British ministers, although Slobodan Miloseviฤ‡ (then FRY president) claimed that the BBC's coverage had been biased against his nation. Conversely, some of those who style themselves anti-establishment in the United Kingdom or who oppose foreign wars have accused the BBC of pro-establishment bias or of refusing to give an outlet to "anti-war" voices. Following the 2003 invasion of Iraq, a study by the Cardiff University School of Journalism of the reporting of the war found that nine out of 10 references to weapons of mass destruction during the war assumed that Iraq possessed them, and only one in 10 questioned this assumption. It also found that, out of the main British broadcasters covering the war, the BBC was the most likely to use the British government and military as its source. It was also the least likely to use independent sources, like the Red Cross, who were more critical of the war. When it came to reporting Iraqi casualties, the study found fewer reports on the BBC than on the other three main channels. The report's author, Justin Lewis, wrote "Far from revealing an anti-war BBC, our findings tend to give credence to those who criticised the BBC for being too sympathetic to the government in its war coverage. Either way, it is clear that the accusation of BBC anti-war bias fails to stand up to any serious or sustained analysis." Prominent BBC appointments are constantly assessed by the British media and political establishment for signs of political bias. The appointment of Greg Dyke as Director-General was highlighted by press sources because Dyke was a Labour Party member and former activist, as well as a friend of Tony Blair. The BBC's former Political Editor, Nick Robinson, was some years ago a chairman of the Young Conservatives and did, as a result, attract informal criticism from the former Labour government, but his predecessor Andrew Marr faced similar claims from the right because he was editor of The Independent, a liberal-leaning newspaper, before his appointment in 2000. Mark Thompson, former Director-General of the BBC, admitted the organisation has been biased "towards the left" in the past. He said, "In the BBC I joined 30 years ago, there was, in much of current affairs, in terms of people's personal politics, which were quite vocal, a massive bias to the left". He then added, "The organization did struggle then with impartiality. Now it is a completely different generation. There is much less overt tribalism among the young journalists who work for the BBC." Following the EU referendum in 2016, some critics suggested that the BBC was biased in favour of leaving the EU. For instance, in 2018, the BBC received complaints from people who took issue that the BBC was not sufficiently covering anti-Brexit marches while giving smaller-scale events hosted by former UKIP leader Nigel Farage more airtime. On the other hand, a poll released by YouGov showed that 45% of people who voted to leave the EU thought that the BBC was 'actively anti-Brexit' compared to 13% of the same kinds of voters who think the BBC is pro-Brexit. In 2008, the BBC Hindi was criticised by some Indian outlets for referring to the terrorists who carried out the 2008 Mumbai attacks as "gunmen". The response to this added to prior criticism from some Indian commentators suggesting that the BBC may have an Indophobic bias. In March 2015, the BBC was criticised for a BBC Storyville documentary interviewing one of the rapists in India. In spite of a ban ordered by the Indian High court, the BBC still aired the documentary "India's Daughter" outside India. BBC News was at the centre of a political controversy following the 2003 invasion of Iraq. Three BBC News reports (Andrew Gilligan's on Today, Gavin Hewitt's on The Ten O'Clock News and another on Newsnight) quoted an anonymous source that stated the British government (particularly the Prime Minister's office) had embellished the September Dossier with misleading exaggerations of Iraq's weapons of mass destruction capabilities. The government denounced the reports and accused the corporation of poor journalism. In subsequent weeks the corporation stood by the report, saying that it had a reliable source. Following intense media speculation, David Kelly was named in the press as the source for Gilligan's story on 9 July 2003. Kelly was found dead, by suicide, in a field close to his home early on 18 July. An inquiry led by Lord Hutton was announced by the British government the following day to investigate the circumstances leading to Kelly's death, concluding that "Dr. Kelly took his own life." In his report on 28 January 2004, Lord Hutton concluded that Gilligan's original accusation was "unfounded" and the BBC's editorial and management processes were "defective". In particular, it specifically criticised the chain of management that caused the BBC to defend its story. The BBC Director of News, Richard Sambrook, the report said, had accepted Gilligan's word that his story was accurate in spite of his notes being incomplete. Davies had then told the BBC Board of Governors that he was happy with the story and told the Prime Minister that a satisfactory internal inquiry had taken place. The Board of Governors, under the chairman's, Gavyn Davies, guidance, accepted that further investigation of the Government's complaints were unnecessary. Because of the criticism in the Hutton report, Davies resigned on the day of publication. BBC News faced an important test, reporting on itself with the publication of the report, but by common consent (of the Board of Governors) managed this "independently, impartially and honestly". Davies' resignation was followed by the resignation of Director General, Greg Dyke, the following day, and the resignation of Gilligan on 30 January. While undoubtedly a traumatic experience for the corporation, an ICM poll in April 2003 indicated that it had sustained its position as the best and most trusted provider of news. The BBC has faced accusations of holding both anti-Israel and anti-Palestine bias. Douglas Davis, the London correspondent of The Jerusalem Post, has described the BBC's coverage of the Arabโ€“Israeli conflict as "a relentless, one-dimensional portrayal of Israel as a demonic, criminal state and Israelis as brutal oppressors [which] bears all the hallmarks of a concerted campaign of vilification that, wittingly or not, has the effect of delegitimising the Jewish state and pumping oxygen into a dark old European hatred that dared not speak its name for the past half-century.". However two large independent studies, one conducted by Loughborough University and the other by Glasgow University's Media Group concluded that Israeli perspectives are given greater coverage. Critics of the BBC argue that the Balen Report proves systematic bias against Israel in headline news programming. The Daily Mail and The Daily Telegraph criticised the BBC for spending hundreds of thousands of British tax payers' pounds from preventing the report being released to the public. Jeremy Bowen, the Middle East Editor for BBC world news, was singled out specifically for bias by the BBC Trust which concluded that he violated "BBC guidelines on accuracy and impartiality." An independent panel appointed by the BBC Trust was set up in 2006 to review the impartiality of the BBC's coverage of the Israeliโ€“Palestinian conflict. The panel's assessment was that "apart from individual lapses, there was little to suggest deliberate or systematic bias." While noting a "commitment to be fair accurate and impartial" and praising much of the BBC's coverage the independent panel concluded "that BBC output does not consistently give a full and fair account of the conflict. In some ways the picture is incomplete and, in that sense, misleading." It notes that, "the failure to convey adequately the disparity in the Israeli and Palestinian experience, [reflects] the fact that one side is in control and the other lives under occupation". Writing in the Financial Times, Philip Stephens, one of the panellists, later accused the BBC's director-general, Mark Thompson, of misrepresenting the panel's conclusions. He further opined "My sense is that BBC news reporting has also lost a once iron-clad commitment to objectivity and a necessary respect for the democratic process. If I am right, the BBC, too, is lost". Mark Thompson published a rebuttal in the FT the next day. The description by one BBC correspondent reporting on the funeral of Yassir Arafat that she had been left with tears in her eyes led to other questions of impartiality, particularly from Martin Walker in a guest opinion piece in The Times, who picked out the apparent case of Fayad Abu Shamala, the BBC Arabic Service correspondent, who told a Hamas rally on 6 May 2001, that journalists in Gaza were "waging the campaign shoulder to shoulder together with the Palestinian people". Walker argues that the independent inquiry was flawed for two reasons. Firstly, because the time period over which it was conducted (August 2005 to January 2006) surrounded the Israeli withdrawal from Gaza and Ariel Sharon's stroke, which produced more positive coverage than usual. Furthermore, he wrote, the inquiry only looked at the BBC's domestic coverage, and excluded output on the BBC World Service and BBC World. Tom Gross accused the BBC of glorifying Hamas suicide bombers, and condemned its policy of inviting guests such as Jenny Tonge and Tom Paulin who have compared Israeli soldiers to Nazis. Writing for the BBC, Paulin said Israeli soldiers should be "shot dead" like Hitler's S.S, and said he could "understand how suicide bombers feel". The BBC also faced criticism for not airing a Disasters Emergency Committee aid appeal for Palestinians who suffered in Gaza during 22-day war there between late 2008 and early 2009. Most other major UK broadcasters did air this appeal, but rival Sky News did not. British journalist Julie Burchill has accused BBC of creating a "climate of fear" for British Jews over its "excessive coverage" of Israel compared to other nations. In light of the Gaza war, the BBC suspended seven Arab journalists over allegations of expressing support for Hamas via social media. BBC and ABC share video segments and reporters as needed in producing their newscasts. with the BBC showing ABC World News Tonight with David Muir in the UK. However, in July 2017, the BBC announced a new partnership with CBS News allows both organisations to share video, editorial content, and additional newsgathering resources in New York, London, Washington and around the world. BBC News subscribes to wire services from leading international agencies including PA Media (formerly Press Association), Reuters, and Agence France-Presse. In April 2017, the BBC dropped Associated Press in favour of an enhanced service from AFP. BBC News reporters and broadcasts are now and have in the past been banned in several countries primarily for reporting which has been unfavourable to the ruling government. For example, correspondents were banned by the former apartheid regime of South Africa. The BBC was banned in Zimbabwe under Mugabe for eight years as a terrorist organisation until being allowed to operate again over a year after the 2008 elections. The BBC was banned in Burma (officially Myanmar) after their coverage and commentary on anti-government protests there in September 2007. The ban was lifted four years later in September 2011. Other cases have included Uzbekistan, China, and Pakistan. BBC Persian, the BBC's Persian language news site, was blocked from the Iranian internet in 2006. The BBC News website was made available in China again in March 2008, but as of October 2014[update], was blocked again. In June 2015, the Rwandan government placed an indefinite ban on BBC broadcasts following the airing of a controversial documentary regarding the 1994 Rwandan genocide, Rwanda's Untold Story, broadcast on BBC2 on 1 October 2014. The UK's Foreign Office recognised "the hurt caused in Rwanda by some parts of the documentary". In February 2017, reporters from the BBC (as well as the Daily Mail, The New York Times, Politico, CNN, and others) were denied access to a United States White House briefing. In 2017, BBC India was banned for a period of five years from covering all national parks and sanctuaries in India. Following the withdrawal of CGTN's UK broadcaster licence on 4 February 2021 by Ofcom, China banned BBC News from airing in China. See also References External links |below = Category }}
========================================
[SOURCE: https://www.mako.co.il/hix-special/Article-1aa152edcd57c91027.htm] | [TOKENS: 91]
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/hix-special/Article-1aa152edcd57c91027.htm from 79.181.162.231 on February 21 2026, 10:58:15 UTC
========================================
[SOURCE: https://www.mako.co.il/hix-bizarre/Article-61dbcae00b57c91026.htm] | [TOKENS: 2610]
ืขืฉืจื•ืช ืฆืขื™ืจื™ื ื‘ืžืจืชืฃ: ืชื™ืขื•ื“ ืžืฉื˜ืจืชื™ ื—ื•ืฉืฃ ื˜ืงืก ื—ื ื™ื›ื” ืžืฉืคื™ืœืชื™ืขื•ื“ ืžืžืฆืœืžื•ืช ื’ื•ืฃ ื—ืฉืฃ ืืช ื”ืจื’ืข ื”ื“ืจืžื˜ื™ ืฉื‘ื• ืฉื•ื˜ืจื™ื ืคื™ื–ืจื• ื˜ืงืก ื—ื ื™ื›ื” ื—ืจื™ื’ ื‘ื‘ื™ืช ืื—ื•ื•ื” ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ืื™ื•ื•ื”. ื‘ืžืจืชืฃ ื‘ื™ืช ืืœืคื ื“ืœืชื ืคื™ ื ืžืฆืื• 56 ืฆืขื™ืจื™ื, ื—ืœืงื ืœืœื ื—ื•ืœืฆื•ืช ื•ืื—ืจื™ื ืขื ื›ื™ืกื•ื™ ืขื™ื ื™ื™ื. ืขืœ ื’ื•ืคื ื ืจืื• ื—ื•ืžืจื™ื ืœื‘ื ื™ื, ื—ื•ืžื™ื ื•ืฆื”ื•ื‘ื™ื ืฉืœื ื–ื•ื”ื•ื’ื™ื ืจื™ื™ื˜ื”ื™ืงืกืคื•ืจืกื: 19.02.26, 13:43ืฆื™ืœื•ื: ืžืชื•ืš ื”ืจืฉืชื•ืช ื”ื—ื‘ืจืชื™ื•ืช ื‘ื”ืชืื ืœืกืขื™ืฃ 27ื' ื‘ื—ื•ืงื”ืงื™ืฉื•ืจ ื”ื•ืขืชืงืฉื•ื˜ืจื™ื ื”ื•ื–ืขืงื• ืœืžืงื•ื ื‘ืขืงื‘ื•ืช ืื–ืขืงืช ืืฉ ืฉื”ื•ืคืขืœื”. ืื—ื“ ืžื”ื ื ืฉืžืข ืื•ืžืจ ืœืกื˜ื•ื“ื ื˜ื™ื ื‘ืื—ื•ื•ื”: "ืžื™ืฉื”ื• ืจื•ืฆื” ืœื”ืกื‘ื™ืจ ืœื™ ืžื” ืœืขื–ืื–ืœ ืื ื™ ืจื•ืื” ื›ืืŸ?". ื‘ื”ืžืฉืš ื”ื›ืจื™ื–: "ื–ื” ื ื’ืžืจ ื›ืืŸ. ืื ื—ื ื• ืžื”ืžืฉื˜ืจื”, ืžื™ ืื—ืจืื™ ืคื”?".ื”ืฉื•ื˜ืจ ื”ื•ืจื” ืœื ื•ื›ื—ื™ื ืœื”ืกื™ืจ ืืช ื›ื™ืกื•ื™ื™ ื”ืขื™ื ื™ื™ื, ืœื”ืชื—ื™ืœ ืœื ืงื•ืช ืืช ื”ืžืงื•ื ื•ืœืคื ื•ืช ืืช ื”ื—ื“ืจ. ืœืžืจื•ืช ื–ืืช, ืืฃ ืื—ื“ ืžื”ื ืœื ื–ื–. "ื‘ืจื•ืจ ืฉื”ื ืœื•ืงื—ื™ื ืืช ื–ื” ืžืื•ื“ ื‘ืจืฆื™ื ื•ืช", ืืžืจ ื”ืฉื•ื˜ืจ ื‘ืชืกื›ื•ืœ. ื‘ื”ืžืฉืš ื”ื•ื ืืคื™ืœื• ื”ื–ื”ื™ืจ ืืช ื”ืกื˜ื•ื“ื ื˜ื™ื ื›ื™ ื”ืชืขืœื•ืœ ืขืœื•ืœ ืœื”ื•ื‘ื™ืœ ืœืกื’ื™ืจืช ืกื ื™ืฃ ืืœืคื ื“ืœืชื ืคื™ (ฮ‘ฮ”ฮฆ: Alpha Delta Phi) ื‘ืงืžืคื•ืก. ืœืžืจื•ืช ื–ืืช, ื”ื ืœื ืฉื™ืชืคื• ืคืขื•ืœื”.ื‘ื”ืžืฉืš ืฉื•ื—ื—ื• ื”ืฉื•ื˜ืจื™ื ืขื ื ืฉื™ื ื”ืื—ื•ื•ื” ืžื—ื•ืฅ ืœืžื‘ื ื”. "ื”ื’ืขื ื• ื‘ื’ืœืœ ืื–ืขืงืช ืืฉ ื•ื ื™ืกื™ื ื• ืœืคื ื•ืช ืื ืฉื™ื, ืื‘ืœ ืœืคื™ ืžื” ืฉื”ื‘ื ืชื™ ืกื™ืจื‘ืชื", ืืžืจ ืื—ื“ ืžื”ื. ื”ื•ื ื”ื•ืกื™ืฃ ื›ื™ ื”ืื™ืจื•ืข ื™ืชื•ืขื“ ื‘ื“ื•ื— ื•ื›ื™ ื”ืžืžืฆืื™ื ื™ื•ืขื‘ืจื• ืœื”ื ื”ืœืช ื”ืื•ื ื™ื‘ืจืกื™ื˜ื”.ื”ืื™ืจื•ืข ื”ืชืจื—ืฉ ื‘ื ื•ื‘ืžื‘ืจ 2024 ื•ืจืง ืœืื—ืจื•ื ื” ื”ืชื™ืจื• ืœืคืจืกื ืืช ืกืจื˜ื•ื ื™ ืžืฆืœืžื•ืช ื”ื’ื•ืฃ. ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ืื™ื•ื•ื” ื”ืชื™ื™ื—ืกื• ืœืžืงืจื” ื•ื”ื“ื’ื™ืฉื• ื›ื™ ื˜ืงืกื™ ื—ื ื™ื›ื” ืžืฉืคื™ืœื™ื (Hazing) ืืกื•ืจื™ื ื‘ื”ื—ืœื˜, ื”ืŸ ืœืคื™ ืชืงื ื•ืŸ ื”ืžื•ืกื“ ื•ื”ืŸ ืœืคื™ ื—ื•ืงื™ ื”ืžื“ื™ื ื”. ืกื ื™ืฃ ืืœืคื ื“ืœืชื ืคื™ ื”ื•ืฉืขื” ืœืชืงื•ืคื” ืฉืœ ืืจื‘ืข ืฉื ื™ื ื‘ืขืงื‘ื•ืช ื”ืื™ืจื•ืข, ื•ืื“ื ืื—ื“ ื ืขืฆืจ ื‘ื—ืฉื“ ืœืฉื™ื‘ื•ืฉ ื”ืœื™ื›ื™ ื—ืงื™ืจื”. ืื•ืœื, ื”ืื™ืฉื•ืžื™ื ื ื’ื“ื• ื‘ื•ื˜ืœื• ื‘ื”ืžืฉืš.ื‘ื™ื–ืืจื—ื“ืฉื•ืช ื‘ืขื•ืœืืžืฆืืชื ื˜ืขื•ืช ืœืฉื•ืŸ? ืขืฉืจื•ืช ืฆืขื™ืจื™ื ื‘ืžืจืชืฃ: ืชื™ืขื•ื“ ืžืฉื˜ืจืชื™ ื—ื•ืฉืฃ ื˜ืงืก ื—ื ื™ื›ื” ืžืฉืคื™ืœ ืชื™ืขื•ื“ ืžืžืฆืœืžื•ืช ื’ื•ืฃ ื—ืฉืฃ ืืช ื”ืจื’ืข ื”ื“ืจืžื˜ื™ ืฉื‘ื• ืฉื•ื˜ืจื™ื ืคื™ื–ืจื• ื˜ืงืก ื—ื ื™ื›ื” ื—ืจื™ื’ ื‘ื‘ื™ืช ืื—ื•ื•ื” ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ืื™ื•ื•ื”. ื‘ืžืจืชืฃ ื‘ื™ืช ืืœืคื ื“ืœืชื ืคื™ ื ืžืฆืื• 56 ืฆืขื™ืจื™ื, ื—ืœืงื ืœืœื ื—ื•ืœืฆื•ืช ื•ืื—ืจื™ื ืขื ื›ื™ืกื•ื™ ืขื™ื ื™ื™ื. ืขืœ ื’ื•ืคื ื ืจืื• ื—ื•ืžืจื™ื ืœื‘ื ื™ื, ื—ื•ืžื™ื ื•ืฆื”ื•ื‘ื™ื ืฉืœื ื–ื•ื”ื• ืฉื•ื˜ืจื™ื ื”ื•ื–ืขืงื• ืœืžืงื•ื ื‘ืขืงื‘ื•ืช ืื–ืขืงืช ืืฉ ืฉื”ื•ืคืขืœื”. ืื—ื“ ืžื”ื ื ืฉืžืข ืื•ืžืจ ืœืกื˜ื•ื“ื ื˜ื™ื ื‘ืื—ื•ื•ื”: "ืžื™ืฉื”ื• ืจื•ืฆื” ืœื”ืกื‘ื™ืจ ืœื™ ืžื” ืœืขื–ืื–ืœ ืื ื™ ืจื•ืื” ื›ืืŸ?". ื‘ื”ืžืฉืš ื”ื›ืจื™ื–: "ื–ื” ื ื’ืžืจ ื›ืืŸ. ืื ื—ื ื• ืžื”ืžืฉื˜ืจื”, ืžื™ ืื—ืจืื™ ืคื”?". ื”ืฉื•ื˜ืจ ื”ื•ืจื” ืœื ื•ื›ื—ื™ื ืœื”ืกื™ืจ ืืช ื›ื™ืกื•ื™ื™ ื”ืขื™ื ื™ื™ื, ืœื”ืชื—ื™ืœ ืœื ืงื•ืช ืืช ื”ืžืงื•ื ื•ืœืคื ื•ืช ืืช ื”ื—ื“ืจ. ืœืžืจื•ืช ื–ืืช, ืืฃ ืื—ื“ ืžื”ื ืœื ื–ื–. "ื‘ืจื•ืจ ืฉื”ื ืœื•ืงื—ื™ื ืืช ื–ื” ืžืื•ื“ ื‘ืจืฆื™ื ื•ืช", ืืžืจ ื”ืฉื•ื˜ืจ ื‘ืชืกื›ื•ืœ. ื‘ื”ืžืฉืš ื”ื•ื ืืคื™ืœื• ื”ื–ื”ื™ืจ ืืช ื”ืกื˜ื•ื“ื ื˜ื™ื ื›ื™ ื”ืชืขืœื•ืœ ืขืœื•ืœ ืœื”ื•ื‘ื™ืœ ืœืกื’ื™ืจืช ืกื ื™ืฃ ืืœืคื ื“ืœืชื ืคื™ (ฮ‘ฮ”ฮฆ: Alpha Delta Phi) ื‘ืงืžืคื•ืก. ืœืžืจื•ืช ื–ืืช, ื”ื ืœื ืฉื™ืชืคื• ืคืขื•ืœื”. ื‘ื”ืžืฉืš ืฉื•ื—ื—ื• ื”ืฉื•ื˜ืจื™ื ืขื ื ืฉื™ื ื”ืื—ื•ื•ื” ืžื—ื•ืฅ ืœืžื‘ื ื”. "ื”ื’ืขื ื• ื‘ื’ืœืœ ืื–ืขืงืช ืืฉ ื•ื ื™ืกื™ื ื• ืœืคื ื•ืช ืื ืฉื™ื, ืื‘ืœ ืœืคื™ ืžื” ืฉื”ื‘ื ืชื™ ืกื™ืจื‘ืชื", ืืžืจ ืื—ื“ ืžื”ื. ื”ื•ื ื”ื•ืกื™ืฃ ื›ื™ ื”ืื™ืจื•ืข ื™ืชื•ืขื“ ื‘ื“ื•ื— ื•ื›ื™ ื”ืžืžืฆืื™ื ื™ื•ืขื‘ืจื• ืœื”ื ื”ืœืช ื”ืื•ื ื™ื‘ืจืกื™ื˜ื”. ื”ืื™ืจื•ืข ื”ืชืจื—ืฉ ื‘ื ื•ื‘ืžื‘ืจ 2024 ื•ืจืง ืœืื—ืจื•ื ื” ื”ืชื™ืจื• ืœืคืจืกื ืืช ืกืจื˜ื•ื ื™ ืžืฆืœืžื•ืช ื”ื’ื•ืฃ. ื‘ืื•ื ื™ื‘ืจืกื™ื˜ืช ืื™ื•ื•ื” ื”ืชื™ื™ื—ืกื• ืœืžืงืจื” ื•ื”ื“ื’ื™ืฉื• ื›ื™ ื˜ืงืกื™ ื—ื ื™ื›ื” ืžืฉืคื™ืœื™ื (Hazing) ืืกื•ืจื™ื ื‘ื”ื—ืœื˜, ื”ืŸ ืœืคื™ ืชืงื ื•ืŸ ื”ืžื•ืกื“ ื•ื”ืŸ ืœืคื™ ื—ื•ืงื™ ื”ืžื“ื™ื ื”. ืกื ื™ืฃ ืืœืคื ื“ืœืชื ืคื™ ื”ื•ืฉืขื” ืœืชืงื•ืคื” ืฉืœ ืืจื‘ืข ืฉื ื™ื ื‘ืขืงื‘ื•ืช ื”ืื™ืจื•ืข, ื•ืื“ื ืื—ื“ ื ืขืฆืจ ื‘ื—ืฉื“ ืœืฉื™ื‘ื•ืฉ ื”ืœื™ื›ื™ ื—ืงื™ืจื”. ืื•ืœื, ื”ืื™ืฉื•ืžื™ื ื ื’ื“ื• ื‘ื•ื˜ืœื• ื‘ื”ืžืฉืš.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Teletext] | [TOKENS: 6284]
Contents Teletext Teletext, or broadcast teletext, is a standard for displaying text and rudimentary graphics on suitably equipped television sets. Teletext sends data in the broadcast signal, hidden in the invisible vertical blanking interval (VBI) area at the top and bottom of the screen. The teletext decoder in the television buffers this information as a series of "pages", each given a number. The user can display chosen pages using their remote control. Teletext was uni-directional- the user could only receive, and not respond or send data of their own. Teletext was created in the United Kingdom in the early 1970s by John Adams, Philips' lead designer for video display units to provide closed captioning to television shows for the hearing impaired. Public teletext information services were introduced by major broadcasters in the UK, starting with the BBC's Ceefax service in 1974. It offered a range of text-based information, typically including news, weather and TV schedules. Similar systems were subsequently introduced by other television broadcasters in various countries, with launches in West Germany (ARD and ZDF's Videotext) and the Netherlands (NOS's Teletekst) in 1980. Teletext formed the basis for the World System Teletext standard (CCIR Teletext System B), an extended version of the original system. This standard saw widespread use across Europe starting in the 1980s, with almost all television sets including a decoder. Other standards were developed around the world, notably NABTS (CCIR Teletext System C) in the United States, Antiope (CCIR Teletext System A) in France and JTES (CCIR Teletext System D) in Japan, but these were never as popular as their European counterpart and most closed by the early 1990s. Teletext inspired the later Videotex system that enabled bi-directional communication in a format later recognised as a prototype of the World Wide Web (WWW). Implementations included the French Minitel and the British Prestel. Introduced by the General Post Office, Prestel used Teletext's display standards but instead ran over bi-directional telephone lines using modems. Most European teletext services continued to exist in one form or another until well into the 2000s when the expansion of the Internet precipitated a closure of some of them. However, many European television stations continue to provide teletext services and even make teletext content available via web and dedicated apps. The recent availability of digital television has led to more advanced systems being provided that perform the same task, such as MHEG-5 and Multimedia Home Platform. History Teletext is a means of sending text and simple geometric shapes to a properly equipped television screen by use of one of the "vertical blanking interval" lines that together form the dark band dividing pictures horizontally on the television screen. Transmitting and displaying subtitles was relatively easy. It requires limited bandwidth; at a rate of perhaps a few words per second. However, it was found that by combining even a slow data rate with a suitable memory, whole pages of information could be sent and stored on the TV for later recall. In the early 1970s, work was in progress in Britain to develop such a system. The goal was to provide UK rural homes with electronic hardware that could download pages of up-to-date news, reports, facts and figures targeting UK agriculture. The original idea was the brainchild of Philips (CAL) Laboratories in 1970. In 1971, CAL engineer John Adams created a design and proposal for UK broadcasters. His configuration contained all the fundamental elements of classic teletext including pages of 24 rows with 40 characters each, page selection, sub-pages of information and vertical blanking interval data transmission. A major objective for Adams during the concept development stage was to make teletext affordable to the home user. In reality, there was no scope to make an economic teletext system with 1971 technology. However, as the low cost was essential to the project's long-term success, this obstacle had to be overcome. Meanwhile, the General Post Office (GPO), whose telecommunications division later became British Telecom, had been researching a similar concept since the late 1960s, known as Viewdata. Unlike Teledata, a one-way service carried in the existing TV signal, Viewdata was a two-way system using telephones. Since the Post Office owned the telephones, this was considered to be an excellent way to drive more customers to use the phones. In 1972, the BBC demonstrated its system, now known as Ceefax ("seeing facts", the departmental stationery used the "Cx" logo), on various news shows; this was not actual broadcast teletext but was transmitted using a wired connection. The Independent Television Authority (ITA) announced its own service in 1973, known as ORACLE (Optional Reception of Announcements by Coded Line Electronics). Not to be outdone, the GPO immediately announced a 1200/75 baud videotext service under the name Prestel (this system was based on teletext protocols, but telephone-based). The TV-broadcast based systems were originally incompatible; Ceefax displayed pages of 24 lines with 32 characters each, while ORACLE offered pages of 22 lines with 40 characters each. In other ways the standards overlapped; for instance, both used 7-bit ASCII characters and other basic details. In 1974, all the services agreed on a standard for displaying the information. The display would be a simple 24 ร— 40 grid of text, with some graphics characters for constructing simple graphics. The standard did not define the delivery system, so both Viewdata-like and Teledata-like services could at least share the TV-side hardware (which at that time was quite expensive). The world's first demonstration of live broadcast teletext was a 50-page experimental transmission of ORACLE from the Crystal Palace transmitter to an invited audience on 9 April 1973. The BBC began occasional test transmissions of Ceefax using dummy pages in 1973-4, and by June 1974 it had requested government authorisation to begin formal testing. Approval was granted and the BBC news department put together an editorial team of nine, led by editor Colin McIntyre, to develop a news and information service which began on 23 September 1974. Initially limited to 30 pages, Ceefax expanded to 50 pages in early 1975 and was expected to grow to a "full magazine" of 100 pages later that year. Despite having been demonstrated first, ORACLE did not begin an experimental service until 30 June 1975. There was no industrial commitment to make consumer decoders until May 1975, when the UK arm of Texas Instruments was announced as the first company to build decoders that would be sold to TV manufacturers for incorporation into their own sets by early 1976. Before that point, the only existing teletext decoders were custom-built standalone experimental units, although there was enough technical information about the spec publicly available that the BBC acknowledged in late June 1974 that "a few gifted amateurs" had built their own. Wireless World magazine ran a series of articles between November 1975 and June 1976 describing the design and construction of a teletext decoder using mainly TTL devices; however, development was limited until the first TV sets with built-in decoders started appearing in 1976. By May 1976, volume production of teletext TVs had not yet started and while a handful of sets with teletext decoders were on sale, they cost about ยฃ1000 each (ยฃ11,350 in 2026). To try and reduce the cost of entry for consumers, the manufacturer Labgear produced a prototype standalone decoder box that could be used with any existing 625-line TV set, with an expected price point of ยฃ200 (ยฃ2260 in 2026), but this was still too expensive to prompt widespread adoption. By August 1977, it was estimated that up to 3000 teletext TVs had been sold, and both Ceefax and ORACLE were still officially considered "experimental" services. Neither service ever received a formal public launch to separate their experimental phase from full operation; the word "experimental" was simply dropped from references to them in the late 70s. From October 1977 to April 1978, an industrial dispute meant that ORACLE was blacked out nationally. From 1975 until 1977, ORACLE had operated for 12.5 hours a day Monday to Friday. A planned extension of its operating hours to cover the weekend prompted engineering staff at LWT, the national origination point for ORACLE, to request more money for the additional duties; this was refused, which led to the affected staff refusing to maintain the equipment during the week. A test page was broadcast instead. By December 1979, it was estimated that there were about 14,000 teletext TVs in use, and that was the first year that the Queen's Christmas speech was publicly announced as being subtitled on Ceefax. The "Broadcast Teletext Specification" was published in September 1976 jointly by the IBA, the BBC and the British Radio Equipment Manufacturers' Association. The new standard also made the term "teletext" generic, describing any such system. The standard was internationalised as World System Teletext (WST) by CCIR. Other systems entered commercial service, like Prestel (in 1979). Teletext became popular in the United Kingdom when Ceefax, Oracle and the British government promoted teletext through a massive campaign in 1981. By 1982, there were two million such sets, and by the mid-1980s they were available as an option for almost every European TV set, typically by means of a plug-in circuit board. It took another decade before the decoders became a standard feature on almost all sets with a screen size above 15 inches (Teletext is still usually only an option for smaller "portable" sets). From the mid-1980s, both Ceefax and ORACLE were broadcasting several hundred pages on every channel, slowly changing them throughout the day. In 1986, WST was formalised as an international standard as CCIR Teletext System B. It was also adopted in many other European countries. A number of similar teletext services were developed in other countries, some of which attempted to address the limitations of the initial British-developed system, by adding extended character sets or improving graphic abilities. For example, state-owned RAI launched its teletext service, called Televideo, in 1984, with support for Latin character set. Mediaset, the main commercial broadcaster, launched its Mediavideo Teletext in 1993. La7Video in 2001, heir to TMCvideo, the teletext of TMC Telemontecarlo born in the mids 90s. Always in the 90s, Rete A and Rete Mia teletexts arrived. Retemia's teletext has not been functional since 2000, Rete A's since 2006, La7Video since 2014 and Mediavideo since 2022. These developments are covered by the different World System Teletext Levels. In France, where the SECAM standard is used in television broadcasting, a teletext system was developed in the late 1970s under the name Antiope. It had a higher data rate and was capable of dynamic page sizes, allowing more sophisticated graphics. It was phased out in favour of World System Teletext in 1991. In North America, NABTS, the North American Broadcast Teletext Specification, was developed to encoding NAPLPS teletext pages, as well as other types of digital data. NABTS was the standard used for both CBS's ExtraVision and NBC's NBC Teletext services in the mid-1980s. Japan developed its own JTES teletext system with support for Chinese, Katakana and Hiragana characters. Broadcasts started in 1983 by NHK. In 1986, the four existing teletext systems were adopted into the international standard CCIR 653 (now ITU-R BT.653) as CCIR Teletext System A (Antiope), B (World System Teletext), C (NABTS) and D (JTES). In 2023, the Dutch public broadcasting organization NOS replaced the original underlying system for teletext that had been in use since the 1980s with a new system. The reason behind the replacement was that the original Cyclone system became harder to maintain over the years and the NOS even had to consult sometimes retired British teletext experts to deal with issues. For example, a recent issue was that a Windows update was incompatible with the old Cyclone system. Since NOS Teletekst is still popular in the Netherlands (with 3.5 million people using it weekly on televisions and 1 million people using it weekly as app on other devices), NOS decided to build a new modern underlying system to replace Cyclone. To make Teletekst look visually the same as on the old Cyclone system, the developers of the new system made use of reverse engineering. The World Wide Web began to take over some of the functions of teletext from the late 1990s. However, due to its broadcast nature, Teletext remained a reliable source of information during times of crisis, for example during the September 11 attacks when webpages of major news sites became inaccessible because of the high demand. As the web matured, many broadcasters ceased broadcast of Teletext โ€” CNN in 2006 and the BBC in 2012. In the UK the decline of Teletext was hastened by the introduction of digital television, though an aspect of teletext continues in closed captioning. In other countries the system is still widely used on standard-definition DVB broadcasts. A number of broadcast authorities have ceased the transmission of teletext services. Subtitling still continues to use teletext in Australia, New Zealand, and Singapore with some providers switching to using image-based DVB subtitling for HD broadcasts. New Zealand solely uses DVB subtitling on terrestrial transmissions despite teletext still being used on internal SDI links. Technology Teletext information is broadcast in the vertical blanking interval between image frames in a broadcast television signal, in numbered "pages". For example, a list of news headlines might appear on page 110; a teletext user would type "110" into the TV's remote control to view this page. The broadcaster constantly sends out pages in a sequence. There will typically be a delay of a few seconds from requesting the page and it being broadcast and displayed, the time being entirely dependent on the number of pages being broadcast. More sophisticated receivers use a memory buffer to store some or all of the teletext pages as they are broadcast, allowing almost instant display from the buffer. This basic architecture separates teletext from other digital information systems, such as the Internet, whereby pages are 'requested' and then 'sent' to the user โ€“ a method not possible given the one-way nature of broadcast teletext. Unlike the Internet, teletext is broadcast, so it does not slow down further as the number of users increases, although the greater number of pages, the longer one is likely to wait for each to be found in the cycle. For this reason, some pages (e.g. common index pages) are broadcast more than once in each cycle. Teletext is also used for carrying special packets interpreted by TVs and video recorders, containing information about subjects such as channels and programming.[citation needed] Teletext allows up to eight 'magazines' to be broadcast, identified by the first digit of the three-digit page number (1โ€“8). Within each magazine there may theoretically be up to 256 pages at a given time, numbered in hexadecimal and prefixed with the magazine number โ€“ for example, magazine 2 may contain pages numbered 200-2FF. In practice, however, non-decimal page numbers are rarely used as domestic teletext receivers will not have options to select hex values A-F, with such numbered pages only occasionally used for 'special' pages of interest to the broadcaster and not intended for public view. The broadcaster constantly sends out pages in sequence in one of two modes: Serial mode broadcasts every page sequentially whilst parallel mode divides VBI lines amongst the magazines, enabling one page from each magazine to be broadcast simultaneously. There will typically be a delay of a few seconds from requesting the page and it being broadcast and displayed; the time is entirely dependent on the number of pages being broadcast in the magazine (parallel mode) or in total (serial mode) and the number of VBI lines allocated. In parallel mode, therefore, some magazines will load faster than others. A standard PAL signal contains 625 lines of video data per screen, broken into two "fields" containing half the lines of the whole image, divided as every odd line, then every even line number. Lines near the top of the screen are used to synchronize the display to the signal and are not seen on-screen. Data formatted in accordance with CEPT presentation layer protocol and data syntax standard is stored in these lines, where they are not visible, using lines 6โ€“22 on the first field and 318โ€“335 on the second field. The system does not have to use all of these lines; a unique pattern of bits allows the decoder to identify which lines contain data. Unused lines must not be used for other services as it will prevent teletext transmission. Some teletext services use a great number of lines, others, for reasons of bandwidth and technical issues, use fewer. Teletext in the PAL B system can use the VBI lines 6โ€“22 in first half image and 318โ€“334 in the other to transmit 360 data bits including clock run-in and framing code during the active video period at a rate of 6.9375 Mbit/s ยฑ25 bit/s using binary NRZ line coding.: 15 The amplitude for a "0" is black level ยฑ2% and a "1" is 66ยฑ6% of the difference between black and peak white level. The clock run in consist of 8 times of "10" and the framing code is "11100100". The two last bits of the clock-run in shall start within 12+0.4โˆ’1.0 ฮผs from the negative flank of the line synchronization pulse.: 16 The 6.9375 Mbit/s rate is 444 ร— nominal fH, i.e. the TV line frequency. Thus 625 ร— 25 ร— 444 = 6,937,500 Hz. Each bit will then be 144 ns long. The bandwidth amplitude is 50% at 3.5 MHz and 0% at 6 MHz. If the horizontal sync pulse during the vertical synchronization starts in the middle of the horizontal scan line. Then first interlace frame will be sent, otherwise, if vertical synchronization let the full video line complete the second interlace frame is sent.: 14 Like EIA-608, bits are transmitted in the order of LSB to MSB with odd parity coding of 7-bit character codes.: 17 However unlike EIA-608, the DVB version is transmitted the same way. For single bit error recovery during transmission, the packet address (page row and magazine numbers) and header bytes (page number, subtitle flag, etc.) use hamming code 8/4: 21 with extended packets (header extensions) using hamming 24/18,: 21 which basically doubles the bits used. The commonly used standard B uses a fixed PAL subtitling bandwidth of 8,600 (7,680 without page/packet header) bits/s per field for a maximum of 32 characters per line per caption (maximum three captions โ€“ lines 19 โ€“ 21) for a 25 frame broadcast. While the bandwidth is greater than EIA-608, so is the error rate with more bits encoded per field. Subtitling packets use a lot of non-boxed spacing to control the horizontal positioning of a caption and to pad out the fixed packet. The vertical caption position is determined by the packet address. In the case of the Ceefax and ORACLE systems and their successors in the UK, the teletext signal is transmitted as part of the ordinary analog TV signal but concealed from view in the Vertical Blanking Interval (VBI) television lines which do not carry picture information. The teletext signal is digitally coded as 45-byte packets, so the resulting rate is 7,175 bits per second per line (41 7-bit 'bytes' per line, on each of 25 frames per second). A teletext page comprises one or more frames, each containing a screen-full of text. The pages are sent out one after the other in a continual loop. When the user requests a particular page the decoder simply waits for it to be sent, and then captures it for display. In order to keep the delays reasonably short, services typically only transmit a few hundred frames in total. Even with this limited number, waits can be up to 30 seconds, although teletext broadcasters can control the speed and priority with which various pages are broadcast. Modern television sets, however, usually have built-in memory, often for a few thousand different pages. This way, the teletext decoder captures every page sent out and stores it in memory, so when a page is requested by the user it can be loaded directly from memory instead of having to wait for the page to be transmitted. When the page is transmitted again, the decoder updates the page in memory. The text can be displayed instead of the television image, or superimposed on it (a mode commonly called mix). Some pages, such as subtitles (closed captioning), are in-vision, meaning that text is displayed in a block on the screen covering part of the television image. The original standard provides a monospaced 40ร—24 character grid. Characters are sent using a 7-bit codec, with an 8th bit employed for error detection. The standard was improved in 1976 (World System Teletext Level 1) to allow for improved appearance and the ability to individually select the color of each character from a palette of eight. The proposed higher resolution Level 2 (1981) was not adopted in Britain (in-vision services from Ceefax & ORACLE did use it at various times, however, though even this was ceased by the BBC in 1996), although transmission rates were doubled from two to four lines a frame. In the early 1980s, a number of higher extension levels were envisaged for the specification, based on ideas then being promoted for worldwide videotex standards (telephone dial-up services offering a similar mix of text and graphics). The most common implementation is Level 1.5, which supports languages other than English. Virtually any TV sold in Europe since the 1990s has support for this level. After 1994 some stations adopted Level 2.5 Teletext or Hi-Text, which allows for a larger color palette and higher resolution graphics. The proposed higher content levels included geometrically specified graphics (Level 4), and higher-resolution photographic-type images (Level 5), to be conveyed using the same underlying mechanism at the transport layer. No TV sets currently implement the two most sophisticated levels. Decoders The Mullard SAA5050 was a character generator chip used in the UK teletext-equipped television sets. In addition to the UK version, several variants of the chip existed with slightly different character sets for particular localizations and/or languages. These had part numbers SAA5051 (German), SAA5052 (Swedish), SAA5053 (Italian), SAA5054 (Belgian), SAA5055 (U.S. ASCII), SAA5056 (Hebrew) and SAA5057 (Cyrillic). The type of decoder circuitry is sometimes marked on televisions as CCT (Computer-Controlled Teletext), or ECCT (Enhanced Computer-Controlled Teletext). Besides the hardware implementations, it is also possible to decode teletext using a PC and video capture or DVB board, as well as recover historical teletext from self-recorded VHS tapes. The Acorn BBC Micro's default graphics mode (mode 7) was based on teletext display, and the computer could be used to create and serve teletext-style pages over a modem connection. With a suitable adapter, the computer could receive and display teletext pages, as well as software over the BBC's Ceefax service, for a time. The Philips P2000 home computer's video logic was also based on a chip designed to provide teletext services on television sets. Uses Some TV channels offer a service called interactive teletext to remedy some of the shortcomings of standard teletext. To use interactive teletext, the user calls a special telephone number with a push-button telephone. A computer then instructs them to go to a teletext page which is assigned to them for that session. Usually, the page initially contains a menu of options, from which the user chooses using the telephone keypad. When a choice has been made, the selected page is immediately broadcast for viewing. This is in contrast with usual teletext where the user has to wait for the selected page to be broadcast. This technology enables teletext to be used for games, chat, access to databases, etc. It overcomes the limitations on the number of available pages. On the other hand, only a limited number of users can be serviced at the same time, since one page number is allocated per user. Some channels solve this by taking into account where the user is calling from and by broadcasting different teletext pages in different geographical regions. In that way, two different users can be assigned the same page number at the same time as long as they do not receive the TV signals from the same source. Another drawback to the technology is the privacy concerns in that many users can see what a user is doing because the interactive pages are received by all viewers. Also, the user usually has to pay for the telephone call to the TV station. Spanish prisons have banned or deactivated TV sets with teletext capabilities, after finding that the inmates received coded messages from accomplices outside through the bulletin board sections. The same phenomenon has been observed in Finland, where inmates received messages from smugglers through the family bulletin board. The ability to display colored characters and pixels is also used to create teletext art. A teletext page in World System Teletext Level 1 format offers 7-bit colors and a canvas with 40x24 sixels, each containing a text character or 2x3 pixels. Specific control commands can be used to switch between text and graphic pixels, and to add effects such as rasterization, blinking, or double line height. The rasterized working area and the limited display options result in the typical teletext aesthetics. In cooperation with Finnish state television YLE, the Museum of Teletext Art has been presenting and archiving international teletext art online, on air and in exhibitions since 2014. Legacy and successors While the basic teletext format has remained unchanged in more than 30 years, a number of improvements and additions have been made. Prestel was a British information-retrieval system based on teletext protocols. However, it was essentially a different system, using a modem and the phone system to transmit and receive the data, comparable to systems such as France's Minitel. The modem was asymmetric, with data sent at 75-bit/s, and received at 1200-bit/s. This two-way nature allowed pages to be served on request, in contrast to the TV-based systems' sequential rolling method. It also meant that a limited number of extra services were available such as booking events or train tickets and a limited amount of online banking. A number of teletext services have been syndicated to web viewers, which mimic the look and feel of broadcast teletext. RSS feeds of news and information from the BBC are presented in Ceefax format in the web viewer. In 2016, the Teefax teletext service was launched in the United Kingdom to coverage by the BBC, ITV and others. Using a Raspberry Pi computer card as a set-top box, it feeds its service to standard televisions. Teefax content is a mix of crowdsourcing, syndication and contributions from media professionals who contributed heavily to broadcast teletext services. Teefax is also syndicated to a web viewer. With the advent of digital television, some countries adopted the name "digital teletext" for newer standards, despite the older teletext standards' digital nature. Digital teletext is encoded with standards including MHEG-5 and Multimedia Home Platform (MHP). Other countries use the same teletext streams as before on DVB transmissions, due to the DVB-TXT and DVB-VBI sub-standards. Those allow the emulation of analogue teletext on digital TV platforms, directly on the TV or set-top box, or by recreating analog output, reproducing the vertical blanking interval data in which teletext is carried. Similar systems A closely related service is the Video Program System (VPS), introduced in Germany in 1985. Like teletext, this signal is also broadcast in the vertical blanking interval. It consists only of 32 bits of data, primarily the date and time for which the broadcast of the currently running TV programme was originally scheduled. Video recorders can use this information (instead of a simple timer) in order to automatically record a scheduled programme, even if the broadcast time changes after the user programmes the VCR. VPS also provides a PAUSE code; broadcasters can use it to mark interruptions and pause the recorders, however, advertisement-financed broadcasters tend not to use it during their ad breaks. VPS (line 16) definition is now included in the Programme Delivery Control (PDC) standard from ETSI. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Oculus_Rift_CV1] | [TOKENS: 1439]
Contents Oculus Rift CV1 Oculus Rift CV1, also known simply as Oculus Rift, is a virtual reality headset developed by Oculus VR, a subsidiary of Meta Platforms, known at the time as Facebook Inc. It was announced in January 2016, and released in March the same year. The device constituted the first commercial release in the Oculus Rift lineup. Production of the CV1 concluded in March 2019, being succeeded by the Oculus Rift S. Facebook stated that it will continue to provide software support to the CV1 "for the foreseeable future". History After the DK1 and DK2 prototypes, Oculus VR finally announced on May 6, 2015, that the consumer version of the Rift would ship in the first quarter of 2016 with pre-orders starting on January 6, 2016, at 8 am PST. On January 5, 2016, the day before pre-orders went live, in an update posted to the original Kickstarter page, it was announced that all Kickstarter backers who pledged for a Rift development kit would get a free Kickstarter Edition Oculus Rift. On January 6, 2016, pre-orders started, at US$599.99. At the same time, the shipment date was announced for March 28, 2016. On January 16, 2016, shipping dates for new orders of the Rift were delayed until July 2016 due to the number of pre-orders on day 1. On March 25, 2016, the first batch of Oculus Rift headsets began shipping to consumers. In March 2017 at Game Developers Conference (GDC), Oculus lowered the price of the headset to US$499, and lowered the price of the Oculus Touch motion controller accessory from $200 to $99. In July 2017, Oculus introduced a Rift + Touch SKU at a $499 price point to succeed the original headset-only SKU. In March 2019, shortly before the Rift S was announced, the original Rift started disappearing from physical third-party stores, making it available only via the official Facebook listing. Then, during the announcement, it was said that the Rift S would replace the original Rift. Shortly afterwards, most sellers stopped replenishing their Rift stock. In March 2019, Oculus VR stated that they planned to support the original Rift with software updates for "the foreseeable future." Hardware The CV1 is an improved version of the Crescent Bay Prototype, featuring per-eye displays running at 90 Hz with a higher combined resolution than DK2, 360-degree positional tracking, integrated audio, a vastly increased positional tracking volume, and a heavy focus on consumer ergonomics and aesthetics. The device features two Pentile OLED displays, 1080ร—1200 resolution per eye, a 90 Hz refresh rate, and a 110ยฐ field of view. The separation of the lenses is adjustable by a slider on the bottom of the device, in order to accommodate a wide range of interpupillary distances. The Fresnel lenses are not interchangeable; however, there are multiple facial interfaces so that the device can be positioned at different distances from the user's eyes. This also allows for users wearing glasses to use the Rift, as well as users with widely varying facial shapes.[citation needed] The Rift CV1 features integrated headphones that provide real-time 3D audio effects. This was developed from technology licensed from RealSpace 3D Audio, by Visisonics. The headphones are user-replaceable, with Oculus and other manufactures such as JBL providing aftermarket audio accessories. The CV1 was struck by a design flaw by which the headphones lost sound on either side, after a cable running through the headband at the back of the headset was severed from regular use. In light of this issue, Palmer Luckey announced a campaign by which he was meant to ship a free custom made solution for this flaw to affected users, but the announcement never materialized. Constellation is the headset's rotational and positional tracking system, used to track the position of the user's head as well as other VR devices, with low latency, and sub-millimeter accuracy. The system consists of external infrared tracking sensors that optically track specially designed VR devices. This provides the Rift with full 6 degree of freedom rotational and positional tracking.[citation needed] The Rift, or any other device being tracked by the system, is fitted with a series of precisely positioned infrared LEDs under or above the surface, set to blink in a specific pattern. By knowing the configuration of the LEDs on the objects and their pattern, the system can determine the precise position of the device with sub-millimeter accuracy and near-zero latency. The system then includes one or more USB stationary infrared sensors that originally come with a stand in a desk lamp form factor, but standard screw holes allow for the stand to be removed and the sensor to be mounted anywhere the user sees fit. The sensors normally sit on the user's desk, creating a 3D space that allows the user to use the device while sitting, standing, walking, or even jumping around the room. In its initial presentation, before the Touch controllers were released, the system was only used to track the head-mounted display, and a single sensor was included with the device, which was sufficient there being no chance of the user's hands blocking the headset from it. When the Touch controllers were released, two-sensor setups became the baseline, in order to guarantee proper tracking of the headset and controllers. For "room scale" virtual reality, three or more sensors are required. When the controllers were sold separately, a second sensor was included. Later, the standard Rift bundle was updated to include the controllers and additional sensor. Additional sensors, placed behind the user, allow for 360ยบ rotation without the sensors being occluded by the user them self. Three- and four-sensor configurations become the standard for this scenario.[citation needed] As a result of a partnership with Microsoft, early Oculus Rift units were bundled with an Xbox Wireless Controller and USB wireless adapter, being what the majority of virtual reality games used at that point, when motion controllers weren't available yet. They also included the Oculus Remote, a wireless remote that provides basic navigation functionality. It has four directional inputs, Enter and Back buttons, volume up/down, and an Oculus button. On December 6, 2016, Oculus released Oculus Touch, a motion controller system which uses the Constellation sensors for positional tracking. Touch was originally distributed as a standalone accessory; in July 2017, Oculus replaced the original headset-only SKU with a "Rift + Touch" SKU that replaces the Xbox controller and Oculus Remote with Oculus Touch. Rift for Business During Oculus Connect in June 2017, Oculus VR announced and released their Oculus Rift for Business bundle for US$900, which included the Rift CV1 HMD, Oculus Touch controllers, three Constellation sensors, an Oculus remote, and three Rift Fits (the name given to the piece that cushions the device onto the user's face). The bundle also includes an expanded warranty, preferential customer service, and commercial use license. References External links
========================================