source
stringlengths
32
199
text
stringlengths
26
3k
https://en.wikipedia.org/wiki/RCU
RCU may refer to: Science and technology Read-copy-update, a computer operating system synchronization mechanism Remote concentrator unit in telephony Organocopper complexes (RCu), in reactions of organocopper reagents Organizations Radio Club Uruguayo Rogue Credit Union, a federal credit union in Medford, Oregon Royal College Union, the alumni association of Royal College Colombo, Sri Lanka Regional Cadet Units of the Australian Army Cadets Regional Coordinating Unit in the Northwest Pacific Action Plan Regional Crime Unit of the Hong Kong Police Force Other uses Rocket City United, an American soccer team RC Unterföhring, a German rugby union club Las Higueras Airport (IATA code), Argentina A remote control unit
https://en.wikipedia.org/wiki/Attack%20vector
In computer security, an attack vector is a specific path, method, or scenario that can be exploited to break into an IT system, thus compromising its security. The term was derived from the corresponding notion of vector in biology. An attack vector may be exploited manually, automatically, or through a combination of manual and automatic activity. Often, this is a multi-step process. For instance, malicious code (code that the user did not consent to being run and that performs actions the user would not consent to) often operates by being added to a harmless seeming document made available to an end user. When the unsuspecting end user opens the document, the malicious code in question (known as the payload) is executed and performs the abusive tasks it was programmed to execute, which may include things such as spreading itself further, opening up unauthorized access to the IT system, stealing or encrypting the user's documents, etc. In order to limit the chance of discovery once installed, the code in question is often obfuscated by layers of seemingly harmless code. Some common attack vectors: exploiting buffer overflows; this is how the Blaster worm was able to propagate. exploiting webpages and email supporting the loading and subsequent execution of JavaScript or other types of scripts without properly limiting their powers. exploiting networking protocol flaws to perform unauthorized actions at the other end of a network connection. phishing: sending deceptive messages to end users to entice them to reveal confidential information, such as passwords. See also References Computer viruses Computer worms
https://en.wikipedia.org/wiki/CBR
CBR may refer to: Business and organizations CBR Innovation Network (or Canberra Innovation Network), a government innovation Central Bank of Russia Center for Bio-Ethical Reform, an anti-abortion non-profit organization Centre for Blood Research, at the University of British Columbia Central Board of Revenue, the Pakistan department for revenue collection Championship Bull Riding, a rodeo organization in Texas, USA Cimenteries et Briqueteries Réunies, now HeidelbergCement Comics .cbr, a file extension for comic book archive files Comic Book Resources, a news and discussion website Radio CBR (AM), a radio station in Calgary, Alberta, Canada CBR-FM, a radio station in Calgary, Alberta, Canada CBR, former call sign of radio station CBU in Vancouver, British Columbia, Canada Science, medicine, and engineering California bearing ratio, a strength measurement of material under a paved area Cannabinoid receptor, a type of cell membrane receptor Carbonyl reductase genes: CBR1 and CBR3 Case-based reasoning, an artificial intelligence technique Central benzodiazepine receptor, the receptor for benzodiazepines in the central nervous system Crude birth rate, a measure of live births Community-based rehabilitation, programs for the disabled Cosmic background radiation Critical body residue, the measure of toxicity in tissue residue Telecommunications Constant bitrate, in telecommunication, sound and music formats Constraint-based routing, in telecommunication and computer networks Content-based routing or router, a type of application-oriented networking Transport Canberra Airport, IATA airport code CBR Canberra MRT station, Australia Cleburne (Amtrak station), Texas, USA Honda CBR series, a line of sports motorbikes Chesapeake Beach Railway, a former railroad from Washington, D.C. to Maryland, USA Cooksbridge railway station, in Sussex, England Other uses CBR Building, an office building situated in Watermael-Boitsfort, Brussels, Belgium Cost-benefit ratio, in economics Captive bead ring, a type of body piercing jewelry Chesapeake Bay Retriever, a dog breed Crash Bandicoot Racing, Japanese name for the Crash Team Racing video game Carpathian Biosphere Reserve, a nature reserve in eastern Europe CBR, a sports and marketing term for Canberra, Australia See also Chemical, biological, radiological, and nuclear (CBRN), a category of hazardous incidents or weapons
https://en.wikipedia.org/wiki/Creatures%20%28video%20game%20series%29
Creatures is an artificial life video game series created in the mid-1990s by English computer scientist Steve Grand while working for the Cambridge video game developer Millennium Interactive. The gameplay focuses on raising alien creatures known as Norns, teaching them to survive, helping them explore their world, defending them against other species, and breeding them. Words can be taught to the creatures by a learning computer (for verbs) or by repeating the name of the object while the creature looks at it. Once a creature understands language, the player can instruct their creature by typing in instructions, which the creature can choose to obey. A complete life cycle is modeled for the creatures - childhood, adolescence, adulthood, and senescence, each with its own unique needs. The gameplay is designed to foster an emotional bond between the player and their creatures. Rather than taking a scripted approach, the games in the Creatures series were driven by detailed biological and neurological simulation and its unexpected results. There have been six major Creatures releases from Creature Labs: between 1996 and 2001 there were three main games, the Docking Station add-on (generally referred to as a separate game) and two children's games, and there were three games created for console systems. Overview The program was one of the first commercial titles to code artificial life organisms from the genetic level upwards using a sophisticated biochemistry and neural network brains, including simulated senses of sight, hearing and touch. This meant that the Norns and their DNA could develop and "evolve" in increasingly diverse ways, unpredicted by the makers. By breeding certain Norns with others, some traits could be passed on to following generations. The Norns turned out to behave similarly to living creatures. Norns respond to external stimuli, such as interaction with the player, and internal stimuli, such as changes in chemical concentrations or neural activities. Sight is simulated by having a group of neurons representing each type of object in the world. When an object belonging to this type is in front of the creature ('within eyesight'), the neuron becomes active and the creature can 'see' the object. The Norns possess simulated biological drives which give punishment when they are raised, and reward when they are lowered. The model for Norns' decision-making process is behaviorist and based on Norns learning how to reduce their drives. Dickinson and Balleine state that while this stimulus-response/reinforcement process makes the creatures seem like they are goal-directed, they are instead 'habit machines' responding in a learned fashion to particular stimuli. Mutations in the genome can occur, allowing new characteristics to appear in the population and potentially be inherited by a future generation. Because the Norns have a lifespan of roughly 40 hours, users could observe the changes that occur over a large number of genera
https://en.wikipedia.org/wiki/EDonkey
eDonkey may refer to: eDonkey network (also known as eDonkey2000 network or eD2k), a popular file sharing network eDonkey2000, a discontinued file sharing program that used the eDonkey network
https://en.wikipedia.org/wiki/Intranet%20strategies
In business, an intranet strategy is the use of an intranet and associated hardware and software to obtain one or more organizational objectives. An intranet is an access-restricted network used internally in an organization. An intranet uses the same concepts and technologies as the World Wide Web and Internet. This includes web browsers and servers running on the internet protocol suite and using Internet protocols such as FTP, TCP/IP, HTML, and Simple Mail Transfer Protocol (SMTP). Role of intranets Intranets are generally used for four types of applications: 1) Communication and collaboration send and receive e-mail, faxes, voice mail, and paging discussion rooms and chat rooms audio and video conferencing virtual team meetings and project collaboration online company discussions as events (e.g., IBM Jams) inhouse blogs 2) Web publishing develop and publish hyperlinked multi-media documents such as: policy manuals company newsletters product catalogs technical drawings training material telephone directories 3) Business operations and management order processing inventory control production setup and control management information systems database access 4) Intranet portal management centrally administer all network functions including servers, clients, security, directories, and traffic give users access to a variety of internal and external business tools/applications integrate different technologies conduct regular user research to identify and confirm strategy (random sample surveys, usability testing, focus groups, in-depth interviews with wireframes, etc.) Why have an intranet strategy? Having a strategy pre-supposes a planned, orderly process with proper costing and budgeting, it involves consulting with the parties who are going to be using the intranet, allows for an efficient integration with existing systems and phasing-out of older ones, has long term benefits when the intranet needs to be scaled or made more secure, maintains control and quality in the hands of the designated department that "owns" it, and creates for the provision of feedback to monitor whether the "investment" is living up to the organization's expectations. Potential advantages of using intranets reduces printing, distribution, and paper costs - particularly on policy manuals, company newsletters, product catalogs, technical drawings, training material, and telephone directories easy to use - no specialized training required inexpensive to use (once it is set up) moderate initial setup costs (hardware and software) standardized network protocol (TCP/IP), document protocol (HTML), and file transfer protocol (ftp) already well established and suitable for all platforms can be used throughout the enterprise reduces employee training costs reduces sales and marketing costs reduces office administration and accounting costs ease of access results in a more integrated company with employees communicating and collaborating more freely and more productivel
https://en.wikipedia.org/wiki/Head%20%28Unix%29
head is a program on Unix and Unix-like operating systems used to display the beginning of a text file or piped data. Syntax The command syntax is: head [options] <file_name> By default, head will print the first 10 lines of its input to the standard output. The number of lines printed may be changed with a command line option. The following example shows the first 20 lines of filename: head -n 20 filename This displays the first 5 lines of all files starting with foo: head -n 5 foo* Most versions allow omitting n and instead directly specifying the number: -5. GNU head allows negative arguments for the -n option, meaning to print all but the last - argument value counted - lines of each input file. Flags -c <x number of bytes> Copy first x number of bytes. Other Many early versions of Unix and Plan 9 did not have this command, and documentation and books used sed instead: sed 5q filename The example prints every line (implicit) and quit after the fifth. Equivalently, awk may be used to print the first five lines in a file: awk 'NR < 6' filename However, neither sed nor awk were available in early versions of BSD, which were based on Version 6 Unix, and included head. Implementations A head command is also part of ASCII's MSX-DOS2 Tools for MSX-DOS version 2. The command has also been ported to the IBM i operating system. See also tail (Unix) dd (Unix) List of Unix commands References External links head manual page from GNU coreutils. FreeBSD documentation for head Unix text processing utilities Unix SUS2008 utilities IBM i Qshell commands
https://en.wikipedia.org/wiki/Peter%20Funt
Peter Funt (born 1947) is an American actor, host, and producer for the hit TV show Candid Camera. He worked for Denver radio station KHOW, the ABC Radio Network, The New York Times and various other media organizations. He is a University of Denver graduate. Early life Peter Funt grew up in New York, where he worked summers on the set of his father Allen Funt's show, Candid Camera. He graduated from the University of Denver, earning his Bachelor of Arts in mass communications and journalism. During his time at the university he worked on the newspaper, The Clarion, as well as the radio station, KVDU. While a student, he interviewed Martin Luther King Jr. on his radio show in 1967. After he graduated he worked at the Denver radio station KHOW and at the ABC Radio Network in New York. In 1970, he won the Silurian's Award for that year's best news reporting based upon Funt's coverage on ABC News of racial disturbances in Asbury Park, New Jersey. Afterwards he got a job as an arts and leisure writer for The New York Times. During his earlier career, he also authored a book titled Gotcha! on the lost art of practical joking and was the editor and publisher of the television magazine On Cable. Career Involvement with Candid Camera Candid Camera first aired in 1948 and was created by Peter Funt's father, Allen. Peter first appeared on Candid Camera at age 3, posing as a shoeshine boy who charged ten dollars per shoe. He joined the show professionally in 1987 when he became a co-host with his father. During this time the show was being broadcast on the CBS television network. In 1993, Allen Funt had a serious stroke, from which he never fully recovered. This required Peter to host the show full-time. Later in 1996, he hosted and was the executive producer of the Candid Camera 50th anniversary special. The show returned in 1996 for the revamped version of Candid Camera. In 1997, Peter co-hosted the show with Suzanne Somers. In 2001, the show moved to the PAX network and Peter then co-hosted the show with Dina Eastwood until 2004, and began another revival of the show in 2014 with Mayim Bialik. During his time on the show Peter was a producer, host and acted on the show. He also produced and hosted over 200 episodes. In 2001, Philip Zelnick, a Laughlin/Bullhead International Airport passenger claimed that he was injured during one of the sequences on the show. Funt posed as an airport security guard and had passengers go through a fake x-ray machine. During the skit, Zelnick received a bruise on his thigh when getting off a conveyor belt. Zelnick sued Funt and the show and won the case, but when Funt appealed, the matter was settled out of court. The Mojave County Airport Authority settled during the trial and paid Zelnick $95,000. PAX TV also gave him $7,500 out of court. Career after Candid Camera Following his work with Candid Camera, Funt has written frequent op-eds for many nationally recognized news outlets including The New York Times, USA To
https://en.wikipedia.org/wiki/Bubblegum%20Crisis%20Tokyo%202040
is a 1998 cyberpunk anime television series produced by AIC. It is a reboot of the 1987 OVA series Bubblegum Crisis, which focuses on the Knight Sabers, a rogue vigilante group made up of four women who use powered suits to fight rogue Boomer robots made by the megacorporation Genom. Bubblegum Crisis Tokyo 2040 premiered on TV Tokyo on October 8, 1998, where it ran until its conclusion on March 31, 1999. Toshiba EMI released the series on VHS and Laserdisc across 13 volumes, each containing two episodes. The first volume was released on January 21, 1999; the final volume was released July 26, 2000. The series was later released on DVD, however the Japanese versions were simply the American DVD releases encoded to play for Region 2. The series was positively received by critics, with some deeming it an improvement over the original OVA series. Plot Like its predecessor, Bubblegum Crisis Tokyo 2040 takes place primarily in Tokyo. Much of the manual labor in the city is done by robots called Boomers, which are run by the high-tech mega-corporation Genom. While spending a night out, Linna Yamazaki, a new employee at the Hugh-Geit Corporation (a Genom subsidiary), observes a Boomer that has “gone rogue”, causing destruction and attacking people. Although the AD Police arrive stop the rogue Boomers, they struggle to restore order. Then, a renegade group called the Knight Sabers dressed in cybernetic, armored Hardsuits appear and save the day. Impressed by their work, Linna decides to join the group, which consists of: Priscilla “Priss” Asagiri, a rock star; Sylia Stingray, a boutique store owner and the group's leader; and Nene Romanova, a computer hacker who also works within the AD Police as a dispatch operator, serving as the group’s mole in that organization. Over the course of the series, the Knight Sabers go after rogue boomers, which frustrates AD Police officers Leon McNichol and his partner Daley Wong. Genom is also frustrated by their actions as their chairman, Quincy Rosenkreutz, and his advisor Brian J. Mason seek to unlock more boomer technology to defeat and capture the Knight Sabers. Later on, Sylia's younger brother Mackie, who is part-boomer, joins the Knight Sabers as an assistant alongside their lead mechanic, Nigel Kirkland. Leon later pursues Priss with romantic intentions, and despite eventually learning that she is a member of the Knight Sabers, they both confess their feelings for one another anyway. Mason uncovers and reactivates Galatea, a humanoid based on Sylia's DNA who is able to control all boomers and wipe out the human race. The Knight Sabers’ hardsuits break down as a result of Galatea’s reactivation, and Sylia is forced to get Nigel and Mackie to build new ones based on liquid metal. Mason has Quincy killed and takes over Genom. He then cuts AD Police's funding, resulting in a strike. Galatea's influence causes boomers across Tokyo to go rogue, trapping the Knight Sabers and the AD Police officers inside thei
https://en.wikipedia.org/wiki/F22%20%28disambiguation%29
F22 and F-22 usually refer to: Lockheed Martin F-22 Raptor, an American stealth fighter aircraft F-22 (series), a series of computer games by Novalogic F-22 Raptor (video game), a 1998 video game F-22: Air Dominance Fighter, a 1997 video game by Digital Image Design F-22 Interceptor, a 1991 video game by Electronic Arts for the Sega Mega Drive/Genesis F-22 Total Air War, a 1998 computer game from Infogrames for Windows 95 F22, F-22, F 22 or F.XXII may also refer to: F-22 (psychedelic), a drug Farrier F-22, a production trimaran sailboat designed in New Zealand by Ian Farrier of Farrier Marine Fokker F.XXII, a 1935 Dutch four-engined 22-passenger airliner General Avia F 22, a 1998 Italian two-seat monoplane F-22P Zulfiquar-class frigate of the Pakistan Navy 76 mm divisional gun M1936 (F-22), a Soviet 76.2 mm divisional gun of the World War II era , a 1938 British Royal Navy J-class destroyer BMW 2 Series (F22) PRR F22, a Pennsylvania Railroad locomotive classification F 22 Kongo, a Swedish Air Force squadron participating in the United nations peace-keeping operations in Congo 1961–1964 22, an F-number measure of photographic lens' aperture size F22, the ICD-10 code for persistent delusional disorders Fluorine-22 (F-22 or 22F), an isotope of fluorine
https://en.wikipedia.org/wiki/CFF
CFF may refer to: Arts, entertainment, and media Celebrity Family Feud, a 2008 NBC game show hosted by Al Roker Charcoal Feather Federation, an anime television series by Yoshitoshi ABe Computing Common File Format, a video file format that is part of the UltraViolet digital rights authentication and licensing system Compact Font Format, a font technology Events Chattanooga Film Festival, an annual film festival in Chattanooga, Tennessee Chicago Fringe Festival, an annual performing arts festival in Chicago, Illinois Organizations and enterprises Cambodian Freedom Fighters, a militant rebel group Central Facility for Funds, a post-trade service by Clearstream Swiss Federal Railways, (French: Chemins de fer fédéraux suisses) Children First Foundation Children's Film Foundation Cornish Fighting Fund, a campaign for Cornish recognition Croatian Football Federation, the governing body of football in Croatia Cystic Fibrosis Foundation Other uses Cafunfo Airport, an Angolan airport with this IATA code Certified in Financial Forensics, a specialty designation for Certified Public Accountants awarded by the AICPA Consistent Force Field, a force field in molecular mechanics Critical flicker fusion rate or threshold, a concept in the psychophysics of vision
https://en.wikipedia.org/wiki/FFS
FFS may refer to: Computing Feige–Fiat–Shamir identification scheme, in cryptography Flash file system Formatted File System Find first set, a type of bit operation FreeFileSync, a software package Amiga Fast File System Berkeley Fast File System Music FFS (band), a rock supergroup of Franz Ferdinand and Sparks FFS (album), their debut album Organizations Political organizations For our Future’s Sake, a UK campaign group calling for a public vote (People's Vote) on the final Brexit deal Social Forces Front (French: ), a political party in Burkina Faso Socialist Forces Front (French: ), a political party in Algeria Other organizations Frankford Friends School, in Philadelphia, Pennsylvania, United States French Federation of Speleology Swiss Federal Railways (Italian: ) Places FFS Arena, in Lund, Sweden Frankfurt South station, in Germany French Frigate Shoals, an island in the Northwestern Hawaiian Islands Science and technology Facial feminization surgery Free-fall sensor Fringe field switching Full flight simulator Other uses Farmer Field School, a community development process Fee-for-service, a payment model For Film's Sake, Australian women's film festival Fully fashioned stockings, stockings with reinforce toe and tops
https://en.wikipedia.org/wiki/GSI
GSI may refer to: Science and technology Geological Strength Index Gonadosomatic Index UK Government Secure Intranet Grid Security Infrastructure, a computer networking specification Businesses and organizations Businesses Gemological Science International, gemstone identification/grading/appraisal services Geophysical Service Incorporated, an American petroleum exploration corporation Guangzhou Shipyard International, a Chinese state-owned shipbuilder GSI Commerce, now eBay Enterprise, an American e-commerce company Scientific organizations GSI Helmholtz Centre for Heavy Ion Research, Germany Geographical Society of Ireland Geospatial Information Authority of Japan Geological Survey of India Geological Survey of Ireland Other organizations Global Security Initiative Gabinete de Segurança Institucional da presidência da república (Institutional Security Cabinet), Brazil Gustav Stresemann Institute, a German educational charity Other uses Graduate student instructor, a teaching fellow at some US universities Grand-Santi Airport, French Guiana (IATA code GSI)
https://en.wikipedia.org/wiki/David%20A.%20Wagner
David A. Wagner (born 1974) is a professor of computer science at the University of California, Berkeley and a well-known researcher in cryptography and computer security. He is a member of the Election Assistance Commission's Technical Guidelines Development Committee, tasked with assisting the EAC in drafting the Voluntary Voting System Guidelines. He is also a member of the ACCURATE project. Wagner received an A.B. in mathematics from Princeton University in 1995, an M.S. in computer science from Berkeley in 1999, and a Ph.D. in computer science from Berkeley in 2000. He has published two books and over 90 peer-reviewed scientific papers. His notable achievements include: 2007 Served as principal investigator for the source code review and also the documentation review of the historic California state Top-to-Bottom review of electronic voting systems certified for use. Flaws found with vendor-supplied voting machines resulted in decertification and provisional recertification by the Secretary of State. 2001 Cryptanalysis of WEP, the security protocol used in 802.11 "WiFi" networks (with Nikita Borisov and Ian Goldberg). 2000 Cryptanalysis of the A5/1 stream cipher used in GSM cellphones (with Alex Biryukov and Adi Shamir). 1999 Cryptanalysis of Microsoft's PPTP tunnelling protocol (with Bruce Schneier and "Mudge"). 1999 Invention of the slide attack, a new form of cryptanalysis (with Alex Biryukov); also the boomerang attack and mod n cryptanalysis (the latter with Bruce Schneier and John Kelsey). 1998 Development of Twofish block cipher, which was a finalist for NIST's Advanced Encryption Standard competition (with Bruce Schneier, John Kelsey, Doug Whiting, Chris Hall, and Niels Ferguson). 1997 Cryptanalyzed the CMEA algorithm used in many U.S. cellphones (with Bruce Schneier). 1995 Discovered a flaw in the implementation of SSL in Netscape Navigator (with Ian Goldberg). References External links Professor Wagner's home page David Wagner election research papers Some of Wagner's publications Another interview 1974 births Living people Modern cryptographers Princeton University alumni UC Berkeley College of Engineering alumni People associated with computer security UC Berkeley College of Engineering faculty American computer scientists Election technology people
https://en.wikipedia.org/wiki/Neil%20Postman
Neil Postman (March 8, 1931 – October 5, 2003) was an American author, educator, media theorist and cultural critic, who eschewed digital technology, including personal computers, mobile devices, and cruise control in cars, and was critical of uses of technology, such as personal computers in school. He is best known for twenty books regarding technology and education, including Amusing Ourselves to Death (1985), Conscientious Objections (1988), Technopoly: The Surrender of Culture to Technology (1992), The Disappearance of Childhood (1982) and The End of Education: Redefining the Value of School (1995). Biography Postman was born in New York City, where he spent most of his life. In 1953, he graduated from the State University of New York at Fredonia and enlisted in the military but was released less than five months later. At Teachers College, Columbia University, he was awarded a master's degree in 1955 and an Ed.D (Doctor of Education) degree in 1958. Postman took a position with San Francisco State University's English Department in 1958. Soon after, in 1959, he began teaching at New York University (NYU). In 1971, at NYU's Steinhardt School of Education, he founded a graduate program in media ecology. He became the School of Education's only University Professor in 1993, and was chairman of the Department of Culture and Communication until 2002. Postman died at age 72 of lung cancer at a hospital in Flushing, Queens, on October 5, 2003. At the time, he had been married to his wife, Shelley Ross Postman, for 48 years. They had three children and were longtime residents of Flushing. Works Postman wrote 20 books and more than 200 magazine and newspaper articles in, for example, The New York Times Magazine, The Atlantic Monthly, Harper's Magazine, Time, Saturday Review, Harvard Educational Review, The Washington Post, Los Angeles Times, Stern and Le Monde. He was the editor of the quarterly journal ETC: A Review of General Semantics from 1976 to 1986. In 1976, Postman taught a course for NYU credit on CBS-TV's Sunrise Semester called "Communication: the Invisible Environment". He was also a contributing editor at The Nation. Several of his articles were reprinted after his death in the quarterly journal, ETC.: A Review of General Semantics as part of a 75th anniversary edition in October 2013. On education In 1969 and 1970, Postman collaborated with the New Rochelle educator Alan Shapiro on the development of a model school based on the principles expressed in Teaching as a Subversive Activity. In Teaching as a Subversive Activity, Postman and co-author Charles Weingartner suggest that many schools have curricula that are trivial and irrelevant to students' lives. The result of Postman and Weingartner's critiques in Teaching as a Subversive Activity was the "Program for Inquiry, Involvement, and Independent Study" within New Rochelle High School. This "open school" experiment survived for 15 years and in subsequent years many programs fo
https://en.wikipedia.org/wiki/Incompatible%20Timesharing%20System
Incompatible Timesharing System (ITS) is a time-sharing operating system developed principally by the MIT Artificial Intelligence Laboratory, with help from Project MAC. The name is the jocular complement of the MIT Compatible Time-Sharing System (CTSS). ITS, and the software developed on it, were technically and culturally influential far beyond their core user community. Remote "guest" or "tourist" access was easily available via the early ARPAnet, allowing many interested parties to informally try out features of the operating system and application programs. The wide-open ITS philosophy and collaborative online community were a major influence on the hacker culture, as described in Steven Levy's book Hackers, and were the direct forerunners of the free and open-source software, open-design, and Wiki movements. History ITS development was initiated in the late 1960s by those (the majority of the MIT AI Lab staff at that time) who disagreed with the direction taken by Project MAC's Multics project (which had started in the mid-1960s), particularly such decisions as the inclusion of powerful system security. The name was chosen by Tom Knight as a joke on the name of the earliest MIT time-sharing operating system, the Compatible Time-Sharing System, which dated from the early 1960s. By simplifying their system compared to Multics, ITS's authors were able to quickly produce a functional operating system for their lab. ITS was written in assembly language, originally for the Digital Equipment Corporation PDP-6 computer, but the majority of ITS development and use was on the later, largely compatible, PDP-10. Although not used as intensively after about 1986, ITS continued to operate on original hardware at MIT until 1990, and then until 1995 at Stacken Computer Club in Sweden. Today, some ITS implementations continue to be remotely accessible, via emulation of PDP-10 hardware running on modern, low-cost computers supported by interested hackers. Significant technical features ITS introduced many then-new features: The first device-independent graphics terminal output; programs generated generic commands to control screen content, which the system automatically translated into the appropriate character sequences for the particular type of terminal operated by the user. A general mechanism for implementing virtual devices in software running in user processes (which were called "jobs" in ITS). Using the virtual-device mechanism, ITS provided transparent inter-machine filesystem access. The ITS machines were all connected to the ARPAnet, and a user on one machine could perform the same operations with files on other ITS machines as if they were local files. Sophisticated process management; user processes were organized in a tree, and a superior process could control a large number of inferior processes. Any inferior process could be frozen at any point in its operation, and its state (including contents of the registers) examined; the proces
https://en.wikipedia.org/wiki/Tracey%20Curro
Tracey Ilana Curro (born 27 November 1963) is an Australian journalist. Curro has previously been a news presenter on GMV-6, QTQ-9 and ATV-10 and a reporter on the Seven Network's Beyond 2000, a science-technology show, and correspondent on 60 Minutes. Career Curro was born and grew up in Ingham, Queensland; her father, Phillip, was a descendant of first-generation immigrants from Sicily. She is a graduate of the Queensland University of Technology (Bachelor of Business – Communications) and the Institute of Strategic Leadership, New Zealand. She was embroiled in a court case when she broke her contract with the producers of Beyond 2000 to join 60 Minutes: Curro v Beyond Productions Pty Ltd (1993) 30 NSWLR 337, decided 7 May 1993. She can occasionally be heard filling in for regular presenters on 774 ABC Melbourne radio, notably filling in for a two-week period in 2005 following the departure of Virginia Trioli, and has written for The Australian Women's Weekly. One of her prized moments of television occurred when she asked Pauline Hanson whether she was xenophobic. The famous response, "Please explain" has now become an Australia classic, and is a line for which Hanson is remembered. Curro was also the Communications Manager for Sustainability Victoria—the greenhouse reduction arm of the Victorian Government, and later a consultant with the executive search firm SHK, specialising in marketing and communications, corporate and public affairs, government relations, internal communication and sustainability. Curro previously filled in for National Nine News Melbourne weekend presenter Jo Hall; she also used to present weekly Crimestopper reports on the Nine Network. She has also been a fill in presenter for Carrie Bickmore on The Project, and was particularly prominent on the show in 2010–11. References External links 1963 births Living people Australian people of Sicilian descent Australian television presenters Australian television journalists Australian women television presenters Australian women journalists People from Ingham, Queensland Queensland University of Technology alumni 60 Minutes (Australian TV program) correspondents
https://en.wikipedia.org/wiki/Michael%20O.%20Rabin
Michael Oser Rabin (; born September 1, 1931) is an Israeli mathematician, computer scientist, and recipient of the Turing Award. Biography Early life and education Rabin was born in 1931 in Breslau, Germany (today Wrocław, in Poland), the son of a rabbi. In 1935, he emigrated with his family to Mandate Palestine. As a young boy, he was very interested in mathematics and his father sent him to the best high school in Haifa, where he studied under mathematician Elisha Netanyahu, who was then a high school teacher. Rabin graduated from the Hebrew Reali School in Haifa in 1948, and was drafted into the army during the 1948 Arab–Israeli War. The mathematician Abraham Fraenkel, who was a professor of mathematics in Jerusalem, intervened with the army command, and Rabin was discharged to study at the university in 1949. Afterwards, he received an M.Sc from Hebrew University of Jerusalem. He began graduate studies at the University of Pennsylvania before receiving a Ph.D. from Princeton University in 1956. Career Rabin became Associate Professor of Mathematics at the University of California, Berkeley (1961–62) and MIT (1962-63). Before moving to Harvard University as Gordon McKay Professor of Computer Science in 1981, he was a professor at the Hebrew University. In the late 1950s, he was invited for a summer to do research for IBM at the Lamb Estate in Westchester County, New York with other promising mathematicians and scientists. It was there that he and Dana Scott wrote the paper "Finite Automata and Their Decision Problems". Soon, using nondeterministic automata, they were able to re-prove Kleene's result that finite state machines exactly accept regular languages. As to the origins of what was to become computational complexity theory, the next summer Rabin returned to the Lamb Estate. John McCarthy posed a puzzle to him about spies, guards, and passwords, which Rabin studied and soon after he wrote an article, "Degree of Difficulty of Computing a Function and Hierarchy of Recursive Sets." Nondeterministic machines have become a key concept in computational complexity theory, particularly with the description of the complexity classes P and NP. Rabin then returned to Jerusalem, researching logic, and working on the foundations of what would later be known as computer science. He was an associate professor and the head of the Institute of Mathematics at the Hebrew University at 29 years old, and a full professor by 33. Rabin recalls, "There was absolutely no appreciation of the work on the issues of computing. Mathematicians did not recognize the emerging new field". In 1960, he was invited by Edward F. Moore to work at Bell Labs, where Rabin introduced probabilistic automata that employ coin tosses in order to decide which state transitions to take. He showed examples of regular languages that required a very large number of states, but for which you get an exponential reduction of the number of states if you go over to probabilistic au
https://en.wikipedia.org/wiki/Blaster%20%28computer%20worm%29
Blaster (also known as Lovsan, Lovesan, or MSBlast) was a computer worm that spread on computers running operating systems Windows XP and Windows 2000 during August 2003. The worm was first noticed and started spreading on August 11, 2003. The rate that it spread increased until the number of infections peaked on August 13, 2003. Once a network (such as a company or university) was infected, it spread more quickly within the network because firewalls typically did not prevent internal machines from using a certain port. Filtering by ISPs and widespread publicity about the worm curbed the spread of Blaster. In September 2003, Jeffrey Lee Parson, an 18-year-old from Hopkins, Minnesota, was indicted for creating the B variant of the Blaster worm; he admitted responsibility and was sentenced to an 18-month prison term in January 2005. The author of the original A variant remains unknown. Creation and effects According to court papers, the original Blaster was created after security researchers from the Chinese group reverse engineered the original Microsoft patch that allowed for execution of the attack. The worm spreads by exploiting a buffer overflow discovered by the Polish security research group Last Stage of Delirium in the DCOM RPC service on the affected operating systems, for which a patch had been released one month earlier in MS03-026 and later in MS03-039. This allowed the worm to spread without users opening attachments simply by spamming itself to large numbers of random IP addresses. Four versions have been detected in the wild. These are the most well-known exploits of the original flaw in RPC, but there were in fact another 12 different vulnerabilities that did not see as much media attention. The worm was programmed to start a SYN flood against port 80 of windowsupdate.com if the system date is after August 15 and before December 31 and after the 15th day of other months, thereby creating a distributed denial of service attack (DDoS) against the site. The damage to Microsoft was minimal as the site targeted was windowsupdate.com, rather than windowsupdate.microsoft.com, to which the former was redirected. Microsoft temporarily shut down the targeted site to minimize potential effects from the worm. The worm's executable, MSBlast.exe, contains two messages. The first reads: I just want to say LOVE YOU SAN!! This message gave the worm the alternative name of Lovesan. The second reads: billy gates why do you make this possible ? Stop making money and fix your software!! This is a message to Bill Gates, the co-founder of Microsoft and the target of the worm. The worm also creates the following registry entry so that it is launched every time Windows starts: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Run\ windows auto update=msblast.exe Timeline May 28, 2003: Microsoft releases a patch that would protect users from an exploit in WebDAV that Welchia used. (Welchia used the same exploit as MSBlast but ha
https://en.wikipedia.org/wiki/Downstream%20%28networking%29
In a telecommunications network or computer network, downstream refers to data sent from a network service provider to a customer. One process sending data primarily in the downstream direction is downloading. However, the overall download speed depends on the downstream speed of the user, the upstream speed of the server, and the network between them. In the client–server model, downstream can refer to the direction from the server to the client. References Data transmission Orientation (geometry) nl:Downstream
https://en.wikipedia.org/wiki/HP%20Multi-Programming%20Executive
MPE (Multi-Programming Executive) is a discontinued business-oriented mainframe computer real-time operating system made by Hewlett-Packard. While initially a mini-mainframe, the final high-end systems supported 12 CPUs and over 2000 simultaneous users. Description It runs on the HP 3000 family of computers, which originally used HP custom 16-bit stack architecture CISC CPUs and were later migrated to PA-RISC where the operating system was called MPE XL. In 1983, the original version of MPE was written in a language called SPL (System Programming Language). MPE XL was written primarily in Pascal, with some assembly language and some of the old SPL code. In 1992, the OS name was changed to MPE/iX to indicate Unix interoperability with the addition of POSIX compatibility. The discontinuance of the product line was announced in late 2001, with support from HP terminating at the end of 2010. A number of 3rd party companies still support both the hardware and software. In 2002 HP released the last version MPE/iX 7.5. Commands Among others, MPE/iX supports the following list of common commands and programs. =SHUTDOWN BASIC CHDIR COPY DEBUG ECHO ELSE EXIT FORTRAN HELP IF PASCAL PRINT RENAME SH WHILE See also HP 3000 References External links Allegro Consultants, Inc. Free HP 3000 Software, MPE Software Support Beechglen Development Inc. MPE Software Support HP MPE/iX homepage HP MPE/iX Command reference openMPE Advocates of continued MPE and IMAGE source code access beyond 2010 Discontinued operating systems Multi-Programming Executive Proprietary operating systems 1974 software
https://en.wikipedia.org/wiki/Richard%20M.%20Karp
Richard Manning Karp (born January 3, 1935) is an American computer scientist and computational theorist at the University of California, Berkeley. He is most notable for his research in the theory of algorithms, for which he received a Turing Award in 1985, The Benjamin Franklin Medal in Computer and Cognitive Science in 2004, and the Kyoto Prize in 2008. Karp was elected a member of the National Academy of Engineering (1992) for major contributions to the theory and application of NP-completeness, constructing efficient combinatorial algorithms, and applying probabilistic methods in computer science. Biography Born to parents Abraham and Rose Karp in Boston, Massachusetts, Karp has three younger siblings: Robert, David, and Carolyn. His family was Jewish, and he grew up in a small apartment, in a then mostly Jewish neighborhood of Dorchester in Boston. Both his parents were Harvard graduates (his mother eventually obtaining her Harvard degree at age 57 after taking evening courses), while his father had had ambitions to go to medical school after Harvard, but became a mathematics teacher as he could not afford the medical school fees. He attended Harvard University, where he received his bachelor's degree in 1955, his master's degree in 1956, and his Ph.D. in applied mathematics in 1959. He started working at IBM's Thomas J. Watson Research Center. In 1968, he became professor of computer science, mathematics, and operations research at the University of California, Berkeley. Karp was the first associate chair of the Computer Science Division within the Department of Electrical Engineering and Computer Science. Apart from a 4-year period as a professor at the University of Washington, he has remained at Berkeley. From 1988 to 1995 and 1999 to the present he has also been a research scientist at the International Computer Science Institute in Berkeley, where he currently leads the Algorithms Group. Richard Karp was awarded the National Medal of Science, and was the recipient of the Harvey Prize of the Technion and the 2004 Benjamin Franklin Medal in Computer and Cognitive Science for his insights into computational complexity. In 1994 he was inducted as a Fellow of the Association for Computing Machinery. He was elected to the 2002 class of Fellows of the Institute for Operations Research and the Management Sciences. He is the recipient of several honorary degrees and a member of the U.S. National Academy of Sciences, the American Academy of Arts and Sciences, and the American Philosophical Society. In 2012, Karp became the founding director of the Simons Institute for the Theory of Computing at the University of California, Berkeley. Work Karp has made many important discoveries in computer science, combinatorial algorithms, and operations research. His major current research interests include bioinformatics. In 1962 he co-developed with Michael Held the Held–Karp algorithm, an exact exponential-time algorithm for the travelling sale
https://en.wikipedia.org/wiki/Stored-value%20card
A stored-value card (SVC) is a payment card with a monetary value stored on the card itself, not in an external account maintained by a financial institution. This means no network access is required by the payment collection terminals as funds can be withdrawn and deposited straight from the card. Like cash, payment cards can be used anonymously as the person holding the card can use the funds. They are an electronic development of token coins and are typically used in low-value payment systems or where network access is difficult or expensive to implement, such as parking machines, public transport systems, and closed payment systems in locations such as ships. Stored-value cards differ from debit cards, where money is on deposit with the issuer, and credit cards which are subject to credit limits set by the issuer and are connected to accounts at financial institutions. Another difference between stored-value cards and debit and credit cards is that debit and credit cards are usually issued in the name of individual account holders, while stored-value cards may be anonymous, as in the case of gift cards. Stored-value cards are prepaid money cards and may be disposed when the value is used, or the card value may be topped up, as in the case of telephone calling cards or when used as a fare card. The term closed-loop means the funds and/or data are physically stored on the token or card in the form of binary-coded data. This is unlike payment cards where data is maintained on the card issuer's computers. Like payment cards, value can be accessed using a magnetic stripe, chip or radio-frequency identification (RFID) embedded in the card; or by entering a code number, printed on the card, into a telephone or other numeric keypad. Names There is no common name for stored-value cards, which are country or company specific. Names for stored-value cards include APPH in US, Mondex in Canada, Chipknip in the Netherlands, Geldkarte in Germany, Quick in Austria, Moneo in France, Proton in Belgium, Carta prepagata ("Prepaid card") in Italy, FeliCa-cards such as Suica in Japan, China T-Union in mainland China, EZ-Link and NETS (CashCard and FlashPay) in Singapore, Papara Card in Turkey, Octopus card in Hong Kong, SUBE card in Argentina, T-Cash in the Philippines and Touch 'n Go and MyRapid Card in Malaysia. The U.S. Department of the Treasury manages three stored-value card programs: EZpay, EagleCash, and Navy Cash. Non-government stored-value cards include Aramark GuestExpress, Compass Zipthru, and Freedompay FreetoGo. Uses Stored-value cards are most commonly used for low-value transactions, such as transit system farecards, telephone prepaid calling cards, cafeterias, or for micropayments in shops or vending machines. They also have an advantage over most other payment cards in that when making, say, a purchase, telecommunication facilities are not needed, which may be important in situations where the availability or reliability of these facilitie
https://en.wikipedia.org/wiki/W.%20Ross%20Ashby
William Ross Ashby (6 September 1903 – 15 November 1972) was an English psychiatrist and a pioneer in cybernetics, the study of the science of communications and automatic control systems in both machines and living things. His first name was not used: he was known as Ross Ashby. His two books, Design for a Brain and An Introduction to Cybernetics, introduced exact and logical thinking into the brand new discipline of cybernetics and were highly influential. These "missionary works" along with his technical contributions made Ashby "the major theoretician of cybernetics after Wiener". Early life and education William Ross Ashby was born in 1903 in London, where his father was working at an advertising agency. From 1921 he studied at Sidney Sussex College, Cambridge, where he received his B.A. in 1924 and his M.B. and B.Ch. in 1928. From 1924 to 1928 he worked at St. Bartholomew's Hospital in London. Later on he also received a Diploma in Psychological Medicine in 1931, and an M.A. 1930 and M.D. from Cambridge in 1935. Career Ashby started working in 1930 as a Clinical Psychiatrist at the London County Council. From 1936 until 1947 he was a Research Pathologist at St Andrew's Hospital in Northampton in England. From 1945 to 1947 he served in India where he was a Major in the Royal Army Medical Corps. When he returned to England, he served as Director of Research of the Barnwood House Hospital in Gloucester from 1947 until 1959. For a year, he was Director of the Burden Neurological Institute in Bristol. In 1960, he went to the United States and became Professor, Depts. of Biophysics and Electrical Engineering, University of Illinois at Urbana–Champaign, until his retirement in 1970. Ashby was president of the Society for General Systems Research from 1962 to 1964. After retiring in August 1970, he became an Honorary Professorial Fellow at the University of Wales in 1970 and a fellow of the Royal College of Psychiatrists in 1971. In June 1972 he was diagnosed with an inoperable brain tumor, and he died on 15 November. Work Despite being widely influential within cybernetics, systems theory and, more recently, complex systems, Ashby is not as well known as many of the notable scientists his work influenced, including Herbert A. Simon, Norbert Wiener, Ludwig von Bertalanffy, Stafford Beer, Stanley Milgram, and Stuart Kauffman. Journal Ashby kept a journal for over 44 years in which he recorded his ideas about new theories. He started May 1928, when he was medical student at St. Bartholomew's Hospital in London. Over the years, he wrote down a series of 25 volumes totaling 7,189 pages. In 2003, these journals were given to The British Library, London, and in 2008, they were made available online as The W. Ross Ashby Digital Archive. Ashby initially considered his theorizing a private hobby, and his later decision to begin publishing his work caused him some distress. He wrote: My fear is now that I may become conspicuous, for a book of mi
https://en.wikipedia.org/wiki/LDAP%20Application%20Program%20Interface
The LDAP Application Program Interface, described by RFC 1823, is an Informational RFC that specifies an application programming interface in the C programming language for version 2 of the Lightweight Directory Access Protocol. Version 2 of LDAP is historic. Commonly available LDAP C APIs do not strictly adhere to this specification. A draft standard is under development for LDAP version 3. References External links - The LDAP Application Program Interface - IETF C (programming language) libraries
https://en.wikipedia.org/wiki/William%20Crowther%20%28programmer%29
William Crowther (born 1936) is an American computer programmer, caver, and rock climber. He is the co-creator of Colossal Cave Adventure from 1975 onward, a seminal computer game that influenced the first decade of video game design and inspired the text adventure game genre. Biography During the early 1970s, Crowther worked at defense contractor and internet pioneer Bolt Beranek and Newman (BBN), where he was part of the original small ARPAnet development team. His implementation of a distributed distance vector routing system for the ARPAnet was an important step in the evolution of the internet. Crowther met and married Pat Crowther while studying at the Massachusetts Institute of Technology, where he received a B.S. in physics in 1958. Adventure Following his divorce from his wife, Crowther used his spare time to develop a text-based adventure game in Fortran on BBN's PDP-10. He created it as a diversion his daughters Sandy and Laura could enjoy when they came to visit. Crowther wrote: In Colossal Cave, or more simply called Adventure, the player moves around an imaginary cave system by entering simple, two-word commands and reading text describing the result. Crowther used his extensive knowledge of cave exploration as a basis for the gameplay, and there are many similarities between the locations in the game and those in Mammoth Cave, particularly its Bedquilt section. In 1975, Crowther released the game on the early ARPAnet system, of which BBN was a prime contractor. In the spring of 1976, he was contacted by Stanford researcher Don Woods, seeking his permission to enhance the game. Crowther agreed, and Woods developed several enhanced versions on a PDP-10 housed in the Stanford Artificial Intelligence Laboratory (SAIL) where he worked. Over the following decade the game gained in popularity, being ported to many operating systems, including personal-computer platform CP/M. The basic game structure invented by Crowther (and based in part on the example of the ELIZA text parser) was carried forward by the designers of later adventure games. Marc Blank and the team that created the Zork adventures cite Adventure as the title that inspired them to create their game. They later founded Infocom and published a series of popular text adventures. Caving The location of the game in Colossal Cave was not a coincidence. Crowther and his first wife Pat were active and dedicated cavers in the 1960s and early 1970s—both were part of many expeditions to connect the Mammoth and Flint Ridge cave systems. Pat played a key role in the September 9, 1972 expedition that finally made the connection. Indeed, even during his time working at BBN, his colleagues noticed that Crowther spent a fair amount of time doing chin-ups in doorframes, which apparently helped him concentrate. As a member of the MIT Outing Club during the late 1950s and early 1960s, Crowther also played an important role in the development of rock climbing in the Shawang
https://en.wikipedia.org/wiki/John%20Hopcroft
John Edward Hopcroft (born October 7, 1939) is an American theoretical computer scientist. His textbooks on theory of computation (also known as the Cinderella book) and data structures are regarded as standards in their fields. He is the IBM Professor of Engineering and Applied Mathematics in Computer Science at Cornell University, Co-Director of the Center on Frontiers of Computing Studies at Peking University, and the Director of the John Hopcroft Center for Computer Science at Shanghai Jiao Tong University. Education He received his bachelor's degree from Seattle University in 1961. He received his master's degree and Ph.D. from Stanford University in 1962 and 1964, respectively. He worked for three years at Princeton University and since then has been at Cornell University. Hopcroft is the grandson of Jacob Nist, founder of the Seattle-Tacoma Box Company. Career In addition to his research work, he is well known for his books on algorithms and formal languages coauthored with Jeffrey Ullman and Alfred Aho, regarded as classic texts in the field. In 1986 he received the Turing Award (jointly with Robert Tarjan) "for fundamental achievements in the design and analysis of algorithms and data structures." Along with his work with Tarjan on planar graphs he is also known for the Hopcroft–Karp algorithm for finding matchings in bipartite graphs. In 1994 he was inducted as a Fellow of the Association for Computing Machinery. In 2005 he received the Harry H. Goode Memorial Award "for fundamental contributions to the study of algorithms and their applications in information processing." In 2008 he received the Karl V. Karlstrom Outstanding Educator Award "for his vision of and impact on computer science, including co-authoring field-defining texts on theory and algorithms, which continue to influence students 40 years later, advising PhD students who themselves are now contributing greatly to computer science, and providing influential leadership in computer science research and education at the national and international level." Hopcroft was elected a member of the National Academy of Engineering in 1989 for fundamental contributions to computer algorithms and for authorship of outstanding computer science textbooks. In 1992, Hopcroft was nominated to the National Science Board by George H. W. Bush. In 2005, he was awarded an honorary doctorate by the University of Sydney, in Sydney, Australia. In 2009, he received an honorary doctorate from Saint Petersburg State University of Information Technologies, Mechanics and Optics. In 2017, Shanghai Jiao Tong University launched a John Hopcroft Center for Computer Science. In 2020 the Chinese University of Hong Kong, Shenzhen opened a Hopcroft Institute for Advanced Information Sciences and designated him as an Einstein professor. Hopcroft is also the co-recipient (with Jeffrey Ullman) of the 2010 IEEE John von Neumann Medal "for laying the foundations for the fields of automata and language theory
https://en.wikipedia.org/wiki/Andrew%20Yao
Andrew Chi-Chih Yao (; born December 24, 1946) is a Chinese computer scientist and computational theorist. He is currently a professor and the dean of Institute for Interdisciplinary Information Sciences (IIIS) at Tsinghua University. Yao used the minimax theorem to prove what is now known as Yao's Principle. Yao was a naturalized U.S. citizen, and worked for many years in the U.S. In 2015, together with Yang Chen-Ning, he renounced his U.S. citizenship and became an academician of the Chinese Academy of Sciences. Early life and education Yao was born in Shanghai, China. He completed his undergraduate education in physics at the National Taiwan University, before completing a Doctor of Philosophy in physics at Harvard University in 1972, and then a second PhD in computer science from the University of Illinois at Urbana–Champaign in 1975. Academic career Yao was an assistant professor at Massachusetts Institute of Technology (1975–1976), assistant professor at Stanford University (1976–1981), and professor at the University of California, Berkeley (1981–1982). From 1982 to 1986, he was a full professor at Stanford University. From 1986 to 2004, Yao was the William and Edna Macaleer Professor of Engineering and Applied Science at Princeton University, where he continued to work on algorithms and complexity. In 2004, Yao became a professor of the Center for Advanced Study, Tsinghua University (CASTU) and the director of the Institute for Theoretical Computer Science (ITCS), Tsinghua University in Beijing. Since 2010, he has served as the Dean of Institute for Interdisciplinary Information Sciences (IIIS) in Tsinghua University. In 2010, he initiated the Conference on Innovations in Theoretical Computer Science (ITCS). Yao is also the Distinguished Professor-at-Large in the Chinese University of Hong Kong. Awards In 1996, Yao was awarded the Knuth Prize. Yao also received the Turing Award in 2000, one of the most prestigious awards in computer science, "in recognition of his fundamental contributions to the theory of computation, including the complexity-based theory of pseudorandom number generation, cryptography, and communication complexity". In 2021, Yao received the Kyoto Prize in Advanced Technology. Yao is a member of U.S. National Academy of Sciences, a fellow of the American Academy of Arts and Sciences, a fellow of the American Association for the Advancement of Science, a fellow of the Association for Computing Machinery, and an academician of Chinese Academy of Sciences. His wife, Frances Yao, is also a theoretical computer scientist. See also Yao's principle Dolev-Yao model Important publications in cryptography Yao's test Yao's Millionaires' Problem Yao graph Garbled circuit References External links Andrew Yao at CASTU 1946 births Living people 20th-century American scientists 20th-century Chinese scientists 21st-century American scientists 21st-century Chinese scientists American computer scientists American e
https://en.wikipedia.org/wiki/Optical%20disc%20drive
In computing, an optical disc drive is a disc drive that uses laser light or electromagnetic waves within or near the visible light spectrum as part of the process of reading or writing data to or from optical discs. Some drives can only read from certain discs, but recent drives can both read and record, also called burners or writers (since they physically burn the organic dye on write-once CD-R, DVD-R and BD-R LTH discs). Compact discs, DVDs, and Blu-ray discs are common types of optical media which can be read and recorded by such drives. Although laptop manufacturers no longer have optical drives bundled with their products, external drives are still available for purchase separately. Drive types , most of the optical disc drives on the market are DVD-ROM drives and BD-ROM drives which read and record from those formats, along with having backward compatibility with CD, CD-R and CD-ROM discs; compact disc drives are no longer manufactured outside of audio devices. Read-only DVD and Blu-ray drives are also manufactured, but are less commonly found in the consumer market and mainly limited to media devices such as game consoles and disc media players. Over the last ten years, laptop computers no longer come with optical disc drives in order to reduce costs and make devices lighter, requiring consumers to purchase external optical drives. Appliances and functionality Optical disc drives are an integral part of standalone appliances such as CD players, DVD players, Blu-ray Disc players, DVD recorders, and video game consoles. As of 2017, the PlayStation and Xbox consoles are the only home video game consoles that are currently using optical discs as its primary storage format, as the Wii U's successor, the Nintendo Switch, began using game cartridges, while the PlayStation Portable is by far the only handheld console to use optical discs, using UMDs). They are also very commonly used in computers to read software and media distributed on disc and to record discs for archival and data exchange purposes. Floppy disk drives, with capacity of 1.44 MB, have been made obsolete: optical media are cheap and have vastly higher capacity to handle the large files used since the days of floppy discs, and the vast majority of computers and much consumer entertainment hardware have optical writers. USB flash drives, high-capacity, small, and inexpensive, are suitable where read/write capability is required. Disc recording is restricted to storing files playable on consumer appliances (films, music, etc.), relatively small volumes of data (e.g. a standard DVD holds 4.7 gigabytes, however, higher-capacity formats such as multi-layer Blu-ray Discs exist) for local use, and data for distribution, but only on a small scale; mass-producing large numbers of identical discs by pressing (replication) is cheaper and faster than individual recording (duplication). To support 8 centimetre diameter discs, drives with mechanical tray loading (desktop computer drive
https://en.wikipedia.org/wiki/Compound%20document
In computing, a compound document is a document that “combines multiple document formats, either by reference, by inclusion, or both.” Compound documents are often produced using word processing software, and may include text and non-text elements such as barcodes, spreadsheets, pictures, digital videos, digital audio, and other multimedia features. Compound document technologies are commonly utilized on top of a software componentry framework, but the idea of software componentry includes several other concepts apart from compound documents, and software components alone do not enable compound documents. Well-known technologies for compound documents include: ActiveX Documents Bonobo by Ximian (primarily used by GNOME) KParts in KDE Mixed Object Document Content Architecture Multipurpose Internet Mail Extensions (MIME) Object linking and embedding (OLE) by Microsoft; see Compound File Binary Format Open Document Architecture from ITU-T (not used) OpenDoc by IBM and Apple Computer (now defunct) RagTime Verdantuim XML and XSL are encapsulation formats used for compound documents of all kinds The first public implementation of compound documents was on the Xerox Star workstation, released in 1981. See also COM Structured Storage Multiple-document interface Transclusion References Electronic documents Multimedia
https://en.wikipedia.org/wiki/Guy%20L.%20Steele%20Jr.
Guy Lewis Steele Jr. (; born October 2, 1954) is an American computer scientist who has played an important role in designing and documenting several computer programming languages and technical standards. Biography Steele was born in Missouri and graduated from the Boston Latin School in 1972. He received a Bachelor of Arts (BA) in applied mathematics from Harvard University (1975) and a Master's degree (MS) and Doctor of Philosophy (PhD) from Massachusetts Institute of Technology (MIT) in computer science (1977, 1980). He then worked as an assistant professor of computer science at Carnegie Mellon University and a compiler implementer at Tartan Laboratories. Then he joined the supercomputer company Thinking Machines, where he helped define and promote a parallel computing version of the Lisp programming language named *Lisp (Star Lisp) and a parallel version of the language C named C*. In 1994, Steele joined Sun Microsystems and was invited by Bill Joy to become a member of the Java team after the language had been designed, since he had a track record of writing good specifications for extant languages. He was named a Sun Fellow in 2003. Steele joined Oracle in 2010 when Oracle acquired Sun Microsystems. Works While at MIT, Steele published more than two dozen papers with Gerald Jay Sussman on the subject of the language Lisp and its implementation (the Lambda Papers). One of their most notable contributions was the design of the language Scheme. Steele also designed the original command set of Emacs and was the first to port TeX (from WAITS to ITS). He has published papers on other subjects, including compilers, parallel processing, and constraint languages. One song he composed has been published in the official journal of the Association for Computing Machinery Communications of the ACM (CACM) ("The Telnet Song", April 1984, a parody of the behavior of a series of PDP-10 TELNET implementations written by Mark Crispin). Steele has served on accredited technical standards committees, including: Ecma International (formerly European Computer Manufacturers Association (ECMA)) TC39 (for the language ECMAScript, for which he was editor of the first edition), X3J11 (for C), and X3J3 (for Fortran) and is, , chairman of X3J13 (for Common Lisp). He was also a member of the Institute of Electrical and Electronics Engineers (IEEE) working group that produced the IEEE Standard for the language Scheme, IEEE Std 1178-1990. He represented Sun Microsystems in the High Performance Fortran Forum, which produced the High Performance Fortran specification in May, 1993. In addition to specifications of the language Java, Steele's work at Sun Microsystems has included research in parallel algorithms, implementation strategies, and architecture and software support. In 2005, Steele began leading a team of researchers at Sun developing a new language named Fortress, a high-performance language designed to obsolete Fortran. Books In 1982, Steele edited The Ha
https://en.wikipedia.org/wiki/Point-to-Point%20Protocol%20over%20Ethernet
The Point-to-Point Protocol over Ethernet (PPPoE) is a network protocol for encapsulating Point-to-Point Protocol (PPP) frames inside Ethernet frames. It appeared in 1999, in the context of the boom of DSL as the solution for tunneling packets over the DSL connection to the ISP's IP network, and from there to the rest of the Internet. A 2005 networking book noted that "Most DSL providers use PPPoE, which provides authentication, encryption, and compression." Typical use of PPPoE involves leveraging the PPP facilities for authenticating the user with a username and password, predominately via the PAP protocol and less often via CHAP. Around 2000, PPPoE was also starting to become a replacement method for talking to a modem connected to a computer or router over an Ethernet LAN displacing the older method, which had been USB. This use-case, connecting routers to modems over Ethernet is still extremely common today. On the customer-premises equipment, PPPoE may be implemented either in a unified residential gateway device that handles both DSL modem and IP routing functions or in the case of a simple DSL modem (without routing support), PPPoE may be handled behind it on a separate Ethernet-only router or even directly on a user's computer. (Support for PPPoE is present in most operating systems, ranging from Windows XP, Linux to Mac OS X.) More recently, some GPON-based (instead of DSL-based) residential gateways also use PPPoE, although the status of PPPoE in the GPON standards is marginal. PPPoE was developed by UUNET, Redback Networks (now Ericsson) and RouterWare (now Wind River Systems) and is available as an informational RFC 2516. In the world of DSL, PPPoE was commonly understood to be running on top of ATM (or DSL) as the underlying transport, although no such limitation exists in the PPPoE protocol itself. Other usage scenarios are sometimes distinguished by tacking as a suffix another underlying transport. For example, PPPoEoE, when the transport is Ethernet itself, as in the case of Metro Ethernet networks. (In this notation, the original use of PPPoE would be labeled PPPoEoA, although it should not be confused with PPPoA, which is a different encapsulation protocol.) PPPoE has been described in some books as a "layer 2.5" protocol, in some rudimentary sense similar to MPLS because it can be used to distinguish different IP flows sharing an Ethernet infrastructure, although the lack of PPPoE switches making routing decisions based on PPPoE headers limits applicability in that respect. Original rationale In late 1998, the DSL service model had yet to reach the large scale that would bring prices down to household levels. ADSL technology had been proposed a decade earlier. Potential equipment vendors and carriers alike recognized that broadband such as cable modem or DSL would eventually replace dialup service, but the hardware (both customer premises and LEC) faced a significant low-quantity cost barrier. Initial estimates for
https://en.wikipedia.org/wiki/Point-to-Point%20Protocol%20over%20ATM
In computer networking, the Point-to-Point Protocol over ATM (PPPoA) is a layer 2 data-link protocol typically used to connect domestic broadband modems to ISPs via phone lines. It is used mainly with DOCSIS and DSL carriers, by encapsulating PPP frames in ATM AAL5. Point-to-Point Protocol over Asynchronous Transfer Mode (PPPoA) is specified by The Internet Engineering Task Force (IETF) in RFC 2364. It offers standard PPP features such as authentication, encryption, and compression. It also supports the encapsulation types: VC-MUX and LLC - see RFC 2364. If it is used as the connection encapsulation method on an ATM based network it can reduce overhead significantly compared with PPPoEoA – by between 0 and ~3.125% for long packets, depending on the packet length and also on the choices of header options in PPPoEoA – see PPPoEoA protocol overheads. This is because it uses headers that are short so imposes minimal overheads, 2 bytes for PPP and 8 bytes for PPPoA (with the RFC2364 VC-MUX option) = 10 bytes. It also avoids the issues that PPPoE suffers from, related to sometimes needing to use an IP MTU of 1492 bytes or less, lower than the standard 1500 bytes. The use of PPPoA over PPPoE is not geographically significant; rather, it varies by the provider's preference. Configuration Configuration of a PPPoA requires PPP configuration and ATM configuration. These data are generally stored in a cable modem or DSL modem, and may or may not be visible to—or configurable by—an end-user. PPP configuration generally includes: user credentials, user name and password, and is unique to each user. ATM configuration includes: Virtual channel link (VCL) – virtual path identifier and virtual channel identifier (VPI/VCI), such as 0/32 (analogous to a phone number) Modulation (Type): such as G.dmt Multiplexing (Method): such as VC-MUX or LLC ATM configuration can either be performed manually, or it may be hard-coded (or pre-set) into the firmware of a DSL modem provided by the user's ISP; it cannot be automatically negotiated. See also PPPoE PPPoX L2TP ATM DSL Notes External links A typical PPPoA architecture diagram (out of date and no longer maintained) Telecommunication protocols Tunneling protocols
https://en.wikipedia.org/wiki/ISDN%20digital%20subscriber%20line
ISDN Digital Subscriber Line (IDSL) uses ISDN-based digital subscriber line technology to provide a data communication channel across existing copper telephone lines at a rate of 144 kbit/s, slightly higher than a bonded dual channel ISDN connection at 128kbit/s. The digital transmission bypasses the telephone company's central office equipment that handles analogue signals. IDSL uses the ISDN grade loop without Basic Rate Interface in ISDN transmission mode. The benefits of IDSL over ISDN are that IDSL provides always-on connections and transmits data via a data network rather than the carrier's voice network. IDSL also avoids per-call fees by being generally billed at a flat-rate. IDSL is not available in all countries. ISDN digital subscriber line (IDSL) is a cross between ISDN and xDSL. It is like ISDN in that it uses a single-wire pair to transmit full-duplex data at 128 kbit/s and at distances of up to RRD range. Like ISDN, IDSL uses a 2B1Q line code to enable transparent operation through the ISDN U interface. Finally, the user continues to use existing CPE (ISDN BRI terminal adapters, bridges, and routers) to make the CO connections. The big difference is from the carrier's point of view. Unlike ISDN, IDSL does not connect through the voice switch. A new piece of data communications equipment terminates the IDSL connection and shunts it off to a router or data switch. This is a key feature because the overloading of central office voice switches by data users is a growing problem for telcos. The limitation of IDSL is that the customer no longer has access to ISDN signaling or voice services. But for Internet service providers, who do not provide a public voice service, IDSL is an alternative way of using POTS dial service to offer higher-speed Internet access, targeting the embedded base of more than five million ISDN users as an initial market. References Digital subscriber line Integrated Services Digital Network Telecommunications-related introductions in 2000
https://en.wikipedia.org/wiki/RC%20circuit
A resistor–capacitor circuit (RC circuit), or RC filter or RC network, is an electric circuit composed of resistors and capacitors. It may be driven by a voltage or current source and these will produce different responses. A first order RC circuit is composed of one resistor and one capacitor and is the simplest type of RC circuit. RC circuits can be used to filter a signal by blocking certain frequencies and passing others. The two most common RC filters are the high-pass filters and low-pass filters; band-pass filters and band-stop filters usually require RLC filters, though crude ones can be made with RC filters. Introduction There are three basic, linear passive lumped analog circuit components: the resistor (R), the capacitor (C), and the inductor (L). These may be combined in the RC circuit, the RL circuit, the LC circuit, and the RLC circuit, with the acronyms indicating which components are used. These circuits, among them, exhibit a large number of important types of behaviour that are fundamental to much of analog electronics. In particular, they are able to act as passive filters. This article considers the RC circuit, in both series and parallel forms, as shown in the diagrams below. Natural response The simplest RC circuit consists of a resistor and a charged capacitor connected to one another in a single loop, without an external voltage source. Once the circuit is closed, the capacitor begins to discharge its stored energy through the resistor. The voltage across the capacitor, which is time-dependent, can be found by using Kirchhoff's current law. The current through the resistor must be equal in magnitude (but opposite in sign) to the time derivative of the accumulated charge on the capacitor. This results in the linear differential equation where is the capacitance of the capacitor. Solving this equation for yields the formula for exponential decay: where is the capacitor voltage at time . The time required for the voltage to fall to is called the RC time constant and is given by, In this formula, is measured in seconds, in ohms and in farads. Complex impedance The complex impedance, (in ohms) of a capacitor with capacitance (in farads) is The complex frequency is, in general, a complex number, where represents the imaginary unit: , is the exponential decay constant (in nepers per second), and is the sinusoidal angular frequency (in radians per second). Sinusoidal steady state Sinusoidal steady state is a special case in which the input voltage consists of a pure sinusoid (with no exponential decay). As a result, and the impedance becomes Series circuit By viewing the circuit as a voltage divider, the voltage across the capacitor is: and the voltage across the resistor is: Transfer functions The transfer function from the input voltage to the voltage across the capacitor is Similarly, the transfer function from the input to the voltage across the resistor is Poles and zeros Both tran
https://en.wikipedia.org/wiki/Year%202038%20problem
The year 2038 problem (also known as Y2038, Y2K38, Y2K38 superbug or the Epochalypse) is a time formatting bug in computer systems with representing times after 03:14:07 UTC on 19 January 2038. The problem exists in systems which measure Unix time – the number of seconds elapsed since the Unix epoch (00:00:00 UTC on 1 January 1970) – and store it in a signed 32-bit integer. The data type is only capable of representing integers between −(2) and 231 − 1, meaning the latest time that can be properly encoded is 2 − 1 seconds after epoch (03:14:07 UTC on 19 January 2038). Attempting to increment to the following second (03:14:08) will cause the integer to overflow, setting its value to −(2) which systems will interpret as 2 seconds before epoch (20:45:52 UTC on 13 December 1901). The problem is similar in nature to the year 2000 problem. Computer systems that use time for critical computations may encounter fatal errors if the Y2038 problem is not addressed. Some applications that use future dates have already encountered the bug. The most vulnerable systems are those which are infrequently or never updated, such as legacy and embedded systems. There is no universal solution to the problem, though many modern systems have been upgraded to measure Unix time with signed 64-bit integers which will not overflow for 292 billion years—approximately 21 times the estimated age of the universe. Cause Many computer systems measure time and date as Unix time, an international standard for digital timekeeping. Unix time is defined as the number of seconds elapsed since 00:00:00 UTC on 1 January 1970 (an arbitrarily chosen time based on the creation of the first Unix system), which has been dubbed the Unix epoch. Unix time has historically been encoded as a signed 32-bit integer, a data type composed of 32 binary digits (bits) which represent an integer value, with 'signed' meaning that the number is stored in Two's complement format. Thus, a signed 32-bit integer can only represent integer values from −(2) to 231 − 1 inclusive. Consequently, if a signed 32-bit integer is used to store Unix time, the latest time that can be stored is 231 − 1 (2,147,483,647) seconds after epoch, which is . Systems that attempt to increment this value by one more second to 2 seconds after epoch (03:14:08) will suffer integer overflow, inadvertently flipping the sign bit to indicate a negative number. This changes the integer value to −(2), or 2 seconds before epoch rather than after, which systems will interpret as 20:45:52 on Friday, 13 December 1901. From here, systems will continue to count up, toward zero, and then up through the positive integers again. As many computer systems use time computations to run critical functions, the bug may introduce fatal errors. Vulnerable systems Any system using data structures with 32-bit time representations has an inherent risk to fail. A full list of these data structures is virtually impossible to derive, but there are well-known
https://en.wikipedia.org/wiki/Power%20Computing%20Corporation
Power Computing Corporation (often referred to as Power Computing) was the first company selected by Apple Inc to create Macintosh-compatible computers ("Mac clones"). Stephen “Steve” Kahng, a computer engineer best known for his design of the Leading Edge Model D, founded the company in November 1993. Power Computing started out with financial backing from Olivetti and Kahng. The first Mac-compatible (clone) PC shipped in May 1995. Like Dell Computer, Power Computing followed a direct, build-to-order sales model. In one year, Power Computing shipped 100,000 units with revenues of $250 million in the first year. Power Computing was the first company to sell $1,000,000 of products on the Internet. Power Computing released upgraded models until 1997 with revenues reaching $400 million a year. The Mac clone business was stopped after Steve Jobs returned as interim CEO of Apple in July 1997. In September, Apple bought the core assets of Power Computing for $100 million in Apple stock and terminated the Mac cloning business. History Power Computing Corporation was founded on 11 November 1993 in Milpitas, California, backed by $5 million from Olivetti and $4 million from Kahng. At the MacWorld Expo in January 1995, just days after receiving notice he had the license to clone Macintosh computers, Kahng enlisted Mac veteran Michael Shapiro to help build the company. Shapiro helped to develop the original logo and brand and worked with Kahng to build the initial management team. Power Computing opened manufacturing and operations offices in Austin, Texas at the recently abandoned facilities of CompuAdd and engineering offices in Cupertino, California, staffed largely by members of Apple's original Power Macintosh team. In 1997, PCC relocated its headquarters to a location directly across I-35 from Dell's main campus, and remained there until Apple acquired PCC's assets in 1997. Kahng set out to create a simplified Mac design that made it cheaper and faster to produce the machines. He then targeted the mail-order market, where Power Computing could get a quicker return on its money than it could by selling through distributors. "With direct mail, you get your money back in days by credit card instead of the 30 to 60 days it takes for the resale channel to repay," Kahng said. At that time, Apple was leaning towards giving licenses to big time computer makers. Initially, even with Kahng's reputation as a "master cloner", getting Apple to take him seriously was a challenge. He ended up bringing Olivetti people with him to meetings. Apple engineers gave him the help he needed to make a Mac prototype. The team reduced the size of the Apple main circuit board so that it could fit into a standard PC box. They also used off-the-shelf PC power supplies and monitors. A few days before the end of the year, it was announced that Apple Computer picked Power Computing to be its first Macintosh clone maker. Jim Gable, Apple's director of Mac licensing w
https://en.wikipedia.org/wiki/NOAA%20Weather%20Radio
NOAA Weather Radio (NWR), also known as NOAA Weather Radio All Hazards, is an automated 24-hour network of VHF FM weather radio stations in the United States that broadcast weather information directly from a nearby National Weather Service office. The routine programming cycle includes local or regional weather forecasts, synopsis, climate summaries or zone/lake/coastal waters forecasts (when applicable). During severe conditions the cycle is shortened into: hazardous weather outlooks, short-term forecasts, special weather statements or tropical weather summaries (the first two are not normally broadcast in most offices). It occasionally broadcasts other non-weather related events such as national security statements, natural disaster information, environmental and public safety statements (such as an AMBER Alert), civil emergencies, fires, evacuation orders, and other hazards sourced from the Federal Communications Commission's (FCC) Emergency Alert System. NOAA Weather Radio uses automated broadcast technology (since 2016: Broadcast Message Handler) that allows for the recycling of segments featured in one broadcast cycle into another and more regular updating of segments to each of the transmitters. It also speeds up the warning transmitting process. Weather radios are widely sold online and in retail stores that specialize in consumer electronics in Canada and the US. Additionally, they are readily available in many supermarkets and drug stores in the southern and midwestern US, which are particularly susceptible to severe weather—large portions of these regions are commonly referred to as "Tornado Alley". History The U.S. Weather Bureau first began broadcasting marine weather information in Chicago and New York City on two VHF radio stations in 1960 as an experiment. Proving to be successful, the broadcasts expanded to serve the general public in coastal regions in the 1960s and early 1970s. By early 1970, ESSA listed 20 U.S. cities using 162.55 MHz and one using 163.275 "ESSA VHF Radio Weather." Later, the U.S. Weather Bureau adopted its current name, National Weather Service (NWS), and was operating 29 VHF-FM weather-radio transmitters under the National Oceanic and Atmospheric Administration (NOAA) which replaced ESSA in 1970. The service was designed with boaters, fishermen, travelers and more in mind, allowing listeners to quickly receive a "life-saving" weather bulletin from their local weather forecast office (WFO), along with routinely updated forecasts and other climatological data in a condensed format at any time of the day or night. The general public could have the latest weather updates when they needed them, and the benefit of more lead-time to prepare during severe conditions. In 1974, NOAA Weather Radio (NWR), as it was now called, reached about 44 percent of the U.S. population over 66 nationwide transmitters. NWR grew to over 300 stations by the late 1970s. Local NWS staff were the voices heard on NWR stations from i
https://en.wikipedia.org/wiki/Alan%20Perlis
Alan Jay Perlis (April 1, 1922 – February 7, 1990) was an American computer scientist and professor at Purdue University, Carnegie Mellon University and Yale University. He is best known for his pioneering work in programming languages and was the first recipient of the Turing Award. Biography Perlis was born to a Jewish family in Pittsburgh, Pennsylvania. He graduated from Taylor Allderdice High School in 1939. In 1943, he received his bachelor's degree in chemistry from the Carnegie Institute of Technology (later renamed Carnegie Mellon University). During World War II, he served in the U.S. Army, where he became interested in mathematics. He then earned both a master's degree (1949) and a Ph.D. (1950) in mathematics at Massachusetts Institute of Technology (MIT). His doctoral dissertation was titled "On Integral Equations, Their Solution by Iteration and Analytic Continuation". In 1952, he participated in Project Whirlwind. He joined the faculty at Purdue University and in 1956, moved to the Carnegie Institute of Technology. He was chair of mathematics and then the first head of the computer science department. In 1962, he was elected president of the Association for Computing Machinery. He was awarded the inaugural Turing Award in 1966, according to the citation, "for his influence in the area of advanced programming techniques and compiler construction." This is a reference to the work he had done on Internal Translator in 1956 (described by Donald Knuth as the first successful compiler), and as a member of the team that developed the programming language ALGOL. In 1971, Perlis moved to Yale University to take the chair of computer science and hold the Eugene Higgins chair. In 1977, he was elected to the National Academy of Engineering. In 1982, he wrote an article, "Epigrams on Programming", for the Association for Computing Machinery's (ACM) SIGPLAN journal, describing in one-sentence distillations many of the things he had learned about programming over his career. The epigrams have been widely quoted. He remained at Yale until his death in 1990. Publications Publications, a selection: 1957. Internal Translator (IT): A Compiler for the 650. With J. W. Smith and H. R. Van Zoeren. 1965. An introductory course in computer programming. With Robert T. Braden. 1970. A view of programming languages. With Bernard A. Galler 1975. Introduction to computer science 1977. In Praise of APL: A Language for Lyrical Programming 1978. Almost Perfect Artifacts Improve only in Small Ways: APL is more French than English 1981. Software Metrics: An Analysis and Evaluation. With Frederick Sayward and Mary Shaw 1986. FAC: A Functional APL Language. With Tu Hai-Chen. About Alan Perlis See also List of pioneers in computer science References External links Oral history interview with Allen Newell at Charles Babbage Institute, University of Minnesota, Minneapolis. Newell discusses the development of the Computer Science Department at Car
https://en.wikipedia.org/wiki/Datasheet
A datasheet, data sheet, or spec sheet is a document that summarizes the performance and other characteristics of a product, machine, component (e.g., an electronic component), material, subsystem (e.g., a power supply), or software in sufficient detail that allows a buyer to understand what the product is and a design engineer to understand the role of the component in the overall system. Typically, a datasheet is created by the manufacturer and begins with an introductory page describing the rest of the document, followed by listings of specific characteristics, with further information on the connectivity of the devices. In cases where there is relevant source code to include, it is usually attached near the end of the document or separated into another file. Datasheets are created, stored, and distributed via product information management or product data management systems. Depending on the specific purpose, a datasheet may offer an average value, a typical value, a typical range, engineering tolerances, or a nominal value. The type and source of data are usually stated on the datasheet. A datasheet is usually used for commercial or technical communication to describe the characteristics of an item or product. It can be published by the manufacturer to help people choose products or to help use the products. By contrast, a technical specification is an explicit set of requirements to be satisfied by a material, product, or service. The ideal datasheet specifies characteristics in a formal structure, according to a strict taxonomy, that allows the information to be processed by a machine. Such machine readable descriptions can facilitate information retrieval, display, design, testing, interfacing, verification, system discovery, and e-commerce. Examples include Open Icecat data-sheets, transducer electronic data sheets for describing sensor characteristics, and Electronic device descriptions in CANopen or descriptions in markup languages, such as SensorML. Product datasheet information A product data sheet (PDS), like any datasheet, has a different data model per category. It typically contains: Identifiers like manufacturer and manufacturer product code, GTIN Classification data, such as UNSPSC Descriptions such as marketing texts Specifications Product images Feature logos Reasons-to-buy Leaflets, typically as PDFs Manuals, typically in PDF. Product videos, 3D objects, and other rich media assets In Open Icecat, the global open catalog or open content project in which hundreds of manufacturers and thousands of e-commerce sellers participate, the data models of tens of thousands of taxonomy classes are defined, and millions of free PDSs can be found conforming these data-sheet data models. Material Safety Data Sheets A Material Safety Data Sheet (MSDS), Safety Data Sheet (SDS), or Product Safety Data Sheet (PSDS) is an important component of product stewardship and occupational safety and health. These are required by agenci
https://en.wikipedia.org/wiki/Internet%20access
Internet access is a facility or service that provides connectivity for a computer, a computer network, or other network device to the Internet, and for individuals or organizations to access or use applications such as email and the World Wide Web. Internet access is offered for sale by an international hierarchy of Internet service providers (ISPs) using various networking technologies. At the retail level, many organizations, including municipal entities, also provide cost-free access to the general public. Availability of Internet access to the general public began with the commercialization of the early Internet in the early 1990s, and has grown with the availability of useful applications, such as the World Wide Web. In 1995, only percent of the world's population had access, with well over half of those living in the United States, and consumer use was through dial-up. By the first decade of the 21st century, many consumers in developed nations used faster broadband technology, and by 2014, 41 percent of the world's population had access, broadband was almost ubiquitous worldwide, and global average connection speeds exceeded one megabit per second. History The Internet developed from the ARPANET, which was funded by the US government to support projects within the government and at universities and research laboratories in the US – but grew over time to include most of the world's large universities and the research arms of many technology companies. Use by a wider audience only came in 1995 when restrictions on the use of the Internet to carry commercial traffic were lifted. In the early to mid-1980s, most Internet access was from personal computers and workstations directly connected to local area networks (LANs) or from dial-up connections using modems and analog telephone lines. LANs typically operated at 10 Mbit/s, while modem data-rates grew from 1200 bit/s in the early 1980s, to 56 kbit/s by the late 1990s. Initially, dial-up connections were made from terminals or computers running terminal emulation software to terminal servers on LANs. These dial-up connections did not support end-to-end use of the Internet protocols and only provided terminal to host connections. The introduction of network access servers supporting the Serial Line Internet Protocol (SLIP) and later the point-to-point protocol (PPP) extended the Internet protocols and made the full range of Internet services available to dial-up users; although slower, due to the lower data rates available using dial-up. An important factor in the rapid rise of Internet access speed has been advances in MOSFET (MOS transistor) technology. The MOSFET, originally invented by Mohamed Atalla and Dawon Kahng in 1959, is the building block of the Internet telecommunications networks. The laser, originally demonstrated by Charles H. Townes and Arthur Leonard Schawlow in 1960, was adopted for MOS light wave systems around 1980, which led to exponential growth of Internet bandwidt
https://en.wikipedia.org/wiki/Compis
Compis (COMPuter I Skolan, also a pun on the colloquial Swedish word kompis meaning comrade or buddy) was a computer system intended for the general educational system in Sweden and sold to Swedish schools beginning in 1984 through the distributor Esselte Studium, who also was responsible for the software packages. The computers were also used in Danish, Finnish and Norwegian schools under the name Scandis. History In 1980, the ABC 80 used in the schools was regarded as becoming obsolete, and Styrelsen för teknisk utveckling (board for technical development) was tasked to find a replacement. In 1981, the procurement Tudis (Teknikupphandlingsprojekt Datorn i Skolan) was launched, and while the decision was controversial, Svenska Datorer AB was awarded the contract with development beginning in 1982. After Svenska Datorer went bankrupt, production was transferred to TeliDatorer/Telenova under Televerket (Sweden) The computer was distributed by Esselte and exclusively marketed towards, and sold to, Swedish, Norwegian and Finnish schools, mainly high stage (year 7-9) and gymnasium-level. The computer was based on the Intel 80186 CPU and with CP/M-86 as the operating system in ROM (although it could also run MS-DOS from disk). The computer had a wide selection of ports, including one for a light pen. The Compis project was criticized from the start, and as the move to IBM PC compatibility came it was left behind and finally cancelled in 1988 although it was in use well into the 1990s. Applications Notable applications being run on the Compis in an educational environment was: COMAL interpreter Turbo Pascal 3.0 compiler, under the name Scandis-Pascal WordStar word processor Harmony software: word processing, spreadsheet and database. The name was a pun on Lotus Symphony, the dominant productivity software at the time. Some schools had simple local area networks of Compis/Scandis computers, in which 10–20 machines shared one hard disk with a typical capacity of 10MB. See also Education in Sweden Unisys ICON External links Compis Info: A site dedicated to the Compis Telenova Compis: some documentation available here (page in Swedish). References Nationalencyclopedins nätupplaga, "Compis" Swedish Internet museum Personal computers Goods manufactured in Sweden
https://en.wikipedia.org/wiki/Pan-American%20Highway
The Pan-American Highway is a network of roads stretching across the Americas and measuring about in total length. Except for a break of approximately across the border between northwest Colombia and southeast Panama called the Darién Gap, the roads link almost all of the Pacific coastal countries of the Americas in a connected highway system. According to Guinness World Records, the Pan-American Highway is the world's longest "motorable road". It is only possible to cross by land between South America and Central America—the last town in Colombia to the first outpost in Panama—by a difficult and dangerous hike of at least four days through the Darién Gap, one of the rainiest areas of the planet. The Pan-American Highway passes through many diverse climates and ecological typesranging from dense jungles to arid deserts and barren tundra. Some areas are fully passable only during the dry season. The Pan-American Highway system is physically mostly complete and extends in de facto terms from Prudhoe Bay, Alaska, in North America, to the southernmost reaches of South America. Several southern highway termini are claimed, including the cities of Puerto Montt and Quellón in Chile, and Ushuaia in Argentina. West and north of the Darién Gap, this roadway is also known as the Inter-American Highway through Central America and Mexico. There it splits into several spurs leading to the Mexico–United States border. Concept of the highway The notion that there could and should be an inter-American highway, linking the nations of North, Central, and South America, is an idea originating from the United States. It was built in stages. The first, not long after one could drive across the United States on a paved road, was the highway from Laredo, Texas, to Mexico City. The second stage was the Inter-American Highway to Panama City; previously there were no roads, and little commerce between most Central American countries. There was no road between Costa Rica and Panama until, concerned about access to the Panama Canal in a war situation, the U.S. Army Corps of Engineers began a highway in 1941. The third stage, which has not been completed and may never be, continues onward to the southern tip of South America at Tierra del Fuego National Park, near Ushuaia, Argentina. Both Panama and Colombia, and environmentalists as well, are opposed to building a highway through the Darién Gap that separates the two continents. The Cuban proposal, forgotten today since nothing came of it, was to create a "circuito del Caribe" (Caribbean circuit). They would have expanded the highway to Puerto Juárez, México (Cancún), and from there by ferry to Pinar del Río, Cuba, from there by road to Havana, and by ferry again to Key West, Florida, and the Overseas Highway. The deterioration of relations between Cuba and the U.S. after the Cuban Revolution of 1959 ended talk of this project. Development and construction The concept of an overland route from one tip of the Ameri
https://en.wikipedia.org/wiki/Pan-American%20Highway%20%28North%20America%29
The Pan-American Highway route in North America is the portion of a network of roads nearly in length that travels through the mainland nations of the Americas. No definitive length of the Pan American Highway exists because the Canadian government has never officially defined any specific route as being part of the Pan-American Highway, while in the United States, the Federal Highway Administration (FHWA) has designated the entire Interstate Highway System part of the Pan-American Highway System, although this has not yet been reinforced by any official highway signage. Mexico officially has many branches connecting to various interstate highways at the U.S. border. United States (Alaska) The Pan-American Highway unofficially begins in Prudhoe Bay, Alaska near Deadhorse. Traveling south to Fairbanks, Alaska, the Highway follows the length of the Dalton Highway (Alaska Route 11) and Alaska Route 2. (Dalton Highway was the subject of the BBC's first episode of World's Most Dangerous Roads.) From Fairbanks, Alaska's third largest city, the Pan-American Highway and the Alaska Highway are one and the same, following Alaska Route 2 southeast to the Canada–United States border southeast of Northway, Alaska, and adjacent to the Tetlin National Wildlife Refuge. Note: The Pan-American Highway reenters the U.S., potentially in several locations along the U.S.-Canada border. Canada Yukon Crossing the border into Canada, Alaska Highway 2 turns into Yukon Highway 1. The first significant settlement along the way is Beaver Creek, Yukon. At Haines Junction, where it meets Yukon Highway 3, Yukon Highway 1 turns east toward Whitehorse, the capital of the Yukon Territory. Through most of Whitehorse, Yukon Highway 2 and Yukon Highway 1 share an alignment. Yukon Highway 1 cuts southeast toward Marsh Lake, Yukon while Yukon Highway 2 cuts south to Skagway, Alaska. Eventually, Yukon Highway 1 intersects with Yukon Highway 8 and Yukon Highway 7 at Jake's Corner, Yukon; the Pan-American Highway continues on Yukon 1 east-northeast from this junction. At Johnson's Crossing, Yukon Highway 1 meets Yukon Highway 6 and travels southeast through Teslin, Yukon. The Pan-American Highway continues on Yukon 1 as it crosses over into British Columbia (B. C.). After several miles, the Highway reenters the Yukon (once again as Highway 1) and continues southeast of Watson Lake until it, once again, enters British Columbia as B.C. Highway 97. British Columbia After travelling about past the British Columbia–Yukon border, the Pan-American Highway reaches the first settlement in British Columbia at Lower Post. After travelling about east, the highway once again re-enters the Yukon for roughly . The Highway then re-enters British Columbia (as BC 97) for the final time. The Pan-American Highway continues south to southeast through a long uninhabited stretch until it passes through the villages of Fireside and Coal River, then runs east parallel to the Liard River. The Pan-Am
https://en.wikipedia.org/wiki/RSA%20Security
RSA Security LLC, formerly RSA Security, Inc. and trade name RSA, is an American computer and network security company with a focus on encryption and encryption standards. RSA was named after the initials of its co-founders, Ron Rivest, Adi Shamir and Leonard Adleman, after whom the RSA public key cryptography algorithm was also named. Among its products is the SecurID authentication token. The BSAFE cryptography libraries were also initially owned by RSA. RSA is known for incorporating backdoors developed by the NSA in its products. It also organizes the annual RSA Conference, an information security conference. Founded as an independent company in 1982, RSA Security was acquired by EMC Corporation in 2006 for US$2.1 billion and operated as a division within EMC. When EMC was acquired by Dell Technologies in 2016, RSA became part of the Dell Technologies family of brands. On 10 March 2020, Dell Technologies announced that they will be selling RSA Security to a consortium, led by Symphony Technology Group (STG), Ontario Teachers’ Pension Plan Board (Ontario Teachers’) and AlpInvest Partners (AlpInvest) for US$2.1 billion, the same price when it was bought by EMC back in 2006. RSA is based in Bedford, Massachusetts, with regional headquarters in Bracknell (UK) and Singapore, and numerous international offices. History Ron Rivest, Adi Shamir and Leonard Adleman, who developed the RSA encryption algorithm in 1977, founded RSA Data Security in 1982. The company acquired a "worldwide exclusive license" from the Massachusetts Institute of Technology to a patent on the RSA cryptosystem technology granted in 1983. In 1994, RSA was against the Clipper chip during the Crypto War. In 1995, RSA sent a handful of people across the hall to found Digital Certificates International, better known as VeriSign. The company then called Security Dynamics acquired RSA Data Security in July 1996 and DynaSoft AB in 1997. In January 1997, it proposed the first of the DES Challenges which led to the first public breaking of a message based on the Data Encryption Standard. In February 2001, it acquired Xcert International, Inc., a privately held company that developed and delivered digital certificate-based products for securing e-business transactions. In May 2001, it acquired 3-G International, Inc., a privately held company that developed and delivered smart card and biometric authentication products. In August 2001, it acquired Securant Technologies, Inc., a privately held company that produced ClearTrust, an identity management product. In December 2005, it acquired Cyota, a privately held Israeli company specializing in online security and anti-fraud solutions for financial institutions. In April 2006, it acquired PassMark Security. On September 14, 2006, RSA stockholders approved the acquisition of the company by EMC Corporation for $2.1 billion. In 2007, RSA acquired Valyd Software, a Hyderabad-based Indian company specializing in file and data secur
https://en.wikipedia.org/wiki/Liquid%20War
Liquid War is an free software multi-player action game based on particle flow mechanic. Thomas Colcombet developed the core concept and the original shortest path algorithm, the software was programmed by . Liquid War 6 is a GNU package distributed as free software and part of the GNU project. Gameplay Gameplay takes place on a 2D battlefield, usually with some obstacles. Each player (2 to 6, computer or human) has an army of particles and a cursor. The objective of the game is to assimilate all enemy particles. The players can only move their cursors and cannot directly control the particles. Each particle follows the shortest path around the obstacles to its team's cursor. A player may have several thousands particles at a time, giving the collection of particles a look of a liquid blob. When a particle moves into a particle from a different team, it will fight and if the opponent particle fails to fight back (it is not moving in the opposite direction) it will eventually be assimilated by its attacker. As particles cannot die but only change teams, the total number of particles on the map remains constant. Since a particle can only fight in one direction at a time (towards its team's cursor), a player that surrounds its opponents will have a distinct advantage. The game ends when one player controls all of the particles or when the time runs out. When the time runs out, the player with the most particles wins. There are multiple maps which affect the obstacles in the battlefield. These obstacles may affect the strategies of the game. is a multiplayer game and can be played by up to 6 people on one computer, or over the Internet or a LAN. A single player mode is available in which the opponents are controlled by the computer. The computer AI's "strategy" is to constantly choose a random point in the enemy and move its cursor to it. History The shortest path algorithm was invented by Thomas Colcombet before the game itself in Spring 1995. The game came as a result of the algorithm, when he realized its applicability to gaming. Colcombet's friend, Christian Mauduit, enhanced the algorithm and coded the game. 3.0 was released on 1 July 1995. It was a "barely usable" MS-DOS game with no network support. Version 5.0 was released on 26 September 1998. It was a complete rewrite and used the Allegro library. Network support was introduced in version 5.4.0, released on 7 July 2001. , the current stable version is 5.6.4 and is available under MS-DOS, Microsoft Windows, Mac OS X, Linux and FreeBSD. Its author, Christian Mauduit, has announced that a complete rewrite is in progress to produce version 6.0, which will abandon the Allegro used for 5.x releases for a full OpenGL implementation. Version 6.0 is a part of the GNU project and was expected to be released in 2008. Version 0.0.7 beta, a testing version, was released in October 2009. Version 0.0.8 beta was released in 2010. Most parts of the game and engine are finished, and playing hotseat
https://en.wikipedia.org/wiki/James%20H.%20Wilkinson
James Hardy Wilkinson FRS (27 September 1919 – 5 October 1986) was a prominent figure in the field of numerical analysis, a field at the boundary of applied mathematics and computer science particularly useful to physics and engineering. Education Born in Strood, England, he won a Foundation Scholarship to Sir Joseph Williamson's Mathematical School in Rochester. He studied the Cambridge Mathematical Tripos at Trinity College, Cambridge, where he graduated as Senior Wrangler. Career Taking up war work in 1940, he began working on ballistics but transferred to the National Physical Laboratory in 1946, where he worked with Alan Turing on the ACE computer project. Later, Wilkinson's interests took him into the numerical analysis field, where he discovered many significant algorithms. Awards and honours Wilkinson received the Turing Award in 1970 "for his research in numerical analysis to facilitate the use of the high-speed digital computer, having received special recognition for his work in computations in linear algebra and 'backward' error analysis." In the same year, he also gave the Society for Industrial and Applied Mathematics (SIAM) John von Neumann Lecture. Wilkinson also received an Honorary Doctorate from Heriot-Watt University in 1973. He was elected as a Distinguished Fellow of the British Computer Society in 1974 for his pioneering work in computer science. The James H. Wilkinson Prize in Numerical Analysis and Scientific Computing, established in 1982 by SIAM, and J. H. Wilkinson Prize for Numerical Software, established in 1991, are named in his honour. In 1987, Wilkinson won the Chauvenet Prize of the Mathematical Association of America, for his paper "The Perfidious Polynomial". Personal life Wilkinson married Heather Ware in 1945. He died at home of a heart attack on October 5, 1986. His wife and their son survived him, a daughter having predeceased him. Selected works (REAP) Reprinted from SIAM in 2023, ISBN 978-1-61197-751-6. (AEP) with Christian Reinsch: Handbook for Automatic Computation, Volume II, Linear Algebra, Springer-Verlag, 1971 The Perfidious Polynomial. In: Studies in Numerical Analysis, pp. 1–28, MAA Stud. Math., 24, Math. Assoc. America, Washington, DC, 1984 References External links 1919 births 1986 deaths 20th-century British mathematicians British computer scientists Turing Award laureates Alumni of Trinity College, Cambridge Fellows of the British Computer Society Fellows of the Royal Society People from Strood People educated at Sir Joseph Williamson's Mathematical School Senior Wranglers Numerical analysts Scientists of the National Physical Laboratory (United Kingdom)
https://en.wikipedia.org/wiki/Charles%20Bachman
Charles William Bachman III (December 11, 1924 – July 13, 2017) was an American computer scientist, who spent his entire career as an industrial researcher, developer, and manager rather than in academia. He was particularly known for his work in the early development of database management systems. His techniques of layered architecture include his namesake Bachman diagrams. Biography Charles Bachman was born in Manhattan, Kansas, in 1924, where his father, Charles Bachman Jr., was the head football coach at Kansas State College. He attended high school in East Lansing, Michigan, where his father served as head football coach at Michigan State College from 1933–1946. In World War II he joined the United States Army and spent March 1944 through February 1946 in the South West Pacific Theater serving in the Anti-Aircraft Artillery Corps in New Guinea, Australia, and the Philippine Islands. There he was first exposed to and used fire control computers for aiming 90 mm guns. After his discharge in 1946 he attended Michigan State College and graduated in 1948 with a bachelor's degree in mechanical engineering, where he was a member of Tau Beta Pi. In mid-1949 he married Connie Hadley. He then attended the University of Pennsylvania. In 1950, he graduated with a master's degree in mechanical engineering, and had also completed three-quarters of the requirements for an MBA from the university's Wharton School of Business. Bachman died on July 13, 2017, at his home in Lexington, Massachusetts, of Parkinson's disease at the age of 92. Work Bachman spent his entire career as a practicing software engineer or manager in industry rather than in academia. In 1950 he started working at Dow Chemical in Midland, Michigan. In 1957 he became Dow's first data processing manager. He worked with the IBM user group SHARE on developing a new version of report generator software, which became known as 9PAC. However, the planned IBM 709 order was cancelled before it arrived. In 1960 he joined General Electric, where by 1963 he developed the Integrated Data Store (IDS), one of the first database management systems using what came to be known as the navigational database model, in the Manufacturing Information And Control System (MIACS) product. Working for customer Weyerhaeuser Lumber, he developed the first multiprogramming network access to the IDS database, an early online transaction processing system called WEYCOS in 1965. Later at GE he developed the "dataBasic" product that offered database support to Basic language timesharing users. In 1970, GE sold its computer business to Honeywell Information Systems, so he and his family moved from Phoenix, Arizona to Lexington, Massachusetts. In 1981, he joined a smaller firm, Cullinane Information Systems (later Cullinet), which offered a version of IDS that was called IDMS and supported IBM mainframes. Bachman Information Systems In 1983, he founded Bachman Information Systems, which developed a line of compu
https://en.wikipedia.org/wiki/Ecological%20footprint
The ecological footprint is a method promoted by the Global Footprint Network to measure human demand on natural capital, i.e. the quantity of nature it takes to support people and their economies. It tracks this demand through an ecological accounting system. The accounts contrast the biologically productive area people use for their consumption to the biologically productive area available within a region, nation, or the world (biocapacity, the productive area that can regenerate what people demand from nature). In short, it is a measure of human impact on the environment and whether that impact is sustainable. Footprint and biocapacity can be compared at the individual, regional, national or global scale. Both footprint and demands on biocapacity change every year with number of people, per person consumption, efficiency of production, and productivity of ecosystems. At a global scale, footprint assessments show how big humanity's demand is compared to what Earth can renew. Global Footprint Network estimates that, as of 2019, humanity has been using natural capital 75% faster than Earth can renew it, which they describe as meaning humanity's ecological footprint corresponds to 1.75 planet Earths. This overuse is called ecological overshoot. Ecological footprint analysis is widely used around the world in support of sustainability assessments. It enables people to measure and manage the use of resources throughout the economy and explore the sustainability of individual lifestyles, goods and services, organizations, industry sectors, neighborhoods, cities, regions, and nations. Overview The first academic publication about ecological footprints was written by William Rees in 1992. The ecological footprint concept and calculation method was developed as the PhD dissertation of Mathis Wackernagel, under Rees' supervision at the University of British Columbia in Vancouver, Canada, from 1990 to 1994. Originally, Wackernagel and Rees called the concept "appropriated carrying capacity". To make the idea more accessible, Rees came up with the term "ecological footprint", inspired by a computer technician who praised his new computer's "small footprint on the desk". In 1996, Wackernagel and Rees published the book Our Ecological Footprint: Reducing Human Impact on the Earth. The simplest way to define an ecological footprint is the amount of environmental resources necessary to produce the goods and services that support an individual's lifestyle, a nation's prosperity, or the economic activity of humanity as a whole. The model is a means of comparing lifestyles, per capita consumption, and population numbers, and checking these against biocapacity. The tool can inform policy by examining to what extent a nation uses more (or less) than is available within its territory, or to what extent the nation's lifestyle and population density would be replicable worldwide. The footprint can be a useful tool to educate people about overconsumption and ov
https://en.wikipedia.org/wiki/Fork%20%28software%20development%29
In software engineering, a project fork happens when developers take a copy of source code from one software package and start independent development on it, creating a distinct and separate piece of software. The term often implies not merely a development branch, but also a split in the developer community; as such, it is a form of schism. Grounds for forking are varying user preferences and stagnated or discontinued development of the original software. Free and open-source software is that which, by definition, may be forked from the original development team without prior permission, and without violating copyright law. However, licensed forks of proprietary software (e.g. Unix) also happen. Etymology The word "fork" has been used to mean "to divide in branches, go separate ways" as early as the 14th century. In the software environment, the word evokes the fork system call, which causes a running process to split itself into two (almost) identical copies that (typically) diverge to perform different tasks. In the context of software development, "fork" was used in the sense of creating a revision control "branch" by Eric Allman as early as 1980, in the context of Source Code Control System: The term was in use on Usenet by 1983 for the process of creating a subgroup to move topics of discussion to. "Fork" is not known to have been used in the sense of a community schism during the origins of Lucid Emacs (now XEmacs) (1991) or the Berkeley Software Distributions (BSDs) (1993–1994); Russ Nelson used the term "shattering" for this sort of fork in 1993, attributing it to John Gilmore. However, "fork" was in use in the present sense by 1995 to describe the XEmacs split, and was an understood usage in the GNU Project by 1996. Forking of free and open-source software Free and open-source software may be legally forked without prior approval of those currently developing, managing, or distributing the software per both The Free Software Definition and The Open Source Definition: In free software, forks often result from a schism over different goals or personality clashes. In a fork, both parties assume nearly identical code bases, but typically only the larger group, or whoever controls the web site, will retain the full original name and the associated user community. Thus, there is a reputation penalty associated with forking. The relationship between the different teams can be cordial or very bitter. On the other hand, a friendly fork or a soft fork is a fork that does not intend to compete, but wants to eventually merge with the original. Eric S. Raymond, in his essay Homesteading the Noosphere, stated that "The most important characteristic of a fork is that it spawns competing projects that cannot later exchange code, splitting the potential developer community". He notes in the Jargon File: David A. Wheeler notes four possible outcomes of a fork, with examples: The death of the fork. This is by far the most common case. It is ea
https://en.wikipedia.org/wiki/Fork%20%28system%20call%29
In computing, particularly in the context of the Unix operating system and its workalikes, fork is an operation whereby a process creates a copy of itself. It is an interface which is required for compliance with the POSIX and Single UNIX Specification standards. It is usually implemented as a C standard library wrapper to the fork, clone, or other system calls of the kernel. Fork is the primary method of process creation on Unix-like operating systems. Overview In multitasking operating systems, processes (running programs) need a way to create new processes, e.g. to run other programs. Fork and its variants are typically the only way of doing so in Unix-like systems. For a process to start the execution of a different program, it first forks to create a copy of itself. Then, the copy, called the "child process", calls the exec system call to overlay itself with the other program: it ceases execution of its former program in favor of the other. The fork operation creates a separate address space for the child. The child process has an exact copy of all the memory segments of the parent process. In modern UNIX variants that follow the virtual memory model from SunOS-4.0, copy-on-write semantics are implemented and the physical memory need not be actually copied. Instead, virtual memory pages in both processes may refer to the same pages of physical memory until one of them writes to such a page: then it is copied. This optimization is important in the common case where fork is used in conjunction with exec to execute a new program: typically, the child process performs only a small set of actions before it ceases execution of its program in favour of the program to be started, and it requires very few, if any, of its parent's data structures. When a process calls fork, it is deemed the parent process and the newly created process is its child. After the fork, both processes not only run the same program, but they resume execution as though both had called the system call. They can then inspect the call's return value to determine their status, child or parent, and act accordingly. History One of the earliest references to a fork concept appeared in A Multiprocessor System Design by Melvin Conway, published in 1962. Conway's paper motivated the implementation by L. Peter Deutsch of fork in the GENIE time-sharing system, where the concept was borrowed by Ken Thompson for its earliest appearance in Research Unix. Fork later became a standard interface in POSIX. Communication The child process starts off with a copy of its parent's file descriptors. For interprocess communication, the parent process will often create one or several pipes, and then after forking the processes will close the ends of the pipes that they don't need. Variants Vfork Vfork is a variant of fork with the same calling convention and much the same semantics, but only to be used in restricted situations. It originated in the 3BSD version of Unix, the first Unix to suppo
https://en.wikipedia.org/wiki/M-expression
In computer programming, M-expressions (or meta-expressions) were an early proposed syntax for the Lisp programming language, inspired by contemporary languages such as Fortran and ALGOL. The notation was never implemented into the language and, as such, it was never finalized. Compared to S-expressions, M-expressions introduce function notation, infix operators (including a operator), and shorthands for and into the language. Background John McCarthy published the first paper on Lisp in 1960 while a research fellow at the Massachusetts Institute of Technology. In it he described a language of symbolic expressions (S-expressions) that could represent complex structures as lists. Then he defined a set of primitive operations on the S-expressions, and a language of meta-expressions (M-expressions) that could be used to define more complex operations. Finally, he showed how the meta-language itself could be represented with S-expressions, resulting in a system that was potentially self-hosting. The draft version of this paper is known as "AI Memo 8". McCarthy had planned to develop an automatic Lisp compiler (LISP 2) using M-expressions as the language syntax and S-expressions to describe the compiler's internal processes. Stephen B. Russell read the paper and suggested to him that S-expressions were a more convenient syntax. Although McCarthy disapproved of the idea, Russell and colleague Daniel J. Edwards hand-coded an interpreter program that could execute S-expressions. This program was adopted by McCarthy's research group, establishing S-expressions as the dominant form of Lisp. McCarthy reflected on the fate of M-expressions in 1979: Implementations A form of sugared M-expressions has been implemented in the Wolfram language of Wolfram Mathematica since 1988: For LISP MLisp was a contemporary (1968-1973) project to implement an M-expression-like frontend for Lisp. A few extra features like hygienic macros, pattern matching, and backtracking were incorporated. It eventually evolved into an abandoned LISP70 draft. M-LISP (MetaLISP) from 1989 was another attempt to blend M-expressions with Scheme. A parser for the "AI Memo 8" M-expression is available in Common Lisp, but the author intends it as a case against M-expressions due to its perceived inability to cope with macros. For K The K (programming language) also includes M-Expressions, in addition to the more terse notation in the APL-tradition. Further development A CGOL (1977) was implemented in MacLisp and follows a similar goal of introducing Algol-like syntax with infix operators. It is known to work on Armed Bear Common Lisp. A more recent (circa 2003) variant is the I-expression, which use indentation to indicate parentheses implicitly, and are thus in some ways intermediate between S-expressions and M-expressions. I-expressions were introduced in Scheme Request For Implementation 49 as an auxiliary syntax for Scheme, but they have not been widely adopted. A further
https://en.wikipedia.org/wiki/Robert%20W.%20Floyd
Robert W Floyd (June 8, 1936 – September 25, 2001) was a computer scientist. His contributions include the design of the Floyd–Warshall algorithm (independently of Stephen Warshall), which efficiently finds all shortest paths in a graph and his work on parsing; Floyd's cycle-finding algorithm for detecting cycles in a sequence was attributed to him as well. In one isolated paper he introduced the important concept of error diffusion for rendering images, also called Floyd–Steinberg dithering (though he distinguished dithering from diffusion). He pioneered in the field of program verification using logical assertions with the 1967 paper Assigning Meanings to Programs. This was a contribution to what later became Hoare logic. Floyd received the Turing Award in 1978. Life Born in New York City, Floyd finished high school at age 14. At the University of Chicago, he received a Bachelor of Arts (B.A.) in liberal arts in 1953 (when still only 17) and a second bachelor's degree in physics in 1958. Floyd was a college roommate of Carl Sagan. Floyd became a staff member of the Armour Research Foundation (now IIT Research Institute) at Illinois Institute of Technology in the 1950s. Becoming a computer operator in the early 1960s, he began publishing many papers, including on compilers (particularly parsing). He was a pioneer of operator-precedence grammars, and is credited with initiating the field of programming language semantics in . He was appointed an associate professor at Carnegie Mellon University by the time he was 27 and became a full professor at Stanford University six years later. He obtained this position without a Doctor of Philosophy (Ph.D.) degree. He was a member of the International Federation for Information Processing (IFIP) IFIP Working Group 2.1 on Algorithmic Languages and Calculi, which specified, maintains, and supports the programming languages ALGOL 60 and ALGOL 68. He was elected a Fellow of the American Academy of Arts and Sciences in 1974. He received the Turing Award in 1978 "for having a clear influence on methodologies for the creation of efficient and reliable software, and for helping to found the following important subfields of computer science: the theory of parsing, the semantics of programming languages, automatic program verification, automatic program synthesis, and analysis of algorithms". Floyd worked closely with Donald Knuth, in particular as the major reviewer for Knuth's seminal book The Art of Computer Programming, and is the person most cited in that work. He was co-author, with Richard Beigel, of the textbook The Language of Machines: an Introduction to Computability and Formal Languages. Floyd supervised seven Ph.D. graduates. Floyd married and divorced twice, first with Jana M. Mason and then computer scientist Christiane Floyd, and he had four children. In his last years he suffered from Pick's disease, a neurodegenerative disease, and thus retired early in 1994. His hobbies included hiking, an
https://en.wikipedia.org/wiki/John%20Cocke%20%28computer%20scientist%29
John Cocke (May 30, 1925 – July 16, 2002) was an American computer scientist recognized for his large contribution to computer architecture and optimizing compiler design. He is considered by many to be "the father of RISC architecture." Biography He was born in Charlotte, North Carolina, US. He attended Duke University, where he received his bachelor's degree in mechanical engineering in 1946 and his Ph.D. in mathematics in 1956. Cocke spent his entire career as an industrial researcher for IBM, from 1956 to 1992. Perhaps the project where his innovations were most noted was in the IBM 801 minicomputer, where his realization that matching the design of the architecture's instruction set to the relatively simple instructions actually emitted by compilers could allow high performance at a low cost. He is one of the inventors of the CYK algorithm (C for Cocke). He was also involved in the pioneering speech recognition and machine translation work at IBM in the 1970s and 1980s, and is credited by Frederick Jelinek with originating the idea of using a trigram language model for speech recognition. Cocke was appointed IBM Fellow in 1972. He won the Eckert-Mauchly Award in 1985, ACM Turing Award in 1987, the National Medal of Technology in 1991 and the National Medal of Science in 1994, IEEE John von Neumann Medal in 1984, The Franklin Institute's Certificate of Merit in 1996, the Seymour Cray Computer Engineering Award in 1999, and The Benjamin Franklin Medal in 2000. He was a member of the American Academy of Arts and Sciences, the American Philosophical Society, and the National Academy of Sciences. In 2002, he was made a Fellow of the Computer History Museum "for his development and implementation of reduced instruction set computer architecture and program optimization technology." He died in Valhalla, New York, US. References External links IBM obituary Duke profile from 1988 By Eileen Bryn Interview transcript IEEE John von Neumann Medal Recipients 1925 births 2002 deaths American computer scientists Computer hardware engineers Computer designers Duke University alumni 20th-century American mathematicians 21st-century American mathematicians Turing Award laureates National Medal of Science laureates National Medal of Technology recipients IBM Research computer scientists Seymour Cray Computer Engineering Award recipients IBM employees IBM Fellows People from Charlotte, North Carolina People from Valhalla, New York Mathematicians from New York (state) Duke University Pratt School of Engineering alumni Members of the United States National Academy of Sciences Members of the American Philosophical Society The Benjamin Franklin Medal in Computer and Cognitive Science laureates
https://en.wikipedia.org/wiki/Original%20programming
Original programming (also called originals or original programs, and subcategorized as "original series", "original movies", "original documentaries" and "original specials") is a term used for in-house television, film or web series productions to which the exclusive domestic and, if the originating service operates non-domestic versions of the service outside of their home country, international broadcast rights are held by traditional and over-the-top content providers. The term was coined by HBO in 1983 when the premium service began producing its slate of in-house series and film productions. HBO initially branded the original series on the network under "HBOriginal" until 1986, and by 1993, the "originals" term had expanded to encompass most of its original productions. The term eventually expanded into use by various cable-originated television networks (including, among others, Disney Channel, TNT and USA Network) to identify their in-house productions. It also advertises them as being distinct from the acquired content offered to fill out the remainder of their programming schedule. Most original made-for-cable or made-for-streaming productions are produced solely in conjunction with independent production companies that also hold day-to-day management responsibilities for the program, although some series (such as The Larry Sanders Show, Queer as Folk, The Leftovers and Power) share financial interests with major television studios—such as 20th Television, Warner Bros. Television and Lionsgate Television—that may also handle distribution responsibilities for domestic and international syndication on behalf of the originating network. Television networks and digital content providers that produce original programming include Cinemax, Netflix, Showtime, Amazon Prime, and YouTube. In October 2020, CBS became the first U.S. broadcast network to identify all of its entertainment programming under the term, branding them as "CBS Originals"; however, the network uses the "original" term for both series produced by sister production company CBS Studios and series produced by third-party production companies. (ABC had previously marketed the 2016 miniseries Madoff and the 2019 made-for-TV movie Same Time, Next Christmas as "ABC Originals".) See also Streaming media Video on demand References Cable television Streaming television Original programming
https://en.wikipedia.org/wiki/Natural%20language%20generation
Natural language generation (NLG) is a software process that produces natural language output. A widely-cited survey of NLG methods describes NLG as "the subfield of artificial intelligence and computational linguistics that is concerned with the construction of computer systems than can produce understandable texts in English or other human languages from some underlying non-linguistic representation of information". While it is widely agreed that the output of any NLG process is text, there is some disagreement about whether the inputs of an NLG system need to be non-linguistic. Common applications of NLG methods include the production of various reports, for example weather and patient reports; image captions; and chatbots. Automated NLG can be compared to the process humans use when they turn ideas into writing or speech. Psycholinguists prefer the term language production for this process, which can also be described in mathematical terms, or modeled in a computer for psychological research. NLG systems can also be compared to translators of artificial computer languages, such as decompilers or transpilers, which also produce human-readable code generated from an intermediate representation. Human languages tend to be considerably more complex and allow for much more ambiguity and variety of expression than programming languages, which makes NLG more challenging. NLG may be viewed as complementary to natural-language understanding (NLU): whereas in natural-language understanding, the system needs to disambiguate the input sentence to produce the machine representation language, in NLG the system needs to make decisions about how to put a representation into words. The practical considerations in building NLU vs. NLG systems are not symmetrical. NLU needs to deal with ambiguous or erroneous user input, whereas the ideas the system wants to express through NLG are generally known precisely. NLG needs to choose a specific, self-consistent textual representation from many potential representations, whereas NLU generally tries to produce a single, normalized representation of the idea expressed. NLG has existed since ELIZA was developed in the mid 1960s, but the methods were first used commercially in the 1990s. NLG techniques range from simple template-based systems like a mail merge that generates form letters, to systems that have a complex understanding of human grammar. NLG can also be accomplished by training a statistical model using machine learning, typically on a large corpus of human-written texts. Example The Pollen Forecast for Scotland system is a simple example of a simple NLG system that could essentially be a template. This system takes as input six numbers, which give predicted pollen levels in different parts of Scotland. From these numbers, the system generates a short textual summary of pollen levels as its output. For example, using the historical data for July 1, 2005, the software produces: Grass pollen levels fo
https://en.wikipedia.org/wiki/Timeline%20of%20audio%20formats
An audio format is a medium for sound recording and reproduction. The term is applied to both the physical recording media and the recording formats of the audio content—in computer science it is often limited to the audio file format, but its wider use usually refers to the physical method used to store the data. Note on the use of analog compared to digital in this list; the definition of digital used here for early formats is that which is represented using discrete values rather than fluctuating variables. A piano roll is digital as it has discrete values, that being a hole for each key, unlike a phonograph record which is analog with a fluctuating groove. Music is recorded and distributed using a variety of audio formats, some of which store additional information. Timeline of audio format developments See also Timeline of video formats Format war Audio data compression References External links History of Recording Technologies Museum Of Obsolete Media – Audio Formats Technology timelines audio Obsolete technologies
https://en.wikipedia.org/wiki/Dive%20computer
A dive computer, personal decompression computer or decompression meter is a device used by an underwater diver to measure the elapsed time and depth during a dive and use this data to calculate and display an ascent profile which, according to the programmed decompression algorithm, will give a low risk of decompression sickness. Most dive computers use real-time ambient pressure input to a decompression algorithm to indicate the remaining time to the no-stop limit, and after that has passed, the minimum decompression required to surface with an acceptable risk of decompression sickness. Several algorithms have been used, and various personal conservatism factors may be available. Some dive computers allow for gas switching during the dive. Audible alarms may be available to warn the diver when exceeding the no-stop limit, the maximum operating depth for the gas mixture, the recommended ascent rate or other limit beyond which risk increases significantly. The display provides data to allow the diver to avoid decompression, or to decompress relatively safely, and includes depth and duration of the dive. Several additional functions and displays may be available for interest and convenience, such as water temperature and compass direction, and it may be possible to download the data from the dives to a personal computer via cable or wireless connection. Data recorded by a dive computer may be of great value to the investigators in a diving accident, and may allow the cause of an accident to be discovered. Dive computers may be wrist-mounted or fitted to a console with the submersible pressure gauge. A dive computer is perceived by recreational scuba divers and service providers to be one of the most important items of safety equipment. Use by professional scuba divers is also common, but use by surface-supplied divers is less widespread, as the diver's depth is monitored at the surface by pneumofathometer and decompression is controlled by the diving supervisor. Purpose The primary purpose of a decompression computer is to facilitate safe decompression by an underwater diver breathing a suitable gas at ambient pressure, by providing information based on the recent pressure exposure history of the diver that allows an ascent with acceptably low risk of developing decompression sickness. Dive computers address the same problem as decompression tables, but are able to perform a continuous calculation of the partial pressure of inert gases in the body based on the actual depth and time profile of the diver. As the dive computer automatically measures depth and time, it is able to warn of excessive ascent rates and missed decompression stops and the diver has less reason to carry a separate dive watch and depth gauge. Many dive computers also provide additional information to the diver including air and water temperature, data used to help prevent oxygen toxicity, a computer-readable dive log, and the pressure of the remaining breathing gas in th
https://en.wikipedia.org/wiki/Curry%20%28programming%20language%29
Curry is an experimental functional logic programming language, based on the Haskell language. It merges elements of functional and logic programming, including constraint programming integration. It is nearly a superset of Haskell, lacking support mostly for overloading using type classes, which some implementations provide anyway as a language extension, such as the Münster Curry Compiler. Foundations of functional logic programming Basic concepts A functional program is a set of functions defined by equations or rules. A functional computation consists of replacing subexpressions by equal (with regard to the function definitions) subexpressions until no more replacements (or reductions) are possible and a value or normal form is obtained. For instance, consider the function double defined by double x = x+x The expression “” is replaced by . The latter can be replaced by if we interpret the operator “” to be defined by an infinite set of equations, e.g., , , etc. In a similar way, one can evaluate nested expressions (where the subexpressions to be replaced are quoted): 'double (1+2)' → '(1+2)'+(1+2) → 3+'(1+2)' → '3+3' → 6 There is also another order of evaluation if we replace the arguments of operators from right to left: 'double (1+2)' → (1+2)+'(1+2)' → '(1+2)'+3 → '3+3' → 6 In this case, both derivations lead to the same result, a property known as confluence. This follows from a fundamental property of pure functional languages, termed referential transparency: the value of a computed result does not depend on the order or time of evaluation, due to the absence of side effects. This simplifies reasoning about, and maintaining, pure functional programs. As many functional languages like Haskell do, Curry supports the definition of algebraic data types by enumerating their constructors. For instance, the type of Boolean values consists of the constructors and that are declared as follows: data Bool = True | False Functions on Booleans can be defined by pattern matching, i.e., by providing several equations for different argument values: not True = False not False = True The principle of replacing equals by equals is still valid provided that the actual arguments have the required form, e.g.: not '(not False)' → 'not True' → False More complex data structures can be obtained by recursive data types. For instance, a list of elements, where the type of elements is arbitrary (denoted by the type variable ), is either the empty list “” or the non-empty list “” consisting of a first element and a list : data List a = [] | a : List a The type “” is usually written as and finite lists x1x2...xn are written as x1x2...xn. We can define operations on recursive types by inductive definitions where pattern matching supports the convenient separation of the different cases. For instance, the concatenation operation “” on polymorphic lists can be defined as follows (the optional type declaration in the first line specifies th
https://en.wikipedia.org/wiki/Commodore%2016
The Commodore 16 is a home computer made by Commodore International with a 6502-compatible 7501 or 8501 CPU, released in 1984 and intended to be an entry-level computer to replace the VIC-20. A cost-reduced version, the Commodore 116, was mostly sold in Europe. The C16 and C116 belong to the same family as the higher-end Plus/4 and are internally very similar to it (albeit with less RAM – 16 KB rather than 64 KB – and lacking the Plus/4's user port and Three-Plus-One software). Software is generally compatible among all three provided it can fit within the C16's smaller RAM and does not utilize the user port on the Plus/4. While the C16 was a failure on the US market, it enjoyed some success in certain European countries and Mexico. Intention The C16 was intended to compete with other sub-$100 computers from Timex Corporation, Mattel, and Texas Instruments (TI). Timex's and Mattel's computers were less expensive than the VIC-20, and although the VIC-20 offered better expandability, a full-travel keyboard, and in some cases more memory, the C16 offered a chance to improve upon those advantages. The TI-99/4A was priced in-between Commodore's VIC-20 and Commodore 64, and is somewhat between them in capability, but TI was lowering its prices. On paper, the C16 was a closer match for the TI-99/4A than the aging VIC-20. Commodore president Jack Tramiel feared that one or more Japanese companies would introduce a consumer-oriented computer and undercut everyone's prices. Although Japanese companies would soon dominate the U.S. video game console market, their feared dominance of the home computer field never materialized. Additionally, Timex, Mattel, and TI departed the computer market before the C16 was released. Description Outwardly the C16 resembles the VIC-20 and the Commodore 64, but with a dark-gray or dark-brown case and light-gray keys. The keyboard layout differs slightly from the earlier models, adding an escape key and four cursor keys replacing the shifted-key arrangement the C-64 and VIC inherited from the PET series. The C16 is in some respects faster than the Commodore 64 and VIC-20; the processor runs at a speed roughly 75% faster, and the BASIC interpreter contains dedicated graphics commands, making drawing images considerably faster. The system was designed around the TED chip which included NTSC and PAL video, sound and DRAM refresh functionality. Though according to the designer it "was supposed to be as close to a single-chip computer as we could get in the 1980s," the CPU, RAM, ROM and some glue logic were still on their own separate chips. (This was considerably less integrated than microcontrollers of the day, but those did not generally offer video and sound functionality.) The C16 has 16 KB of RAM with 12 KB available to its built-in BASIC interpreter. The TED chip offered a palette of 121 colors, which was considerably more than the 16 colors available on the Commodore 64's VIC-II video chip, but it lacked the VIC-II'
https://en.wikipedia.org/wiki/Chipset
In a computer system, a chipset is a set of electronic components on one or more ULSI integrated circuits known as a "Data Flow Management System" that manages the data flow between the processor, memory and peripherals. It is usually found on the motherboard of computers. Chipsets are usually designed to work with a specific family of microprocessors. Because it controls communications between the processor and external devices, the chipset plays a crucial role in determining system performance. Computers In computing, the term chipset commonly refers to a set of specialized chips on a computer's motherboard or an expansion card. In personal computers, the first chipset for the IBM PC AT of 1984 was the NEAT chipset developed by Chips and Technologies for the Intel 80286 CPU. In home computers, game consoles, and arcade hardware of the 1980s and 1990s, the term chipset was used for the custom audio and graphics chips. Examples include the Original Amiga chipset and Sega's System 16 chipset. In x86-based personal computers, the term chipset often refers to a specific pair of chips on the motherboard: the northbridge and the southbridge. The northbridge links the CPU to very high-speed devices, especially RAM and graphics controllers, and the southbridge connects to lower-speed peripheral buses (such as PCI or ISA). In many modern chipsets, the southbridge contains some on-chip integrated peripherals, such as Ethernet, USB, and audio devices. Motherboards and their chipsets often come from different manufacturers. , manufacturers of chipsets for x86 motherboards include AMD, Intel, VIA Technologies and Zhaoxin. In the 1990s, a major designer and manufacturer of chipsets was VLSI Technology in Tempe, Arizona. The early Apple Power Macintosh PCs (that used the Motorola 68030 and 68040) had chipsets from VLSI Technology. Some of their innovations included the integration of PCI bridge logic, the GraphiCore 2D graphics accelerator and direct support for synchronous DRAM, the forerunner of DDR SDRAM memory. In the 1980s, Chips and Technologies pioneered the manufacturing of chipsets for PC-compatible computers. Computer systems produced since then often share commonly used chipsets, even across widely disparate computing specialties. For example, the NCR 53C9x, a low-cost chipset implementing a SCSI interface to storage devices, could be found in Unix machines such as the MIPS Magnum, embedded devices, and personal computers. Move toward processor integration in PCs Traditionally in x86 computers, the processor's primary connection to the rest of the machine was through the motherboard chipset's northbridge. The northbridge was directly responsible for communications with high-speed devices (system memory and primary expansion buses, such as PCIe, AGP, and PCI cards, being common examples) and conversely any system communication back to the processor. This connection between the processor and northbridge is commonly designated the front-side bu
https://en.wikipedia.org/wiki/Castle%20Wolfenstein
Castle Wolfenstein is a 1981 action-adventure game that was developed by Muse Software for the Apple II home computer. It is one of the earliest games to be based on stealth mechanics. An Atari 8-bit family port was released in 1982 and was followed by versions for Commodore 64 (1983) and MS-DOS (1984). The game takes place during World War II. The player takes the role of an Allied prisoner of war who is held captive in the fictional Castle Wolfenstein. After escaping from the cell, the player's objective is to find the Nazis' secret war plans and escape from the castle. Nazi soldier enemies can be dealt with by impersonating, sneaking, or killing them. The game was received positively amongst critics and became one of the best-selling games of the early 1980s. It is considered to have had a direct influence on modern stealth games. The game was praised for its graphics, and gameplay, but criticized for its long waiting times when opening chests. Gameplay Castle Wolfenstein is a two-dimensional action-adventure game that is played from a top-down perspective using a keyboard, joystick, or paddles. It has also been described as a maze game. There are eight difficulty levels in the game that are determined by the player's rank. The player takes the role of an Allied spy that has been captured by Nazis and imprisoned in a dungeon within Castle Wolfenstein for interrogation by the SS Stormtroopers. While the spy is waiting for interrogation, a dying prisoner emerges from a hiding place and hands the player a fully loaded pistol with 10 rounds, and three grenades before passing away. The objective is to escape from the castle and if the player finds the battle plans before escaping, they will be promoted and the complexity of the subsequent run will be increased, while the castle's layout changes and the game starts again. The game takes place in a procedurally-generated castle of approximately 60 rooms that house standard Nazi guards and SS Stormtroopers identified by their bulletproof vests marked with the SS insignia. Standard guards can be eliminated with a pistol and have a chance to surrender if the player points a pistol at them even if they have no ammunition, and SS Stormtroopers with grenades because they usually wear body armor. Enemies can be looted once surrendered or after they've been eliminated and can possess ammunition, grenades, and keys which can be used on doors and chests. Doors and chests can be opened more quickly by shooting at them but will attract the guards in the room, and if the chest contains ammunition and grenades, they will explode resulting in immediate death. Chests may contain bulletproof vests, uniforms, and secret documents, or sauerkraut, sausages, and schnapps that do not affect the gameplay. Uniforms allow the player character to pass guards unnoticed, but they are ineffective against SS Stormtroopers. If the player dies from enemy gunfire, the game restarts with the castle's layout preserved and the sa
https://en.wikipedia.org/wiki/CTEC
The Microsoft Certified Technical Education Centre (Microsoft CTEC) channel provides training for computer professions in the use of Microsoft products. The term Microsoft Certified Technical Education Centre was introduced by Microsoft in early 1990s. Microsoft CTECs are authorised to deliver Microsoft Official Curriculum courses to computer professionals using Microsoft Certified Trainers (MCTs). Through Microsoft CTECs, Microsoft Certified Solution Providers learn about Microsoft BackOffice, Internet and developer tools technology. Currently the program involves more than 900 training centers in North America with more than 1,900 Microsoft CTECs internationally. Microsoft Official Curriculum courses are taught by Microsoft Certified Trainers (MCTs), professionals certified by Microsoft for a particular product or technology. Microsoft CTEC with the new program, Certified Partners for Learning Solutions. External links Microsoft Official Learning Web site Microsoft Solution Finder Microsoft divisions
https://en.wikipedia.org/wiki/Commodore%20Plus/4
The Commodore Plus/4 is a home computer released by Commodore International in 1984. The "Plus/4" name refers to the four-application ROM-resident office suite (word processor, spreadsheet, database, and graphing); it was billed as "the productivity computer with software built in". Internally, the Plus/4 shared the same basic architecture as the lower-end Commodore 16 and 116 models, and was able to use software and peripherals designed for them. The Plus/4 was incompatible with the Commodore 64's software and some of its hardware. Although the Commodore 64 was more established, the Plus/4 was aimed at the more business-oriented part of the personal computer market. History In the early 1980s, Commodore found itself engaged in a price war in the home computer market. Companies like Texas Instruments and Timex Corporation were releasing computers that undercut the price of Commodore's PET line. Commodore's MOS Technology division had designed a video chip but could not find any third-party buyers. The VIC-20 resulted from the confluence of these events and it was introduced in 1980 at a list price of $299.95. Later, spurred by the competition, Commodore was able to reduce the VIC's street price to $99, and it became the first computer (of any kind) to sell over 1 million units. The Commodore 64, the first 64 KB computer to sell for under in the US, was another salvo in the price war but it was far more expensive to make than the VIC-20 because it used discrete chips for video, sound, and I/O. Still, the C64 went on to become a best-seller and was selling for $199 at the time of the Plus/4's introduction. Even while C64 sales were rising, Commodore president Jack Tramiel wanted a new computer line that would use fewer chips and at the same time address some of the user complaints about the VIC and C64. Rumors spread in late 1983 of a new computer in 1984 called the "Commodore 444" or "Ted", with built-in word processing and spreadsheet software, and that it would be one of four new computers that would replace the VIC-20 and C64, which the company would discontinue. The company's third salvo which, as it turned out, was fired just as most of Commodore's competition was leaving the home computer market was the C116, C16, and 264, which became the Plus/4. There were also prototypes of a 232, basically a version of the Plus/4 without the software ROMs, and a V364, which had a numeric keypad and built-in voice synthesis. The latter two models never made it to production. All these computers used a 6502 compatible MOS 7501 or 8501 that was clocked approximately 75% faster than the CPUs used in the VIC-20 and C64, and a MOS Technology TED all-in-one video, sound, and I/O chip. The Plus/4's design is thus philosophically closer to that of the VIC-20 than that of the C64. The Plus/4 was introduced in June 1984 and priced at . The Plus/4 was the flagship computer of the line, featuring of RAM while the C16 and C116 had . The Plus/4 had built-in
https://en.wikipedia.org/wiki/No%20Silver%20Bullet
"No Silver Bullet—Essence and Accident in Software Engineering" is a widely discussed paper on software engineering written by Turing Award winner Fred Brooks in 1986. Brooks argues that "there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity." He also states that "we cannot expect ever to see two-fold gains every two years" in software development, as there is in hardware development (Moore's law). Summary Brooks distinguishes between two different types of complexity: accidental complexity and essential complexity. This is related to Aristotle's classification. Accidental complexity relates to problems which engineers create and can fix. For example, modern programming languages have abstracted away the details of writing and optimizing assembly code, and eliminated the delays caused by batch processing, though other sources of accidental complexity remain. Essential complexity is caused by the problem to be solved, and nothing can remove it; if users want a program to do 30 different things, then those 30 things are essential and the program must do those 30 different things. Brooks claims that accidental complexity has decreased substantially, and today's programmers spend most of their time addressing essential complexity. Brooks argues that this means shrinking all the accidental activities to zero will not give the same order-of-magnitude improvement as attempting to decrease essential complexity. While Brooks insists that there is no one silver bullet, he believes that a series of innovations attacking essential complexity could lead to significant improvements. One technology that had made significant improvement in the area of accidental complexity was the invention of high-level programming languages, such as Ada. Brooks advocates "growing" software organically through incremental development. He suggests devising and implementing the main and subprograms right at the beginning, filling in the working sub-sections later. He believes that programming this way excites the engineers and provides a working system at every stage of development. Brooks goes on to argue that there is a difference between "good" designers and "great" designers. He postulates that as programming is a creative process, some designers are inherently better than others. He suggests that there is as much as a tenfold difference between an ordinary designer and a great one. He then advocates treating star designers equally well as star managers, providing them not just with equal remuneration, but also all the perks of higher status: large office, staff, travel funds, etc. The article, and Brooks's later reflections on it, "'No Silver Bullet' Refined", can be found in the anniversary edition of The Mythical Man-Month. Related concepts Brooks's paper has sometimes been cited in connection with Wirth
https://en.wikipedia.org/wiki/Information%20Processing%20Language
Information Processing Language (IPL) is a programming language created by Allen Newell, Cliff Shaw, and Herbert A. Simon at RAND Corporation and the Carnegie Institute of Technology about 1956. Newell had the job of language specifier-application programmer, Shaw was the system programmer, and Simon had the job of application programmer-user. The code includes features intended to help with programs that perform simple problem solving actions such as lists, dynamic memory allocation, data types, recursion, functions as arguments, generators, and cooperative multitasking. IPL invented the concept of list processing, albeit in an assembly-language style. Basics of IPL An IPL computer has: A set of symbols. All symbols are addresses, and name cells. Unlike symbols in later languages, symbols consist of a character followed by a number, and are written H1, A29, 9–7, 9–100. Cell names beginning with a letter are regional, and are absolute addresses. Cell names beginning with "9-" are local, and are meaningful within the context of a single list. One list's 9-1 is independent of another list's 9–1. Other symbols (e.g., pure numbers) are internal. A set of cells. Lists are made from several cells including mutual references. Cells have several fields: P, a 3-bit field used for an operation code when the cell is used as an instruction, and unused when the cell is data. Q, a 3-valued field used for indirect reference when the cell is used as an instruction, and unused when the cell is data. SYMB, a symbol used as the value in the cell. A set of primitive processes, which would be termed primitive functions in modern languages. The data structure of IPL is the list, but lists are more intricate structures than in many languages. A list consists of a singly linked sequence of symbols, as might be expected—plus some description lists, which are subsidiary singly linked lists interpreted as alternating attribute names and values. IPL provides primitives to access and mutate attribute value by name. The description lists are given local names (of the form 9–1). So, a list named L1 containing the symbols S4 and S5, and described by associating value V1 to attribute A1 and V2 to A2, would be stored as follows. 0 indicates the end of a list; the cell names 100, 101, etc. are automatically generated internal symbols whose values are irrelevant. These cells can be scattered throughout memory; only L1, which uses a regional name that must be globally known, needs to reside in a specific place. IPL is an assembly language for manipulating lists. It has a few cells which are used as special-purpose registers. H1, for example, is the program counter. The SYMB field of H1 is the name of the current instruction. However, H1 is interpreted as a list; the LINK of H1 is, in modern terms, a pointer to the beginning of the call stack. For example, subroutine calls push the SYMB of H1 onto this stack. H2 is the free-list. Procedures which need
https://en.wikipedia.org/wiki/Infomercial
An infomercial is a form of television commercial that resembles regular TV programming yet is intended to promote or sell a product, service or idea. It generally includes a toll-free telephone number or website. Most often used as a form of direct response television (DRTV), they are often program-length commercials (long-form infomercials), and are typically 28:30 or 58:30 minutes in length. Infomercials are also known as paid programming (or teleshopping in Europe). This phenomenon started in the United States, where infomercials were typically shown overnight (usually 1:00 a.m. to 6:00 a.m.), outside peak prime time hours for commercial broadcasters. Some television stations chose to air infomercials as an alternative to the former practice of signing off, while other channels air infomercials 24 hours a day. Some stations also choose to air infomercials during the daytime hours, mostly on weekends, to fill in for unscheduled network or syndicated programming. By 2009, most infomercial spending in the U.S. occurred outside of the traditional overnight hours. Stations in most countries around the world have instituted similar media structures. The infomercial industry is worth over $200 billion. The Washington DC-based National Infomercial Marketing Association was formed in late 1990; by 1993 "it had more than 200" members committed to standards "with teeth". While the term "infomercial" was originally applied only to television advertising, it is now sometimes used to refer to any presentation (often on video) which presents a significant amount of information in an actual, or perceived, attempt to promote a point of view. When used this way, the term may be meant to carry an implication that the party making the communication or political speech is exaggerating truths or hiding important facts. The New York Times cited a professional in the field as saying that "infomercial companies tend to do well during recessions." Format The word "infomercial" is a portmanteau of the words "information" and "commercial". As in any other form of advertisement, the content is a commercial message designed to represent the viewpoints and to serve the interest of the sponsor. Infomercials are often made to closely resemble standard television programs. Some imitate talk shows and try to downplay the fact that the program is actually a commercial message. A few are developed around storylines and have been called "storymercials". However, most do not have specific television formats but craft different elements to tell what their creators hope is a compelling story about the product offered. The term infomercial, by 2007, had come to refer to the format, even when used in a live presentation. Infomercials are designed to solicit quantifiable immediate direct response (a form of direct response marketing not to be confused with direct marketing); they generally feature between two and four internal commercials of 30 to 120 seconds which invite the vie
https://en.wikipedia.org/wiki/Mockingboard
The Mockingboard (a pun on "Mockingbird") is a sound card built by Sweet Micro Systems for the Apple II series of microcomputers. It improves on the Apple II's limited sound capabilities, as did other Apple II sound cards. In 1981, Sweet Micro Systems began designing products not only for creating music, but speech and general sound effects as well, culminating in the release of the Mockingboard in 1983. The Sound II was introduced at , and the Sound/Speech I at . The Mockingboard's hardware allowed programmers to create complex, high-quality sound without need for constant CPU attention. The Mockingboard could be connected to the Apple's built-in speaker or to external speakers. However, as the quality of the built-in speaker was not high, the instruction manual recommended obtaining external speakers. The Mockingboard was available in various models for either the slot-based Apple II / Apple II Plus / Apple IIe systems or in one special model for the Apple IIc. Sound was generated through one or more AY-3-8910 or compatible sound chips, with one chip offering three square-wave synthesis channels. The boards could also be equipped with an optional speech chip (a Votrax SC-01 or compatible chips such as the Arctic-02, SSI 263P, SSI 263AP or 78A263A-P). Some software products supported more than one Mockingboard. Ultima V supported two boards, for a total of 12 voices, of which it used eight. Most other programs supported at most one board with six voices. Applied Engineering's Phasor was compatible with the Mockingboard. It had 4 sound chips and thus provided 12 audio channels. Few programs supported using it for more than six voices, however. An IBM PC-compatible version was developed, but was only distributed with Bank Street Music Writer. Models Early models Sound I: one AY-3-8910 chip for three audio channels Speech I: one SC-01 chip Sound II: two AY-3-8910 chips for six audio channels Sound/Speech I: one AY-3-8910 and one SC-01 Later models Mockingboard A: two AY-3-8913 chips for six audio channels and two open sockets for SSI-263 speech chips Mockingboard B: SSI-263 speech chip upgrade for Mockingboard A Mockingboard C: two AY-3-8913 and one SSI-263 ("A+B=C", essentially a Mockingboard A with the Mockingboard B upgrade pre-installed, only one speech chip allowed) Mockingboard D: for Apple IIc only, not software compatible with the other Mockingboards, two AY-3-8913 and one SSI-263 Mockingboard M: Bundled with Mindscape's Bank Street Music Writer, with two AY-3-8913 chips and an open socket for one speech chip. This model included a headphone jack and a jumper to permit sound to be played through the Apple's built-in speaker. Other compatible cards Echo+: emulates 1 x Mockingboard card, two AY-3-8913 for six channels, speech is provided by a Texas Instruments TMS5220NL Speech Synthesizer, compatibility with SC-01 or SSI-263 unknown. Mustalgame Card: Mockingboard clone from Capital Computer Co (Hong Kong) with two AY-3-
https://en.wikipedia.org/wiki/IP%20address%20blocking
IP address blocking or IP banning is a configuration of a network service that blocks requests from hosts with certain IP addresses. IP address blocking is commonly used to protect against brute force attacks and to prevent access by a disruptive address. It can also be used to restrict access to or from a particular geographic area; for example, syndicating content to a specific region through the use of Internet geolocation. IP address blocking can be implemented with a hosts file (e.g., for Mac, Windows, Android, or OS X) or with a TCP wrapper (for Unix-like operating systems). It can be bypassed using methods such as proxy servers; however, this can be circumvented with DHCP lease renewal. How it works Every device connected to the Internet is assigned a unique IP address, which is needed to enable devices to communicate with each other. With appropriate software on the host website, the IP address of visitors to the site can be logged and can also be used to determine the visitor's geographical location. Logging the IP address can, for example, monitor if a person has visited the site before, for example, to vote more than once, as well as to monitor their viewing pattern, how long since they performed any activity on the site (and set a time out limit), besides other things. Knowing the visitor's geolocation indicates, besides other things, the visitor's country. In some cases, requests from or responses to a certain country would be blocked entirely. Geo-blocking has been used, for example, to block shows in certain countries, such as censoring shows deemed inappropriate. This is especially frequent in places such as China. Internet users may circumvent geo-blocking and censorship and protect their personal identity using a Virtual Private Network. On a website, an IP address block can prevent a disruptive address from access, though a warning and/or account block may be used first. Dynamic allocation of IP addresses by ISPs can complicate IP address blocking by making it difficult to block a specific user without blocking many IP addresses (blocks of IP address ranges), thereby creating collateral damage. For websites with low-enough popularity (often intentionally, with explicitly declaring the majority of potential visitors as out-of-scope) the large-scale collateral damage is often tolerable: most of website accesses, for addresses belong to the same IP range, are accesses of persons just having a dynamic IP address, but the same Internet service provider (ISP), country, city and city districts, based on which IP ranges are assigned by ISPs. On websites with low-enough total visitor count, it is improbable that all these features match more than a single person. For large websites, Terms of Services usually reserve the right of their admins to block access at own discretion, enabling them to create collateral damage this way. Implementations Unix-like operating systems commonly implement IP address blocking using a TCP wrapper,
https://en.wikipedia.org/wiki/William%20Kahan
William "Velvel" Morton Kahan (born June 5, 1933) is a Canadian mathematician and computer scientist, who received the Turing Award in 1989 for "his fundamental contributions to numerical analysis", was named an ACM Fellow in 1994, and inducted into the National Academy of Engineering in 2005. Biography Born to a Canadian Jewish family, he attended the University of Toronto, where he received his bachelor's degree in 1954, his master's degree in 1956, and his Ph.D. in 1958, all in the field of mathematics. Kahan is now emeritus professor of mathematics and of electrical engineering and computer sciences (EECS) at the University of California, Berkeley. Kahan was the primary architect behind the IEEE 754-1985 standard for floating-point computation (and its radix-independent follow-on, IEEE 854). He has been called "The Father of Floating Point", since he was instrumental in creating the original IEEE 754 specification. Kahan continued his contributions to the IEEE 754 revision that led to the current IEEE 754 standard. In the 1980s he developed the program "paranoia", a benchmark that tests for a wide range of potential floating-point bugs. He also developed the Kahan summation algorithm, an important algorithm for minimizing error introduced when adding a sequence of finite-precision floating-point numbers. He coined the term "Table-maker's dilemma" for the unknown cost of correctly rounding transcendental functions to some preassigned number of digits. The Davis–Kahan–Weinberger dilation theorem is one of the landmark results in the dilation theory of Hilbert space operators and has found applications in many different areas. He is an outspoken advocate of better education of the general computing population about floating-point issues and regularly denounces decisions in the design of computers and programming languages that he believes would impair good floating-point computations. When Hewlett-Packard (HP) introduced the original HP-35 pocket scientific calculator, its numerical accuracy in evaluating transcendental functions for some arguments was not optimal. HP worked extensively with Kahan to enhance the accuracy of the algorithms, which led to major improvements. This was documented at the time in the Hewlett-Packard Journal. He also contributed substantially to the design of the algorithms in the HP Voyager series and wrote part of their intermediate and advanced manuals. See also Intel 8087 References External links William Kahan's home page An oral history of William Kahan, Revision 1.1, March, 2016 A Conversation with William Kahan, Dr. Dobb's Journal , November 1, 1997 An Interview with the Old Man of Floating-Point, February 20, 1998 IEEE 754 An Interview with William Kahan April, 1998 Paranoia source code in multiple languages Paranoia for modern graphics processing units (GPUs) 754-1985 - IEEE Standard for Binary Floating-Point Arithmetic, 1985, Superseded by IEEE Std 754-2008 1933 births Living people 20
https://en.wikipedia.org/wiki/SCID
SCID may stand for: Computing Shane's Chess Information Database, a chess database to maintain, view and replay chess games Source Code in Database, program source code stored in a database with structural relations reflecting the language syntax and program structure Synchronous optical networking, carrier identification Health Severe combined immunodeficiency, a genetic disorder in which the immune system fails to develop Severe combined immunodeficiency (non-human), a variation in nonhumans Structured Clinical Interview for DSM-IV SCI/D, referring to spinal cord injury and spinal cord disorder
https://en.wikipedia.org/wiki/Fernando%20J.%20Corbat%C3%B3
Fernando José "Corby" Corbató (July 1, 1926 – July 12, 2019) was an American computer scientist, notable as a pioneer in the development of time-sharing operating systems. Career Corbató was born on July 1, 1926, in Oakland, California, to Hermenegildo Corbató, a Spanish literature professor from Villarreal, Spain, and Charlotte (née Carella Jensen) Corbató. In 1930 the Corbató family moved to Los Angeles for Hermenegildo's job at the University of California, Los Angeles. In 1943, Corbató enrolled at UCLA, but due to World War II he was recruited by the Navy during his first year. During the war, Corbató "debug[ged] an incredible array of equipment", inspiring his future career. Corbató left the Navy in 1946, enrolled at the California Institute of Technology, and received a bachelor's degree in physics in 1950. He then earned a PhD in physics from the Massachusetts Institute of Technology in 1956. He joined MIT's Computation Center immediately upon graduation, became a professor in 1965, and stayed at MIT until he retired. The first time-sharing system he was associated with was known as the MIT Compatible Time-Sharing System (CTSS), an early version of which was demonstrated in 1961. Corbató is credited with the first use of passwords to secure access to files on a large computer system, though he later claimed that this rudimentary security method had proliferated and became unmanageable. The experience with developing CTSS led to a second project, Multics, which was adopted by General Electric for its high-end computer systems (later acquired by Honeywell). Multics pioneered many concepts now used in modern operating systems, including a hierarchical file system, ring-oriented security, access control lists, single-level store, dynamic linking, and extensive on-line reconfiguration for reliable service. Multics, while not particularly commercially successful in itself, directly inspired Ken Thompson to develop Unix, the direct descendants of which are still in extremely wide use; Unix also served as a direct model for many other subsequent operating system designs. Awards Among many awards, Corbató received the Turing Award in 1990, "for his pioneering work in organizing the concepts and leading the development of the general-purpose, large-scale, time-sharing and resource-sharing computer systems". In 2012, he was made a Fellow of the Computer History Museum "for his pioneering work on timesharing and the Multics operating system". Legacy Corbató is sometimes known for "Corbató's Law" which states: Corbató is recognized as helping to create the first computer password. Personal life and death Corbató married programmer Isabel Blandford in 1962; she died in 1973. Corbató had a second wife, Emily (née Gluck); two daughters, Carolyn Corbató Stone and Nancy Corbató, by his late wife Isabel; two step-sons, David Gish and Jason Gish; a brother, Charles; and five grandchildren. Corbató lived on Temple Street in West Newton, Massachuse
https://en.wikipedia.org/wiki/Peripheral%20Interchange%20Program
Peripheral Interchange Program (PIP) was a utility to transfer files on and between devices on Digital Equipment Corporation's computers. It was first implemented on the PDP-6 architecture by Harrison "Dit" Morse early in the 1960s. It was subsequently implemented for DEC's operating systems for PDP-10, PDP-11, and PDP-8 architectures. In the 1970s and 1980s Digital Research implemented PIP on CP/M and MP/M. History It is said that during development it was named ATLATL, which is an acronym for "Anything, Lord to Anything, Lord." This humorously described both its purpose as a device-independent file copying tool and the difficulties at the time of safely copying files between devices. The original PIP syntax was PIP destination←source /switches using the left-arrow character from the ASCII-1963 character set that the Flexowriter keyboards of the time used. As other terminals were introduced that used later versions of ASCII (without the left-arrow character), PIP allowed the syntax PIP destination=source The underscore (_) character, which was in the same ASCII character position that left-arrow had occupied, was still supported to separate the destination and source specifications. Source and destination were file specification strings. These consisted of a device name, typically 2 characters for device type such as DK (disk), LP (line printer), MT (magnetic tape), etc. and a unit number from 0 to 7, a colon (:), filename and extension. Copying was generally permitted between any file specification to any other where it made sense. Early versions of VAX/VMS implemented certain DCL commands, such as DIRECTORY and RENAME, by running RSX-11M PIP in compatibility mode. This usage of PIP was replaced by VAX-specific code in VAX/VMS 2.0, but PIP remained as part of the VAX-11 RSX layered product for VMS. As late as the mid 1980s, PIP was still in common use on TOPS-10, TOPS-20 and PDP-11 systems. PIP in CP/M and MP/M Gary Kildall, who developed CP/M and MP/M, based much of the design of its file structure and command processor on operating systems from Digital Equipment, such as RSTS/E for the PDP-11. Besides accessing files on a floppy disk, the PIP command in CP/M could also transfer data to and from the following "special files": — console (input and output) — an auxiliary device. In CP/M 1 and 2, PIP used (paper tape punch) and (paper tape reader) instead of — list output device, usually the printer — as , but lines were numbered, tabs expanded and form feeds added every 60 lines — null device, akin to \Device\Null and /dev/null — input device that produced end-of-file characters, ASCII — custom input device, by default the same as — punch card unit: — custom output device, by default the same as These were not true device files, however, because their handling was limited to PIP. The two custom devices and were implemented as calls to fixed locations at the start of the PIP program; the intention was that the
https://en.wikipedia.org/wiki/Data%20mart
A data mart is a structure/access pattern specific to data warehouse environments, used to retrieve client-facing data. The data mart is a subset of the data warehouse and is usually oriented to a specific business line or team. Whereas data warehouses have an enterprise-wide depth, the information in data marts pertains to a single department. In some deployments, each department or business unit is considered the owner of its data mart including all the hardware, software and data. This enables each department to isolate the use, manipulation and development of their data. In other deployments where conformed dimensions are used, this business unit owner will not hold true for shared dimensions like customer, product, etc. Warehouses and data marts are built because the information in the database is not organized in a way that makes it readily accessible. This organization requires queries that are too complicated, difficult to access or resource intensive. While transactional databases are designed to be updated, data warehouses or marts are read only. Data warehouses are designed to access large groups of related records. Data marts improve end-user response time by allowing users to have access to the specific type of data they need to view most often, by providing the data in a way that supports the collective view of a group of users. A data mart is basically a condensed and more focused version of a data warehouse that reflects the regulations and process specifications of each business unit within an organization. Each data mart is dedicated to a specific business function or region. This subset of data may span across many or all of an enterprise's functional subject areas. It is common for multiple data marts to be used in order to serve the needs of each individual business unit (different data marts can be used to obtain specific information for various enterprise departments, such as accounting, marketing, sales, etc.). The related term spreadmart is a pejorative describing the situation that occurs when one or more business analysts develop a system of linked spreadsheets to perform a business analysis, then grow it to a size and degree of complexity that makes it nearly impossible to maintain. The term for this condition is "Excel Hell". Data mart vs data warehouse Data warehouse: Holds multiple subject areas Holds very detailed information Works to integrate all data sources Does not necessarily use a dimensional model but feeds dimensional models. Data mart: Often holds only one subject area- for example, Finance, or Sales May hold more summarized data (although it may hold full detail) Concentrates on integrating information from a given subject area or set of source systems Is built focused on a dimensional model using a star schema. Design schemas Star schema - fairly popular design choice; enables a relational database to emulate the analytical functionality of a multidimensional database Snowflake schema
https://en.wikipedia.org/wiki/Macsyma
Macsyma (; "Project MAC's SYmbolic MAnipulator") is one of the oldest general-purpose computer algebra systems still in wide use. It was originally developed from 1968 to 1982 at MIT's Project MAC. In 1982, Macsyma was licensed to Symbolics and became a commercial product. In 1992, Symbolics Macsyma was spun off to Macsyma, Inc., which continued to develop Macsyma until 1999. That version is still available for Microsoft's Windows XP operating system. The 1982 version of MIT Macsyma remained available to academics and US government agencies, and it is distributed by the US Department of Energy (DOE). That version, DOE Macsyma, was maintained by Bill Schelter. Under the name of Maxima, it was released under the GPL in 1999, and remains under active maintenance. Development The project was initiated in July, 1968 by Carl Engelman, William A. Martin (front end, expression display, polynomial arithmetic) and Joel Moses (simplifier, indefinite integration: heuristic/Risch). Martin was in charge of the project until 1971, and Moses ran it for the next decade. Engelman and his staff left in 1969 to return to The MITRE Corporation. Some code came from earlier work, notably Knut Korsvold's simplifier. Later major contributors to the core mathematics engine were: Yannis Avgoustis (special functions), David Barton (solving algebraic systems of equations), Richard Bogen (special functions), Bill Dubuque (indefinite integration, limits, power series, number theory, special functions, functional equations, pattern matching, sign queries, Gröbner, TriangSys), Richard Fateman (rational functions, pattern matching, arbitrary precision floating-point), Michael Genesereth (comparison, knowledge database), Jeff Golden (simplifier, language, system), R. W. Gosper (definite summation, special functions, simplification, number theory), Carl Hoffman (general simplifier, macros, non-commutative simplifier, ports to Multics and LispM, system, visual equation editor), Charles Karney (plotting), John Kulp, Ed Lafferty (ODE solution, special functions), Stavros Macrakis (real/imaginary parts, compiler, system), Richard Pavelle (indicial tensor calculus, general relativity package, ordinary and partial differential equations), David A. Spear (Gröbner), Barry Trager (algebraic integration, factoring, Gröbner), Paul S. Wang (polynomial factorization and GCD, complex numbers, limits, definite integration, Fortran and LaTeX code generation), David Y. Y. Yun (polynomial GCDs), Gail Zacharias (Gröbner) and Rich Zippel (power series, polynomial factorization, number theory, combinatorics). Macsyma was written in Maclisp, and was, in some cases, a key motivator for improving that dialect of Lisp in the areas of numerical computing, efficient compilation and language design. Maclisp itself ran primarily on PDP-6 and PDP-10 computers, but also on the Multics OS and on the Lisp Machine architectures. Macsyma was one of the largest, if not the largest, Lisp programs of the time. C
https://en.wikipedia.org/wiki/OpenZaurus
OpenZaurus is a defunct embedded operating system for the Sharp Zaurus personal mobile tool PDA. History In its original form, the project was a repackaging of the SharpROM, the Zaurus's factory supplied kernel and root filesystem image. In order to make the Zaurus's OS closer to the needs of the developer community, the SharpROM was altered through the use of bugfixes, software additions, and even removals in order to make the package more open. The OpenZaurus project was revamped completely, becoming Debian-based built from source, from the ground up. Due to the change in direction, OpenZaurus became quite similar to other embedded Debian-based distributions, such as Familiar for the iPAQ. OpenZaurus, in its current form, facilitates an easy method for users to build their own custom images. The efforts of Openzaurus, along with other embedded Linux projects, were integrated into the OpenEmbedded Project, which now provides the common framework for these projects. Variants In addition to building a custom OpenZaurus image using OpenEmbedded metadata, The OpenZaurus distribution can be acquired in three variations for each version release. Bootstrap: A minimal, console based image with a working root filesystem, and networking over SSH, WLAN, Bluetooth, or USB. Suitable for bootstrapping a larger, X11 system. GPE: Everything the Bootstrap image contains, plus the X Window System and the GTK+ based GPE Palmtop Environment. OPIE: Everything the Bootstrap image contains and the Qt based OPIE Palmtop Integrated Environment. Status On April 26, 2007, it was announced that the OpenZaurus project was over. Future development efforts are to focus on the Ångström distribution for embedded systems. See also Palm OS Pocket PC Windows Mobile References External links Hentges.net OZ with updated OPIE releases ARM Linux distributions Personal digital assistant software Debian-based distributions Embedded Linux distributions Operating systems using GPE Palmtop Environment Linux distributions
https://en.wikipedia.org/wiki/Mother%20%28video%20game%20series%29
(known as EarthBound outside Japan) is a video game series that consists of three role-playing video games: Mother (1989), known as EarthBound Beginnings outside Japan, for the Family Computer; Mother 2 (1994), known as EarthBound outside Japan, for the Super Nintendo Entertainment System; and Mother 3 (2006) for the Game Boy Advance. Written by Shigesato Itoi, published by Nintendo, and featuring game mechanics modeled on the Dragon Quest series, Mother is known for its sense of humor, originality, and parody. The player uses weapons and psychic powers to fight hostile enemies, which include animated everyday objects, aliens and brainwashed people. Signature elements of the series include a lighthearted approach to the plot, battle sequences with psychedelic backgrounds, and the "rolling HP meter": player health ticks down like an odometer rather than instantly being subtracted, allowing the player to take preventative action, such as healing or finishing the battle, before the damage is fully dealt. While the franchise is popular in Japan, in the Anglosphere it is best associated with the cult following behind EarthBound. While visiting Nintendo for other business, Itoi approached Shigeru Miyamoto about making Mother. When approved for a sequel, Itoi increased his involvement in the design process over the five-year development of EarthBound. When the project began to flounder, producer and later Nintendo president Satoru Iwata rescued the game. EarthBound English localizers were given great liberties when translating the Japanese game's cultural allusions. The American version sold poorly despite a multimillion-dollar marketing budget. Mother 3 was originally slated for release on the Nintendo 64 and its 64DD disk drive accessory, but was cancelled in 2000. Three years later, the project was reannounced for the Game Boy Advance alongside a rerelease of Mother and Mother 2 in the combined cartridge Mother 1 + 2. Mother 3 abandoned the 3D graphics progress for a 2D style, and became a bestseller upon its release. EarthBound was rereleased for the Wii U Virtual Console in 2013, and Mother received its English-language debut for the same platform in 2015, retitled EarthBound Beginnings. In 2022, Nintendo released Mother 1 and 2 to their Nintendo Switch Online service. EarthBound is widely regarded as a video game classic, and is included in multiple top-ten lists. In absence of continued official support for the series, members of the EarthBound fan community organized online to advocate for further series releases through petitions and fan art. Their projects include a full fan translation of Mother 3, a full-length documentary, and a fangame sequel-turned-spiritual successor called Oddity. Ness, the protagonist of EarthBound, received exposure from his inclusion in all five entries of the Super Smash Bros. series. Other Mother series locations and characters have made appearances in the fighting games. Gameplay The series is known for its c
https://en.wikipedia.org/wiki/ILP
ILP can refer to: Computer science Inductive logic programming Information Leak Prevention Instruction-level parallelism Integer linear programming Other ilp., a 2013 album by Kwes Independent Labour Party, United Kingdom Independent Living Program, a US Veteran Affairs program aimed at making sure that each eligible veteran is able to live independently Index Librorum Prohibitorum, the list of publications banned by the Catholic Church between 1559 and 1966. Individual Learning Plan, a teaching methodology Inner Line Permit, a permission required for mainland Indian citizens to be able to travel into a restricted/protected state of North-East India Institution of Lighting Professionals, a professional lighting association based in the UK and Ireland. Intelligence-led policing Isolated Limb Perfusion, a limb-sparing, neoadjuvant therapy for soft tissue sarcomas
https://en.wikipedia.org/wiki/Polygon%20%28computer%20graphics%29
Polygons are used in computer graphics to compose images that are three-dimensional in appearance. Usually (but not always) triangular, polygons arise when an object's surface is modeled, vertices are selected, and the object is rendered in a wire frame model. This is quicker to display than a shaded model; thus the polygons are a stage in computer animation. The polygon count refers to the number of polygons being rendered per frame. Beginning with the fifth generation of video game consoles, the use of polygons became more common, and with each succeeding generation, polygonal models became increasingly complex. Competing methods for rendering polygons that avoid seams Point Floating Point Fixed-Point Polygon because of rounding, every scanline has its own direction in space and may show its front or back side to the viewer. Fraction (mathematics) Bresenham's line algorithm Polygons have to be split into triangles The whole triangle shows the same side to the viewer The point numbers from the Transform and lighting stage have to converted to Fraction (mathematics) Barycentric coordinates (mathematics) Used in raytracing See also Low poly Polygon, for general polygon information Polygon mesh, for polygon object representation Polygon modeling References 3D computer graphics
https://en.wikipedia.org/wiki/Boot%20disk
A boot disk is a removable digital data storage medium from which a computer can load and run (boot) an operating system or utility program. The computer must have a built-in program which will load and execute a program from a boot disk meeting certain standards. While almost all modern computers can boot from a hard drive containing the operating system and other software, they would not normally be called boot disks (because they are not removable media). CD-ROMs are the most common forms of media used, but other media, such as magnetic or paper tape drives, ZIP drives, and more recently USB flash drives can be used. The computer's BIOS must support booting from the device in question. One can make one's own boot disk (typically done to prepare for when the system won't start properly). Uses Boot disks are used for: Operating system installation Data recovery Data purging Hardware or software troubleshooting BIOS flashing Customizing an operating environment Software demonstration Running a temporary operating environment, such as when using a Live USB drive. Administrative access in case of lost password is possible with an appropriate boot disk with some operating systems Games (e.g. for Amiga home computers, running MS-DOS video games on modern computers by using a bootable MS-DOS or FreeDOS USB flash drive). Process The term boot comes from the idea of lifting oneself by one's own bootstraps: the computer contains a tiny program (bootstrap loader) which will load and run a program found on a boot device. This program may itself be a small program designed to load a larger and more capable program, i.e., the full operating system. To enable booting without the requirement either for a mass storage device or to write to the boot medium, it is usual for the boot program to use some system RAM as a RAM disk for temporary file storage. As an example, any computer compatible with the IBM PC is able with built-in software to load the contents of the first 512 bytes of a floppy and to execute it if it is a viable program; boot floppies have a very simple loader program in these bytes. The process is vulnerable to abuse; data floppies could have a virus written to their first sector which silently infects the host computer if switched on with the disk in the drive. Media Bootable floppy disks ("boot floppies") for PCs usually contain DOS or miniature versions of Linux. The most commonly available floppy disk can hold only 1.4 MB of data in its standard format, making it impractical for loading large operating systems. The use of boot floppies is in decline, due to the availability of other higher-capacity options, such as CD-ROMs or USB flash drives. Device selection A modern PC is configured to attempt to boot from various devices in a certain order. If a computer is not booting from the device desired, such as the floppy drive, the user may have to enter the BIOS Setup function by pressing a special key when the computer is
https://en.wikipedia.org/wiki/Read-only
In computer technology, read-only can refer to: Read-only memory (ROM), a type of storage media Read-only access to memory using memory protection Read-only access to files or directories in file system permissions Read-only access for database administrators in database system permissions
https://en.wikipedia.org/wiki/DEM%20%28disambiguation%29
DEM was the ISO 4217 currency code for the Deutsche Mark, former currency of Germany Computing Digital elevation model, a digital representation of ground-surface topography or terrain .dem, a common extension for USGS DEM files Discrete element method or discrete element modeling, a family of numerical methods for computing the motion of a large number of small particles (like molecules or grains of sand) Diffuse element method, a numerical simulation method used (for example) to solve partial differential equations Display Encode Mode, a feature of the AMD's Video Codec Engine Distance Estimation Method, for drawing Julia sets or Mandelbrot sets Organisations Department of Environmental Management, a name of various government entities Democratic Party, short form of the name of the political parties in the world Democratic Party (United States) Democrats (Brazil) Dravske elektrarne Maribor d.o.o., an electric power company in Slovenia Day Eight Music, a record label founded by Jonas Hellborg Other uses Dem language Demonstrative case (abbreviated ) Deus ex machina (Latin; literally "a god from a machine"), a resolution to a story that does not pay due regard to the story's internal logic and that is so unlikely that it challenges suspension of disbelief, and presumably allows the author, director, or developer to end the story in the way that he or she desired Diethyl malonate, the diethyl ester of malonic acid Dynamic element matching, a technique used in integrated circuits design to compensate for components mismatch
https://en.wikipedia.org/wiki/Distance%20fog
Distance fog is a technique used in 3D computer graphics to enhance the perception of distance by shading distant objects differently. Because many of the shapes in graphical environments are relatively simple, and complex shadows are difficult to render, many graphics engines employ a "fog" gradient so objects further from the camera are progressively more obscured by haze and by aerial perspective. This technique simulates the effect of light scattering, which causes more distant objects to appear lower in contrast, especially in outdoor environments. Visibility in a natural haze declines exponentially, not linearly, with distance due to scattering. The colour of the light being scattered into the viewing path affects the colour of the haze; blue under blue skies, reddish near sunset, as with alpenglow. These more subtle details are represented in some graphics. "Fogging" is another use of distance fog in mid-to-late 1990s games, when processing power was not enough to render far viewing distances, and clipping was employed. Clipping could be very distracting since bits and pieces of polygons would flicker in and out of view instantly, and by applying a medium-ranged distance fog, the clipped polygons would appear at a sufficiently far distance that they were obscured by the fog, fading in as the player approached. See also Aerial perspective Anisotropic filtering Computer graphics Draw distance Level of detail (LOD) Scale space References 3D rendering
https://en.wikipedia.org/wiki/Gray%20Panthers
The Gray Panthers are a series of multi-generational local advocacy networks in the United States which confront ageism and many other social justice issues. The organization was formed by Maggie Kuhn in response to her forced retirement from the Presbyterian Church at the age of 65 in 1970. The Gray Panthers are named in reference to the Black Panthers. In addition to its initial response to the issue of mandatory retirement, Gray Panthers have challenged other ageist laws and stereotypes and engaged in anti-war activism, Medicare and Social Security preservation, inter-generational housing, LGBT rights advocacy, environmentalism, the fair treatment of people in nursing homes, and the promotion of single-payer health care. History Founding history Maggie Kuhn's interest in older persons’ rights existed well before she founded the Gray Panthers in 1970. She was involved with the White House Conference on Aging in 1961, and appalled by the way people in some retirement homes were treated. What really sparked her determination to form an activist organization was when she found herself a victim of the lack of rights for older persons in 1970, forced to retire from a job she loved in the Presbyterian Church. Instead of passively accepting retirement, Kuhn decided to band together with other people she knew who were also forced to retire. The lack of accepting the status quo would not just form the Gray Panther organization, but also the Gray Panther name--“It’s a fun name. There’s a certain militancy, rather than just a docile acceptance of what our country’s doing.” When Kuhn formed the organization, her alliances were not just limited to other older persons facing a plight similar to her own. As indicated by the Gray Panthers’ slogan which endures to this day, “Age and Youth in Action,” she made it a priority to include people of all generations. Gray Panthers membership continues to be inclusive of people of diverse ages. Issues in the 1970s and 1980s During this period, the main issues of the Gray Panthers included forced retirement, ageist stereotypes, cuts to Medicare and Social Security, and world peace. There had been a mandatory forced retirement age at 65, based on the perception that older persons could not be productive members of a workplace after that age. While that stereotype was difficult to overcome, the Panthers were ultimately successful in their efforts to overturn the law, and in 1986, Congress passed a law banning mandatory retirement in most jobs. The law was signed by President Ronald Reagan, as of that time the oldest ever President of the United States. There were other ageist stereotypes that the Gray Panthers wanted to end. In particular, the stereotype that older persons were “impotent, frail, disabled, demented, or dependent.” By confronting those stereotypes, the Gray Panthers were at the forefront of ideas some considered provocative—namely, ideas that older persons should be able to live in inter-generatio
https://en.wikipedia.org/wiki/Datenschleuder
, literally translated as The Data Slingshot: The scientific trade journal for data voyagers, is a German hacker magazine that is published at irregular intervals by the Chaos Computer Club (CCC). Topics include primarily political and technical aspects of the digital world (freedom of information, data privacy (data protection), closed-circuit television, personal privacy (personal rights), cryptography and many more). was first published in 1984 and also can be subscribed to independently of a membership in the CCC. The founder is Herwart Holland Moritz. All (more than 90) back issues are freely available on the Internet as well. The current print paper format is DIN A5 as per ISO 216. Its editorial process is carried out over the Internet, while the magazine itself is printed in and distributed from Hamburg. Issue #92 from March 2008 contained a reproduction of a fingerprint from Wolfgang Schäuble, the interior minister of Germany at the time. The US phreaking magazine TAP – The Youth International Party Line (YIPL) (founded in 1971) has been described as a model for Datenschleuder. See also Chaos Computer Club Chaos Digest (ChaosD) References External links 1984 establishments in West Germany Computer magazines published in Germany German-language magazines Hacker magazines Irregularly published magazines published in Germany Magazines established in 1984 Magazines published in Hamburg Works about computer hacking
https://en.wikipedia.org/wiki/P4
P4 may refer to: Computing Intel Pentium 4, a processor series shipped from 2000 to 2008 The P4 power connector, introduced in the ATX12V 1.0 standard to power these and later CPUs The i486DX (P4) model of the Intel 80486 microprocessor, introduced in 1989 P4 (programming language), for controlling network data forwarding P4, the Perforce software command line client Media P4 Radio Hele Norge (PFI), a Norwegian radio company Kanal 24 (Kanal 4), which acquired the Norwegian P4 channel from PFI Sveriges Radio P4, a Swedish radio channel Persona 4, a 2008 video game Periphery IV: Hail Stan, 2019 album by American progressive metal band Periphery Military Skaraborg Regiment (armoured), a Swedish army unit, designated P 4 Peugeot P4, a French military vehicle Science P4 laboratory, a biosafety level 4 facility Tetraphosphorus (P4), an allotrope of phosphorus Group p4, the plane symmetry group wallpaper group p4 Progesterone (Pregn-4-ene-3,20-dione), a steroid hormone Kerberos, the fourth moon of Pluto Perfect fourth, a musical interval Enterobacteria phage P4 P4, an EEG electrode site according to the 10-20 system P4 cell, a stage in the Caenorhabditis elegans embryonic development P4-metric, in statistics, a performance metric Roads P4 road (Latvia) P04 road (Ukraine) Other uses Papyrus 4, a New Testament manuscript Trans-Pacific Strategic Economic Partnership, or P4, a free trade agreement between Brunei, Chile, New Zealand, and Singapore Prussian P 4, a German steam locomotive Protofour, or P4, a set of standards for model railways See also 4P (disambiguation) Phosphate, molecular formula PO43−. Play (telecommunications) P4, a Polish cellular telecommunications provider
https://en.wikipedia.org/wiki/DirectBand
DirectBand was a North American wireless datacast network owned and operated by Microsoft. It used FM radio broadcasts in over 100 cities to constantly transmit data to a variety of devices, including portable GPS devices, wristwatches and home weather stations. How it works DirectBand used the 67.65 kHz subcarrier leased by Microsoft from commercial radio broadcasters. This subcarrier delivers about 12 kbit/s (net after ECC) of data per tower, for over 100 MB per day per city. Data included traffic, sports, weather, stocks, news, movie times, calendar appointments, and local time. Not like RDS DirectBand did not use the RDS (Radio Data System) subcarrier. RDS is a different system and has much lower data rate (~730 bit/s after ECC, including framing). Its much narrower subcarrier is primarily used for radio station information and traffic. DirectBand and RDS can co-exist on the same FM station. Forward acting error correction Since many DirectBand uses were mobile, and there was no opportunity to request retransmission of a broadcast signal, DirectBand utilized an advanced error-correction strategy that allowed for reconstruction of messages even when sizable portions of the message were lost due to buildings, tunnels or other obstructions of the FM signal. Error correction was 1/2 rate interleaved trellis with time diversity, soft-decision decode. The DirectBand data rate was in excess of 12 kbit/s after ECC. Push network DirectBand was a push network new content was delivered every two minutes. Users pre-selected the virtual channels that they were interested in. Receivers There were a variety of DirectBand receivers. All used a small (2.794 mm × 2.794 mm × 860 µm) radio receiver. Some designs added an ARM7-based processor. The initial DirectBand products were a series of data watches. These had mild success, but never met expectations and production of new watches was discontinued in 2008. Recently, several other applications have surfaced, the most visible being the traffic data/local info market, particularly to auto GPS sets for Garmin and Avis. This competes directly with older RDS-based services, which operate at a substantially lower data rate. Microsoft design DirectBand is a product of the Smart Personal Objects Technology (SPOT) team at Microsoft. System hardware was designed for Microsoft by SCA Data Systems of Santa Monica, California. MSN Direct is the consumer brand that Microsoft uses for devices that receive content from the DirectBand network. FM subcarrier usage RDS uses a portion of the FM station spectrum immediately above the stereo signal, centered at 57 kHz (the stereo pilot frequency). RDS extends between about 55 and 59 kHz. DirectBand is above RDS, extending from about 59 kHz to 75 kHz. Shutdown On October 26, 2009, Microsoft announced that MSN Direct service would end on January 1, 2012. Although this clearly indicated Microsoft's intent to cease usage of the service, it is not yet known whether th
https://en.wikipedia.org/wiki/Blitter%20object
A Bob (contraction of Blitter object) is a graphical element (GEL) used by the Amiga computer. Bobs are hardware sprite-like objects, movable on the screen with the help of the blitter coprocessor. The AmigaOS GEL system consists of VSprites, Bobs, AnimComps (animation components) and AnimObs (animation objects), each extending the preceding with additional functionality. While VSprites are a virtualization of hardware sprites Bobs are drawn into a playfield by the blitter, saving and restoring the background of the GEL as required. The Bob with the highest video priority is the last one to be drawn, which makes it appear to be in front of all other Bobs. In contrast to hardware sprites Bobs are not limited in size and number. Bobs require more processing power than sprites, because they require at least one DMA memory copy operation to draw them on the screen. Sometimes three distinct memory copy operations are needed: one to save the screen area where the Bob would be drawn, one to actually draw the Bob, and one later to restore the screen background when the Bob moves away. An AnimComp adds animation to a Bob and an AnimOb groups AnimComps together and assigns them velocity and acceleration. See also Original Amiga chipset References Rob Peck (1986). ROM Kernel Reference Manual: Libraries and Devices, Addison-Wesley, Amiga Amiga APIs Computer graphics Demo effects
https://en.wikipedia.org/wiki/Internet%20caf%C3%A9
An Internet café, also known as a cybercafé, is a café (or a convenience store or a fully dedicated Internet access business) that provides the use of computers with high bandwidth Internet access on the payment of a fee. Usage is generally charged by the minute or part of hour. An Internet cafe will generally also offer refreshments or other services such as phone repair. Internet cafes are often hosted within a shop or other establishment. They are located worldwide, and many people use them when traveling to access webmail and instant messaging services to keep in touch with family and friends. Apart from travelers, in many developing countries Internet cafés are the primary form of Internet access for citizens as a shared-access model is more affordable than personal ownership of equipment and/or software. Internet cafés are a natural evolution of the traditional café. As Internet access rose many pubs, bars and cafés added terminals and eventually Wi-Fi hotspots, eroding the distinction between the Internet café and normal cafés. Pre-internet online cafes The early history of public access online networking sites is largely unwritten and undocumented. There are many experiments that can lay claim to being precursors to internet cafés. In March 1988, the 'Electronic Café' was opened near Hongik University in Seoul, South Korea by Ahn Sang-soo (Professor of Hongik University) and Gum Nu-ri (Professor of Kookmin University). Two 16bit computers connected to Online service networks through telephone lines. Offline meetings were held in the café, which served as a place that connected online and offline activities. In July 1991, the SFnet Coffeehouse Network was opened in San Francisco, United States by Wayne Gregori. Gregori installed coin-operated computer terminals in coffeehouses throughout the San Francisco Bay Area. The terminals dialed into a 32 line Bulletin Board System that offered an array of electronic services including FIDOnet mail and, in 1992, Internet mail. Internet cafés The concept of a café with full Internet access (and the name Cybercafé) was invented in early 1994 by Ivan Pope. Commissioned to develop an Internet event for an arts weekend at the Institute of Contemporary Arts (ICA) in London, and inspired by the SFnet terminal based cafes, Pope wrote a proposal outlining the concept of a café with Internet access. For the event Seduced and Abandoned: The Body in the Virtual World. Over the weekend of March 12–13 in the theatre at the ICA, Pope ran a Cybercafe which consisted of multiple Apple Mac computers on cafe style tables with menus of available services. Around June 1994, The Binary Cafe, Canada's first Internet café, opened in Toronto, Ontario. Inspired partly by the ICA event and associated with an Internet provider startup, EasyNet, in the same building, a commercial Internet café called Cyberia opened on September 1, 1994, in London, England. The first public, commercial American Internet café was concei
https://en.wikipedia.org/wiki/Inside%20Macintosh
Inside Macintosh is the developer documentation published by Apple Computer, documenting the APIs and machine architecture of the Macintosh's classic Mac OS. Early editions The first Inside Macintosh documentation, for the Mac 128K, was distributed in two large binders with photocopied 3-hole-punched pages. Every few months, updated sections were distributed for insertion into the binders. Some of the original sections were written by very early members of the Macintosh group, including Chris Espinosa and Joanna Hoffman. In July 1982, Caroline Rose was hired to take over the software documentation, while Bradley Hacker focused on documenting the hardware. In addition to being the lead writer, Rose edited Volumes I–III and was the project supervisor. In 1984, additional writers joined the effort, including Robert Anders, Mark Metzler, Kate Withey, Steve Chernicoff, Andy Averill, and Brent Davis.   Due to numerous last-minute software changes, the official version to be published by Addison-Wesley was delayed. In the meantime, a $25 Promotional Edition (known as the "phone book edition" because it was published by phone book publisher Lakeside Press) became available in April 1985. Addison-Wesley published Volumes I–III in July 1985 in two formats: as three separate paperback books and as one hardcover book combining all three volumes. It was the official technical documentation for the original Mac 128K, the Mac 512K ("Fat Mac"), and Mac XL models. Reception Reactions to Volumes I–III were mixed. While many praised the documentation for its clarity, thoroughness, and consistency, others disagreed, particularly complaining about the lack of sample code. Among the positive feedback were the following: In the January 27, 1986, issue of InfoWorld, columnist John C. Dvorak wrote that the highlight of the Appleworld Conference, for many, was Addison-Wesley’s publication of Inside Macintosh. "It's $75 and worth every penny. It tells you everything you never wanted know about the Macintosh—a must for any developer." Also in 1986, Inside Macintosh Volumes I–III won an Award of Achievement in the Society for Technical Communication's Northern California competition. In 1988, noted software developer and columnist Stan Krute wrote, "If Pulitzers had a technical writing category, Inside Mac would win a prize. [Its writers] have given us the most comprehensive insight into a complex cybernetic system yet seen." On the negative side: Bruce F. Webster in Byte of December 1985 described Inside Macintosh as "infamous, expensive, and obscure", but "for anyone wanting to do much with the Mac ... the only real [printed] source of information." He quoted Kathe Spracklen, developer of Sargon, as saying that the book "consists of 25 chapters, each of which requires that you understand the other 24 before reading it." A Mac GUI article by Dog Cow quotes Robert C. Platt as saying, "The best guide to the Mac's ROMs is Inside Macintosh. Unfortunately, In
https://en.wikipedia.org/wiki/Telerobotics
Telerobotics is the area of robotics concerned with the control of semi-autonomous robots from a distance, chiefly using television, wireless networks (like Wi-Fi, Bluetooth and the Deep Space Network) or tethered connections. It is a combination of two major subfields, which are teleoperation and telepresence. Teleoperation Teleoperation indicates operation of a machine at a distance. It is similar in meaning to the phrase "remote control" but is usually encountered in research, academic and technical environments. It is most commonly associated with robotics and mobile robots but can be applied to a whole range of circumstances in which a device or machine is operated by a person from a distance. Teleoperation is the most standard term, used both in research and technical communities, for referring to operation at a distance. This is opposed to "telepresence", which refers to the subset of telerobotic systems configured with an immersive interface such that the operator feels present in the remote environment, projecting his or her presence through the remote robot. One of the first telepresence systems that enabled operators to feel present in a remote environment through all of the primary senses (sight, sound, and touch) was the Virtual Fixtures system developed at US Air Force Research Laboratories in the early 1990s. The system enabled operators to perform dexterous tasks (inserting pegs into holes) remotely such that the operator would feel as if he or she was inserting the pegs when in fact it was a robot remotely performing the task. A telemanipulator (or teleoperator) is a device that is controlled remotely by a human operator. In simple cases the controlling operator's command actions correspond directly to actions in the device controlled, as for example in a radio-controlled model aircraft or a tethered deep submergence vehicle. Where communications delays make direct control impractical (such as a remote planetary rover), or it is desired to reduce operator workload (as in a remotely controlled spy or attack aircraft), the device will not be controlled directly, instead being commanded to follow a specified path. At increasing levels of sophistication the device may operate somewhat independently in matters such as obstacle avoidance, also commonly employed in planetary rovers. Devices designed to allow the operator to control a robot at a distance are sometimes called telecheric robotics. Two major components of telerobotics and telepresence are the visual and control applications. A remote camera provides a visual representation of the view from the robot. Placing the robotic camera in a perspective that allows intuitive control is a recent technique that although based in Science Fiction (Robert A. Heinlein's Waldo 1942) has not been fruitful as the speed, resolution and bandwidth have only recently been adequate to the task of being able to control the robot camera in a meaningful way. Using a head mounted display, the
https://en.wikipedia.org/wiki/Cacophony%20Society
The Cacophony Society is an American organization described on their website as "a randomly gathered network of free spirits united in the pursuit of experiences beyond the pale of mainstream society". It was started in 1986 by surviving members of the defunct Suicide Club of San Francisco. Cacophony has been described as an indirect culture jamming outgrowth of the Dada movement. One of its central concepts is the Trip to the Zone, or Zone Trip, inspired by the 1979 Film Stalker by Andrei Tarkovsky. According to self-designated members of the Society, "you may already be a member". The anarchic nature of the Society means that membership is left open-ended and anyone may sponsor an event, though not every idea pitched garners attendance by members. Cacophony events often involve costumes and pranks in public places and sometimes going into places that are generally off limits to the public. Cacophonists have been known to regale Christmas shoppers with improvised Christmas carols while dressed as Santa Claus. San Francisco chapter Members of the Cacophony Society's first group also became the primary organizers of the annual Burning Man event after Cacophony member Michael Mikel attended its previous incarnation as an as-yet-unnamed beach party at Baker Beach in 1988 and publicized the 1989 event in the Cacophony Society newsletter. Cacophonist Kevin Evans conceived of Zone Trip #4 in 1990 and organized it with John Law and Michael Mikel, publicizing it in the newsletter as "A Bad Day at Black Rock". Larry Harvey and Jerry James were subsequently invited to bring their effigy along, after they were prevented from burning it on the beach by law enforcement. Other events created by the Society are: the Atomic Cafe, the Chinese New Year's Treasure Hunt, the picnic on the Golden Gate Bridge, driving an earthquake-damaged car on the closed Embarcadero Freeway to commemorate the 1989 Loma Prieta earthquake, the Brides of March, Urban Iditarod, and the Sewer Walk. After a lull in activity in the San Francisco branch of the society in the late 1990s and the cessation of publication of that chapter's monthly newsletter Rough Draft listing of events for the San Francisco Cacophony Society (172 issues were produced during the years 1986 to 2001), a group of subscribers to the practically defunct society's email discussion list became active under the Cacophony Society aegis following a mock Pigeon Roast put on by a fictitious organization calling itself "Bay Area Rotisserie Friends" in San Francisco's Union Square in 2000 proposed by Drunken Consumptive Panda. This new group of Cacophonists is occasionally referred to by its members as Cacophony 2.0 and emphasize their chaotic, ebullient spirit with the motto "If you don't live it, it won't come out your [bull]horn." The Society's newsletter was briefly revived under the name 2econd Draft. In 2013 Kevin Evans, Carrie Galbraith and John Law co-authored Tales of the San Francisco Cacophony Society,
https://en.wikipedia.org/wiki/3B%20series%20computers
The 3B series computers are a line of minicomputers made between the late 1970s and 1993 by AT&T Computer Systems' Western Electric subsidiary, for use with the company's UNIX operating system. The line primarily consists of the models 3B20, 3B5, 3B15, 3B2, and 3B4000. The series is notable for controlling a series of electronic switching systems for telecommunication, for general computing purposes, and for serving as the historical software porting base for commercial UNIX. History The first 3B20D was installed in Fresno, California at Pacific Bell. Within two years, several hundred were in place throughout the Bell System. Some of the units came with "small, slow hard disks". The general purpose family of 3B computer systems includes the 3B2, 3B5, 3B15, 3B20S, and 3B4000. They run the AT&T UNIX operating system and were named after the successful 3B20D High Availability processor. In 1984, after regulatory constraints were lifted, AT&T introduced the 3B20D, 3B20S, 3B5, and 3B2 to the general computer market, a move that some commentators saw as an attempt to compete with IBM. In Europe, the 3B computers were distributed by Italian firm Olivetti, in which AT&T had a minority shareholding. After AT&T bought NCR Corporation, effective January 1992, the computers were marketed through NCR sales channels. Having produced 70,000 units, the AT&T Oklahoma City plant stopped manufacturing 3B machines at the end of 1993, with the 3B20D to be the last units manufactured. 3B high-availability processors The original series of 3B computers includes the models 3B20C, 3B20D, 3B21D, and 3B21E. These systems are 32-bit microprogrammed duplex (redundant) high availability processor units running a real-time operating system. They were first produced in the late 1970s at the WECo factory in Lisle, Illinois, for telecommunications applications including the 4ESS and 5ESS systems. They use the Duplex Multi Environment Real Time (DMERT) operating system which was renamed UNIX-RTR (Real Time Reliable) in 1982. The Data Manipulation Unit (DMU) provides arithmetic and logic operations on 32-bit words using eight AMD 2901 4-bit-slice ALUs. The first 3B20D is called the Model 1. Each processor's control unit consists of two frames of circuit packs. The whole duplex system requires seven-foot frames of circuit packs plus at least one tape drive frame (most telephone companies at that time wrote billing data on magnetic tapes), and many washing machine-sized disk drives. For training and lab purposes, a 3B20D can be divided into two "half-duplex" systems. A 3B20S consists of most of the same hardware as a half-duplex but uses a completely different operating system. The 3B20C was briefly available as a high-availability fault tolerant multiprocessing general-purpose computer in the commercial market in 1984. The 3B20E was created to provide a cost-reduced 3B20D for small offices that did not expect such high availability. It consists of a virtual "emulated" 3B20D
https://en.wikipedia.org/wiki/Null%20device
In some operating systems, the null device is a device file that discards all data written to it but reports that the write operation succeeded. This device is called /dev/null on Unix and Unix-like systems, NUL: (see TOPS-20) or NUL on CP/M and DOS (internally \DEV\NUL), nul on OS/2 and newer Windows systems (internally \Device\Null on Windows NT), NIL: on Amiga operating systems, and NL: on OpenVMS. In Windows Powershell, the equivalent is $null. It provides no data to any process that reads from it, yielding EOF immediately. In IBM operating systems DOS/360 and successors and also in OS/360 and successors such files would be assigned in JCL to DD DUMMY. In programmer jargon, especially Unix jargon, it may also be called the bit bucket or black hole. History According to the Berkeley UNIX man page, Version 4 Unix, which AT&T released in 1973, included a null device. Usage The null device is typically used for disposing of unwanted output streams of a process, or as a convenient empty file for input streams. This is usually done by redirection. The /dev/null device is a special file, not a directory, so one cannot move a whole file or directory into it with the Unix mv command. References in computer culture This entity is a common inspiration for technical jargon expressions and metaphors by Unix programmers, e.g. "please send complaints to /dev/null", "my mail got archived in /dev/null", and "redirect to /dev/null"—being jocular ways of saying, respectively: "don't bother sending complaints", "my mail was deleted", and "go away". The iPhone Dev Team commonly uses the phrase "send donations to /dev/null", meaning they do not accept donations. The fictitious person name "Dave (or Devin) Null" is sometimes similarly used (e.g., "send complaints to Dave Null"). In 1996, Dev Null was an animated virtual reality character created by Leo Laporte for MSNBC's computer and technology TV series The Site. Dev/null is also the name of a vampire hacker in the computer game Vampire: The Masquerade – Redemption. A 2002 advertisement for the Titanium PowerBook G4 reads The Titanium Powerbook G4 Sends other UNIX boxes to /dev/null. The null device is also a favorite subject of technical jokes, such as warning users that the system's /dev/null is already 98% full. The 1995 April Fool's issue of the German magazine c't reported on an enhanced /dev/null chip that would efficiently dispose of the incoming data by converting it to a flicker on an internal glowing LED. See also Filesystem Hierarchy Standard Memory hole rm (Unix) Standard streams Unix philosophy Write-only memory Device file Notes References CP/M technology Unix file system technology Device file Computer humor
https://en.wikipedia.org/wiki/Devnull
Devnull is the name of a computer worm for the Linux operating system that has been named after , Unix's null device. This worm was found on 30 September 2002. This worm, once the host has been compromised, downloads and executes a shell script from a web server. This script downloads a gzipped executable file named from the same address, and then decompresses and runs the file. This downloaded file appears to be an IRC client. It connects to different channels and waits for commands to process on the infected host. Then the worm checks for presence of the GCC compiler on the local system and, if found, creates a directory called . Next, it downloads a compressed file called . After decompressing, two files are created: an ELF binary file called and a source script file called . The latter gets compiled into the ELF binary . The executable will scan for vulnerable hosts and use the compiled program to exploit a known OpenSSL vulnerability. See also Linux malware External links F-Secure's Website: Linux/Devnull Computer worms Linux malware
https://en.wikipedia.org/wiki/Chamber%20of%20commerce
A chamber of commerce, or board of trade, is a form of business network. For example, a local organization of businesses whose goal is to further the interests of businesses. Business owners in towns and cities form these local societies to advocate on behalf of the business community. Local businesses are members, and they elect a board of directors or executive council to set policy for the chamber. The board or council then hires a President, CEO, or Executive Director, plus staffing appropriate to size, to run the organization. A chamber of commerce may be a voluntary or a mandatory association of business firms belonging to different trades and industries. They serve as spokespeople and representatives of a business community. They differ from country to country. History The first chamber of commerce was founded in 1599 in Marseille, France, as the "Chambre de Commerce". Another official chamber of commerce followed 65 years later, probably in Bruges, then part of the Spanish Netherlands. The Royal Barcelona Board of Trade was established in 1758. The world's oldest English-speaking chamber of commerce and oldest chamber of commerce in North America is the Halifax Chamber of Commerce, founded in 1750. The Glasgow Chamber of Commerce, was founded in 1783. However, Hull Chamber of Commerce is the UK's oldest, followed by those of Leeds and of Belfast in present day Northern Ireland. As a non-governmental institution, a chamber of commerce has no direct role in the writing and passage of laws and regulations that affect businesses. It can, however, lobby in an attempt to get laws passed that are favorable to businesses. The United States Chamber of Commerce has a long history of anti-union lobbying and union-busting in the United States at the local and federal level. Characteristics Membership in an individual chamber can range from a few dozen to well over 800,000, as is the case with the Paris Île-de-France Regional Chamber of Commerce and Industry. Some chamber organizations in China report even larger membership numbers. Chambers of commerce can range in scope from individual neighborhoods within a city or town up to an international chamber of commerce. In the United States, chambers do not operate in the same manner as the Better Business Bureau in that, while the BBB has the authority to bind its members under a formal operation doctrine (and, thus, can remove them if complaints arise regarding their services), the local chamber membership is either voluntary or required by law. Some chambers are partially funded by local government, others are non-profit, and some are a combination of the two. Chambers of commerce also can include economic development corporations or groups (though the latter can sometimes be a formal branch of a local government, the groups work together and may in some cases share office facilities) as well as tourism and visitor bureaus. Some chambers have joined state, national (such as the United State
https://en.wikipedia.org/wiki/John%20Walker%20%28programmer%29
John Walker is a computer programmer, author and co-founder of the computer-aided design software company Autodesk. He has more recently been recognized for his writing on his website Fourmilab. Early projects In 1974/1975, Walker wrote the ANIMAL software, which self-replicated on UNIVAC 1100 machines. It is considered one of the first computer viruses. Walker also founded the hardware integration manufacturing company Marinchip. Among other things, Marinchip pioneered the translation of numerous computer language compilers to Intel platforms. Autodesk In 1982, John Walker and 12 other programmers pooled US$59,000 to start Autodesk (AutoCAD), and began working on several computer applications. The first completed was AutoCAD, a software application for computer-aided design (CAD) and drafting. AutoCAD had begun life as Interact, a CAD, written by programmer Michael Riddle in a proprietary language. Walker and Riddle rewrote the program, and established a profit-sharing agreement for any product derived from InteractCAD. Walker subsequently paid Riddle US$10 million for all the rights. The company went public in 1985. By mid-1986, the company had grown to 255 employees with annual sales of over $40 million. That year, Walker resigned as chairman and president of the company, continuing to work as a programmer. In 1989, Walker's book, The Autodesk File, was published. It describes his experiences at Autodesk, based around internal documents (particularly email) of the company. Walker moved to Switzerland in 1991. By 1994, when he resigned from the company, it was the sixth-largest personal computer software company in the world, primarily from the sales of AutoCAD. Walker owned about $45 million of stock in Autodesk at the time. Fourmilab He publishes on his personal domain, "Fourmi Lab", designed to be a play on Fermilab and , French for “ant”, one of his early interests. On his Web site, Walker publishes about his personal projects, including a hardware random number generator called HotBits, along with software that he writes and freely distributes, such as his Earth and Moon viewer. Another notable book was called The Hacker's Diet. The digital imprimatur Among other things, he is noted for a frequently cited article entitled The Digital Imprimatur: How big brother and big media can put the Internet genie back in the bottle, an article about Internet censorship written in 2003. It was published in the magazine Knowledge, Technology & Policy. In the article, Walker argues that there is increasing pressure limiting the ability for Internet users to voice their ideas, as well as predicting further Internet censorship. Walker said that the most likely candidate to usher what he calls "the digital imprimatur" is digital rights management, or DRM. In popular culture Walker's interest in artificial life prompted him to hire Rudy Rucker, a mathematician and science fiction author, for work on cellular automata software. Rucker later drew
https://en.wikipedia.org/wiki/Logitech
Logitech International S.A. ( ) is a Swiss company and a multinational manufacturer of computer peripherals and software, with global headquarter in Lausanne, Switzerland. The company has offices throughout Europe, Asia, Oceania, and the Americas, and is one of the world's leading manufacturers of input and interface devices for personal computers (PCs) and other digital products. It is a component of the flagship Swiss Market Index. The company develops and markets personal peripherals for PC navigation, video communication and collaboration, music and smart homes. This includes products like keyboards, mice, tablet accessories, headphones and headsets, webcams, Bluetooth speakers, universal remotes and more. Its name is derived from logiciel, the French word for software. History Logitech was founded in Apples, Vaud, Switzerland, in 1981, by Daniel "Bobo" Borel, Pierluigi Zappacosta, and former Olivetti engineer Giacomo Marini. Swiss-born Borel and Italian-born Zappacosta had met in California while taking electrical engineering classes in the late 1970s at Stanford University, under professors such as Ethernet inventor Robert Metcalfe. Returning to Europe, they began working on new ideas near Romanel-sur-Morges, Switzerland, and they brought in the Italian engineer Marini to round out the new company. Borel served as chairman of the board, focused on sales and manufacturing, and he was chief executive officer (CEO) for most of the 1990s. Zappacosta served as president and a period as CEO, and he oversaw research. He left Logitech in 1997 to lead Digital Persona, a biometrics company. The company founders first concentrated on creating word processing software for a large Swiss company, but the company canceled the project. Next, they turned to the computer mouse as an essential component of the graphical user interface used by a workstation requested by the Japanese company Ricoh. Logitech's first mouse, the P4 model, was produced in 1982 in Switzerland, based on an opto-mechanical design by Swiss inventor Jean-Daniel Nicoud working at the École Polytechnique Fédérale de Lausanne (Federal Institute of Technology in Lausanne). One of Logitech's offices was at 165 University Avenue, Palo Alto, California, US, home to a number of noted technology startups and centrally located in Silicon Valley. In 1984, Logitech won a contract to supply Hewlett-Packard with computer mice in the role of original equipment manufacturer (OEM). The mice that Logitech supplied to HP were made in a new factory in Fremont, California, and they were branded HP: they did not display the Logitech name. In the early-to-mid-1980s, Logitech stopped making mice in Switzerland, instead opening factories in Cork, Ireland, and Hsinchu, Taiwan, in addition to the Fremont location. Logitech created the first wireless mouse in 1984, using infrared (IR) light to connect to the Metaphor Computer Systems workstation developed by David Liddle and Donald Massaro, former Xerox PAR
https://en.wikipedia.org/wiki/The%20Journeyman%20Project%20%28video%20game%29
The Journeyman Project is a time travel adventure computer game developed by Presto Studios. Gameplay The game features a first-person perspective. The protagonist sees a display, a rectangle shaped visor (acting as a monocle for Agent 5). This user interface helps to reduce the movie size and maintain relatively high frame rates. Controls work as four interface buttons located below the screen. They move Agent 5 forward and backward, and rotate Agent 5 left and right. The Journeyman Project was billed as interactive movie adventure game, where the player is presented with several clues and puzzles that must be solved in order to move on or finish the level. Items that the player finds can be helpful or harmful as he attempts to explore his surroundings. The most important of these items are the seven bio-chips, which enhance the player-character's abilities in various ways. The game's user interface stores the bio-chips in a special "bio-chip panel", which serves as a "quick-menu" for activating and deactivating the various chips. Story The game takes place in the distant future, after the Earth has been united into a peaceful global community. A scientist has discovered the technology of time travel but because of its dangerous nature, the prototype machine, "Pegasus", has been placed under government control and further attempts at traveling through time or developing time travel technology are forbidden by law. The game begins as humanity welcomes the first alien delegation to visit the planet, and prepares to answer positively to an invitation to join the interplanetary "Symbiotry of Peaceful Beings". During the induction ceremony, the government-operated Temporal Security Agency (TSA), which was established to enforce prohibitions on time travel and safeguard the timeline, detects three temporal disturbances that have altered history; the Agency mobilizes Agent 5 to correct the disruptions, which have altered the timeline so that the Symbiotry never extended its invitation. Upon retrieving a cache of unaltered historical data from the distant past, Agent 5 discovers that the anachronisms are related to Earth's first contact with the Symbiotry; ten years prior, the aliens had extended their offer, and planned to return in one decade to receive Earth's answer. An unknown party has altered the timeline to prevent contact with the Symbiotry, either through preventing them from reaching Earth or changing humanity's reaction to the aliens' arrival. The disruptions occurred during three key events in Earth's recent past: The conference of 2112 which led to peace and prosperity through the unification of Earth over opposing voices. The robot "Poseidon" is sent to launch a nuclear missile and detonate it above Ghorbistan so that fear prevents the countries from proceeding with the acceptance of the treaty. The first acknowledged contact with an alien ship in 2185 above Mars. The robot "Ares" is sent to accomplish two tasks: to destroy the M
https://en.wikipedia.org/wiki/HSC
HSC may stand for: Business Hughes Systique Corporation Halifax Shopping Centre Harmonized System Code Harsco Corporation Computing HSC50 (Hierarchical Storage Controller), using DEC Mass Storage Control Protocol Hughes Systique Corporation Medicine Health and Social Care in Northern Ireland Hematopoietic stem cell Hepatic stellate cell Hospital for Sick Children, Toronto, Canada Hydraulic Sinus Condensing, a technique used in dentistry to lift the sinus Hysteroscopy, a method for looking into the uterus via the cervix Organizations Harare Sports Club, sports venue in Zimbabwe Health Service Commissioner for England Health Sponsorship Council Heraldry Society of Canada Huntington Society of Canada Health and Safety Commission Horizon Scanning Centre (HSC) United States Homeland Security Council Schooling Certificates Higher School Certificate (New South Wales), Australia Higher School Certificate (Victoria), Australia Higher School Certificate (UK) Higher Secondary (School) Certificate (South Asian countries) Schools Hampden–Sydney College Hillfield Strathallan College, private school in Hamilton, Ontario, Canada Hsin Sheng College of Medical Care and Management, private junior college in Taoyuan City, Taiwan Related Hindu Students Council Other High-speed craft Houston Ship Channel Hyper Suprime-Cam, camera for Subaru Telescope Mauser HSc, pistol Shaoguan Guitou Airport, IATA code