source
stringlengths
32
199
text
stringlengths
26
3k
https://en.wikipedia.org/wiki/Device
A device is usually a constructed tool. Device may also refer to: Technology Computing Device, a colloquial term encompassing different types of computers, such as desktops, laptops, tablets, smartphones, etc. Device file, an interface of a device driver Peripheral, any device attached to a computer that expands its functionality Warfare Improvised explosive device (IED) Nuclear weapon Other uses in technology Appliance, a device for a particular task Electronic component Gadget Machine Medical device Arts, entertainment, and media Music Groups Device (metal band), American industrial metal band active 2012–2014 Device (pop-rock band), American pop-rock trio from the mid 1980s Albums Device (Device album), 2013 Device (Eon album), 2006 Other uses in arts, entertainment, and media Plot device, as in storytelling Rhetorical device, a technique used in writing or speaking The Device, a 2014 American science fiction horror film Other uses Dev1ce or Device, nicknames for Nicolai Reedtz, a Danish professional CS:GO player Device, something that can be trademarked, such as a logotype (or "logo") in printing or in law Devices, aspects of a military-award decoration in the United Kingdom Personal device, another name for a heraldic badge People with the surname Device Alizon Device (died 1612), executed as one of the Pendle witches Elizabeth Device (died 1612), executed as one of the Pendle witches James Device (died 1612), executed as one of the Pendle witches Jennet Device (), witness at the trial of the Pendle witches See also Devise
https://en.wikipedia.org/wiki/Saatchi%20%26%20Saatchi
Saatchi & Saatchi is a British multinational communications and advertising agency network with 114 offices in 76 countries and over 6,500 staff. It was founded in 1970 and is currently headquartered in London. The parent company of the agency group was known as Saatchi & Saatchi PLC from 1976 to 1994, was listed on the New York Stock Exchange until 2000 and, for a time, was a constituent of the FTSE 100 Index. In 2000, the group was acquired by the Publicis Groupe. In 2005 it went private. History Early years (1970–1975) Saatchi & Saatchi was founded in London by brothers Maurice (now Lord Saatchi) and Charles Saatchi in 1970. Following stints starting as a copywriter at the New York City offices of Benton & Bowles in 1965, then at Collett Dickenson Pearce and John Collins & Partners, Charles Saatchi teamed up with art director Ross Cramer, and the genesis of what would become Saatchi & Saatchi was born in London in 1967 as the creative consultancy CramerSaatchi. The consultancy took on employees John Hegarty and Jeremy Sinclair and began to work direct for clients. It was Sinclair's "Pregnant Man" ad for the Health Education Council which first attracted attention to the small agency. Charles's younger brother Maurice joined the business in 1970 after Cramer's departure whereupon it was renamed "Saatchi and Saatchi" and developed into a full-service advertising agency. As a creative consultancy they had mainly worked on projects primarily for agencies but having become a full service agency they were able to quickly build their own solid client base and, by 1975, this included blue-chip clients such as Associated Newspapers, British Leyland, Brutus Jeans, Cunard, Dunlop, National Magazines, and Nestle. They also worked for the Labour Party. It was during these early years that the agency produced its first famous ad: the Pregnant Man for the UK's Health Education Council featuring a man who appeared to be pregnant. In 1975, the Saatchi brothers set upon a ferocious course of business acquisitions. With the reverse takeover of the Garland Compton agency in 1975, they achieved a public company listing and were thereafter able to make rights issues to raise the capital required for their acquisition drive. Garland Compton were a long established, conservative British agency, founded by Sidney Garland in 1927. In 1960, the US agency, Compton Advertising, bought a minority holding so that they could build their own international network, primarily to service their largest client, Procter & Gamble. Garland Compton was also a top 10 agency with big spending clients including Bass, British Caledonian, Gillette, New Zealand Dairy Board, Procter & Gamble, Schweppes and United Biscuits. They had also established a regional network of agencies and, of course, had access to the global Compton network. With the merger, the key management positions were taken by Saatchi people. With the merger, Saatchi & Saatchi moved into Garland Compton's offices on
https://en.wikipedia.org/wiki/Programming%20Language%20for%20Business
Programming Language for Business or PL/B is a business-oriented programming language originally called DATABUS and designed by Datapoint in 1972 as an alternative to COBOL because Datapoint's 8-bit computers could not fit COBOL into their limited memory, and because COBOL did not at the time have facilities to deal with Datapoint's built-in keyboard and screen. A version of DATABUS became an ANSI standard, and the name PL/B came about when Datapoint chose not to release its trademark on the DATABUS name. Functionality Much like Java and .NET, PL/B programs are compiled into an intermediate byte-code, which is then interpreted by a runtime library. Because of this, many PL/B programs can run on DOS, Unix, Linux, and Windows operating systems. The PL/B development environments are influenced by Java and Visual Basic, and offer many of the same features found in those languages. PL/B (Databus) is actively used all over the world, and has several forums on the Internet dedicated to supporting software developers. Since its inception, PL/B has been enhanced and adapted to keep it modernized and able to access various data sources. It has a database capability built-in with ISAM and Associative Hashed Indexes, as well as ODBC, SQL, Oracle, sequential, random access, XML and JSON files. All the constructs of modern programming languages have been incrementally added to the language. PL/B also has the ability to access external routines through COM, DLL's and .NET assemblies. Full access to the .NET framework is built into many versions. Several implementations of the language are capable of running as an Application Server like Citrix, and connecting to remote databases through a data manager. Source code example IF (DF_EDIT[ITEM] = "PHYS") STATESAVE MYSTATE IF (C_F07B != 2) DISPLAY *SETSWALL 1:1:1:80: *BGCOLOR=2,*COLOR=15: *P49:1," 7-Find " ELSE DISPLAY *SETSWALL 1:1:1:80: *BGCOLOR=7,*COLOR=0: *P49:1," 7-Find " ENDIF STATEREST MYSTATE TRAP GET_PRO NORESET IF F7 ENDIF IF (SHOW_FILTER AND THIS_FILTER AND C_CUSTNO <> "MAG") LOADMOD "filter" PACK PASS_ID WITH "QED ",QED_ID1,BLANKS MOVE " FILTER DISPLAY (F6) " TO PASS_DESC SET C_BIGFLT CALL RUN_FILT USING PASS_ID,PASS_DESC,"432" UNLOAD "filter" CLEAR THIS_FILTER ENDIF References External links Sunbelt implementation of PL/B ANSI PL/B Standards Committee MMCC PL/B programming notebook DB/C DX, DATABUS, and PL/B Overview Databus Simplified User Guide Procedural programming languages Cross-platform software Programming languages created in 1972 Structured programming languages
https://en.wikipedia.org/wiki/PDO
PDO can refer to: Chemistry 1,3-Propanediol, an industrial chemical Palladium(II) oxide (PdO) Polydioxanone, a synthetic polymer Computing PHP Data Objects, a PHP extension that can be used as a database abstraction layer Portable Distributed Objects, a version of Cocoa's Distributed Objects for remote use Process Data Object, a type of protocol frame in some fieldbuses, for instance, CANopen Entertainment Panzer Dragoon Orta, a 3D shooter created by Smilebit on the Xbox Perfect Dark Zero, a 2005 video game Phoenix Dynasty Online, a 2007 fantasy MMORPG Other uses ISO 639:pdo or Padoe language, an Austronesian language spoken in South Sulawesi Pacific decadal oscillation, a pattern of climate variation Padre Aldamiz International Airport (Ident: PDO) Petroleum Development Oman, an Omani oil company and the part of the Omani capital region where the company is located Philips and Dupont Optical, a CD manufacturer that was associated with the Compact Disc bronzing issue Protected Designation of Origin, the name of an area that is used as a designation for an agricultural product or foodstuff Pseudo-differential operator, a concept in mathematical analysis An advanced statistic in ice hockey See also Perfect Dark Zero (PD0), a 2005 videogame in the Perfect Dark (Joanna Dark) series Palladium catalysts of the form "Pd(0)"
https://en.wikipedia.org/wiki/SHODAN
SHODAN (Sentient Hyper-Optimized Data Access Network) is a fictional artificial intelligence and the main antagonist of the cyberpunk-horror themed video games System Shock, System Shock 2, and System Shock 2023. SHODAN has a god complex and a malevolent demeanor. Created as the AI for TriOptimum Corporation's Citadel Station, SHODAN is hacked, causing her to go rogue, take control of the station, and turn the staff into mutants and cyborgs. Voiced by Terri Brosius, SHODAN is known for her chaotic, discordant voice and manipulative nature. The character has been well-received by gaming media, often considered one of the top video game villains and one of the best female characters in gaming. SHODAN has been compared to other famous AI characters, like GLaDOS from Portal, and praised for her unforgettable personality and omnipresent, threatening nature. Character design SHODAN is an artificial intelligence whose moral restraints were removed from her programming by a hacker in order for Edward Diego, station chief of Citadel Station, on which SHODAN was installed, to delete compromising files regarding illegal experiments and his corruption. She is a megalomaniac with a god complex and sees humans as little better than insects, something which she constantly reminds the player of. Her words are accompanied by stuttering, fluctuating voice pitch, shifts of timbre, and the presence of three voices speaking the same words with the constituent voices alternately lagging behind and leading ahead in different patterns, as well as computer glitches resembling a sound card malfunction. The disc version refers to SHODAN as either an 'it' or a 'he', while the later CD version uses 'she'. On screens, SHODAN manifests herself as an eerie-looking green and/or grey female cybernetic face that usually wears a malevolent expression, and speaks with a chaotic, discordant voice. She is voiced by former Tribe keyboardist and vocalist, Terri Brosius, the wife of System Shock 2'''s sound editor, Eric Brosius, who distorted the samples to provide the distinctive SHODAN effect. In the cyberspace of System Shock, she is initially represented as an inverted blue-grey cone, reminiscent of the Master Control Program from the 1982 Disney film Tron. After she has been hacked, the cone turns red, the surface becomes covered in rough metallic material and four "tentacles" or "claws" grow from the top, with her actual face starting to form above that. Appearances SHODAN was created on Earth to serve as the artificial intelligence of the TriOptimum Corporation's research and mining space station Citadel Station, which orbits around Saturn. She was hacked by the game's protagonist (at the behest of the corrupt corporate Vice President Edward Diego, in exchange for amnesty and a military-grade neural implant) and, to access the vital information about TriOptimum corporation, its ethical restrictions were removed, starting a process that eventually resulted in the AI going rogue,
https://en.wikipedia.org/wiki/Contacts%20%28Apple%29
Contacts is a computerized address book included with the Apple operating systems iOS, iPadOS, watchOS and macOS, previously Mac OS X and OS X. It includes various cloud synchronization capabilities and integrates with other Apple applications and features, including iMessage, FaceTime and the iCloud service (and previously its predecessor MobileMe). History An application known as Address Book was included with Mac OS X from its release in 2001 and in preceding beta versions. Address Book was rewritten for Mac OS X Jaguar (2002) and as of 2020 has remained in roughly the same form ever since. The iPhone also included contacts storage from its release, which starting from iPhone OS 2 (2008) was also broken out into a standalone application. In 2010, the iPad with iOS 3.2 introduced a new two-pane contacts app, featuring the skeuomorphic design style popular with Apple around this time under the leadership of Scott Forstall. OS X Lion (2011) featured a redesigned Address Book application in the style of the iPad Contacts app, also in a two-pane design. In 2012 with OS X Mountain Lion it returned to a three-pane design and changed names to match iOS. The following year, both versions of Contacts switched with their parent operating systems to a more flat design style, a change attributed to Forstall's departure from Apple in the autumn of 2012. In 2013 iOS Contacts switched to the new UI along with the whole of iOS 7, while with OS X Mavericks the skeuomorphic design was removed leaving a basic UI. With OS X Yosemite (2014) the OS X Contacts app switched along with the rest of the operating system to the iOS 7-style UI. In 2021, Apple introduced Contacts to Apple Watch in watchOS 8. Features Exports and imports cards in vCard 3.0 format Imports cards from LDIF, tab-delimited, and comma-separated files C and Objective-C API to interface with other applications Prints labels and envelopes, mailing lists, pocket address books Can configure page setup and paper size before printing One-click automatic look up for duplicate entries Change of address notification Contact groups Smart groups based on Spotlight Look up addresses on Apple Maps Auto-merge when importing vCards Customize fields and categories Automatic formatting of phone numbers Synchronizes with Microsoft Exchange Server Synchronizes with Yahoo! Address Book Synchronizes with Google Contact Sync Speech recognition searching Capability to query an LDAP database containing person information Plugin interface allowing third-party developers to add functionality to the program Integration with macOS Integration with Mail, Calendar, Messages, FaceTime, Fax, Safari, iPhone iSync compatibility to sync contacts to phones, PDAs, iPods, and other Macs Contacts are indexed by Spotlight Address Book stores previous recipient addresses used by Mail URLs in Address Book cards appear in Safari's Address Book bookmarks Buddies in iChat can be associated with Address Book cards Birthdays saved in A
https://en.wikipedia.org/wiki/Multiplayer%20video%20game
A multiplayer video game is a video game in which more than one person can play in the same game environment at the same time, either locally on the same computing system (couch co-op), on different computing systems via a local area network, or via a wide area network, most commonly the Internet (e.g. World of Warcraft, Call of Duty, DayZ). Multiplayer games usually require players to share a single game system or use networking technology to play together over a greater distance; players may compete against one or more human contestants, work cooperatively with a human partner to achieve a common goal, or supervise other players' activity. Due to multiplayer games allowing players to interact with other individuals, they provide an element of social communication absent from single-player games. History Non-networked Some of the earliest video games were two-player games, including early sports games (such as 1958's Tennis For Two and 1972's Pong), early shooter games such as Spacewar! (1962) and early racing video games such as Astro Race (1973). The first examples of multiplayer real-time games were developed on the PLATO system about 1973. Multi-user games developed on this system included 1973's Empire and 1974's Spasim; the latter was an early first-person shooter. Other early video games included turn-based multiplayer modes, popular in tabletop arcade machines. In such games, play is alternated at some point (often after the loss of a life). All players' scores are often displayed onscreen so players can see their relative standing. Danielle Bunten Berry created some of the first multiplayer video games, such as her debut, Wheeler Dealers (1978) and her most notable work, M.U.L.E. (1983). Gauntlet (1985) and Quartet (1986) introduced co-operative 4-player gaming to the arcades. The games had broader consoles to allow for four sets of controls. Networked Ken Wasserman and Tim Stryker identified three factors which make networked computer games appealing: Multiple humans competing with each other instead of a computer Incomplete information resulting in suspense and risk-taking Real-time play requiring quick reaction John G. Kemeny wrote in 1972 that software running on the Dartmouth Time Sharing System (DTSS) had recently gained the ability to support multiple simultaneous users, and that games were the first use of the functionality. DTSS's popular American football game, he said, now supported head-to-head play by two humans. The first large-scale serial sessions using a single computer were STAR (based on Star Trek), OCEAN (a battle using ships, submarines and helicopters, with players divided between two combating cities) and 1975's CAVE (based on Dungeons & Dragons), created by Christopher Caldwell (with artwork and suggestions by Roger Long and assembly coding by Robert Kenney) on the University of New Hampshire's DECsystem-1090. The university's computer system had hundreds of terminals, connected (via serial lines) thro
https://en.wikipedia.org/wiki/Layer%202%20Tunneling%20Protocol
In computer networking, Layer 2 Tunneling Protocol (L2TP) is a tunneling protocol used to support virtual private networks (VPNs) or as part of the delivery of services by ISPs. It uses encryption ('hiding') only for its own control messages (using an optional pre-shared secret), and does not provide any encryption or confidentiality of content by itself. Rather, it provides a tunnel for Layer 2 (which may be encrypted), and the tunnel itself may be passed over a Layer 3 encryption protocol such as IPsec. History Published in August 1999 as proposed standard RFC 2661, L2TP has its origins primarily in two older tunneling protocols for point-to-point communication: Cisco's Layer 2 Forwarding Protocol (L2F) and Microsoft's Point-to-Point Tunneling Protocol (PPTP). A new version of this protocol, L2TPv3, appeared as proposed standard RFC 3931 in 2005. L2TPv3 provides additional security features, improved encapsulation, and the ability to carry data links other than simply Point-to-Point Protocol (PPP) over an IP network (for example: Frame Relay, Ethernet, ATM, etc.). Description The entire L2TP packet, including payload and L2TP header, is sent within a User Datagram Protocol (UDP) datagram. A virtue of transmission over UDP (rather than TCP) is that it avoids the "TCP meltdown problem". It is common to carry PPP sessions within an L2TP tunnel. L2TP does not provide confidentiality or strong authentication by itself. IPsec is often used to secure L2TP packets by providing confidentiality, authentication and integrity. The combination of these two protocols is generally known as L2TP/IPsec (discussed below). The two endpoints of an L2TP tunnel are called the L2TP access concentrator (LAC) and the L2TP network server (LNS). The LNS waits for new tunnels. Once a tunnel is established, the network traffic between the peers is bidirectional. To be useful for networking, higher-level protocols are then run through the L2TP tunnel. To facilitate this, an L2TP session is established within the tunnel for each higher-level protocol such as PPP. Either the LAC or LNS may initiate sessions. The traffic for each session is isolated by L2TP, so it is possible to set up multiple virtual networks across a single tunnel. The packets exchanged within an L2TP tunnel are categorized as either control packets or data packets. L2TP provides reliability features for the control packets, but no reliability for data packets. Reliability, if desired, must be provided by the nested protocols running within each session of the L2TP tunnel. L2TP allows the creation of a virtual private dialup network (VPDN) to connect a remote client to its corporate network by using a shared infrastructure, which could be the Internet or a service provider's network. Tunneling models An L2TP tunnel can extend across an entire PPP session or only across one segment of a two-segment session. This can be represented by four different tunneling models, namely: voluntary tunnel compulsor
https://en.wikipedia.org/wiki/Maude%20%28TV%20series%29
Maude is an American sitcom television series that was originally broadcast on the CBS network from September 12, 1972, until April 22, 1978. Maude stars Bea Arthur as Maude Findlay, an outspoken, middle-aged, politically liberal woman living in suburban Tuckahoe, New York with her fourth husband, household appliance store owner Walter Findlay (Bill Macy). Maude embraces the tenets of women's liberation, always votes for Democratic Party candidates, and advocates for civil rights and racial and gender equality. However, her overbearing and sometimes domineering personality often gets her into trouble when speaking about these issues. The show was the first spin-off of All in the Family, on which Arthur had made two appearances as Maude, Edith Bunker's favorite cousin. Like All in the Family, Maude was a sitcom with topical storylines created by producers Norman Lear and Bud Yorkin. Unusual for an American sitcom, several episodes (such as "Maude's Night Out" and "The Convention") featured only the characters of Maude and her husband Walter, in what amounted to half-hour "two-hander" teleplays. In the season four episode "The Analyst" (sometimes referred to as "Maude Bares Her Soul"), Arthur as Maude, speaking to an unseen psychiatrist, was the sole actor on screen for the entire episode. The show's theme song, "And Then There's Maude", was written by Alan and Marilyn Bergman and Dave Grusin, and performed by Donny Hathaway. Characters Maude first appears in two season-two episodes of All in the Family: the first in December 1971 as a visitor to the Bunker home, and the second, a backdoor pilot setting up the premise of the Maude series, in March 1972. She is Edith Bunker's (Jean Stapleton) favorite cousin who has been married four times. Her first husband, Barney, died shortly after their marriage; she divorced the next two, Albert and Chester. Albert was never portrayed on screen, but the episode "Poor Albert" revolved around his death, while second former husband Chester would appear once on the show (played by Martin Balsam). Her fourth (and current) husband, Walter Findlay (played by Bill Macy), owns an appliance store called Findlay's Friendly Appliances. Maude and Walter met just before the 1968 presidential election. Maude sometimes gets in the last word during their many arguments with her hallmark catchphrase, "God'll get you for that, Walter", which came directly from Bea Arthur. Maude's deep, raspy voice is also an occasional comic foil whenever she answers the phone and explaining in one episode, "No, this is not Mr. Findlay; this is Mrs. Findlay! Mr. Findlay has a much higher voice." Maude's daughter, Carol Traynor (played by Adrienne Barbeau – in the All in the Family pilot episode the character was played by Marcia Rodd), is also divorced and has one child, like Maude. Carol and her son, Phillip (played by Brian Morrison in seasons 1-5 and by Kraig Metzinger in the sixth), live with the Findlays. Though single, Carol mainta
https://en.wikipedia.org/wiki/Nielsen
Nielsen may refer to: Business Nielsen Gallery, an American commercial art gallery Nielsen Holdings, global information, data, and measurement company Nielsen Corporation, a marketing research firm Nielsen Audio, formerly Arbitron, which measures radio listenership Nielsen Broadcast Data Systems, a service also known as BDS that tracks monitored radio, television, and internet airplay of songs Nielsen Media Research, the company that creates the Nielsen ratings Nielsen ratings, a rating system used to gauge audience measurement of television programming habits in the United States Nielsen Norman Group, a computer user interface and user experience consulting firm Other uses Nielsen (surname), including a list of people Nielsen (crater), a lunar impact crater on the Oceanus Procellarum Nielsen–Olesen vortex, a point-like object localized in two spatial dimensions or a classical solution of field theory with the same property Nielsen fixed-point theorem Nielsen Fjord Nielsen Glacier Nielsen Park She-Oak See also Neilsen (disambiguation) Neilson (disambiguation) Nielson Nilsen Nilsson
https://en.wikipedia.org/wiki/Hans%20Reiser
Hans Thomas Reiser (born December 19, 1963) is an American computer programmer, entrepreneur, and convicted murderer. In April 2008, Reiser was convicted of the first-degree murder of his wife, Nina Reiser, who disappeared in September 2006. He subsequently pleaded guilty to a reduced charge of second-degree murder, as part of a settlement agreement that included disclosing the location of Nina Reiser's body, which he revealed to be in a shallow grave near the couple's home. Prior to his incarceration, Reiser created the ReiserFS computer file system, which may be used by the Linux kernel but which is now scheduled for removal in 2025, as well as its attempted successor, Reiser4. In 2004, he founded Namesys, a corporation meant to coordinate the development of both file systems. Childhood, education, and career Hans Thomas Reiser was born in Oakland, California to Ramon and Beverly (née Kleiber) Reiser and grew up in the same city. He dropped out of junior high school when he was 13 because of his disdain for what he considered an overly rigid, conventional schooling system, and for constantly being ridiculed and bullied by his peers. Reiser has stated in interviews that, at the age of 15, he was accepted into the University of California, Berkeley. Reiser attended the university off and on until he received a BS in computer science in 1992 at age 28. Although Reiser preferred higher education, he did not pursue a Ph.D. for the same educational reasons for dropping out of junior high school. He worked part- to full-time in the computer field while founding the California-based software company Namesys. Before Namesys, Reiser was employed at Synopsys, IBM Research, Premenos Corp., and Accurate Information Systems. Namesys and ReiserFS Reiser and his company Namesys developed the journaled computer file systems ReiserFS and Reiser4. ReiserFS has been available in the Linux operating system since version 2.4.1 and has, at times, been the default filesystem on several Linux distributions including, until 2006, Novell's SUSE Linux Enterprise. Following Reiser's 2006 arrest on suspicion of murder, people in the free software community expressed concern over the future of Reiser's newer filesystem (Reiser4). Jonathan Corbet, editor of LWN.net, argued that the immaturity of Reiser4's feature set and Reiser's extensive combative relationship with the community meant that the filesystem's future had been limited in any event. Shortly after Reiser's arrest, the employees of Namesys stated that they would continue to work, that the arrest had no immediate effect on the rate of the software's development, and if the case expanded over a longer time, they would seek solutions to ensure the long-term future of the company. On December 21, 2006, Reiser announced that he was selling the company to raise money for his increasing legal fees. According to an interview with Namesys employee Edward Shishkin, as of January 2008, the commercial activity of the c
https://en.wikipedia.org/wiki/Software%20protection%20dongle
A software protection dongle (commonly known as a dongle or key) is an electronic copy protection and content protection device. When connected to a computer or other electronics, they unlock software functionality or decode content. The hardware key is programmed with a product key or other cryptographic protection mechanism and functions via an electrical connector to an external bus of the computer or appliance. In software protection, dongles are two-interface security tokens with transient data flow with a pull communication that reads security data from the dongle. In the absence of these dongles, certain software may run only in a restricted mode, or not at all. In addition to software protection, dongles can enable functions in electronic devices, such as receiving and processing encoded video streams on television sets. Etymology The Merriam-Webster dictionary states that the "First known use of dongle" was in 1981 and that the etymology was "perhaps alteration of dangle." Dongles rapidly evolved into active devices that contained a serial transceiver (UART) and even a microprocessor to handle transactions with the host. Later versions adopted the USB interface, which became the preferred choice over the serial or parallel interface. A 1992 advertisement for Rainbow Technologies claimed the word dongle was derived from the name "Don Gall". Though untrue, this has given rise to an urban myth. Usage Efforts to introduce dongle copy-protection in the mainstream software market have met stiff resistance from users. Such copy-protection is more typically used with very expensive packages and vertical market software such as CAD/CAM software, cellphone flasher/JTAG debugger software, MICROS Systems hospitality and special retail software, digital audio workstation applications, and some translation memory packages. In cases such as prepress and printing software, the dongle is encoded with a specific, per-user license key, which enables particular features in the target application. This is a form of tightly controlled licensing, which allows the vendor to engage in vendor lock-in and charge more than it would otherwise for the product. An example is the way Kodak licenses Prinergy to customers: When a computer-to-plate output device is sold to a customer, Prinergy's own license cost is provided separately to the customer, and the base price contains little more than the required licenses to output work to the device. USB dongles are also a big part of Steinberg's audio production and editing systems, such as Cubase, WaveLab, Hypersonic, HALion, and others. The dongle used by Steinberg's products is also known as a Steinberg Key. The Steinberg Key can be purchased separately from its counterpart applications and generally comes bundled with the "Syncrosoft License Control Center" application, which is cross-platform compatible with both Mac OS X and Windows. Some software developers use traditional USB flash drives as software licens
https://en.wikipedia.org/wiki/Queer%20Eye%20%282003%20TV%20series%29
Queer Eye is an American reality television series that premiered on the Bravo network in July 2003, initially broadcast as Queer Eye for the Straight Guy. The series was created by executive producers David Collins and Michael Williams along with David Metzler through their company, Scout Productions. Each episode features a team of gay professionals in the fields of fashion, personal grooming, interior design, entertaining, and culture collectively known as the "Fab Five" performing a makeover (in the parlance of the show, a "make-better"): revamping wardrobe, redecorating, and offering lifestyle advice. Queer Eye for the Straight Guy quickly became a surprise success, winning an Emmy Award for Outstanding Reality Program in 2004, with subsequent merchandising, international franchising of the concept, and a woman-oriented spin-off, Queer Eye for the Straight Girl. The series name was abbreviated to Queer Eye at the beginning of its third season to include making over individuals regardless of gender or sexual orientation. Queer Eye ended production in June 2006 and the final episode aired October 30, 2007. During September 2008, the Fine Living Network briefly aired Queer Eye in syndication. The series was also run again by the CBS-affiliated Twist network in 2023. Netflix revived the series in 2018 with a new Fab Five. "Fab Five" experts Ted Allen: "Food and Wine Connoisseur", expert on alcohol, beverages, food preparation and presentation Kyan Douglas: "Grooming Guru", expert on hair, grooming, personal hygiene, and makeup Thom Filicia: "Design Doctor", expert on interior design and home organization Carson Kressley: "Fashion Savant", expert on clothing, fashion and personal styling Jai Rodriguez: "Culture Vulture", expert on popular culture, relationships and social interaction Episodes Production Producers Collins and Metzler were given approval by Bravo to develop Queer Eye after the ratings success the network experienced when it counterprogrammed a marathon of its 2002 series Gay Weddings at the same time as Super Bowl XXXVII during 2003 January. The pilot episode was filmed in Boston, Massachusetts during June 2002. Of the eventual Fab Five, only Kressley and Allen appeared. The culture, design and grooming roles were filled by James Hannaham, Charles Daboub Jr., and Sam Spector, respectively. The pilot was delivered to Bravo during September 2002, and was well received in audience testing. Soon thereafter, NBC purchased Bravo and ordered 12 episodes of the series. NBC promoted the show extensively, including billboard campaigns and print advertisements in national magazines. Kyan Douglas and Thom Filicia joined the show for these episodes, along with Blair Boone in the role of "culture guy." Boone filmed two episodes (which were broadcast as the second and third episodes and for which he was credited as a "guest culture expert") but was replaced by Rodriguez beginning with production of the third episode. Each episo
https://en.wikipedia.org/wiki/Resource%20Directory%20Description%20Language
In computing, Resource Directory Description Language (RDDL) is an extension of XHTML Basic 1.0. An RDDL document, called a Resource Directory, provides a package of information about some target. The targets which RDDL was designed to describe are XML namespaces. The specification for RDDL has no official standing and has not been considered nor approved by any organization (e.g., W3C). RDDL is designed to allow both human readers and software robots to find any sort of resource associated with a particular namespace. Instead of putting one thing at the end of a namespace URI, RDDL puts a document there that lists all the machine-processable documents that might be available, including: Document Type Definitions (DTD) XML schemas in a variety of languages (including RELAX, Schematron, W3C XML Schema, TREX, and others) Cascading Style Sheets, XSLT, and other style sheet specifications Specification documents rddl:resource An RDDL document identifies each related resource by a resource element in the http://www.rddl.org/ namespace, which is customarily mapped to the rddl prefix. This element is a simple XLink (that is, it has an xlink:type attribute with the value simple) and its xlink:href attribute points to the related resource. Furthermore, the xlink:role attribute identifies the nature of the related resource and the optional xlink:arcrole attribute identifies the purpose of the related resource. An optional xlink:title attribute can provide a brief description of the purpose of the link. External links RDDL home page Cover Pages: Resource Directory Description Language (RDDL) What does a namespace URI locate? HTML
https://en.wikipedia.org/wiki/Viral%20marketing
Viral marketing is a business strategy that uses existing social networks to promote a product mainly on various social media platforms. Its name refers to how consumers spread information about a product with other people, much in the same way that a virus spreads from one person to another. It can be delivered by word of mouth, or enhanced by the network effects of the Internet and mobile networks. The concept is often misused or misunderstood, as people apply it to any successful enough story without taking into account the word "viral". Viral advertising is personal and, while coming from an identified sponsor, it does not mean businesses pay for its distribution. Most of the well-known viral ads circulating online are ads paid by a sponsor company, launched either on their own platform (company web page or social media profile) or on social media websites such as YouTube. Consumers receive the page link from a social media network or copy the entire ad from a website and pass it along through e-mail or posting it on a blog, web page or social media profile. Viral marketing may take the form of video clips, interactive Flash games, advergames, ebooks, brandable software, images, text messages, email messages, or web pages. The most commonly utilized transmission vehicles for viral messages include pass-along based, incentive based, trendy based, and undercover based. However, the creative nature of viral marketing enables an "endless amount of potential forms and vehicles the messages can utilize for transmission", including mobile devices. The ultimate goal of marketers interested in creating successful viral marketing programs is to create viral messages that appeal to individuals with high social networking potential (SNP) and that have a high probability of being presented and spread by these individuals and their competitors in their communications with others in a short period. The term "viral marketing" has also been used pejoratively to refer to stealth marketing campaigns—marketing strategies that advertise a product to people without them knowing they are being marketed to. History The emergence of "viral marketing", as an approach to advertisement, has been tied to the popularization of the notion that ideas spread like viruses. The field that developed around this notion, memetics, peaked in popularity in the 1990s. As this then began to influence marketing gurus, it took on a life of its own in that new context. The brief career of Australian pop singer Marcus Montana is largely remembered as an early example of viral marketing. In early 1989, thousands of posters declaring "Marcus is Coming" were placed around Sydney, generating discussion and interest within the media and the community about the meaning of the mysterious advertisements. The campaign successfully made Montana's musical debut a talking point, but his subsequent music career was a failure. The term viral strategy was first used in marketing in 1995, in a pr
https://en.wikipedia.org/wiki/TOM%20Group
TOM Group Limited () is a technology and Media listed on the Main Board of the Stock Exchange of Hong Kong. TOM Group has technology operations in Social Network, Mobile Internet; and investments in E-Commerce, Fintech and Advanced Data Analytics sectors. History The TOM Group was founded in October 1999 as a joint venture between Cheung Kong–Hutchison Group, Solina Chau and other investors. In March 2000, it was listed on the Growth Enterprise Market (ticker symbol: 8001). The Group has subsequently transferred its listing to the Main Board of the Stock Exchange of Hong Kong (ticker symbol: 2383). Operations Headquartered in Hong Kong, the Group has regional offices in Beijing and Shanghai with approximately 1,200 employees. Technology In 2007, TOM Group's publishing arm, Cite Media stepped into digitalization by riding on the established social networking site, Pixnet. Pixnet is the largest user-generated content platform in Taiwan, focusing on sharing information on food, travel, beauty & style, 3C, and movies. In June 2014, TOM group also invested in a Hong Kong fintech player, WeLab, which is a company that operates online lending platforms in Hong Kong, China and Indonesia. In March 2020, TOM Group invested in MioTech. MioTech leverages artificial intelligence and big data technologies to tackle sustainability and social responsibility challenges facing financial institutions, corporations, and individuals, such as climate change, carbon emission reduction and corporate governance. Publishing TOM Group publishing arm has developed into a publishing platform in Greater China. In Taiwan, a total of 40 publishers are consolidated under Cite Publishing Holding Group. Over 60 magazine titles with annual printing volume exceeding 29 million copies; a total of 5 new magazines on computer, digital related and lifestyle genres were launched. More than 2,000 new book titles; and the annual printing volume reached to 19 million copies; a total catalogue of 14,000 titles out of which 400 more were licensed for local production by publishers in Mainland China. References Leung, H. "Forms snapped up in two hours; Stampede as tom.com goes public," Hong Kong Standard, 19 February 2000. External links CK Hutchison Holdings Companies listed on the Hong Kong Stock Exchange Publishing companies of Hong Kong Online companies of China Chinese companies established in 1999
https://en.wikipedia.org/wiki/Transparency%20%28human%E2%80%93computer%20interaction%29
Any change in a computing system, such as a new feature or new component, is transparent if the system after change adheres to previous external interface as much as possible while changing its internal behaviour. The purpose is to shield from change all systems (or human users) on the other end of the interface. Confusingly, the term refers to overall invisibility of the component, it does not refer to visibility of component's internals (as in white box or open system). The term transparent is widely used in computing marketing in substitution of the term invisible, since the term invisible has a bad connotation (usually seen as something that the user can't see and has no control over) while the term transparent has a good connotation (usually associated with not hiding anything). The vast majority of the times, the term transparent is used in a misleading way to refer to the actual invisibility of a computing process, which is also described by the term opaque, especially with regards to data structures. Because of this misleading and counter-intuitive definition, modern computer literature tends to prefer use of "agnostic" over "transparent". The term is used particularly often with regard to an abstraction layer that is invisible either from its upper or lower neighbouring layer. Also temporarily used later around 1969 in IBM and Honeywell programming manuals the term referred to a certain computer programming technique. An application code was transparent when it was clear of the low-level detail (such as device-specific management) and contained only the logic solving a main problem. It was achieved through encapsulation – putting the code into modules that hid internal details, making them invisible for the main application. Examples For example, the Network File System is transparent, because it introduces the access to files stored remotely on the network in a way uniform with previous local access to a file system, so the user might even not notice it while using the folder hierarchy. The early File Transfer Protocol (FTP) is considerably less transparent, because it requires each user to learn how to access files through an ftp client. Similarly, some file systems allow transparent compression and decompression of data, enabling users to store more files on a medium without any special knowledge; some file systems encrypt files transparently. This approach does not require running a compression or encryption utility manually. In software engineering, it is also considered good practice to develop or use abstraction layers for database access, so that the same application will work with different databases; here, the abstraction layer allows other parts of the program to access the database transparently (see Data Access Object, for example). In object-oriented programming, transparency is facilitated through the use of interfaces that hide actual implementations done with different underlying classes. Types of transparency i
https://en.wikipedia.org/wiki/National%20Cycle%20Network
The National Cycle Network (NCN) was established to encourage cycling and walking throughout the United Kingdom, as well as for the purposes of bicycle touring. It was created by the charity Sustrans who were aided by a £42.5 million National Lottery grant. However Sustrans themselves only own around 2% of the paths on the network, these rest being made of existing public highways and rights of way, and permissive paths negotiated by Sustrans with private landowners, which Sustrans have then labelled as part of their network. In 2017, the Network was used for over 786 million cycling and walking trips, made by 4.4 million people. In 2020, around a quarter of the NCN was scrapped on safety grounds, leaving of signed routes. These are made up of of traffic-free paths with the remaining on-road. It uses shared use paths, disused railways, minor roads, canal towpaths and traffic-calmed routes in towns and cities. History The Bristol and Bath Railway Path (now part of National Route 4) is a walking and cycling path on a disused railway. It opened in 1984 and was the first part of what would later become the NCN. The National Cycle Network began with a National Lottery Grant from the Millennium Commission in 1995. The original goal was to create of signposted cycle routes by 2005, with 50% of these not being on roads, and all of it being "suitable for an unsupervised twelve year old." By mid-2000, of route was signposted to an "interim" standard, and a new goal was then set to double that to by 2005. August 2005 saw the completion of that goal. In 2018, Sustrans published the National Cycle Network - Paths for Everyone report which reviewed the quality and usage of the Network and set out a vision for its future. The report rated 42% of the then network as 'very poor' and identified over 12,000 barriers on the network which made it inaccessible by some users. As a result, around a quarter of the network was de-designated. , there were of signed cycle and walking route that are part of the Network. Routes National routes There are ten main national routes. they are not all complete. Route 1: Dover – Tain. Running the length of the east coast and passing through London and Edinburgh. Route 2: Dover – St Austell in England, along the south coast. Route 3: Bristol – Land's End, incorporating the West Country Way via Chew Valley Lake, and the Cornish Way Route 4: London (Greenwich) – Fishguard, in West Wales, via Reading, Bath, Bristol, Newport, Caerphilly, Pontypridd, Swansea and Llanelli. Route 5: Reading – Holyhead, via Birmingham, The Midlands and the North Wales coast Route 6: Windsor – Lake District, running in sections via Luton, Milton Keynes, Northampton, Derby, Nottingham, Sheffield, Manchester and Preston crossing the Pennine Cycleway Route 7: Sunderland – Inverness via Glasgow. Route 8: Cardiff – Holyhead, through the heart of Wales. Route 9: Belfast – Newry in sections of traffic-free route, with the major sections bein
https://en.wikipedia.org/wiki/Proposed%20directive%20on%20the%20patentability%20of%20computer-implemented%20inventions
The Proposal for a Directive of the European Parliament and of the Council on the patentability of computer-implemented inventions (Commission proposal COM(2002) 92), procedure number 2002/0047 (COD) was a proposal for a European Union (EU) directive aiming to harmonise national patent laws and practices concerning the granting of patents for computer-implemented inventions, provided they meet certain criteria. The European Patent Office describes a computer-implemented invention (CII) as "one which involves the use of a computer, computer network or other programmable apparatus, where one or more features are realised wholly or partly by means of a computer program". The proposal became a major focus for conflict between those who regarded the proposed directive as a way to codify the case law of the Boards of Appeal of the European Patent Office (unrelated to the EU institutions) in the sphere of computing, and those who asserted that the directive is an extension of the patentability sphere, not just a harmonisation, that ideas are not patentable and that the expression of those ideas is already adequately protected by the law of copyright. Following several years of debate and numerous conflicting amendments to the proposal, the proposal was rejected on 6 July 2005 by the European Parliament by an overwhelming majority of 648 to 14 votes. History Original draft On 20 February 2002, the European Commission initiated a proposal for a directive to codify and "harmonise" the different EU national patent laws and cement the practice of the European Patent Office of granting patents for computer-implemented inventions provided they meet certain criteria (cf. software patents under the European Patent Convention). The directive also took on the role of excluding "business methods" from patentability (in contrast with the situation under United States law), because business methods as such are not patentable under the different European national patent laws or under the European Patent Convention. Opponents of the original directive claimed that it was a thinly disguised attempt to make all software patentable. Supporters, however, argued that this was not the case since the proposal explained in several locations (pages 11, 14, 24, 25) that there should be no extension to the existing scope of patentability for computer programs and that pure business methods implemented in software would not be patentable. Only computer programs which provided a "technical contribution" would be patentable. This reliance on the word "technical" was an important weakness in the directive, since it is not a word that has a well-defined meaning, and a "technical contribution" was only defined as being "a contribution to the state of the art in a technical field which is not obvious to a person skilled in the art." (See Article 2 of the proposal). Nevertheless, the term has been used as a benchmark for what is and is not patentable by the European Patent Office a
https://en.wikipedia.org/wiki/Systolic%20array
In parallel computer architectures, a systolic array is a homogeneous network of tightly coupled data processing units (DPUs) called cells or nodes. Each node or DPU independently computes a partial result as a function of the data received from its upstream neighbours, stores the result within itself and passes it downstream. Systolic arrays were first used in Colossus, which was an early computer used to break German Lorenz ciphers during World War II. Due to the classified nature of Colossus, they were independently invented or rediscovered by H. T. Kung and Charles Leiserson who described arrays for many dense linear algebra computations (matrix product, solving systems of linear equations, LU decomposition, etc.) for banded matrices. Early applications include computing greatest common divisors of integers and polynomials. They are sometimes classified as multiple-instruction single-data (MISD) architectures under Flynn's taxonomy, but this classification is questionable because a strong argument can be made to distinguish systolic arrays from any of Flynn's four categories: SISD, SIMD, MISD, MIMD, as discussed later in this article. The parallel input data flows through a network of hard-wired processor nodes, which combine, process, merge or sort the input data into a derived result. Because the wave-like propagation of data through a systolic array resembles the pulse of the human circulatory system, the name systolic was coined from medical terminology. The name is derived from systole as an analogy to the regular pumping of blood by the heart. Applications Systolic arrays are often hard-wired for specific operations, such as "multiply and accumulate", to perform massively parallel integration, convolution, correlation, matrix multiplication or data sorting tasks. They are also used for dynamic programming algorithms, used in DNA and protein sequence analysis. Architecture A systolic array typically consists of a large monolithic network of primitive computing nodes which can be hardwired or software configured for a specific application. The nodes are usually fixed and identical, while the interconnect is programmable. The more general wave front processors, by contrast, employ sophisticated and individually programmable nodes which may or may not be monolithic, depending on the array size and design parameters. The other distinction is that systolic arrays rely on synchronous data transfers, while wavefront tend to work asynchronously. Unlike the more common Von Neumann architecture, where program execution follows a script of instructions stored in common memory, addressed and sequenced under the control of the CPU's program counter (PC), the individual nodes within a systolic array are triggered by the arrival of new data and always process the data in exactly the same way. The actual processing within each node may be hard wired or block micro coded, in which case the common node personality can be block programmable. The systo
https://en.wikipedia.org/wiki/Virtual%20Network%20Computing
Virtual Network Computing (VNC) is a graphical desktop-sharing system that uses the Remote Frame Buffer protocol (RFB) to remotely control another computer. It transmits the keyboard and mouse input from one computer to another, relaying the graphical-screen updates, over a network. VNC is platform-independent – there are clients and servers for many GUI-based operating systems and for Java. Multiple clients may connect to a VNC server at the same time. Popular uses for this technology include remote technical support and accessing files on one's work computer from one's home computer, or vice versa. VNC was originally developed at the Olivetti & Oracle Research Lab in Cambridge, United Kingdom. The original VNC source code and many modern derivatives are open source under the GNU General Public License. There are a number of variants of VNC which offer their own particular functionality; e.g., some optimised for Microsoft Windows, or offering file transfer (not part of VNC proper), etc. Many are compatible (without their added features) with VNC proper in the sense that a viewer of one flavour can connect with a server of another; others are based on VNC code but not compatible with standard VNC. VNC and RFB are registered trademarks of RealVNC Ltd. in the US and some other countries. History The Olivetti & Oracle Research Lab (ORL) at Cambridge in the UK developed VNC at a time when Olivetti and Oracle Corporation owned the lab. In 1999, AT&T acquired the lab, and in 2002 closed down the lab's research efforts. Developers who worked on VNC while still at the AT&T Research Lab include: Tristan Richardson (inventor) Andy Harter (project leader) Quentin Stafford-Fraser James Weatherall Andy Hopper Following the closure of ORL in 2002, several members of the development team (including Richardson, Harter, Weatherall and Hopper) formed RealVNC in order to continue working on open-source and commercial VNC software under that name. The original GPLed source code has fed into several other versions of VNC. Such forking has not led to compatibility problems because the RFB protocol is designed to be extensible. VNC clients and servers negotiate their capabilities with handshaking in order to use the most appropriate options supported at both ends. , RealVNC Ltd claims the term "VNC" as a registered trademark in the United States and in other countries. Etymology The name Virtual Network Computer/Computing (VNC) originated with ORL's work on a thin client called the Videotile, which also used the RFB protocol. The Videotile had an LCD display with pen input and a fast ATM connection to the network. At the time, network computer was commonly used as a synonym for a thin client; VNC is essentially a software-only (i.e. virtual) network computer. Operation The VNC server is the program on the machine that shares some screen (and may not be related to a physical display – the server can be "headless"), and allows the client to share control
https://en.wikipedia.org/wiki/Filename
A filename or file name is a name used to uniquely identify a computer file in a file system. Different file systems impose different restrictions on filename lengths. A filename may (depending on the file system) include: name – base name of the file extension – may indicate the format of the file (e.g. .txt for plain text, .pdf for Portable Document Format, .dat for unspecified binary data, etc.) The components required to identify a file by utilities and applications varies across operating systems, as does the syntax and format for a valid filename. The characters allowed in filenames depend on the file system. The letters A–Z and digits 0–9 are allowed by most file systems; many file systems support additional characters, such as the letters a–z, special characters, and other printable characters such as accented letters, symbols in non-Roman alphabets, and symbols in non-alphabetic scripts. Some file systems allow even unprintable characters, including Bell, Null, Return and Linefeed, to be part of a filename, although most utilities do not handle them well. Filenames may include things like a revision or generation number of the file such as computer code, a numerical sequence number (widely used by digital cameras through the DCF standard), a date and time (widely used by smartphone camera software and for screenshots), and/or a comment such as the name of a subject or a location or any other text to facilitate the searching the files. Some people use of the term filename when referring to a complete specification of device, subdirectories and filename such as the Windows C:\Program Files\Microsoft Games\Chess\Chess.exe. The filename in this case is Chess.exe. Some utilities have settings to suppress the extension as with MS Windows Explorer. History During the 1970s, some mainframe and minicomputers had operating systems where files on the system were identified by a user name, or account number. For example, on the TOPS-10 and RSTS/E operating systems from Digital Equipment Corporation, files were identified by optional device name (one or two characters) followed by an optional unit number, and a colon ":". If not present, it was presumed to be SY: the account number, consisting of a bracket "[", a pair of numbers separated by a comma, and followed by a close bracket "]". If omitted, it was presumed to be yours. mandatory file name, consisting of 1 to 6 characters (upper-case letters or digits) optional 3-character extension. On the OS/VS1, MVS, and OS/390 operating systems from IBM, a file name was up to 44 characters, consisting of upper case letters, digits, and the period. A file name must start with a letter or number, a period must occur at least once each 8 characters, two consecutive periods could not appear in the name, and must end with a letter or digit. By convention, the letters and numbers before the first period was the account number of the owner or the project it belonged to, but there was no requiremen
https://en.wikipedia.org/wiki/Screen%20capture
Screen capture may refer to: Screenshot, an image file which shows the content of a computer's screen at the moment of shot Screencast, also known as a video screen capture, a digital recording of computer screen output, often containing audio narration
https://en.wikipedia.org/wiki/Health%20informatics
Health informatics is the study and implementation of computer structures and algorithms to improve communication, understanding, and management of medical information. It can be view as branch of engineering and applied science. The health domain provides an extremely wide variety of problems that can be tackled using computational techniques. Health informatics is a spectrum of multidisciplinary fields that includes study of the design, development and application of computational innovations to improve health care. The disciplines involved combines medicine fields with computing fields, in particular computer engineering, software engineering, information engineering, bioinformatics, bio-inspired computing, theoretical computer science, information systems, data science, information technology, autonomic computing, and behavior informatics. In academic institutions, medical informatics research focus on applications of artificial intelligence in healthcare and designing medical devices based on embedded systems. In some countries term informatics is also used in the context of applying library science to data management in hospitals. In this meaning health informatics aims at developing methods and technologies for the acquisition, processing, and study of patient data, 'Clinical informaticians' are qualified health and social care professionals and 'clinical informatics' is a subspecialty within several medical specialties. Subject areas within health informatics Jan van Bemmel has described medical informatics as the theoretical and practical aspects of information processing and communication based on knowledge and experience derived from processes in medicine and health care. The Faculty of Clinical Informatics has identified six high level domains of core competency for clinical informaticians: Health and Wellbeing in Practice Information Technologies and Systems Working with Data and Analytical Methods Enabling Human and Organizational Change Decision Making Leading Informatics Teams and projects. Tools to support practitioners Clinical informaticians use their knowledge of patient care combined with their understanding of informatics concepts, methods, and health informatics tools to: assess information and knowledge needs of health care professionals, patients and their families. characterize, evaluate, and refine clinical processes, develop, implement, and refine clinical decision support systems, and lead or participate in the procurement, customization, development, implementation, management, evaluation, and continuous improvement of clinical information systems. Clinicians collaborate with other health care and information technology professionals to develop health informatics tools which promote patient care that is safe, efficient, effective, timely, patient-centered, and equitable. Many clinical informaticists are also computer scientists. The frustration experiences by many practitioners is described in "Why
https://en.wikipedia.org/wiki/Slrn
slrn is a console-based news client for multiple operating systems, developed by John E. Davis and others. It was originally developed in 1994 for Unix-like operating systems and VMS, and now also supports Microsoft Windows. It supports scoring rules to highlight, sort or kill articles based on information from their header. It is customizable, allows free key-bindings and can be extended using the S-Lang macro language. Offline reading is possible by using either slrnpull (included with slrn) or a local newsserver (like leafnode or INN). slrn is free software. slrn was maintained by Thomas Schultz from 2000 to 2007, with the help of others who made contributions, but development is now again followed by the original author, John E. Davis. Current development focuses on better support for different character sets and tighter integration of the S-Lang language processor. Version 1.0.0 of slrn was released on December 21, 2012, 18 years after the first release. The latest release is 1.0.3 on October 23, 2016. Historically slrn was the starting point for many Usenet users. slrn is still a compromise between features, resource usage and simplicity. Operation slrn is fully controlled with the keyboard, and new messages are composed with an external text editor. Name The slrn name derives from the use of S-Lang and its function to read news. See also List of Usenet newsreaders Comparison of Usenet newsreaders List of free and open-source software packages References External links Free Usenet clients Software that uses S-Lang GNOME Applications Software using the GPL license
https://en.wikipedia.org/wiki/Geographic%20Names%20Information%20System
The Geographic Names Information System (GNIS) is a database of name and location information about more than two million physical and cultural features throughout the United States and its territories, Antarctica, and the associated states of the Marshall Islands, Federated States of Micronesia, and Palau. It is a type of gazetteer. It was developed by the United States Geological Survey (USGS) in cooperation with the United States Board on Geographic Names (BGN) to promote the standardization of feature names. Data were collected in two phases. Although a third phase was considered, which would have handled name changes where local usages differed from maps, it was never begun. The database is part of a system that includes topographic map names and bibliographic references. The names of books and historic maps that confirm the feature or place name are cited. Variant names, alternatives to official federal names for a feature, are also recorded. Each feature receives a permanent, unique feature record identifier, sometimes called the GNIS identifier. The database never removes an entry, "except in cases of obvious duplication." Original purposes The GNIS was originally designed for four major purposes: to eliminate duplication of effort at various other levels of government that were already compiling geographic data, to provide standardized datasets of geographic data for the government and others, to index all of the names found on official U.S. government federal and state maps, and to ensure uniform geographic names for the federal government. Phase 1 Phase 1 lasted from 1978 to 1981, with a precursor pilot project run over the states of Kansas and Colorado in 1976, and produced 5 databases. It excluded several classes of feature because they were better documented in non-USGS maps, including airports, the broadcasting masts for radio and television stations, civil divisions, regional and historic names, individual buildings, roads, and triangulation station names. The databases were initially available on paper (2 to 3 spiral-bound volumes per state), on microfiche, and on magnetic tape encoded (unless otherwise requested) in EBCDIC with 248-byte fixed-length records in 4960-byte blocks. The feature classes for association with each name included (for examples) "locale" (a "place at which there is or was human activity" not covered by a more specific feature class), "populated place" (a "place or area with clustered or scattered buildings"), "spring" (a spring), "lava" (a lava flow, kepula, or other such feature), and "well" (a well). Mountain features would fall into "ridge", "range", or "summit" classes. A feature class "tank" was sometimes used for lakes, which was problematic in several ways. This feature class was undocumented, and it was (in the words of a 1986 report from the Engineer Topographic Laboratories of the United States Army Corps of Engineers) "an unreasonable determination", with the likes of Cayuga Lake being
https://en.wikipedia.org/wiki/Bravo%20%28American%20TV%20network%29
Bravo is an American basic cable television network, launched on December 8, 1980. It is owned by the NBCUniversal Media Group division of Comcast's NBCUniversal. The channel originally focused on programming related to fine arts and film. The network's brand is focused on reality series targeted at 25-to-54-year-old women and the LGBTQIA+ community at large. As of January 2016, approximately 89,824,000 American households (77% of households with TV) receive Bravo. History Bravo originally launched as a commercial-free premium channel on December 8, 1980. It was originally co-owned by Cablevision's Rainbow Media division and Warner-Amex Satellite Entertainment; the channel claimed to be "the first television service dedicated to film and the performing arts". The channel originally broadcast its programming two days a week and—like Bravo's former sister network Nickelodeon, which shared its channel space with Alpha Repertory Television Service—shared its channel space with the adult-oriented pay channel Escapade (now Playboy TV), which featured softcore pornographic films. In 1981, Bravo was available to 48,000 subscribers throughout the United States; this total increased four years later to around 350,000 subscribers. A 1985 profile of Bravo in The New York Times observed that most of its programming consisted of international, classic, and independent film. Celebrities such as E. G. Marshall and Roberta Peters provided opening and closing commentary to the films broadcast on the channel. Performing arts programs seen on Bravo included the show Jazz Counterpoint. During the mid-1980s, Bravo converted from a premium service into a basic cable channel, although it remained a commercial-free service. Bravo signed an underwriting deal with Texaco in 1992 and within a month broadcast the first Texaco Showcase production, a stage adaptation of Romeo and Juliet. By the mid-1990s, Bravo began to incorporate more PBS-style underwriting sponsorships, and then began accepting traditional commercial advertising by 1998. In the Encyclopedia of Television, Megan Mullen perceived certain Bravo programs as "considered too risky or eclectic for mainstream channels". Those programs were Karaoke and Cold Lazarus, the final serials by British playwright Dennis Potter shown by Bravo in June 1997, and Michael Moore's documentary series The Awful Truth from 1999. In 1999, Metro-Goldwyn-Mayer acquired a 20% stake in the channel, which it subsequently sold back to Rainbow Media in 2001. NBC bought the network in 2002 for $1.25 billion; it had owned a stake in the channel and its sister networks for several years up to that point. NBC's then-parent company, General Electric, merged the network and its other broadcast and cable properties with Vivendi Universal Entertainment in May 2004 to form NBC Universal. Bravo saw a massive success in 2003 with the reality series Queer Eye for the Straight Guy, which garnered 3.5 million viewers. The network began to add more
https://en.wikipedia.org/wiki/Friendly%20artificial%20intelligence
Friendly artificial intelligence (also friendly AI or FAI) is hypothetical artificial general intelligence (AGI) that would have a positive (benign) effect on humanity or at least align with human interests or contribute to fostering the improvement of the human species. It is a part of the ethics of artificial intelligence and is closely related to machine ethics. While machine ethics is concerned with how an artificially intelligent agent should behave, friendly artificial intelligence research is focused on how to practically bring about this behavior and ensuring it is adequately constrained. Etymology and usage The term was coined by Eliezer Yudkowsky, who is best known for popularizing the idea, to discuss superintelligent artificial agents that reliably implement human values. Stuart J. Russell and Peter Norvig's leading artificial intelligence textbook, Artificial Intelligence: A Modern Approach, describes the idea: Yudkowsky (2008) goes into more detail about how to design a Friendly AI. He asserts that friendliness (a desire not to harm humans) should be designed in from the start, but that the designers should recognize both that their own designs may be flawed, and that the robot will learn and evolve over time. Thus the challenge is one of mechanism design—to define a mechanism for evolving AI systems under a system of checks and balances, and to give the systems utility functions that will remain friendly in the face of such changes. 'Friendly' is used in this context as technical terminology, and picks out agents that are safe and useful, not necessarily ones that are "friendly" in the colloquial sense. The concept is primarily invoked in the context of discussions of recursively self-improving artificial agents that rapidly explode in intelligence, on the grounds that this hypothetical technology would have a large, rapid, and difficult-to-control impact on human society. Risks of unfriendly AI The roots of concern about artificial intelligence are very old. Kevin LaGrandeur showed that the dangers specific to AI can be seen in ancient literature concerning artificial humanoid servants such as the golem, or the proto-robots of Gerbert of Aurillac and Roger Bacon. In those stories, the extreme intelligence and power of these humanoid creations clash with their status as slaves (which by nature are seen as sub-human), and cause disastrous conflict. By 1942 these themes prompted Isaac Asimov to create the "Three Laws of Robotics"—principles hard-wired into all the robots in his fiction, intended to prevent them from turning on their creators, or allowing them to come to harm. In modern times as the prospect of superintelligent AI looms nearer, philosopher Nick Bostrom has said that superintelligent AI systems with goals that are not aligned with human ethics are intrinsically dangerous unless extreme measures are taken to ensure the safety of humanity. He put it this way: Basically we should assume that a 'superintellige
https://en.wikipedia.org/wiki/PICT
PICT is a graphics file format introduced on the original Apple Macintosh computer as its standard metafile format. It allows the interchange of graphics (both bitmapped and vector), and some limited text support, between Mac applications, and was the native graphics format of QuickDraw. The PICT file format consists essentially of a series of QuickDraw commands. The original version, PICT 1, was designed to be as compact as possible while describing vector graphics. To this end, it featured single byte opcodes, many of which embodied operations such as "do the previous operation again". As such it was quite memory efficient, but not very expandable. With the introduction of the Macintosh II and Color QuickDraw, PICT was revised to version 2. This version featured 16-bit opcodes and numerous changes which enhanced its utility. PICT 1 opcodes were supported as a subset for backward compatibility. Within a Mac application, any sequence of drawing operations could be simply recorded/encoded to the PICT format by opening a "Picture", then closing it after issuing the required commands. By saving the resulting byte stream as a resource, a PICT resource resulted, which could be loaded and played back at any time. The same stream could be saved to a data file on disk (with 512 bytes of unused header space added) as a PICT file. With the change to Mac OS X and discontinuation of QuickDraw, PICT was dropped in favor of Portable Document Format (PDF) as the native metafile format, though PICT support is retained by many applications as it was so widely supported on Classic Mac OS. This "PICT" image format supports single channel and color channel RGB images and grayscale images. Photoshop no longer has the ability to open PICT files which use QuickDraw object data (but can open simple raster-based PICTs), and cannot save files in PICT format. PICT versions The PICT format has two versions: PICT 1 format: The old format that only allowed eight colors and focused on compact storage. PICT 2 format: A superset of format 1 that supports 4, 8, 16 and 24-bit color and greyscale images. 32-bit color with a generally unused alpha channel is also supported. Furthermore certain compressed PixMap types can be included using QuickTime. Compression method With the QuickTime 2.0 multimedia framework, Apple added support for compression using JPEG or any other QuickTime compressor to bitmaps embedded in PICT data. References Further reading External links Apple Developer Legacy documentation Description on fileformat.info Description at fileformats.archiveteam.org Graphics file formats Macintosh operating systems
https://en.wikipedia.org/wiki/Metasyntax
In logic and computer science, a metasyntax describes the allowable structure and composition of phrases and sentences of a metalanguage, which is used to describe either a natural language or a computer programming language. Some of the widely used formal metalanguages for computer languages are Backus–Naur form (BNF), extended Backus–Naur form (EBNF), Wirth syntax notation (WSN), and augmented Backus–Naur form (ABNF). These metalanguages have their own metasyntax each composed of terminal symbols, nonterminal symbols, and metasymbols. A terminal symbol, such as a word or a token, is a stand-alone structure in a language being defined. A nonterminal symbol represents a syntactic category, which defines one or more valid phrasal or sentence structure consisted of an n-element subset. Metasymbols provide syntactic information for denotational purposes in a given metasyntax. Terminals, nonterminals, and metasymbols do not apply across all metalanguages. Typically, the metalanguage for token-level languages (formally called "regular languages") does not have nonterminals because nesting is not an issue in these regular languages. English, as a metalanguage for describing certain languages, does not contain metasymbols since all explanation could be done using English expression. There are only certain formal metalanguages used for describing recursive languages (formally called context-free languages) that have terminals, nonterminals, and metasymbols in their metasyntax. Element of metasyntax Terminals: a stand-alone syntactic structure. Terminals could be denoted by double quoting the name of the terminals. e.g. , , , Nonterminals: a symbolic representation defining a set of allowable syntactic structures that is composed of a subset of elements. Nonterminals could be denoted by angle bracketing the name of the nonterminals. e.g. , , Metasymbol: a symbolic representation denoting syntactic information. e.g. , , , , , Methods of phrase termination Juxtaposition: e.g. Alternation: e.g. Repetition: e.g. Optional phrase: e.g. Grouping: e.g. Specific metasyntax conventions The standard convention 'Backus–Naur form' denotes nonterminal symbols by angle bracketing the name of the syntactic category, while it denotes terminal symbols by double quoting the terminal words. Terminals can never appear on the left-hand side of the metasymbol in a derivation rule. The body of the definition on the right-hand side may be composed with several alternative forms with each alternative syntactic construct being separated by the metasymbol . Each of these alternative construct may be either terminal or nonterminal. 'Extended Backus–Naur form' uses all facilities in BNF and introduces two more metasymbols for additional features. One of these two new features is applied to denote an optional phrase in a statement by square bracketing the optional phrase. The second feature is applied to denote a phrase that is to be repeated zero or more time
https://en.wikipedia.org/wiki/Remington%20Rand
Remington Rand, Inc. was an early American business machine manufacturer, originally a typewriter manufacturer and in a later incarnation the manufacturer of the UNIVAC line of mainframe computers. Formed in 1927 following a merger, Remington Rand was a diversified conglomerate making other office equipment, electric shavers, etc. The Remington Rand Building at 315 Park Avenue South in New York City is a 20-floor skyscraper completed in 1911. After 1955, Remington Rand had a long series of mergers and acquisitions that eventually resulted in the formation of Unisys. History Remington Rand was formed in 1927 by the merger of the Remington Typewriter Company and Rand Kardex Corporation. One of its earliest factories, the former Herschell–Spillman Motor Company Complex, was listed on the National Register of Historic Places in 2013. Within the first year, Remington Rand acquired the Dalton Adding Machine Company, the Powers Accounting Machine Company, the Baker-Vawter Company, and the Kalamazoo Loose Leaf Binder Company. From 1936 to 1937 Remington Rand went on strike, which resulted in violence and the loss of jobs. From 1942 to 1945, Remington Rand was a contract manufacturer of the M1911A1 .45 caliber semi-automatic pistol used by the United States Armed Forces during World War II. Remington Rand produced more M1911A1 pistols than any other wartime manufacturer. Remington Rand ranked 66th among United States corporations in the value of World War II military production contracts. In 1950, Remington Rand acquired the Eckert–Mauchly Computer Corporation, founded by the makers of the ENIAC, and in 1952, they acquired Engineering Research Associates (ERA), both of which were pioneers in electronic computing. At that time, Remington Rand was one of the biggest computer companies in the United States. On June 14, 1951, the company's first computer was introduced, the UNIVAC I (Universal Automatic Computer). Many branches of the U.S. military, including the Air Force and the Army, were among the first to use the computers. When companies started to buy the computers, they would leave the computers at the Remington Rand facility since they were so big and bulky. The UNIVAC I was about the size of a one-car garage, and 46 of them were built and sold for $1 million each. Remington Rand was acquired by Sperry Corporation in 1955 to form Sperry Rand (later shortened to Sperry). However, the brand "Remington Rand" continued as a subdivision for many years. Sperry merged in 1986 with Burroughs to form Unisys. Remington Rand was a regular co-sponsor of the CBS panel show What's My Line? throughout much of the show's run. Strike of 1936–1937 Remington Rand had a major worker strike between 1936 and 1937 when the company bought the Noiseless Typewriter Company in 1924, and the Noiseless Typewriter Company kept their company name and their workers were getting paid by Remington Rand. Also in the summer of 1936, James Rand Jr. tried to break up the strike b
https://en.wikipedia.org/wiki/TRAC%20%28programming%20language%29
TRAC (for Text Reckoning And Compiling) Language is a programming language developed between 1959–1964 by Calvin Mooers and first implemented on the PDP-1 in 1964 by L. Peter Deutsch. It was one of three "first languages" recommended by Ted Nelson in Computer Lib. TRAC T64 was used until at least 1984, when Mooers updated it to TRAC T84. Language description TRAC is a purely text-based language — a kind of macro language. Unlike traditional ad hoc macro languages of the time, such as those found in assemblers, TRAC is well planned, consistent, and in many senses complete. It has explicit input and output operators, unlike the typical implicit I/O at the outermost macro level, which makes it simultaneously simpler and more versatile than older macro languages. It also differs from traditional macro languages in that TRAC numbers are strings of digits, with integer arithmetic (without specific limits on maximum values) being provided through built-in ("primitive") functions. Arguably, one aspect of its completeness is that the concept of error is limited to events like lack of file space and requesting expansion of a string longer than the interpreter's working storage; what would in many languages be described as illegal operations are dealt with in TRAC by defining a result (often a null string) for every possible combination of a function's argument strings. TRAC is a text-processing language, also called a string processing language. The emphasis on strings as strings is so strong that TRAC provides mechanisms for handling the language's own syntactic characters either in their syntactic roles or like any other character, and self-modifying code has more the feel of a natural consequence of typical TRAC programming techniques than of being a special feature. TRAC is, like APL or LISP, an expression oriented language (in contrast to more typical procedure-oriented languages), but unlike APL, it completely lacks operators. In most respects, it is a case of pure functional programming. TRAC is homoiconic. TRAC has in common with LISP a syntax that generally involves the presence of many levels of nested parentheses. The main inspiration for TRAC came from three papers by Douglas McIlroy. Intellectual property Mooers trademarked the name TRAC in an effort to maintain his control over the definition of the language, an unusual and pioneering action at the time. At one point, he brought an intellectual property infringement suit against DEC, alleging that a contract to deliver a mini-computer with a TRAC interpreter violated his rights. "The first issue of Dr. Dobb's Journal, one of the early publications in the personal computer field, has a vitriolic editorial against Mooers and his rapacity in trying to charge people for his computing language." The name has since been used several times for unrelated information technology projects, including a current open source project management system called Trac. Influence and usage TRAC
https://en.wikipedia.org/wiki/Electronika%2060
The Electronika 60 () is a computer made in the Soviet Union by Elektronika in Voronezh from 1978 until 1991. It is a rack-mounted system with no built-in display or storage devices. It was usually paired with a 15IE-00-013 terminal and I/O devices. The main logic unit is located on the M2 CPU board. The original implementation of Tetris was written for the Electronika 60 by Alexey Pajitnov in 1985. As the Electronika 60 does not have raster graphics, text characters were used to form the blocks. Technical specifications M2 CPU: LSI-11 (PDP-11 LSI CPU implementation) clone Word length: 16 bits Address space: 32K words (64 KB) RAM size: 4K words (8 KB) Number of instructions: 81 Performance: 250,000 operations per second Floating-point capacity: 32 bits Number of VLSI chips: 5 Board dimensions: 240 × 280 mm References External links Article about Electronika-60 in Russian Images of the Electronika 60M Archive software and documentation for Soviet computers UK-NC, DVK and BK0010 PDP-11 Ministry of the Electronics Industry (Soviet Union) computers
https://en.wikipedia.org/wiki/Cooley%E2%80%93Tukey%20FFT%20algorithm
The Cooley–Tukey algorithm, named after J. W. Cooley and John Tukey, is the most common fast Fourier transform (FFT) algorithm. It re-expresses the discrete Fourier transform (DFT) of an arbitrary composite size in terms of N1 smaller DFTs of sizes N2, recursively, to reduce the computation time to O(N log N) for highly composite N (smooth numbers). Because of the algorithm's importance, specific variants and implementation styles have become known by their own names, as described below. Because the Cooley–Tukey algorithm breaks the DFT into smaller DFTs, it can be combined arbitrarily with any other algorithm for the DFT. For example, Rader's or Bluestein's algorithm can be used to handle large prime factors that cannot be decomposed by Cooley–Tukey, or the prime-factor algorithm can be exploited for greater efficiency in separating out relatively prime factors. The algorithm, along with its recursive application, was invented by Carl Friedrich Gauss. Cooley and Tukey independently rediscovered and popularized it 160 years later. History This algorithm, including its recursive application, was invented around 1805 by Carl Friedrich Gauss, who used it to interpolate the trajectories of the asteroids Pallas and Juno, but his work was not widely recognized (being published only posthumously and in Neo-Latin). Gauss did not analyze the asymptotic computational time, however. Various limited forms were also rediscovered several times throughout the 19th and early 20th centuries. FFTs became popular after James Cooley of IBM and John Tukey of Princeton published a paper in 1965 reinventing the algorithm and describing how to perform it conveniently on a computer. Tukey reportedly came up with the idea during a meeting of President Kennedy's Science Advisory Committee discussing ways to detect nuclear-weapon tests in the Soviet Union by employing seismometers located outside the country. These sensors would generate seismological time series. However, analysis of this data would require fast algorithms for computing DFTs due to the number of sensors and length of time. This task was critical for the ratification of the proposed nuclear test ban so that any violations could be detected without need to visit Soviet facilities. Another participant at that meeting, Richard Garwin of IBM, recognized the potential of the method and put Tukey in touch with Cooley. However, Garwin made sure that Cooley did not know the original purpose. Instead, Cooley was told that this was needed to determine periodicities of the spin orientations in a 3-D crystal of helium-3. Cooley and Tukey subsequently published their joint paper, and wide adoption quickly followed due to the simultaneous development of Analog-to-digital converters capable of sampling at rates up to 300 kHz. The fact that Gauss had described the same algorithm (albeit without analyzing its asymptotic cost) was not realized until several years after Cooley and Tukey's 1965 paper. Their pape
https://en.wikipedia.org/wiki/Delay%20slot
In computer architecture, a delay slot is an instruction slot being executed without the effects of a preceding instruction. The most common form is a single arbitrary instruction located immediately after a branch instruction on a RISC or DSP architecture; this instruction will execute even if the preceding branch is taken. Thus, by design, the instructions appear to execute in an illogical or incorrect order. It is typical for assemblers to automatically reorder instructions by default, hiding the awkwardness from assembly developers and compilers. Branch delay slots When a branch instruction is involved, the location of the following delay slot instruction in the pipeline may be called a branch delay slot. Branch delay slots are found mainly in DSP architectures and older RISC architectures. MIPS, PA-RISC, ETRAX CRIS, SuperH, and SPARC are RISC architectures that each have a single branch delay slot; PowerPC, ARM, Alpha, and RISC-V do not have any. DSP architectures that each have a single branch delay slot include the VS DSP, μPD77230 and TMS320C3x. The SHARC DSP and MIPS-X use a double branch delay slot; such a processor will execute a pair of instructions following a branch instruction before the branch takes effect. The TMS320C4x uses a triple branch delay slot. The following example shows delayed branches in assembly language for the SHARC DSP including a pair after the RTS instruction. Registers R0 through R9 are cleared to zero in order by number (the register cleared after R6 is R7, not R9). No instruction executes more than once. R0 = 0; CALL fn (DB); /* call a function, below at label "fn" */ R1 = 0; /* first delay slot */ R2 = 0; /* second delay slot */ /***** discontinuity here (the CALL takes effect) *****/ R6 = 0; /* the CALL/RTS comes back here, not at "R1 = 0" */ JUMP end (DB); R7 = 0; /* first delay slot */ R8 = 0; /* second delay slot */ /***** discontinuity here (the JUMP takes effect) *****/ /* next 4 instructions are called from above, as function "fn" */ fn: R3 = 0; RTS (DB); /* return to caller, past the caller's delay slots */ R4 = 0; /* first delay slot */ R5 = 0; /* second delay slot */ /***** discontinuity here (the RTS takes effect) *****/ end: R9 = 0; The goal of a pipelined architecture is to complete an instruction every clock cycle. To maintain this rate, the pipeline must be full of instructions at all times. The branch delay slot is a side effect of pipelined architectures due to the branch hazard, i.e. the fact that the branch would not be resolved until the instruction has worked its way through the pipeline. A simple design would insert stalls into the pipeline after a branch instruction until the new branch target address is computed and loaded into the program counter. Each cycle where a stall is inserted is considered one branch delay
https://en.wikipedia.org/wiki/Tokaido%20Shinkansen
The Tokaido Shinkansen () is a Japanese high-speed rail line that is part of the nationwide Shinkansen network. Along with the Sanyo Shinkansen, it forms a continuous high-speed railway through the Taiheiyō Belt, also known as the Tokaido corridor. Opening in 1964, running between Tokyo and Shin-Ōsaka, it is the world's first high-speed rail line. Along with being the world's first high-speed rail line, it is also one of the most heavily used. Since 1987 it has been operated by the Central Japan Railway Company (JR Central), prior to that by Japanese National Railways (JNR). There are three types of services on the line: from fastest to slowest, they are the limited-stop Nozomi, the semi-fast Hikari, and the all-stop Kodama. Many Nozomi and Hikari trains continue onward to the San'yō Shinkansen, going as far as Fukuoka's Hakata Station. The line was named a joint Historic Mechanical Engineering Landmark and IEEE Milestone by the American Society of Mechanical Engineers and the Institute of Electrical and Electronics Engineers in 2000. History The predecessor for the Tokaido and Sanyo Shinkansen lines was originally conceived at the end of the 1930s as a dangan ressha (bullet train) between Tokyo and Shimonoseki, which would have taken nine hours to cover the nearly distance between the two cities. This project was planned as the first part of an East Asian rail network serving Japan's overseas territories. The beginning of World War II stalled the project in its early planning stages, although three tunnels were dug that were later used in the Shinkansen route. By 1955, the original Tokaido line between Tokyo and Osaka was congested. Even after its electrification the next year, the line was still the busiest in Japan's railway network by a long margin, with demand being around double the then capacity. In 1957, a public forum was organized to discuss “The Possibility of a Three-hour Rail Trip Between Tokyo and Osaka.” After substantial debate, the Japanese National Railways (JNR) decided to build a new line alongside the original one to supplement it. The president of JNR at the time, Shinji Sogō, started attempting to persuade politicians to back the project. Realizing the high expenses of the project early on due to the use of new, unfamiliar technologies and the high concentration of tunnels and viaducts, Sogō settled for less government funding than what was needed. The Diet approved the plan in December 1958, agreeing to fund out of the required over a five-year construction period. Then-finance minister Eisaku Satō recommended that the rest of the funds should be taken from non-governmental sources so that political changes would not cause funding issues. Construction of the line began on April 20, 1959 under Sogō and chief engineer Hideo Shima. In 1960, Shima and Sogō were sent to the United States to borrow money from the World Bank. Although the original request was for US$200 million, they came back with only $80 million, e
https://en.wikipedia.org/wiki/Art%20line
Art line may refer to: Arterial line, a catheter placed into an artery to measure blood pressure Artificial transmission line, a four-terminal electrical network See also Artline (disambiguation)
https://en.wikipedia.org/wiki/TCF
TCF can mean: Facilities and structures TCF Center, Detroit, Michigan, USA; a convention center TCF Stadium, Minneapolis, Minnesota, USA; of the University of Minnesota Computing Trenton Computer Festival, US Tor Carding Forum, stolen credit card marketplace Technical control facility in telecommunications Transparency and Consent Framework in online advertising Education Test de connaissance du français, a test of knowledge of French The Citizens Foundation, low-cost schools in Pakistan Entertainment TCF, short for Twentieth Century Fox TCF Hungary Film Rights Exploitation Limited Liability Company or TCF Hungary Film Rights Exploitation Ltd. is a licensee of films in Brazil, Italy, Japan, Korea and Spain Finance TCF Financial Corporation, a holding company TCF Bank Science TCF-1 or HNF1A, a gene TCF4 or TCF7L2, a protein transcription factor TCF/LEF family of transcription factors, TCF7, etc. Technology Transparent conducting film, used in touch screens and solar cells Trillion (1012) cubic feet Totally Chlorine Free in bleaching of wood pulp Other uses Shuttle America (feeder airline ICAO code) The Compassionate Friends, UK, for bereaved parents Third Coast International Audio Festival See also
https://en.wikipedia.org/wiki/Magic%20number%20%28programming%29
In computer programming, a magic number is any of the following: A unique value with unexplained meaning or multiple occurrences which could (preferably) be replaced with a named constant A constant numerical or text value used to identify a file format or protocol; for files, see List of file signatures A distinctive unique value that is unlikely to be mistaken for other meanings (e.g., Globally Unique Identifiers) Unnamed numerical constants The term magic number or magic constant refers to the anti-pattern of using numbers directly in source code. This has been referred to as breaking one of the oldest rules of programming, dating back to the COBOL, FORTRAN and PL/1 manuals of the 1960s. The use of unnamed magic numbers in code obscures the developers' intent in choosing that number, increases opportunities for subtle errors (e.g. is every digit correct in 3.14159265358979323846 and is this equal to 3.14159?) and makes it more difficult for the program to be adapted and extended in the future. Replacing all significant magic numbers with named constants (also called explanatory variables) makes programs easier to read, understand and maintain. Names chosen to be meaningful in the context of the program can result in code that is more easily understood by a maintainer who is not the original author (or even by the original author after a period of time). An example of an uninformatively named constant is int SIXTEEN = 16, while int NUMBER_OF_BITS = 16 is more descriptive. The problems associated with magic 'numbers' described above are not limited to numerical types and the term is also applied to other data types where declaring a named constant would be more flexible and communicative. Thus, declaring const string testUserName = "John" is better than several occurrences of the 'magic value' "John" in a test suite. For example, if it is required to randomly shuffle the values in an array representing a standard pack of playing cards, this pseudocode does the job using the Fisher–Yates shuffle algorithm: for i from 1 to 52 j := i + randomInt(53 - i) - 1 a.swapEntries(i, j) where a is an array object, the function randomInt(x) chooses a random integer between 1 and x, inclusive, and swapEntries(i, j) swaps the ith and jth entries in the array. In the preceding example, 52 is a magic number. It is considered better programming style to write the following: int deckSize:= 52 for i from 1 to deckSize j := i + randomInt(deckSize + 1 - i) - 1 a.swapEntries(i, j) This is preferable for several reasons: It is easier to read and understand. A programmer reading the first example might wonder, What does the number 52 mean here? Why 52? The programmer might infer the meaning after reading the code carefully, but it is not obvious. Magic numbers become particularly confusing when the same number is used for different purposes in one section of code. It is easier to alter the value of the number, as it is not duplicate
https://en.wikipedia.org/wiki/EROS%20%28microkernel%29
Extremely Reliable Operating System (EROS) is an operating system developed starting in 1991 at the University of Pennsylvania, and then Johns Hopkins University, and The EROS Group, LLC. Features include automatic data and process persistence, some preliminary real-time support, and capability-based security. EROS is purely a research operating system, and was never deployed in real world use. , development stopped in favor of a successor system, CapROS. Key concepts The overriding goal of the EROS system (and its relatives) is to provide strong support at the operating system level for the efficient restructuring of critical applications into small communicating components. Each component can communicate with the others only through protected interfaces, and is isolated from the rest of the system. A protected interface, in this context, is one that is enforced by the lowest level part of the operating system, the kernel. That is the only part of the system that can move information from one process to another. It also has complete control of the machine and (if properly constructed) cannot be bypassed. In EROS, the kernel-provided mechanism by which one component names and invokes the services of another is a capability, using inter-process communication (IPC). By enforcing capability-protected interfaces, the kernel ensures that all communications to a process arrive via an intentionally exported interface. It also ensures that no invocation is possible unless the invoking component holds a valid capability to the invoked component. Protection in capability systems is achieved by restricting the propagation of capabilities from one component to another, often through a security policy termed confinement. Capability systems naturally promote component-based software structure. This organizational approach is similar to the programming language concept of object-oriented programming, but occurs at larger granularity and does not include the concept of inheritance. When software is restructured in this way, several benefits emerge: The individual components are most naturally structured as event loops. Examples of systems that are commonly structured this way include aircraft flight control systems (see also DO-178B Software Considerations in Airborne Systems and Equipment Certification), and telephone switching systems (see 5ESS switch). Event-driven programming is chosen for these systems mainly because of simplicity and robustness, which are essential attributes in life-critical and mission-critical systems. Components become smaller and individually testable, which helps to more readily isolate and identify flaws and bugs. The isolation of each component from the others limits the scope of any damage that may occur when something goes wrong or the software misbehaves. Collectively, these benefits lead to measurably more robust and secure systems. The Plessey System 250 was a system originally designed for use in telephony switches, which
https://en.wikipedia.org/wiki/Roads%20in%20Ireland
The island of Ireland, comprising Northern Ireland and the Republic of Ireland, has an extensive network of tens of thousands of kilometres of public roads, usually surfaced. These roads have been developed and modernised over centuries, from trackways suitable only for walkers and horses, to surfaced roads including modern motorways. Driving is on the left-hand side of the road. The major routes were established before Irish independence and consequently take little cognisance of the border other than a change of identification number and street furniture. Northern Ireland has had motorways since 1962, and has a well-developed network of primary, secondary and local routes. The Republic started work on its motorway network in the early 1980s; and historically, the road network there was once somewhat less well developed. However, the Celtic Tiger economic boom and an influx of European Union structural funding, saw national roads and regional roads in the Republic come up to international standard quite quickly. In the mid-1990s, for example, the Republic went from having only a few short sections of motorway to a network of motorways, dual carriageways and other improvements on most major routes as part of a National Development Plan. Road construction in Northern Ireland now tends to proceed at a slower pace than in the Republic, although a number of important bypasses and upgrades to dual carriageway have recently been completed or are about to begin. Roads in Northern Ireland are classified as either Highways, motorways (shown by the letter M followed by a route number, e.g. M1), A-roads (shown by the letter A followed by a route number, e.g. A6), B-roads (shown by the letter B followed by a route number, e.g. B135) and other roads. There are two types of A-roads: primary and non-primary. Roads in the Republic are classified as motorways (shown by the letter M followed by a route number, e.g. M7), national roads (shown by the letter N followed by a route number, e.g. N25), regional roads (shown by the letter R followed by a route number, e.g. R611) and local roads (shown by the letter L followed by a route number, e.g. L4202). There are two types of national roads: national primary routes and national secondary routes. Road signs in Northern Ireland follow the same design rules as the rest of the United Kingdom. Distance signposts in Northern Ireland show distances in miles, while all signposts placed in the Republic since the 1990s use kilometres. The Republic's road signs are generally bilingual, using both official languages, Irish and English. However, signs in the Gaeltacht (Irish speaking areas) use only Irish. The Irish language names are written in italic script, the English in capitals. Signs in Northern Ireland are in English only. Warning signs in the Republic have a yellow background and are diamond-shaped, those in Northern Ireland are triangle-shaped and have a white background with a red border. Speed limits in Northern Ir
https://en.wikipedia.org/wiki/Magic%20Roundabout
Magic Roundabout may refer to: The Magic Roundabout, a 1963 children's television series (originally Le Manège Enchanté) The Magic Roundabout (film), a 2005 computer-animated film, based on the series The Magic Roundabout, a 1979 project to build a full scale Millennium Falcon in Pembroke for a Star Wars film A special traffic roundabout in England with a complex layout, nicknamed after the above series, also known as a ring junction: Magic Roundabout (Colchester) Magic Roundabout (Hemel Hempstead) Magic Roundabout (High Wycombe) Magic Roundabout (Swindon) "Magic Roundabout", a song on IQ's 1985 album The Wake "Magic Roundabout", a 1975 song by Jasper Carrott
https://en.wikipedia.org/wiki/Interlisp
Interlisp (also seen with a variety of capitalizations) is a programming environment built around a version of the programming language Lisp. Interlisp development began in 1966 at Bolt, Beranek and Newman (renamed BBN Technologies) in Cambridge, Massachusetts with Lisp implemented for the Digital Equipment Corporation (DEC) PDP-1 computer by Danny Bobrow and D. L. Murphy. In 1970, Alice K. Hartley implemented BBN LISP, which ran on PDP-10 machines running the operating system TENEX (renamed TOPS-20). In 1973, when Danny Bobrow, Warren Teitelman and Ronald Kaplan moved from BBN to the Xerox Palo Alto Research Center (PARC), it was renamed Interlisp. Interlisp became a popular Lisp development tool for artificial intelligence (AI) researchers at Stanford University and elsewhere in the community of the Defense Advanced Research Projects Agency (DARPA). Interlisp was notable for integrating interactive development tools into an integrated development environment (IDE), such as a debugger, an automatic correction tool for simple errors (via do what I mean (DWIM) software design), and analysis tools. Adaptations At Xerox PARC, an early attempt was made to define a virtual machine to facilitate porting, termed the Interlisp virtual machine. However, it was not useful as a basis for porting. Peter Deutsch defined a byte-coded instruction set for Interlisp, and implemented it as a microcode emulator for the Xerox Alto. This was then ported to a series of workstation designs produced by Xerox for internal use and for commercial exploitation, including on the Xerox 1100 (Dolphin), 1108 (Dandelion), 1109 (the floating-point enabled Dandetiger), 1186 (Daybreak), and 1132 (Dorado). Interlisp implementations for these were known collectively as Interlisp-D. Commercially, these were sold as Lisp machines and branded as Xerox AI Workstations when Larry Masinter was the chief scientist of that group. The same designs, but with different software, were also sold under different names (e.g., when running the Viewpoint system, the 1186 Daybreak was sold as the Xerox 6085.) Releases of Interlisp-D were named according to a musical theme, which ended with Koto, Lyric, and Medley. Later versions included an implementation of pre-American National Standards Institute (ANSI) Common Lisp, named Xerox Common Lisp. LOOPS, the object system for Interlisp-D, became, along with Symbolics' Flavors system, the basis for the Common Lisp Object System (CLOS). In 1974, DARPA awarded a contract to the University of California, San Diego (UCSD) to implement Interlisp on the Burroughs B6700. The motivation was the larger virtual memory addressing space afforded by the B6700 architecture compared to the PDP-10. However, by the time the software was released (1975), the PDP-10's address space had been increased, and Interlisp-10 remained the standard of the day for AI research. The implementors were Bill Gord and Stan Sieler, with guidance from Daniel Bobrow, and under the overal
https://en.wikipedia.org/wiki/Computer%20art
Computer art is any art in which computers play a role in production or display of the artwork. Such art can be an image, sound, animation, video, CD-ROM, DVD-ROM, video game, website, algorithm, performance or gallery installation. Many traditional disciplines are now integrating digital technologies and, as a result, the lines between traditional works of art and new media works created using computers has been blurred. For instance, an artist may combine traditional painting with algorithm art and other digital techniques. As a result, defining computer art by its end product can thus be difficult. Computer art is bound to change over time since changes in technology and software directly affect what is possible. Origin of the term On the title page of the magazine Computers and Automation, January 1963, Edmund Berkeley published a picture by Efraim Arazi from 1962, coining for it the term "computer art." This picture inspired him to initiate the first Computer Art Contest in 1963. The annual contest was a key point in the development of computer art up to the year 1973. History The precursor of computer art dates back to 1956–1958, with the generation of what is probably the first image of a human being on a computer screen, a (George Petty-inspired) pin-up girl at a SAGE air defense installation. Desmond Paul Henry created his first electromechanical Henry Drawing Machine in 1961, using an adapted analogue Bombsight Computer. His drawing machine-generated artwork was shown at the Reid Gallery in London in 1962 after his traditional, non-machine artwork won him the privilege of a one-man exhibition there. It was artist L.S.Lowry who encouraged Henry to include examples of his machine-generated art in the Reid Gallery exhibition. . By the mid-1960s, most individuals involved in the creation of computer art were in fact engineers and scientists because they had access to the only computing resources available at university scientific research labs. Many artists tentatively began to explore the emerging computing technology for use as a creative tool. In the summer of 1962, A. Michael Noll programmed a digital computer at Bell Telephone Laboratories in Murray Hill, New Jersey to generate visual patterns solely for artistic purposes. His later computer-generated patterns simulated paintings by Piet Mondrian and Bridget Riley and became classics. Noll also used the patterns to investigate aesthetic preferences in the mid-1960s. The two early exhibitions of computer art were held in 1965: Generative Computergrafik, February 1965, at the Technische Hochschule in Stuttgart, Germany, and Computer-Generated Pictures, April 1965, at the Howard Wise Gallery in New York. The Stuttgart exhibit featured work by Georg Nees; the New York exhibit featured works by Bela Julesz and A. Michael Noll and was reviewed as art by The New York Times. A third exhibition was put up in November 1965 at Galerie Wendelin Niedlich in Stuttgart, Germany, showing wo
https://en.wikipedia.org/wiki/BBC%20Kids%20%28Canadian%20TV%20channel%29
BBC Kids was a Canadian specialty television channel carrying programming for children and teenagers. It was a joint venture between Knowledge West Communications, which managed the network and held the majority 80% interest and was a subsidiary of Knowledge Network, with BBC Studios licensing the BBC brand and holding the remaining 20% in the JV. Originally an ad-supported network, it transitioned to non-commercial operation when it was transferred to Knowledge. History Launch In November 2000, Alliance Atlantis was granted approval by the Canadian Radio-television and Telecommunications Commission (CRTC) to launch BBC Kids, described as "a national English language Category 2 (what is the current category B) specialty television service devoted to top-quality educational and entertaining programming for children and youth (ages 2-17). It will feature programming primarily from the UK and around the world. 65% of the programming will target children ages 2 to 11, the majority of which will target 6 to 11 years old and 35% will target youth ages 12 to 17." The channel launched on November 5, 2001 as a joint venture between Alliance Atlantis and BBC Worldwide, the BBC's overseas operating arm. As was its remit, it primarily sourced its programming from networks and producers from the United Kingdom, though its programming sources were never exclusively limited to those of the BBC. This also included international co-productions, including Tots TV, Mr. Bean: The Animated Series, and The Sleepover Club. It also broadcast a small amount of Canadian and Anglo-Canadian co-productions (including previously-produced Alliance Atlantis programming co-produced with an American network such as PBS, Disney Channel, or Nickelodeon for American broadcast) to meet CRTC Canadian content regulations and quotas. Transition from Alliance Atlantis to Knowledge Network On January 18, 2008, a joint venture between Canwest and Goldman Sachs Capital Partners known as CW Media, acquired control of BBC Kids through its purchase of Alliance Atlantis' broadcasting assets, which were placed in a trust in August 2007. It then became a part of Shaw Communications on October 10, 2010 after it acquired Canwest outright with the GSCP stake in CW Media. Shaw's ownership would be short-lived due to regulatory requirements to sell some former CW Media assets, and that process started on December 22, 2010 with early due diligence with a then-undisclosed sales partner. On January 17, 2011, Knowledge Network Corporation, a Crown corporation of the Government of British Columbia, announced it had finalized an agreement to purchase the channel through a subsidiary called Knowledge-West Communications Corporation from Shaw Media. BBC Worldwide would retain its existing interest. The sale was completed on April 29, 2011, and with it and CRTC approval, it converted from a commercial network to a non-commercial service, while relocating operations from Toronto to Burnaby in suburban Vanc
https://en.wikipedia.org/wiki/The%20Point%20%28radio%20network%29
The Point ("Independent Radio, The Point") is a radio network operating in the state of Vermont. The flagship station is WNCS (104.7 FM) in Montpelier, which signed on in 1977. It was started by Jeb Spaulding who later served as Chancellor of the Vermont State Colleges, State Treasurer of Vermont, and Secretary of Administration under Vermont Governor Peter Shumlin. Although at that time there was no designated adult album alternative format, The Point's programming format has been solidly adult album alternative/progressive rock for its entire history. The Point has won numerous national awards over the course of its history, including trade publication Radio and Records AAA Station of the Year (markets 101+) in 2005, 2006, 2007, and 2008, which was the publication's final year of operation. In 2008, The Point was inducted into the trade publication FMQBs Hall Of Fame for AAA Stations in markets 51 and smaller, and in 2013 The Point was named FMQBs AAA Station of the Year (markets 50+). Frequencies The Point broadcasts on five FM stations. They are: Notes: All of the stations are owned by Montpelier Broadcasting Inc., which, in turn, is owned by Northeast Broadcasting, Inc., (based in Bedford, New Hampshire), which also owns WWMP (and owned WCAT) in Burlington; WSKI in Montpelier; and other stations in Andover, Massachusetts, and in Colorado, Idaho, and Wyoming. Former stations The Point was also carried by WRJT (103.1 FM) in Royalton, serving the White River Junction–Lebanon–Hanover area, from its 1996 sign on until its sale to the Educational Media Foundation in 2020; it is now K-Love station WZKC. WRJT also operated translator W299AM (107.7 FM) in Lebanon, New Hampshire; since 2021, that facility, while still owned by Northeast Broadcasting, has carried separately-owned WFRD. WFAD (1490 AM) and translator W266CU (101.1 FM) in Middlebury carried The Point in the early 2020s, prior to its 2022 sale to Christian Ministries. References External links Radio stations in Vermont American radio networks Adult album alternative radio stations in the United States Radio stations established in 1977 1977 establishments in Vermont
https://en.wikipedia.org/wiki/The%20Oregon%20Trail%20%281971%20video%20game%29
The Oregon Trail is a text-based strategy video game developed by Don Rawitsch, Bill Heinemann, and Paul Dillenberger in 1971 and produced by the Minnesota Educational Computing Consortium (MECC) beginning in 1975. It was developed as a computer game to teach school children about the realities of 19th-century pioneer life on the Oregon Trail. In the game, the player assumes the role of a wagon leader guiding a party of settlers from Independence, Missouri, to Oregon City, Oregon via a covered wagon in 1847. Along the way the player must purchase supplies, hunt for food, and make choices on how to proceed along the trail while encountering random events such as storms and wagon breakdowns. The original versions of the game contain no graphics, as they were developed for computers that used teleprinters instead of computer monitors. A later Apple II port added a graphical shooting minigame. The first version of the game was developed over the course of two weeks for use by Rawitsch in a history unit at Jordan Junior High School in Minneapolis. Despite its popularity with the students, it was deleted from the school district's mainframe computer at the end of the school semester. Rawitsch recreated the game in 1974 for the MECC, which distributed educational software for free in Minnesota and for sale elsewhere, and recalibrated the probabilities of events based on historical journals and diaries for the game's release the following year. After the rise of microcomputers in the 1970s, the MECC released several versions of the game over the next decade for the Apple II, Atari 8-bit family, and Commodore 64 computers, before redesigning it as a graphical commercial game for the Apple II under the same name in 1985. The game is the first entry in The Oregon Trail series; games in the series have since been released in many editions by various developers and publishers, many titled The Oregon Trail. The multiple games in the series are often considered to be iterations on the same title, and have collectively sold over 65 million copies and have been inducted into the World Video Game Hall of Fame. The series has also inspired a number of spinoffs such as The Yukon Trail and The Amazon Trail. Gameplay The Oregon Trail is a text-based strategy video game in which the player, as the leader of a wagon train, controls a group journeying down the Oregon Trail from Independence, Missouri to Oregon City, Oregon in 1847. The player purchases supplies, then plays through approximately twelve rounds of decision making, each representing two weeks on the trail. Each round begins with the player being told their current distance along the trail and the date, along with their current supplies. Supplies consist of food, bullets, clothing, miscellaneous supplies, and cash, each given as a number. Players are given the option to hunt for food, and in some rounds to stop at a fort to purchase supplies, and then choose how much food to consume that round. The game c
https://en.wikipedia.org/wiki/DBLP
DBLP is a computer science bibliography website. Starting in 1993 at Universität Trier in Germany, it grew from a small collection of HTML files and became an organization hosting a database and logic programming bibliography site. Since November 2018, DBLP is a branch of Schloss Dagstuhl – Leibniz-Zentrum für Informatik (LZI). DBLP listed more than 5.4 million journal articles, conference papers, and other publications on computer science in December 2020, up from about 14,000 in 1995 and 3.66 million in July 2016. All important journals on computer science are tracked. Proceedings papers of many conferences are also tracked. It is mirrored at three sites across the Internet. For his work on maintaining DBLP, Michael Ley received an award from the Association for Computing Machinery (ACM) and the VLDB Endowment Special Recognition Award in 1997. Furthermore, he was awarded the ACM Distinguished Service Award for "creating, developing, and curating DBLP" in 2019. DBLP originally stood for DataBase systems and Logic Programming. As a backronym, it has been taken to stand for Digital Bibliography & Library Project; however, it is now preferred that the acronym be simply a name, hence the new title "The DBLP Computer Science Bibliography". DBL-Browser DBL-Browser (Digital Bibliographic Library Browser) is a utility for browsing the DBLP website. The browser was written by Alexander Weber in 2005 at the University of Trier. It was designed for use off-line in reading the DBLP, which consisted of 696,000 bibliographic entries in 2005 (and in 2015 has more than 2.9 million). DBL-Browser is GPL software, available for download from SourceForge. It uses the XML DTD. Written in Java programming language, this code shows the bibliographic entry in several types of screens, ranging from graphics to text: Author page Article page Table of contents Related conferences / journals Related authors (graphic representation of relationships) Trend analysis (graphics histogram) DBLP is similar to the bibliographic portion of arxiv.org which also links to articles. DBL-Browser provides a means to view some of the associated computer science articles. See also List of academic databases and search engines Association for Computational Linguistics CiteSeerX CogPrints Google Scholar Live Search Academic The Collection of Computer Science Bibliographies Dagstuhl References External links CompleteSearch DBLP provides a fast search-as-you-type interface to DBLP, as well as faceted search. It is maintained by Hannah Bast and synchronized twice daily with the DBLP database. Since December 2007, the search functionality is embedded into each DBLP author page (via JavaScript). FacetedDBLP provides a faceted search interface to DBLP, synchronized once per week with the DBLP database. In addition to common facets such as year, author, or venues, it contains a topic-based facet summarizing and characterizing the current result set based on the author keywords fo
https://en.wikipedia.org/wiki/Graphical%20Kernel%20System
The Graphical Kernel System (GKS) was the first ISO standard for low-level computer graphics, introduced in 1977. A draft international standard was circulated for review in September 1983. Final ratification of the standard was achieved in 1985. Overview GKS provides a set of drawing features for two-dimensional vector graphics suitable for charting and similar duties. The calls are designed to be portable across different programming languages, graphics devices and hardware, so that applications written to use GKS will be readily portable to many platforms and devices. GKS was fairly common on computer workstations in the 1980s and early 1990s. GKS formed the basis of Digital Research's GSX and GEM products; the latter was common on the Atari ST and was occasionally seen on PCs particularly in conjunction with Ventura Publisher. It was little used commercially outside these markets, but remains in use in some scientific visualization packages. It is also the underlying API defining the Computer Graphics Metafile. A descendant of GKS was PHIGS. One popular application based on an implementation of GKS is the GR Framework, a C library for high-performance scientific visualization that has become a common plotting backend among Julia users. A main developer and promoter of the GKS was José Luis Encarnação, formerly director of the Fraunhofer Institute for Computer Graphics (IGD) in Darmstadt, Germany. GKS has been standardized in the following documents: ANSI standard ANSI X3.124 of 1985. ISO 7942:1985 standard, revised as ISO 7942:1985/Amd 1:1991 and ISO/IEC 7942-1:1994, as well as ISO/IEC 7942-2:1997, ISO/IEC 7942-3:1999 and ISO/IEC 7942-4:1998 The language bindings are ISO standard ISO 8651. GKS-3D (Graphical Kernel System for Three Dimensions) functional definition is ISO standard ISO 8805, and the corresponding C bindings are ISO/IEC 8806. The functionality of GKS is wrapped up as a data model standard in the STEP standard, section ISO 10303-46. See also General Graphics Interface GSS-KERNEL IGES (Initial Graphics Exchange Specification) NAPLPS References Further reading External links Unofficial source of current implementation information GKS at FOLDOC Computer graphics Application programming interfaces Graphics standards Graphical Kernel System
https://en.wikipedia.org/wiki/Barry%20Boehm
Barry William Boehm (May 16, 1935 – August 20, 2022) was an American software engineer, distinguished professor of computer science, industrial and systems engineering; the TRW Professor of Software Engineering; and founding director of the Center for Systems and Software Engineering at the University of Southern California. He was known for his many contributions to the area of software engineering. In 1996, Boehm was elected as a member into the National Academy of Engineering for contributions to computer and software architectures and to models of cost, quality, and risk for aerospace systems. Biography Boehm was born on May 16, 1935. He received a BA in mathematics from Harvard University in 1957, and an MS in 1961, and PhD from UCLA in 1964, both in mathematics as well. He also received honorary Sc.D. in Computer Science from the U. of Massachusetts in 2000 and in Software Engineering from the Chinese Academy of Sciences in 2011. In 1955 he started working as a programmer-analyst at General Dynamics. In 1959 he switched to the RAND Corporation, where he was head of the Information Sciences Department until 1973. From 1973 to 1989 he was chief scientist of the Defense Systems Group at TRW Inc. From 1989 to 1992 he served within the U.S. Department of Defense (DoD) as director of the DARPA Information Science and Technology Office, and as director of the DDR&E Software and Computer Technology Office. From 1992 he was TRW Professor of Software Engineering, Computer Science Department, and director, USC Center for Systems and Software Engineering, formerly Center for Software Engineering. He served on the board of several scientific journals, including the IEEE Transactions on Software Engineering, Computer, IEEE Software, ACM Computing Reviews, Automated Software Engineering, Software Process, and Information and Software Technology. Awards Later awards for Boehm included the Office of the Secretary of Defense Award for Excellence in 1992, the ASQC Lifetime Achievement Award in 1994, the ACM Distinguished Research Award in Software Engineering in 1997, and the IEEE International Stevens Award. He was an AIAA Fellow, an ACM Fellow, an IEEE Fellow, and a member of the National Academy of Engineering (1996). He received the Mellon Award for Excellence in Mentoring in 2005 and the IEEE Simon Ramo Medal in 2010. He was appointed a distinguished professor on January 13, 2014. He was awarded the INCOSE Pioneer Award in 2019 by the International Council on Systems Engineering for significant pioneering contributions to the field of systems engineering. Work Boehm's research interests included software development process modeling, software requirements engineering, software architectures, software metrics and cost models, software engineering environments, and knowledge-based software engineering. His contributions to the field, according to Boehm (1997) himself, include "the Constructive Cost Model (COCOMO), the spiral model of the software
https://en.wikipedia.org/wiki/Portable%20desk
The portable desk had many forms and is an ancestor of the portable computer, the modern laptop an atavistic grandchild of the 19th-century lap desk. Medieval era and Renaissance All desks were portable to some extent, from medieval times to the end of the Renaissance, with the exception of built-in tables and inclined ranks of desks found in places such as the scriptorium or library of a monastery. This was due to the itinerant nature of medieval kingship and the similar conditions that prevailed in lesser administrations under dukes or counts. There was rarely a single capital for a kingdom, and the monarch and his (or her) court would travel periodically between several seats of power during the year, taking precious goods and much of their furniture with them. A good example of this is Henry VIII's writing desk. The traditional French words for furniture – le mobilier and les meubles – reflect this. They describe those goods that are "mobile", in contrast to those that are not: les immeubles, that is, buildings. The desks in medieval woodcuts and other illustrations of the period were massive affairs, but could be hauled by several men. Some were made of pieces that could be knocked down for transport. The trestle desk was a common form for the period. It was usually fitted with a slanted top. In the homes of lesser nobles and certain members of the merchant classes the portable furniture never travelled very far. Most domestic life took place in a single large hall. Furniture was constantly shifted around, stored and often disassembled to suit the role the great room was playing at a particular time in the day or the month. Varguenos, bible boxes and other chests There are two survivors of these medieval and renaissance forms: the rather large Bargueño desk or Vargueno, a chest desk from 16th-century Spain, and the relatively small Bible box, which probably had a later origin. These two forms are usually not employed as portable desks any more, but they are bought and sold as antiques or as reproductions and usually valued as much for their monetary worth or their aesthetic appeal as for their practical use. The lap desk appeared sometime in the 17th century and became a stylish accessory for travelling gentlemen. Like the Bible box, the lap desk was usually small enough to be carried on a horse or by a gentleman's butler or valet. From the 18th century onward, however, it grew in size and became too heavy to be used comfortably on a lap. Several regional variations, such as the French escritoire, were developed. At the other end of the scale, the 17th century saw the appearance of several other kinds of "chest" desks, such as those destined for use in ships or for getting paperwork done during a military campaign. These were usually known as the campaign desk and the field desk. Decline Most portable desks gradually disappeared during the 19th century, as useful day-to-day writing tools. The introduction of mass liter
https://en.wikipedia.org/wiki/Capital%20London
Capital London is an Independent Local Radio station owned and operated by the Global media company as part of its national Capital FM Network. As Capital Radio it was launched in the London area in 1973 as one of Britain's first two commercial radio stations. Its brief was to entertain, while its opposite number, London Broadcasting (LBC), was licensed to provide news and information. In search of a larger audience in 1974, Capital Radio rapidly moved from a general and entertainment station with drama, features, documentaries and light music to a more successful pop music-based format. In 1988 it became two stations: 95.8 Capital FM and Capital Gold. After some national expansion with the purchase of other radio stations the Capital Radio Group merged with GWR Group in 2005 to form GCap Media which in turn was taken over by Global Radio in 2008. In 2011 Capital was launched nationally, apart from the daily breakfast and weekday drivetime shows, becoming part of the Capital FM Network. In 2019, the breakfast show also became national, with 11 regional drivetime shows. History Pre-launch The Sound Broadcasting Act 1972 allowed for the establishment of local commercial radio stations in the United Kingdom to operate alongside the national radio stations provided by the BBC. In October 1972 the Independent Broadcasting Authority invited applications for two local radio licences in London: one for a general and entertainment station, the other for news and information. The licence for the entertainment service saw eight organisations applying, many of them with established entertainment pedigrees. Associated Television, run by Lew Grade, was one of them, as was the long-established Isle of Man broadcaster Manx Radio. Others were specially formed companies: Piccadilly Radio under the leadership of the film producer Lord Brabourne, Network Broadcasting headed by the writer Lord Willis and the broadcaster Ned Sherrin, the actor and comedian Bernard Braden’s London Radio Independent Broadcasters and London Independent Broadcasting which included the impresario Robert Stigwood, the then radio producer John Whitney, the record and electronics company EMI, and Mecca Leisure Group. The theatre director Peter Hall (director) supported Artists in Radio. The successful franchisee, however, was Capital Radio Limited. This company, with shareholders including Rediffusion Radio Holdings Limited, Local News of London Limited and The Observer (Holdings) Limited was headed as chairman by the actor and film director Richard Attenborough. Other board members at that time included record producer George Martin, actor and film director Bryan Forbes, theatrical producer Peter Saunders, and a millionaire dentist and long-time commercial radio enthusiast Barclay Barclay-White. By the time of Capital Radio’s launch in October 1973 some of the competitors for the licence such as Lord Willis and John Whitney had joined the board. Test transmissions by the IBA commenc
https://en.wikipedia.org/wiki/KERNAL
KERNAL is Commodore's name for the ROM-resident operating system core in its 8-bit home computers; from the original PET of 1977, followed by the extended but related versions used in its successors: the VIC-20, Commodore 64, Plus/4, Commodore 16, and Commodore 128. Description The Commodore 8-bit machines' KERNAL consists of the low-level, close-to-the-hardware OS routines roughly equivalent to the BIOS in IBM PC compatibles (in contrast to the BASIC interpreter routines, also located in ROM) as well as higher-level, device-independent I/O functionality, and is user-callable via a jump table in RAM whose central (oldest) part, for reasons of backwards compatibility, remains largely identical throughout the whole 8-bit series. The KERNAL ROM occupies the last 8 KB of the 8-bit CPU's 64 KB address space ($E000–$FFFF). The jump table can be modified to point to user-written routines, for example to integrate a fast loader so that its fast replacement routines are used system-wide, or replacing the system text output routine with one that works in bitmapped mode rather than character mode. This use of a jump table was new to small computers at the time. The Adventure International games published for the VIC-20 on cartridge are an example of software that uses the KERNAL. Because they only use the jump table, the games can be memory dumped to disk, loaded into a Commodore 64, and run without modification. The KERNAL was initially written for the Commodore PET by John Feagans, who introduced the idea of separating the BASIC routines from the operating system. It was further developed by several people, notably Robert Russell, who added many of the features for the VIC-20 and the C64. Example A simple, yet characteristic, example of using the KERNAL is given by the following 6502 assembly language subroutine (written in ca65 assembler format/syntax): CHROUT = $ffd2 ; CHROUT is the address of the character output routine CR = $0d ; PETSCII code for Carriage Return ; hello: ldx #0 ; start with character 0 by loading 0 into the x index register next: lda message,x ; load byte from address message+x into the accumulator beq done ; if the accumulator holds zero, we're done and want to branch out of the loop jsr CHROUT ; call CHROUT to output char to current output device (defaults to screen) inx ; increment x to move to the next character bne next ; loop back while the last character is not zero (max string length 255 bytes) done: rts ; return from subroutine ; message: .byte "Hello, world!" .byte CR, 0 ; Carriage Return and zero marking end of string This code stub employs the CHROUT routine, whose address is found at address $FFD2 (65490), to send a text string to the default output device (e.g., the display screen).
https://en.wikipedia.org/wiki/Bank%20switching
Bank switching is a technique used in computer design to increase the amount of usable memory beyond the amount directly addressable by the processor instructions. It can be used to configure a system differently at different times; for example, a ROM required to start a system from diskette could be switched out when no longer needed. In video game systems, bank switching allowed larger games to be developed for play on existing consoles. Bank switching originated in minicomputer systems. Many modern microcontrollers and microprocessors use bank switching to manage random-access memory, non-volatile memory, input-output devices and system management registers in small embedded systems. The technique was common in 8-bit microcomputer systems. Bank-switching may also be used to work around limitations in address bus width, where some hardware constraint prevents straightforward addition of more address lines, and to work around limitations in the ISA, where the addresses generated are narrower than the address bus width. Some control-oriented microprocessors use a bank-switching technique to access internal I/O and control registers, which limits the number of register address bits that must be used in every instruction. Unlike memory management by paging, data is not exchanged with a mass storage device like disk storage. Data remains in quiescent storage in a memory area that is not currently accessible to the processor (although it may be accessible to the video display, DMA controller, or other subsystems of the computer) without the use of special prefix instructions. Technique Bank switching can be considered as a way of extending the address space of processor instructions with some register. Examples: The follow-on system to a processor with a 12 bit address has a 15 bit address bus, but there is no way to directly specify the high three bits on the address bus. Internal bank registers can be used to provide those bits. The follow-on system to a processor with a 15 bit address has an 18 bit address bus, but legacy instructions only have 15 address bits; internal bank registers can be used to provide those bits. Some new instructions can explicitly specify the bank. A processor with a 16-bit external address bus can only address 216 = 65536 memory locations. If an external latch was added to the system, it could be used to control which of two sets of memory devices, each with 65536 addresses, could be accessed. The processor could change which set is in current use by setting or clearing the latch bit. The latch can be set or cleared by the processor in several ways; a particular memory address may be decoded and used to control the latch, or, in processors with separately-decoded I/O addresses, an output address may be decoded. Several bank-switching control bits could be gathered into a register, approximately doubling the available memory spaces with each additional bit in the register. Because the external bank-selecting latch (or
https://en.wikipedia.org/wiki/Gen%20Digital
Gen Digital Inc. (formerly Symantec Corporation and NortonLifeLock) is a multinational software company co-headquartered in Tempe, Arizona and Prague, Czech Republic. The company provides cybersecurity software and services. Gen is a Fortune 500 company and a member of the S&P 500 stock-market index. The company also has development centers in Pune, Chennai and Bangalore. Its portfolio includes Norton, Avast, LifeLock, Avira, AVG, ReputationDefender, and CCleaner. On October 9, 2014, Symantec declared it would split into two independent publicly traded companies by the end of 2015. One company would focus on security, the other on information management. On January 29, 2016, Symantec sold its information-management subsidiary, named Veritas Technologies, and which Symantec had acquired in 2004, to The Carlyle Group. On August 9, 2019, Broadcom Inc. announced they would be acquiring the Enterprise Security software division of Symantec for $10.7 billion, and the company became known as NortonLifeLock. After completing its merger with Avast in September 2022, the company adopted the name Gen Digital Inc. History 1982 to 1989 Founded in 1982 by Gary Hendrix with a National Science Foundation grant, Symantec was originally focused on artificial intelligence-related projects, including a database program. Hendrix hired several Stanford University natural language processing researchers as the company's first employees. In 1984, it became clear that the advanced natural language and database system that Symantec had developed could not be ported from DEC minicomputers to the PC. This left Symantec without a product, but with expertise in natural language database query systems and technology. As a result, later in 1984, Symantec was acquired by another, smaller software startup company, C&E Software, founded by Denis Coleman and Gordon Eubanks and headed by Eubanks. C&E Software developed a combined file management and word processing program called Q&A. Barry Greenstein, now a professional poker player, was the principal developer of the word processor component within Q&A. The merged company retained the name Symantec. Eubanks became its chairman, Vern Raburn, the former president of the original Symantec, remained as president of the combined company. The new Symantec combined the file management and word processing functionality that C&E had planned, and added an advanced Natural Language query system (designed by Gary Hendrix and engineered by Dan Gordon) that set new standards for ease of database query and report generation. The natural language system was named "The Intelligent Assistant". Turner chose the name of Q&A for Symantec's flagship product, in large part because the name lent itself to use in a short, easily merchandised logo. Brett Walter designed the user interface of Q&A (Brett Walter, director of product management). Q&A was released in November 1985. In 1986, Vern Raburn and Gordon Eubanks swapped roles, and Eubanks became
https://en.wikipedia.org/wiki/Peter%20Norton
Peter Norton (born November 14, 1943) is an American programmer, software publisher, author, and philanthropist. He is best known for the computer programs and books that bear his name and portrait. Norton sold his software business to Symantec Corporation in 1990. Norton was born in Aberdeen, Washington, and raised in Seattle. He attended Reed College and later worked on mainframes and minicomputers for companies like Boeing and Jet Propulsion Laboratory. Norton founded Peter Norton Computing in 1982, pioneering IBM PC compatible utilities software. His first computer book, "Inside the IBM PC: Access to Advanced Features & Programming," was published in 1983. By 1988, Norton Computing had grown to $15 million in revenue with 38 employees. In 1990, Norton Computing released the Norton Backup program, and in 1990, Norton sold the company to Symantec for $70 million. Norton later chaired Acorn Technologies and eChinaCash. He has a significant personal art collection and has been involved in various philanthropic endeavors, including the Peter Norton Family Foundation. He has also donated art to numerous museums and universities. Early life Norton was born in Aberdeen, Washington, and raised in Seattle. He attended Reed College in Portland, Oregon, and majored in math and philosophy. He graduated in 1965. Before he became involved with microcomputers, he spent a dozen years working on mainframes and minicomputers for companies including Boeing and Jet Propulsion Laboratory. His earliest low-level system utilities were designed to allow mainframe programmers access to a block of RAM that IBM normally reserved for diagnostics. Career Utility software When the IBM PC made its debut in 1981, Norton was among the first to buy one. After he was laid off during an aerospace industry cutback, he took up microcomputer programming to make ends meet. One day he accidentally deleted a file. Rather than re-enter the data, as most would have, he decided to write a program to recover the information from the disk. His friends were delighted with the program and he developed a group of utility programs that he sold – one at a time – to user groups. In 1982, he founded Peter Norton Computing with $30,000 and an IBM computer. The company was a pioneer in IBM PC compatible utilities software. Its 1982 introduction of the Norton Utilities included Norton's UNERASE tool to retrieve erased data from MS-DOS and IBM PC DOS formatted disks. Norton marketed the program (primarily on foot) through his one-man software publishing company, leaving behind little pamphlets with technical notes at users group meetings and computer stores. A publisher saw his pamphlets, and saw that he could write about a technical subject. The publisher called him and asked him if he wanted to write a book. Norton's first computer book, Inside the IBM PC: Access to Advanced Features & Programming (Techniques), was published in 1983. Eight editions of this bestseller were published, the la
https://en.wikipedia.org/wiki/Numa
Numa or NUMA may refer to: Non-uniform memory access (NUMA), in computing Places Numa Falls, a waterfall in Kootenay National Park, Canada 15854 Numa, a main-belt asteroid United States Numa, Indiana Numa, Iowa Numa, Oklahoma Numa Peak, a mountain in the Glacier National Park, Montana People Numa Andoire (1908–1994), French football defender and manager Numa Ayrinhac (1881–1951), Franco-Argentine artist Numa Coste (1843-1907), French painter and journalist. Numa Denis Fustel de Coulanges (1830–1889), French historian Numa Droz (1844–1899), Swiss politician Numa Edward Hartog (1846–1871), British academic and activist Numa F. Montet (1892–1985), American politician Numa François Gillet (fl. 1868–1935), French painter Numa Lavanchy (born 1993), Swiss football midfielder Numa Marcius, first Pontifex Maximus of Ancient Rome Numa Morikazu (1843–1890), Meiji era Japanese politician Numa Pompilio Llona (1832–1907), Ecuadorian poet, journalist, educator, diplomat, and philosopher Numa Pompilius (753–673 BC), second king of Rome Numa S. Trivas (fl. 1899–1949), Russian-American art historian and collector Numa Sadoul (born 1947), French writer, actor, and director , Japanese footballer Mauro Numa (born 1961), Italian fencer Shosaku Numa (1929–1992), Japanese neuroscientist Other uses National Underwater and Marine Agency, an organization in the United States (namesake of a fictional US government organization in novels by Clive Cussler) Northern Paiute people, who call themselves Numa Numa, a performing lion who was raised at Gay's Lion Farm in El Monte, California, US , English whaling, transport, and merchant ship Cyclone Numa, a Mediterranean tropical-like cyclone in November 2017, which had subtropical characteristics See also Numa Numa (disambiguation) Nuclear mitotic apparatus protein 1, encoded by the NUMA1 gene Japanese-language surnames
https://en.wikipedia.org/wiki/The%20Sports%20Network
The Sports Network (TSN) is a Canadian English language discretionary sports specialty channel owned by CTV Specialty Television, owned jointly by Bell Media (70%) and ESPN Inc. (30%). The company was established by the Labatt Brewing Company in 1984 as part of the first group of Canadian specialty cable channels. TSN is the largest specialty channel in Canada in terms of gross revenue, with a total of in revenue in 2013. TSN broadcasts primarily from studio facilities located at Bell Media Agincourt in the Scarborough neighbourhood of Toronto, Ontario. Stewart Johnston currently serves as president of TSN, a position he has held since 2010. TSN's networks focus on sports-related programming, including live and recorded event telecasts, sports talk shows, and other original programming. History Early history Licensed by the Canadian Radio-television and Telecommunications Commission (CRTC) on April 2, 1984, as the Action Canada Sports Network, the channel was launched by the Labatt Brewing Company on September 1 of the same year as The Sports Network, or TSN. The network was founded under the leadership of Gordon Craig, a former employee of CBC Sports; alongside coverage of the then co-owned Toronto Blue Jays, TSN also reached a deal with ESPN (itself only 5 years old) shortly before launch to provide additional programs. Although reaching around 400,000 subscribers, TSN's early years were hindered by its initial status as a premium service, bundled in a high-cost package with movie channels such as First Choice and Superchannel, alongside competition with free-to-air sports broadcasts by CBC Television among others. To improve the prominence of the network, TSN sought to obtain the national cable rights to the National Hockey League—rights that, according to the league, were not sold under the current arrangement with CBC. However, the task was complicated by claims by CBC that it owned the cable rights to the NHL, along with the involvement of competing beer company Molson in Canadian NHL rights at the time. With the help of a Molson employee who was a friend of Gordon, a deal was reached between TSN, Molson, and the NHL to allow the network to broadcast games on cable. By December 1987, TSN had reached one million subscribers, but the network's staff sought wider distribution for the channel as part of basic cable service; the CRTC approved the network's request for permission to allow TSN to be carried as part of a basic cable lineup. Mike Day, producer of TSN's daily sports news program SportsDesk lamented about the shift to basic cable and the larger audience it would bring, commenting that "one night you're doing a news show that potentially has an audience of one million people, and the next day the potential is five million people." In 1991, TSN acquired rights to the IIHF World Junior Championship, otherwise known as the "World Juniors", which were previously broadcast by CBC. TSN's coverage, along with the recent "Punch-up in
https://en.wikipedia.org/wiki/Icon%20%28computing%29
In computing, an icon is a pictogram or ideogram displayed on a computer screen in order to help the user navigate a computer system. The icon itself is a quickly comprehensible symbol of a software tool, function, or a data file, accessible on the system and is more like a traffic sign than a detailed illustration of the actual entity it represents. It can serve as an electronic hyperlink or file shortcut to access the program or data. The user can activate an icon using a mouse, pointer, finger, or voice commands. Their placement on the screen, also in relation to other icons, may provide further information to the user about their usage. In activating an icon, the user can move directly into and out of the identified function without knowing anything further about the location or requirements of the file or code. Icons as parts of the graphical user interface of the computer system, in conjunction with windows, menus and a pointing device (mouse), belong to the much larger topic of the history of the graphical user interface that has largely supplanted the text-based interface for casual use. Overview The computing definition of "icon" can include three distinct semiotical elements: Icon, which resembles its referent (such as a road sign for falling rocks). This category includes stylized drawings of objects from the office environment or from other professional areas such as printers, scissors, file cabinets and folders. Index, which is associated with its referent (smoke is a sign of fire). This category includes stylized drawings used to refer to actions "printer" and "print", "scissors" and "cut" or "magnifying glass" and "search". Symbol, which is related to its referent only by convention (letters, musical notation, mathematical operators etc.). This category includes standardized symbols found across many electronic devices, such as the power on/off symbol and the USB icon. The majority of icons are encoded and decoded using metonymy, synecdoche, and metaphor. An example of metaphorical representation characterizes all the major desktop-based computer systems including the desktop that uses an iconic representation of objects from the 1980s office environment to transpose attributes from a familiar context/object to an unfamiliar one. This is known as skeuomorphism, and an example is the use of the floppy disk to represent saving data; even though floppy disks have been obsolete for roughly a quarter century, it is still recognized as "the save icon". Metonymy is in itself a subset of metaphors that use one entity to point to another related to it such as using a fluorescent bulb instead of a filament one to represent power saving settings. Synecdoche is considered as a special case of metonymy, in the usual sense of the part standing for the whole such as a single component for the entire system, speaker driver for the entire audio system settings. Additionally, a group of icons can be categorised as brand icons, used to
https://en.wikipedia.org/wiki/Automata%20%28disambiguation%29
Automata are self-operating machines. Automata may also refer to: Computing Cellular automata, a discrete model studied in computability theory and other disciplines Von Neumann cellular automata, the original expression of cellular automata Automata theory, the study of abstract machines Automata UK, a former software house Arts and entertainment "The Automata", an 1819 short story by E. T. A. Hoffmann "Automata", a 1929 short story by S. Fowler Wright Autómata, a 2014 science-fiction film Nier: Automata, a 2017 video game Automata, an alternative title of The Devil's Machine, a 2019 British horror film directed by Lawrie Brewster See also Automat (disambiguation) Automatic (disambiguation) Automaton (disambiguation)
https://en.wikipedia.org/wiki/TransLink%20%28British%20Columbia%29
TransLink, formally the South Coast British Columbia Transportation Authority, is the statutory authority responsible for the regional transportation network of Metro Vancouver in British Columbia, Canada, including public transport, major roads and bridges. Its main operating facilities are located in the city of New Westminster. TransLink was created in 1998 as the Greater Vancouver Transportation Authority (GVTA) and was fully implemented in April 1999 by the Government of British Columbia to replace BC Transit in the Greater Vancouver Regional District and assume many transportation responsibilities previously held by the provincial government. TransLink is responsible for various modes of transportation in the Metro Vancouver region as well as the West Coast Express, which extends into the Fraser Valley Regional District (FVRD). On November 29, 2007, the province of British Columbia approved legislation changing the governance structure and official name of the organization. History 2007 reorganization On March 8, 2007, BC Minister of Transportation Kevin Falcon announced a restructuring of TransLink. Major changes include new revenue-generating measures, a restructuring of the executive of the body, and increases in the areas under TransLink's jurisdiction. The reorganization of TransLink proposed the following changes: The old board will be replaced by a Council of Mayors from the municipalities in the area served by TransLink, a board of non-political experts, and a regional transportation commissioner appointed by the Council of Mayors. The provincial government will set the regional transportation vision. The Board will guide the operation of TransLink as per the 3- and 10-year transportation plans. It will also develop the options for 3- and 10-year plans; one option will be a base option that maintains the status quo. The Council of Mayors will vote on which 3- and 10-year transportation plan options to adopt. Mayors will receive one vote per 20,000 people or portion thereof in their jurisdiction. The TransLink independent commissioner will ensure that TransLink's 3- and 10-year transportation plans are consistent with the regional transportation vision set by the provincial government. TransLink's jurisdiction is initially planned to be expanded to include Mission, Abbotsford, and Squamish. In the long term, this may be further expanded to include the area along the Sea-to-Sky Highway as far north as Pemberton and east to Hope. TransLink will be funded using an approximate ratio of 1/3 of revenue from fuel taxes, 1/3 of revenue from property taxes, and 1/3 of revenue from other non-government sources (e.g., fares, advertising, property development). TransLink will hold the power to increase funding from fuel tax from 12 cents per litre (55 cents per Imp gal or 45 cents per US gal) to 15 cents per litre (68 cents per Imp gal or 57 cents per US gal). In 2012, the rate was increased to 17 cents. TransLink will increase f
https://en.wikipedia.org/wiki/List%20of%20DOS%20commands
This article presents a list of commands used by DOS operating systems, especially as used on x86-based IBM PC compatibles (PCs). Other DOS operating systems are not part of the scope of this list. In DOS, many standard system commands were provided for common tasks such as listing files on a disk or moving files. Some commands were built into the command interpreter, others existed as external commands on disk. Over the several generations of DOS, commands were added for the additional functions of the operating system. In the current Microsoft Windows operating system, a text-mode command prompt window, cmd.exe, can still be used. Command processing The command interpreter for DOS runs when no application programs are running. When an application exits, if the transient portion of the command interpreter in memory was overwritten, DOS will reload it from disk. Some commands are internal—built into COMMAND.COM; others are external commands stored on disk. When the user types a line of text at the operating system command prompt, COMMAND.COM will parse the line and attempt to match a command name to a built-in command or to the name of an executable program file or batch file on disk. If no match is found, an error message is printed, and the command prompt is refreshed. External commands were too large to keep in the command processor, or were less frequently used. Such utility programs would be stored on disk and loaded just like regular application programs but were distributed with the operating system. Copies of these utility command programs had to be on an accessible disk, either on the current drive or on the command path set in the command interpreter. In the list below, commands that can accept more than one file name, or a filename including wildcards (* and ?), are said to accept a filespec (file specification) parameter. Commands that can accept only a single file name are said to accept a filename parameter. Additionally, command line switches, or other parameter strings, can be supplied on the command line. Spaces and symbols such as a "/" or a "-" may be used to allow the command processor to parse the command line into filenames, file specifications, and other options. The command interpreter preserves the case of whatever parameters are passed to commands, but the command names themselves and file names are case-insensitive. Many commands are the same across many DOS systems, but some differ in command syntax or name. DOS commands A partial list of the most common commands for MS-DOS and IBM PC DOS follows below. APPEND Sets the path to be searched for data files or displays the current search path. The APPEND command is similar to the PATH command that tells DOS where to search for program files (files with a .COM, . EXE, or .BAT file name extension). The command is available in MS-DOS versions 3.2 and later. ASSIGN The command redirects requests for disk operations on one drive to a different drive. It can also di
https://en.wikipedia.org/wiki/Gene%20regulatory%20network
A gene (or genetic) regulatory network (GRN) is a collection of molecular regulators that interact with each other and with other substances in the cell to govern the gene expression levels of mRNA and proteins which, in turn, determine the function of the cell. GRN also play a central role in morphogenesis, the creation of body structures, which in turn is central to evolutionary developmental biology (evo-devo). The regulator can be DNA, RNA, protein or any combination of two or more of these three that form a complex, such as a specific sequence of DNA and a transcription factor to activate that sequence. The interaction can be direct or indirect (through transcribed RNA or translated protein). In general, each mRNA molecule goes on to make a specific protein (or set of proteins). In some cases this protein will be structural, and will accumulate at the cell membrane or within the cell to give it particular structural properties. In other cases the protein will be an enzyme, i.e., a micro-machine that catalyses a certain reaction, such as the breakdown of a food source or toxin. Some proteins though serve only to activate other genes, and these are the transcription factors that are the main players in regulatory networks or cascades. By binding to the promoter region at the start of other genes they turn them on, initiating the production of another protein, and so on. Some transcription factors are inhibitory. In single-celled organisms, regulatory networks respond to the external environment, optimising the cell at a given time for survival in this environment. Thus a yeast cell, finding itself in a sugar solution, will turn on genes to make enzymes that process the sugar to alcohol. This process, which we associate with wine-making, is how the yeast cell makes its living, gaining energy to multiply, which under normal circumstances would enhance its survival prospects. In multicellular animals the same principle has been put in the service of gene cascades that control body-shape. Each time a cell divides, two cells result which, although they contain the same genome in full, can differ in which genes are turned on and making proteins. Sometimes a 'self-sustaining feedback loop' ensures that a cell maintains its identity and passes it on. Less understood is the mechanism of epigenetics by which chromatin modification may provide cellular memory by blocking or allowing transcription. A major feature of multicellular animals is the use of morphogen gradients, which in effect provide a positioning system that tells a cell where in the body it is, and hence what sort of cell to become. A gene that is turned on in one cell may make a product that leaves the cell and diffuses through adjacent cells, entering them and turning on genes only when it is present above a certain threshold level. These cells are thus induced into a new fate, and may even generate other morphogens that signal back to the original cell. Over longer distances morphogen
https://en.wikipedia.org/wiki/Computer-aided%20maintenance
Computer-aided maintenance (not to be confused with CAM which usually stands for Computer Aided Manufacturing) refers to systems that utilize software to organize planning, scheduling, and support of maintenance and repair. A common application of such systems is the maintenance of computers, either hardware or software, themselves. It can also apply to the maintenance of other complex systems that require periodic maintenance, such as reminding operators that preventive maintenance is due or even predicting when such maintenance should be performed based on recorded past experience. Computer aided configuration The first computer-aided maintenance software came from DEC in the 1980s to configure VAX computers. The software was built using the techniques of artificial intelligence expert systems, because the problem of configuring a VAX required expert knowledge. During the research, the software was called R1 and was renamed XCON when placed in service. Fundamentally, XCON was a rule-based configuration database written as an expert system using forward chaining rules. As one of the first expert systems to be pressed into commercial service it created high expectations, which did not materialize, as DEC lost commercial pre-eminence. Help Desk software Help desks frequently use help desk software that captures symptoms of a bug and relates them to fixes, in a fix database. One of the problems with this approach is that the understanding of the problem is embodied in a non-human way, so that solutions are not unified. Strategies for finding fixes The bubble-up strategy simply records pairs of symptoms and fixes. The most frequent set of pairs is then presented as a tentative solution, which is then attempted. If the fix works, that fact is further recorded, along with the configuration of the presenting system, into a solutions database. Oddly enough, shutting down and booting up again manages to 'fix,' or at least 'mask,' a bug in many computer-based systems; thus reboot is the remedy for distressingly many symptoms in a 'fix database.' The reason a reboot often works is that it causes the RAM to be flushed. However, typically the same set of actions are likely to create the same result demonstrating a need to refine the "startup" applications (which launch into memory) or install the latest fix/patch of the offending application. Currently, most expertise in finding fixes lies in human domain experts, who simply sit at a replica of the computer-based system, and who then 'talk through' the problem with the client to duplicate the problem, and then relate the fix. References Help desk Product lifecycle management Computer systems
https://en.wikipedia.org/wiki/Lookup%20table
In computer science, a lookup table (LUT) is an array that replaces runtime computation with a simpler array indexing operation, in a process termed as direct addressing. The savings in processing time can be significant, because retrieving a value from memory is often faster than carrying out an "expensive" computation or input/output operation. The tables may be precalculated and stored in static program storage, calculated (or "pre-fetched") as part of a program's initialization phase (memoization), or even stored in hardware in application-specific platforms. Lookup tables are also used extensively to validate input values by matching against a list of valid (or invalid) items in an array and, in some programming languages, may include pointer functions (or offsets to labels) to process the matching input. FPGAs also make extensive use of reconfigurable, hardware-implemented, lookup tables to provide programmable hardware functionality. LUTs differ from hash tables in a way that, to retrieve a value with key , a hash table would store the value in the slot where is a hash function i.e. is used to compute the slot, while in the case of LUT, the value is stored in slot , thus directly addressable. History Before the advent of computers, lookup tables of values were used to speed up hand calculations of complex functions, such as in trigonometry, logarithms, and statistical density functions. In ancient (499 AD) India, Aryabhata created one of the first sine tables, which he encoded in a Sanskrit-letter-based number system. In 493 AD, Victorius of Aquitaine wrote a 98-column multiplication table which gave (in Roman numerals) the product of every number from 2 to 50 times and the rows were "a list of numbers starting with one thousand, descending by hundreds to one hundred, then descending by tens to ten, then by ones to one, and then the fractions down to 1/144" Modern school children are often taught to memorize "times tables" to avoid calculations of the most commonly used numbers (up to 9 x 9 or 12 x 12). Early in the history of computers, input/output operations were particularly slow – even in comparison to processor speeds of the time. It made sense to reduce expensive read operations by a form of manual caching by creating either static lookup tables (embedded in the program) or dynamic prefetched arrays to contain only the most commonly occurring data items. Despite the introduction of systemwide caching that now automates this process, application level lookup tables can still improve performance for data items that rarely, if ever, change. Lookup tables were one of the earliest functionalities implemented in computer spreadsheets, with the initial version of VisiCalc (1979) including a LOOKUP function among its original 20 functions. This has been followed by subsequent spreadsheets, such as Microsoft Excel, and complemented by specialized VLOOKUP and HLOOKUP functions to simplify lookup in a vertical or horizontal table.
https://en.wikipedia.org/wiki/Varian%20Fry
Varian Mackey Fry (October 15, 1907 – September 13, 1967) was an American journalist. Fry ran a rescue network in Vichy France that helped 2,000 to 4,000 anti-Nazi and Jewish refugees to escape Nazi Germany and the Holocaust. He was the first of five Americans to be recognized as "Righteous Among the Nations", an honorific given by the State of Israel to non-Jews who risked their lives to save Jews during the Holocaust. Early life Fry was born in New York City. His parents were Lillian (Mackey) and Arthur Fry, a manager of the Wall Street firm Carlysle and Mellick. The family moved to Ridgewood, New Jersey, in 1910. He grew up in Ridgewood and enjoyed bird-watching and reading. During World War I, at 9 years of age, Fry and friends conducted a fund-raising bazaar for the American Red Cross that included a vaudeville show, an ice cream stand and fish pond. He was educated at Hotchkiss School from 1922 to 1924, when he left the school due to hazing rituals. He then attended the Riverdale Country School, graduating in 1926. An able and multilingual student, Fry scored in the top 10% of the Harvard University entrance exams. In 1927, as a Harvard undergraduate, he founded Hound & Horn, an influential literary quarterly, in collaboration with Lincoln Kirstein. He was suspended for a prank just before graduation and had to repeat his senior year. Through Kirstein's sister, Mina, he met his future wife, Eileen Avery Hughes, an editor of Atlantic Monthly, who was seven years his senior and had been educated at Roedean School and Oxford University. Although Fry was a closeted homosexual, according to his son James, they married on 2 June 1931. Journalist While working as a foreign correspondent for the American journal The Living Age, Fry visited Berlin in 1935, and personally witnessed Nazi abuse against Jews on more than one occasion, which "turned him into an ardent anti-Nazi". He said in 1945, "I could not remain idle as long as I had any chances at all of saving even a few of its intended victims." Following his visit to Berlin, in 1935 Fry wrote about the savage treatment of Jews by Hitler's regime in The New York Times. He wrote books about foreign affairs for Headline Books, owned by the Foreign Policy Association, including The Peace that Failed. It describes the troubled political climate following World War I, the break-up of Czechoslovakia and the events leading up to World War II. Emergency Rescue Committee Greatly disturbed by what he saw, Fry helped raise money to support European anti-Nazi movements. Shortly after the invasion of France in June 1940, which the Germans quickly occupied, Fry and friends formed the Emergency Rescue Committee (ERC) in New York City, with support of First Lady Eleanor Roosevelt and others. By August 1940, Fry was in Marseille representing the ERC in an effort to help persons seeking to flee the Nazis. They worked to circumvent bureaucratic processes set up by French authorities, who would not issue exi
https://en.wikipedia.org/wiki/Role-playing%20video%20game
A role-playing video game, commonly referred to as a role-playing game (RPG) or computer role-playing game (CRPG), is a video game genre where the player controls the actions of a character (or several party members) immersed in some well-defined world, usually involving some form of character development by way of recording statistics. Many role-playing video games have origins in tabletop role-playing games and use much of the same terminology, settings and game mechanics. Other major similarities with pen-and-paper games include developed story-telling and narrative elements, player character development, complexity, as well as replay value and immersion. The electronic medium removes the necessity for a gamemaster and increases combat resolution speed. RPGs have evolved from simple text-based console-window games into visually rich 3D experiences. Characteristics Role-playing video games use much of the same terminology, settings and game mechanics as early tabletop role-playing games such as Dungeons & Dragons. Players control a central game character, or multiple game characters, usually called a party, and attain victory by completing a series of quests or reaching the conclusion of a central storyline. Players explore a game world, while solving puzzles and engaging in combat. A key feature of the genre is that characters grow in power and abilities, and characters are typically designed by the player. RPGs rarely challenge a player's physical coordination or reaction time, with the exception of action role-playing games. Role-playing video games typically rely on a highly developed story and setting, which is divided into a number of quests. Players control one or several characters by issuing commands, which are performed by the character at an effectiveness determined by that character's numeric attributes. Often these attributes increase each time a character gains a level, and a character's level goes up each time the player accumulates a certain amount of experience. Role-playing video games also typically attempt to offer more complex and dynamic character interaction than what is found in other video game genres. This usually involves additional focus on the artificial intelligence and scripted behavior of computer-controlled non-player characters. Story and setting The premise of many role-playing games tasks the player with saving the world, or whichever level of society is threatened. There are often twists and turns as the story progresses, such as the surprise appearance of estranged relatives, or enemies who become friends or vice versa. The game world is often rooted in speculative fiction (i.e. fantasy or science fiction), which allows players to do things they cannot do in real life and helps players suspend their disbelief about the rapid character growth. To a lesser extent, settings closer to the present day or near future are possible. The story often provides much of the entertainment in the game. Because these
https://en.wikipedia.org/wiki/HVD
HVD may refer to: High-voltage differential signaling, an electrical signalling method Hosted Virtual Desktop, a type of computer desktop virtualization High-Definition Versatile Disc, a DVD format Holographic Versatile Disc, an optical disc technology High-value detention site, a type of United States military prison Humanistischer Verband Deutschlands, a German humanist organization Khovd Airport, in Mongolia
https://en.wikipedia.org/wiki/Machine%20code%20monitor
A machine code monitor ( machine language monitor) is software that allows a user to enter commands to view and change memory locations on a computer, with options to load and save memory contents from/to secondary storage. Some full-featured machine code monitors provide detailed control ("single-stepping") of the execution of machine language programs (much like a debugger), and include absolute-address code assembly and disassembly capabilities. Motorola published the MIKBUG ROM monitor for the 6800 in 1973 and the BUFFALO ROM monitor for the 68HC11. Machine code monitors became popular during the home computer era of the 1970s and 1980s and were sometimes available as resident firmware in some computers (e.g., the built-in monitors in the Commodore 128, Heathkit H89 and Zenith laptops). Often, computer manufacturers rely on their ROM-resident monitors to permit users to reconfigure their computers following installation of upgrade hardware, such as expanded main memory, additional disk drives, or different video displays. It was not unheard of to perform all of one's programming in a monitor in lieu of a full-fledged symbolic assembler. Even after full-featured assemblers became readily available, a machine code monitor was indispensable for debugging programs. The usual technique was to set break points in the code undergoing testing (e.g., with a BRK instruction in 6502 assembly language) and start the program. When the microprocessor encountered a break point, the test program would be interrupted and control would be transferred to the machine code monitor. Typically, this would trigger a register dump and then the monitor would await programmer input. Activities at this point might include examining memory contents, patching code and/or perhaps altering the processor registers prior to restarting the test program. In most systems where higher-level languages are employed, debuggers are used to present a more abstract and friendly view of what is happening within a program. However, the use of machine code monitors persists, especially in the area of hobby-built computers. References Microcomputer software
https://en.wikipedia.org/wiki/Gr%C3%B6bner%20basis
In mathematics, and more specifically in computer algebra, computational algebraic geometry, and computational commutative algebra, a Gröbner basis is a particular kind of generating set of an ideal in a polynomial ring over a field . A Gröbner basis allows many important properties of the ideal and the associated algebraic variety to be deduced easily, such as the dimension and the number of zeros when it is finite. Gröbner basis computation is one of the main practical tools for solving systems of polynomial equations and computing the images of algebraic varieties under projections or rational maps. Gröbner basis computation can be seen as a multivariate, non-linear generalization of both Euclid's algorithm for computing polynomial greatest common divisors, and Gaussian elimination for linear systems. Gröbner bases were introduced by Bruno Buchberger in his 1965 Ph.D. thesis, which also included an algorithm to compute them (Buchberger's algorithm). He named them after his advisor Wolfgang Gröbner. In 2007, Buchberger received the Association for Computing Machinery's Paris Kanellakis Theory and Practice Award for this work. However, the Russian mathematician Nikolai Günther had introduced a similar notion in 1913, published in various Russian mathematical journals. These papers were largely ignored by the mathematical community until their rediscovery in 1987 by Bodo Renschuch et al. An analogous concept for multivariate power series was developed independently by Heisuke Hironaka in 1964, who named them standard bases. This term has been used by some authors to also denote Gröbner bases. The theory of Gröbner bases has been extended by many authors in various directions. It has been generalized to other structures such as polynomials over principal ideal rings or polynomial rings, and also some classes of non-commutative rings and algebras, like Ore algebras. Tools Polynomial ring Gröbner bases are primarily defined for ideals in a polynomial ring over a field . Although the theory works for any field, most Gröbner basis computations are done either when is the field of rationals or the integers modulo a prime number. In the context of Gröbner bases, a nonzero polynomial in is commonly represented as a sum where the are nonzero elements of , called coefficients, and the are monomials (called power products by Buchberger and some of his followers) of the form where the are nonnegative integers. The vector is called the exponent vector of the monomial. When the list of the variables is fixed, the notation of monomials is often abbreviated as Monomials are uniquely defined by their exponent vectors, and, when a monomial ordering (see below) is fixed, a polynomial is uniquely represented by the ordered list of the ordered pairs formed by an exponent vector and the corresponding coefficient. This representation of polynomials is especially efficient for Gröbner basis computation in computers, although it is less convenient fo
https://en.wikipedia.org/wiki/Correctness%20%28computer%20science%29
In theoretical computer science, an algorithm is correct with respect to a specification if it behaves as specified. Best explored is functional correctness, which refers to the input-output behavior of the algorithm (i.e., for each input it produces an output satisfying the specification). Within the latter notion, partial correctness, requiring that if an answer is returned it will be correct, is distinguished from total correctness, which additionally requires that an answer is eventually returned, i.e. the algorithm terminates. Correspondingly, to prove a program's total correctness, it is sufficient to prove its partial correctness, and its termination. The latter kind of proof (termination proof) can never be fully automated, since the halting problem is undecidable. For example, successively searching through integers 1, 2, 3, … to see if we can find an example of some phenomenon—say an odd perfect number—it is quite easy to write a partially correct program (see box). But to say this program is totally correct would be to assert something currently not known in number theory. A proof would have to be a mathematical proof, assuming both the algorithm and specification are given formally. In particular it is not expected to be a correctness assertion for a given program implementing the algorithm on a given machine. That would involve such considerations as limitations on computer memory. A deep result in proof theory, the Curry–Howard correspondence, states that a proof of functional correctness in constructive logic corresponds to a certain program in the lambda calculus. Converting a proof in this way is called program extraction. Hoare logic is a specific formal system for reasoning rigorously about the correctness of computer programs. It uses axiomatic techniques to define programming language semantics and argue about the correctness of programs through assertions known as Hoare triples. Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Although crucial to software quality and widely deployed by programmers and testers, software testing still remains an art, due to limited understanding of the principles of software. The difficulty in software testing stems from the complexity of software: we can not completely test a program with moderate complexity. Testing is more than just debugging. The purpose of testing can be quality assurance, verification and validation, or reliability estimation. Testing can be used as a generic metric as well. Correctness testing and reliability testing are two major areas of testing. Software testing is a trade-off between budget, time and quality. See also Formal verification Design by contract Program analysis Model checking Compiler correctness Program derivation Notes References "Human Language Technology. Challenges for Computer Science and Linguistics." Google Books. N.p., n.d. Web. 10
https://en.wikipedia.org/wiki/Heuristic%20evaluation
A heuristic evaluation is a usability inspection method for computer software that helps to identify usability problems in the user interface design. It specifically involves evaluators examining the interface and judging its compliance with recognized usability principles (the "heuristics"). These evaluation methods are now widely taught and practiced in the new media sector, where user interfaces are often designed in a short space of time on a budget that may restrict the amount of money available to provide for other types of interface testing. Introduction The main goal of heuristic evaluations is to identify any problems associated with the design of user interfaces. Usability consultants Rolf Molich and Jakob Nielsen developed this method on the basis of several years of experience in teaching and consulting about usability engineering. Heuristic evaluations are one of the most informal methods of usability inspection in the field of human–computer interaction. There are many sets of usability design heuristics; they are not mutually exclusive and cover many of the same aspects of user interface design. Quite often, usability problems that are discovered are categorized—often on a numeric scale—according to their estimated impact on user performance or acceptance. Often the heuristic evaluation is conducted in the context of use cases (typical user tasks), to provide feedback to the developers on the extent to which the interface is likely to be compatible with the intended users' needs and preferences. The simplicity of heuristic evaluation is beneficial at the early stages of design and prior to user-based testing. This usability inspection method does not rely on users which can be burdensome due to the need for recruiting, scheduling issues, a place to perform the evaluation, and a payment for participant time. In the original report published, Nielsen stated that four experiments showed that individual evaluators were "mostly quite bad" at doing heuristic evaluations and suggested multiple evaluators were needed, with the results aggregated, to produce and to complete an acceptable review. Most heuristic evaluations can be accomplished in a matter of days. The time required varies with the size of the artifact, its complexity, the purpose of the review, the nature of the usability issues that arise in the review, and the competence of the reviewers. Using heuristic evaluation prior to user testing is often conducted to identify areas to be included in the evaluation or to eliminate perceived design issues prior to user-based evaluation. Although heuristic evaluation can uncover many major usability issues in a short period of time, a criticism that is often leveled is that results are highly influenced by the knowledge of the expert reviewer(s). This "one-sided" review repeatedly has different results than software performance testing, each type of testing uncovering a different set of problems. Methodology Heuristic evaluat
https://en.wikipedia.org/wiki/SYSLINUX
The Syslinux Project is a suite of five different boot loaders for starting up Linux distros on computers. It was primarily developed by H. Peter Anvin. Components The Syslinux Project consists of five different boot loaders: The eponymous SYSLINUX, used for booting from the FAT filesystem ISOLINUX, used for booting from the ISO 9660 filesystem PXELINUX, used for booting from a network server using the Preboot Execution Environment (PXE) system EXTLINUX, used for booting from Btrfs, ext2, ext3, ext4, FAT, NTFS, UFS/UFS2, and XFS filesystems MEMDISK, emulates a RAM disk for older operating systems like MS-DOS The project also includes two separate menu systems and a development environment for additional modules. SYSLINUX and ISOLINUX SYSLINUX was originally meant for rescue floppy disks, live USBs, or other lightweight environments. ISOLINUX is meant for live CDs and Linux installation CDs. The SYSLINUX bootloader can be used to boot multiple distributions from a single source such as a USB stick. A minor complication is involved when booting from compact discs. The El Torito standard allows booting in two different modes: No emulation Requires storing the boot information directly on the CD. ISOLINUX is suitable for this mode. Floppy emulation Requires storing the boot information in a disk image file suitable for emulating a FAT-formatted floppy disk. SYSLINUX is suitable for this mode. To have this choice is sometimes useful, since ISOLINUX is vulnerable to BIOS bugs. For that reason, it is handy to be able to boot using SYSLINUX. This mostly affects computers built before about 1999, and, in fact, for modern computers the "no emulation" mode is generally the more reliable method. Newer ISOLINUX versions support creation of so-called "hybrid ISO" images, that put both the El Torito boot record of the compact discs and the master boot record of hard disks into an ISO image. This hybrid image could then be written to both a compact disc or a USB flash drive. PXELINUX PXELINUX is used in conjunction with a PXE-compliant ROM on a network interface controller (NIC), which enables receiving a bootstrap program over the local area network. This bootstrap program loads and configures an operating system kernel that puts the user in control of the computer. Typically, PXELINUX is used for performing Linux installations from a central network server or for booting diskless workstations. EXTLINUX EXTLINUX is a general-purpose bootloader, similar to LILO or GRUB. Since Syslinux 4, EXTLINUX is capable of handling Btrfs, FAT, NTFS, UFS/UFS2, and XFS filesystems. COMBOOT SYSLINUX can be extended by COMBOOT modules written in C or assembly language. 32-bit modules typically use the .c32 filename extension. Version 5 and later do not support 16-bit .com modules. Hardware Detection Tool (HDT) Since the 3.74 release, the Syslinux project hosts the Hardware Detection Tool (HDT) project, licensed under the terms of GNU GPL. This tool is a 32-bit
https://en.wikipedia.org/wiki/Component%20Pascal
Component Pascal is a programming language in the tradition of Niklaus Wirth's Pascal, Modula-2, Oberon and Oberon-2. It bears the name of the language Pascal and preserves its heritage, but is incompatible with Pascal. Instead, it is a minor variant and refinement of Oberon-2 with a more expressive type system and built-in string support. Component Pascal was originally named Oberon/L, and was designed and supported by a small ETH Zürich spin-off company named Oberon microsystems. They developed an integrated development environment (IDE) named BlackBox Component Builder. Since 2014, development and support has been taken over by a small group of volunteers. The first version of the IDE was released in 1994, as Oberon/F. At the time, it presented a novel approach to graphical user interface (GUI) construction based on editable forms, where fields and command buttons are linked to exported variables and executable procedures. This approach bears some similarity to the code-behind way used in Microsoft's .NET 3.0 to access code in Extensible Application Markup Language (XAML), which was released in 2008. An open-source software implementation of Component Pascal exists for the .NET and Java virtual machine (JVM) platforms, from the Gardens Point team around John Gough at Queensland University of Technology in Australia. On 23 June 2004 Oberon microsystems announced that the BlackBox Component Builder was made available as a free download and that an open-source version was planned. The beta open-source version was initially released in December 2004 and updated to a final v1.5 release in December 2005. It includes the complete source code of the IDE, compiler, debugger, source analyser, profiler, and interfacing libraries, and can also be downloaded from their website. Several release candidates for v1.6 appeared in the years 2009–2011, the latest one (1.6rc6) appeared on Oberon microsystems web pages in 2011. At the end of 2013, Oberon microsystems released the final release 1.6. It is probably the last release bundled by them. A small community took over the ongoing development. BlackBox Component Pascal uses the extensions .odc (Oberon document) for document files, such as source files, and .osf (Oberon symbol file) for symbol files while Gardens Point Component Pascal uses .cp for source and .cps for symbol files. BlackBox Component Pascal has its own executable and loadable object format .ocf (Oberon code file); it includes a runtime linking loader for this format. The document format (.odc) is a rich text binary format, which allows formatting, supports conditional folding, and allows active content to be embedded in the source text. It also handles user interface elements in editable forms. This is in the tradition of the Oberon Text format. Syntax The full syntax for CP, as given by the Language Report, is shown below. In the extended Backus–Naur form, only 34 grammatical productions are needed, one more than for Oberon-2, although i
https://en.wikipedia.org/wiki/Outline%20of%20software%20engineering
The following outline is provided as an overview of and topical guide to software engineering: Software engineering – application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software; that is the application of engineering to software. The ACM Computing Classification system is a poly-hierarchical ontology that organizes the topics of the field and can be used in semantic web applications and as a de facto standard classification system for the field. The major section "Software and its Engineering" provides an outline and ontology for software engineering. Software applications Software engineers build software (applications, operating systems, system software) that people use. Applications influence software engineering by pressuring developers to solve problems in new ways. For example, consumer software emphasizes low cost, medical software emphasizes high quality, and Internet commerce software emphasizes rapid development. Business software Accounting software Analytics Data mining closely related to database Decision support systems Airline reservations Banking Automated teller machines Cheque processing Credit cards Commerce Trade Auctions (e.g. eBay) Reverse auctions (procurement) Bar code scanners Compilers Parsers Compiler optimization Interpreters Linkers Loaders Communication E-mail Instant messengers VOIP Calendars — scheduling and coordinating Contact managers Computer graphics Animation Special effects for video and film Editing Post-processing Cryptography Databases, support almost every field Embedded systems Both software engineers and traditional engineers write software control systems for embedded products. Automotive software Avionics software Heating ventilating and air conditioning (HVAC) software Medical device software Telephony Telemetry Engineering All traditional engineering branches use software extensively. Engineers use spreadsheets, more than they ever used calculators. Engineers use custom software tools to design, analyze, and simulate their own projects, like bridges and power lines. These projects resemble software in many respects, because the work exists as electronic documents and goes through analysis, design, implementation, and testing phases. Software tools for engineers use the tenets of computer science; as well as the tenets of calculus, physics, and chemistry. Computer Aided Design (CAD) Electronic Design Automation (EDA) Numerical Analysis Simulation File FTP File sharing File synchronization Finance Bond market Futures market Stock market Games Poker Multiuser Dungeons Video games Information systems, support almost every field LIS Management of laboratory data MIS Management of financial and personnel data Logistics Supply chain management Manufacturing Computer Aided Manufacturing (CAM) Distributed Control Systems (DCS) Music Music sequencers Sound effects Music synthesis Network Management Network management system Element Manage
https://en.wikipedia.org/wiki/Promiscuous%20mode
In computer networking, promiscuous mode is a mode for a wired network interface controller (NIC) or wireless network interface controller (WNIC) that causes the controller to pass all traffic it receives to the central processing unit (CPU) rather than passing only the frames that the controller is specifically programmed to receive. This mode is normally used for packet sniffing that takes place on a router or on a computer connected to a wired network or one being part of a wireless LAN. Interfaces are placed into promiscuous mode by software bridges often used with hardware virtualization. In IEEE 802 networks such as Ethernet or IEEE 802.11, each frame includes a destination MAC address. In non-promiscuous mode, when a NIC receives a frame, it drops it unless the frame is addressed to that NIC's MAC address or is a broadcast or multicast addressed frame. In promiscuous mode, however, the NIC allows all frames through, thus allowing the computer to read frames intended for other machines or network devices. Many operating systems require superuser privileges to enable promiscuous mode. A non-routing node in promiscuous mode can generally only monitor traffic to and from other nodes within the same broadcast domain (for Ethernet and IEEE 802.11) or ring (for Token Ring). Computers attached to the same Ethernet hub satisfy this requirement, which is why network switches are used to combat malicious use of promiscuous mode. A router may monitor all traffic that it routes. Promiscuous mode is often used to diagnose network connectivity issues. There are programs that make use of this feature to show the user all the data being transferred over the network. Some protocols like FTP and Telnet transfer data and passwords in clear text, without encryption, and network scanners can see this data. Therefore, computer users are encouraged to stay away from insecure protocols like telnet and use more secure ones such as SSH. Detection As promiscuous mode can be used in a malicious way to capture private data in transit on a network, computer security professionals might be interested in detecting network devices that are in promiscuous mode. In promiscuous mode, some software might send responses to frames even though they were addressed to another machine. However, experienced sniffers can prevent this (e.g., using carefully designed firewall settings). An example is sending a ping (ICMP echo request) with the wrong MAC address but the right IP address. If an adapter is operating in normal mode, it will drop this frame, and the IP stack never sees or responds to it. If the adapter is in promiscuous mode, the frame will be passed on, and the IP stack on the machine (to which a MAC address has no meaning) will respond as it would to any other ping. The sniffer can prevent this by configuring a firewall to block ICMP traffic. Some applications that use promiscuous mode The following applications and applications classes use promiscuous mode. Pack
https://en.wikipedia.org/wiki/Panzer%20General
Panzer General is a 1994 computer wargame developed and published by Strategic Simulations Inc. (SSI). It simulates conflict during World War II. The designers of Panzer General were heavily influenced by the Japanese wargame series Daisenryaku. Panzer General was a major commercial hit: 250,000 units were sold at full price, and long tail sales continued in the years ahead. It became and remained SSI's best-selling game across all genres, and was named the best-selling computer wargame of all time in 2007. It is the first in the commercially successful Panzer General series. Gameplay Panzer General is a turn-based game, set on operational level hex maps. One plays lone scenarios from either Axis or Allied side and against a computer or human opponent. In Campaign Mode, the player assumes the role of a German Generalissimus against the Allied computer. Panzer General is an operational-level game, and units approximate battalions, although the unit size and map scale from one scenario to the next are elastic. While the names and information for the units are reasonably accurate, the scenarios only approximate historical situations. Its novel feature was to link individual scenarios into a campaign spanning World War II from 1939 to 1945. Units are able to gain experience and become stronger, where success in one battle would award the player prestige to upgrade units, acquire additional units, and select a better scenario for the next battle. The game requires the player to use combined-arms tactics, where each unit is strong against some unit types but very vulnerable to others. Dug-in enemy positions must be softened by artillery, which is vulnerable and needs protection. Before attacking the infantry and anti-tank, one needs first to destroy the enemy artillery that protects them from behind. If no tanks can get within range, one does this mostly by bombers, but then it is advantageous to destroy the air defense units first. The fighter planes must negotiate dual roles: destroying the enemy air force and protecting their own bombers. The player must carefully observe the road system to speed the advance, or may use Bridge engineers to cross the rivers. The game rewards a Blitzkrieg strategy - penetrating deep into the enemy positions while postponing the destruction of some of the encountered enemy units for later. The performance of units is increased by their experience points, which are acquired through combat. In Campaign mode particularly, one then has to protect the experienced units as the most valuable assets. Campaigns Panzer General has 38 scenarios based on real or fictitious battles from World War II. The player can engage in a single battle or a campaign mode. In Campaign Mode, a series of battles unfold as a campaign heads to victory. There is one long campaign as Germany, with five starting locales: Poland (1939); from Poland, through Norway, to the West with possible amphibious invasion in Britain. North Africa (194
https://en.wikipedia.org/wiki/Unionville
Unionville is the name of some places in North America: Canada Unionville, Ontario Unionville GO Station, a station in the GO Transit network located in the community South Unionville, a community in Markham, Ontario United States Unionville, Connecticut Unionville, Georgia Unionville, Illinois (disambiguation) Unionville, Indiana Unionville, Iowa Unionville, Frederick County, Maryland Unionville, Michigan Unionville, Missouri Unionville, Nevada Unionville, New Jersey Unionville Vineyards, a winery in Unionville. Unionville, New York (disambiguation) (multiple) Unionville, North Carolina Unionville, Ashtabula County, Ohio; on the border with Lake County Unionville, Columbiana County, Ohio Unionville, Holmes County, Ohio Unionville, Morgan County, Ohio Unionville, Washington County, Ohio Unionville Center, Ohio Unionville, Pennsylvania (disambiguation) (multiple) Unionville, South Carolina, historic settlement now named Union, South Carolina Unionville, Tennessee Unionville, Utah, historic settlement now renamed Hoytsville, Utah See also Unienville, a French commune
https://en.wikipedia.org/wiki/Sigma%207
Sigma 7 may refer to: the callsign of the spacecraft used in the 1962 Mercury-Atlas 8 mission the SDS Sigma 7 computer, made by Scientific Data Systems (SDS), later known as Xerox Data Systems (XDS) Sigma Seven, a Japanese company Sigma 7 (video game), 1987 computer game by Durell Software Sigma 7 AC servo drive series from Yaskawa Electric Corporation
https://en.wikipedia.org/wiki/Donald%20Duck%3A%20Goin%27%20Quackers
Donald Duck: Goin' Quackers (known as Donald Duck: Quack Attack in Europe) is a platform video game developed and published by Ubi Soft for various consoles and Windows-based personal computers. A different game with the same title was first released for the Game Boy Color, as well as on Game Boy Advance, the latter being given the title Donald Duck Advance. The game's reception was mixed, with reviewers praising the music, backgrounds and animations, but criticizing the short length and its aim to be played by a younger demographic. Gameplay Goin' Quackerss gameplay is very similar to that of Crash Bandicoot and requires the player to move through various settings in 24 levels in four warp rooms. The four level themes are Duckie Mountain, Duckburg, Magica DeSpell's Manor and Merlock's Temple. Donald Duck has to dodge various enemies and obstacles throughout the levels and defeat bosses at the end of each warp room. There are also bonus levels where Donald Duck has to outrun a bear, a truck with an evil face, a ghost hand and a statue head, respectively. The viewpoint of the levels can change between a 2D side-scrolling perspective and a 3D perspective. Re-doing the levels to defeat Gladstone's time in same, gives the player advantages in the game. The player has four lives that can increase by finding special items. Each life gives Donald two opportunities to be touched by the enemy, the first time is touched, he becomes angry and throws berserk to the enemies, the second time, he loses a life. Donald can also unlock new outfits, which alter cutscenes and idle animations (such as Donald taking photos of the place if he is dressed like a tourist). Plot Goin' Quackers begins with Donald Duck, Gladstone Gander and Gyro Gearloose watching television reporter Daisy Duck discovering the mysterious temple of the evil magician Merlock. As she tells the story, she is kidnapped by Merlock. His arch-rival Gladstone sets out to find her before Donald, who decides to use Gyro's new invention, the "Tubal Teleport System", to track down Merlock and Daisy. However, the machine doesn't have enough power to get there and for it to reach Merlock's lair, Donald must go on a journey to plant an antenna at certain locations to boost the machine's power. Along the way, he must compete with Gladstone, reverse the spells that Merlock put on Huey, Dewey, and Louie's toys, and defeat several bosses, including the Beagle Boys and Magica De Spell. In the end, Donald is able to locate Merlock, he defeats him and rescues Daisy. The temple collapses, but Gyro is able to teleport them back to his lab, where Donald receives a good kiss from Daisy for rescuing her. This game features the returning voice talents of Tony Anselmo, Tress MacNeille, June Foray, Corey Burton and Russi Taylor. Development The game was conceptualized by Ubi Soft Montreal in a collaboration with Disney Interactive as an homage to Disney comic book artist Carl Barks, who died in 2000.. Different Ubis
https://en.wikipedia.org/wiki/Ostankino%20Tower
Ostankino Tower () is a television and radio tower in Moscow, Russia, owned by the Moscow branch of unitary enterprise Russian TV and Radio Broadcasting Network. Standing , it was designed by Nikolai Nikitin. , it is the tallest free-standing structure in Europe and 12th tallest in the world. Between 1967 and 1974, it was the tallest in the world. The tower was the first free-standing structure to exceed in height. Ostankino was built to mark the 50th anniversary of the October Revolution. It is named after the surrounding Ostankino district of Moscow. Upon the completion of construction, approximately 10,000,000 individuals resided within the transmitter coverage area, which expanded to over 15,000,000 people by 2014. This area encompasses Moscow and the Moscow Region, as well as certain portions of the Vladimir and Kaluga regions. The ownership of the TV tower lies with the Moscow Regional Center, a division of the Russian Television and Radio Broadcasting Network (RTRN). Under favorable weather conditions, the Ostankino TV Tower can be observed by residents of certain cities in the Moscow Region, including Balashikha, Voskresensk, Zelenograd, Korolev, Krasnogorsk, Lyubertsy, Mytishchi, Odintsovo, Podolsk, Khimki, and Shchyolkovo. History Choosing a site for construction The primary function of the Ostankino TV Tower is to transmit TV signals. Prior to its establishment, this task was fulfilled by a structure devised by Vladimir Shukhov in 1922. During that era, the "wicker" tower located on Shabolovka Street was widely regarded as an extraordinary feat of engineering ingenuity. However, it became apparent in the 1950s that the tower had reached the end of its lifespan. Originally designed primarily for radio signal transmission, the tower found itself adapting to the emergence of television broadcasting in the late 1930s. After nearly two decades, it became evident that the design had become outdated and struggled to efficiently handle its assigned functions. The city required a new tower that would be more robust and spacious. Additionally, the authorities in the capital desired the signal from the structure to encompass not only Moscow but also the surrounding region. Consequently, based on the engineers' calculations, the height of the structure was determined to surpass 500 meters. Initially, the tower was intended to be situated on Balchug Island, which is now home to the Peter the Great Statue. Then a plot of land was offered in Cheryomushki, conveniently situated in the midst of the new blocks of Khrushchevka houses. The intention was to combine technical and construction progress. However, the idea did not come to fruition for a couple of reasons. Firstly, there was insufficient space for the tower in that location. Secondly, the construction could potentially disrupt the operations of the nearby Vnukovo airport. Ultimately, it was decided that the TV tower would be placed in Ostankino, where there was ample space available. So
https://en.wikipedia.org/wiki/Method%20of%20complements
In mathematics and computing, the method of complements is a technique to encode a symmetric range of positive and negative integers in a way that they can use the same algorithm (or mechanism) for addition throughout the whole range. For a given number of places half of the possible representations of numbers encode the positive numbers, the other half represents their respective additive inverses. The pairs of mutually additive inverse numbers are called complements. Thus subtraction of any number is implemented by adding its complement. Changing the sign of any number is encoded by generating its complement, which can be done by a very simple and efficient algorithm. This method was commonly used in mechanical calculators and is still used in modern computers. The generalized concept of the radix complement (as described below) is also valuable in number theory, such as in Midy's theorem. The nines' complement of a number given in decimal representation is formed by replacing each digit with nine minus that digit. To subtract a decimal number y (the subtrahend) from another number x (the minuend) two methods may be used: In the first method the nines' complement of x is added to y. Then the nines' complement of the result obtained is formed to produce the desired result. In the second method the nines' complement of y is added to x and one is added to the sum. The leftmost digit '1' of the result is then discarded. Discarding the leftmost '1' is especially convenient on calculators or computers that use a fixed number of digits: there is nowhere for it to go so it is simply lost during the calculation. The nines' complement plus one is known as the ten's complement. The method of complements can be extended to other number bases (radices); in particular, it is used on most digital computers to perform subtraction, represent negative numbers in base 2 or binary arithmetic and test underflow and overflow in calculation. Numeric complements The radix complement of an digit number in radix is defined as . In practice, the radix complement is more easily obtained by adding 1 to the diminished radix complement, which is . While this seems equally difficult to calculate as the radix complement, it is actually simpler since is simply the digit repeated times. This is because (see also Geometric series Formula). Knowing this, the diminished radix complement of a number can be found by complementing each digit with respect to , i.e. subtracting each digit in from . The subtraction of from using diminished radix complements may be performed as follows. Add the diminished radix complement of to to obtain or equivalently , which is the diminished radix complement of . Further taking the diminished radix complement of results in the desired answer of . Alternatively using the radix complement, may be obtained by adding the radix complement of to to obtain or . Assuming , the result will be greater or equal to and dropping the
https://en.wikipedia.org/wiki/Broadcast%20flag
A broadcast flag is a bit field sent in the data stream of a digital television program that indicates whether or not the data stream can be recorded, or if there are any restrictions on recorded content. Possible restrictions include the inability to save an unencrypted digital program to a hard disk or other non-volatile storage, inability to make secondary copies of recorded content (in order to share or archive), forceful reduction of quality when recording (such as reducing high-definition video to the resolution of standard TVs), and inability to skip over commercials. In the United States, new television receivers using the ATSC standard were supposed to incorporate this functionality by July 1, 2005. The requirement was successfully contested in 2005 and rescinded in 2011. FCC ruling Officially called "Digital Broadcast Television Redistribution Control," the FCC's rule is in 47 CFR 73.9002(b) and the following sections, stating in part: "No party shall sell or distribute in interstate commerce a Covered Demodulator Product that does not comply with the Demodulator Compliance Requirements and Demodulator Robustness Requirements." According to the rule, hardware must "actively thwart" piracy. The rule's Demodulator Compliance Requirements insists that all HDTV demodulators must "listen" for the flag (or assume it to be present in all signals). Flagged content must be output only to "protected outputs" (such as DVI and HDMI ports with HDCP encryption), or in degraded form through analog outputs or digital outputs with visual resolution of 720x480 pixels (EDTV) or less. Flagged content may be recorded only by "authorized" methods, which may include tethering of recordings to a single device. Since broadcast flags could be activated at any time, a viewer who often records a program might suddenly find that it is no longer possible to save their favorite show. This and other reasons lead many to see the flags as a direct affront to consumer rights. The Demodulator Robustness Requirements are difficult to implement in open source systems. Devices must be "robust" against user access or modifications so that someone could not easily alter it to ignore the broadcast flags that permit access to the full digital stream. Since open-source device drivers are by design user-modifiable, a PC TV tuner card with open-source drivers would not be "robust". The GNU Radio project already successfully demonstrated that purely software-based demodulators can exist and the hardware rule is not fully enforceable. Current status In American Library Association v. FCC, 406 F.3d 689 (D.C. Cir. 2005), the United States Court of Appeals for the D.C. Circuit ruled that the FCC had exceeded its authority in creating this rule. The court stated that the Commission could not prohibit the manufacture of computer or video hardware without copy-protection technology because the FCC only has authority to regulate transmissions, not devices that receive communication
https://en.wikipedia.org/wiki/COCOMO
The Constructive Cost Model (COCOMO) is a procedural software cost estimation model developed by Barry W. Boehm. The model parameters are derived from fitting a regression formula using data from historical projects (63 projects for COCOMO 81 and 163 projects for COCOMO II). History The constructive cost model was developed by Barry W. Boehm in the late 1970s and published in Boehm's 1981 book Software Engineering Economics as a model for estimating effort, cost, and schedule for software projects. It drew on a study of 63 projects at TRW Aerospace where Boehm was Director of Software Research and Technology. The study examined projects ranging in size from 2,000 to 100,000 lines of code, and programming languages ranging from assembly to PL/I. These projects were based on the waterfall model of software development which was the prevalent software development process in 1981. References to this model typically call it COCOMO 81. In 1995 COCOMO II was developed and finally published in 2000 in the book Software Cost Estimation with COCOMO II. COCOMO II is the successor of COCOMO 81 and is claimed to be better suited for estimating modern software development projects; providing support for more recent software development processes and was tuned using a larger database of 161 projects. The need for the new model came as software development technology moved from mainframe and overnight batch processing to desktop development, code reusability, and the use of off-the-shelf software components. COCOMO consists of a hierarchy of three increasingly detailed and accurate forms. The first level, Basic COCOMO is good for quick, early, rough order of magnitude estimates of software costs, but its accuracy is limited due to its lack of factors to account for difference in project attributes (Cost Drivers). Intermediate COCOMO takes these Cost Drivers into account and Detailed COCOMO additionally accounts for the influence of individual project phases. Last one is Complete COCOMO model which addresses the shortcomings of both basic & intermediate. Intermediate COCOMOs Intermediate COCOMO computes software development effort as function of program size and a set of "cost drivers" that include subjective assessment of product, hardware, personnel and project attributes. This extension considers a set of four "cost drivers", each with a number of subsidiary attributes:- Product attributes Required software reliability extent Size of application database Complexity of the product Hardware attributes Run-time performance constraints Memory constraints Volatility of the virtual machine environment Required turnabout time Personnel attributes Analyst capability Software engineering capability Applications experience Virtual machine experience Programming language experience Project attributes Use of software tools Application of software engineering methods Required development schedule Each of the 15 attributes receives a rating on a
https://en.wikipedia.org/wiki/Log-structured%20file%20system
A log-structured filesystem is a file system in which data and metadata are written sequentially to a circular buffer, called a log. The design was first proposed in 1988 by John K. Ousterhout and Fred Douglis and first implemented in 1992 by Ousterhout and Mendel Rosenblum for the Unix-like Sprite distributed operating system. Rationale Conventional file systems tend to lay out files with great care for spatial locality and make in-place changes to their data structures in order to perform well on optical and magnetic disks, which tend to seek relatively slowly. The design of log-structured file systems is based on the hypothesis that this will no longer be effective because ever-increasing memory sizes on modern computers would lead to I/O becoming write-heavy because reads would be almost always satisfied from memory cache. A log-structured file system thus treats its storage as a circular log and writes sequentially to the head of the log. This has several important side effects: Write throughput on optical and magnetic disks is improved because they can be batched into large sequential runs and costly seeks are kept to a minimum. The structure is naturally suited to media with append-only zones or pages such as flash storages and shingled magnetic recording HDDs Writes create multiple, chronologically-advancing versions of both file data and meta-data. Some implementations make these old file versions nameable and accessible, a feature sometimes called time-travel or snapshotting. This is very similar to a versioning file system. Recovery from crashes is simpler. Upon its next mount, the file system does not need to walk all its data structures to fix any inconsistencies, but can reconstruct its state from the last consistent point in the log. Log-structured file systems, however, must reclaim free space from the tail of the log to prevent the file system from becoming full when the head of the log wraps around to meet it. The tail can release space and move forward by skipping over data for which newer versions exist farther ahead in the log. If there are no newer versions, then the data is moved and appended to the head. To reduce the overhead incurred by this garbage collection, most implementations avoid purely circular logs and divide up their storage into segments. The head of the log simply advances into non-adjacent segments which are already free. If space is needed, the least-full segments are reclaimed first. This decreases the I/O load (and decreases the write amplification) of the garbage collector, but becomes increasingly ineffective as the file system fills up and nears capacity. Disadvantages The design rationale for log-structured file systems assumes that most reads will be optimized away by ever-enlarging memory caches. This assumption does not always hold: On magnetic media—where seeks are relatively expensive—the log structure may actually make reads much slower, since it fragments files that con
https://en.wikipedia.org/wiki/Internet%20exchange%20point
Internet exchange points (IXes or IXPs) are common grounds of IP networking, allowing participant Internet service providers (ISPs) to exchange data destined for their respective networks. IXPs are generally located at places with preexisting connections to multiple distinct networks, i.e., datacenters, and operate physical infrastructure (switches) to connect their participants. Organizationally, most IXPs are each independent not-for-profit associations of their constituent participating networks (that is, the set of ISPs which participate at that IXP). The primary alternative to IXPs is private peering, where ISPs directly connect their networks to each other. IXPs reduce the portion of an ISP's traffic that must be delivered via their upstream transit providers, thereby reducing the average per-bit delivery cost of their service. Furthermore, the increased number of paths available through the IXP improves routing efficiency (by allowing routers to select shorter paths) and fault-tolerance. IXPs exhibit the characteristics of the network effect. History Internet exchange points began as Network Access Points or NAPs, a key component of Al Gore's National Information Infrastructure (NII) plan, which defined the transition from the US Government-paid-for NSFNET era (when Internet access was government sponsored and commercial traffic was prohibited) to the commercial Internet of today. The four Network Access Points (NAPs) were defined as transitional data communications facilities at which Network Service Providers (NSPs) would exchange traffic, in replacement of the publicly financed NSFNET Internet backbone. The National Science Foundation let contracts supporting the four NAPs, one to MFS Datanet for the preexisting MAE-East in Washington, D.C., and three others to Sprint, Ameritech, and Pacific Bell, for new facilities of various designs and technologies, in New York (actually Pennsauken, New Jersey), Chicago, and California, respectively. As a transitional strategy, they were effective, providing a bridge from the Internet's beginnings as a government-funded academic experiment, to the modern Internet of many private-sector competitors collaborating to form a network-of-networks, transporting Internet bandwidth from its points-of-production at Internet exchange points to its sites-of-consumption at users' locations. This transition was particularly timely, coming hard on the heels of the ANS CO+RE controversy, which had disturbed the nascent industry, led to congressional hearings, resulted in a law allowing NSF to promote and use networks that carry commercial traffic, prompted a review of the administration of NSFNET by the NSF's Inspector General (no serious problems were found), and caused commercial operators to realize that they needed to be able to communicate with each other independent of third parties or at neutral exchange points. Although the three telco-operated NAPs faded into obscurity relatively quickly after th
https://en.wikipedia.org/wiki/Human-readable%20medium%20and%20data
In computing, a human-readable medium or human-readable format is any encoding of data or information that can be naturally read by humans, resulting in human-readable data. It is often encoded as ASCII or Unicode text, rather than as binary data. In most contexts, the alternative to a human-readable representation is a machine-readable format or medium of data primarily designed for reading by electronic, mechanical or optical devices, or computers. For example, Universal Product Code (UPC) barcodes are very difficult to read for humans, but very effective and reliable with the proper equipment, whereas the strings of numerals that commonly accompany the label are the human-readable form of the barcode information. Since any type of data encoding can be parsed by a suitably programmed computer, the decision to use binary encoding rather than text encoding is usually made to conserve storage space. Encoding data in a binary format typically requires fewer bytes of storage and increases efficiency of access (input and output) by eliminating format parsing or conversion. With the advent of standardized, highly structured markup languages, such as Extensible Markup Language (XML), the decreasing costs of data storage, and faster and cheaper data communication networks, compromises between human-readability and machine-readability are now more common-place than they were in the past. This has led to humane markup languages and modern configuration file formats that are far easier for humans to read. In addition, these structured representations can be compressed very effectively for transmission or storage. Human-readable protocols greatly reduce the cost of debugging. Various organizations have standardized the definition of human-readable and machine-readable data and how they are applied in their respective fields of application, e.g., the Universal Postal Union. Often the term human-readable is also used to describe shorter names or strings, that are easier to comprehend or to remember than long, complex syntax notations, such as some Uniform Resource Locator strings. Occasionally "human-readable" is used to describe ways of encoding an arbitrary integer into a long series of English words. Compared to decimal or other compact binary-to-text encoding systems, English words are easier for humans to read, remember, and type in. See also Self-documenting code – source code that is both machine-readable and human-readable Human-readable code Machine-Readable Documents Machine-readable data Data (computing) Data conversion Hellschreiber Human–computer interaction Human factors Plain text Quoted printable References Optical character recognition
https://en.wikipedia.org/wiki/Lempel%E2%80%93Ziv%E2%80%93Markov%20chain%20algorithm
The Lempel–Ziv–Markov chain algorithm (LZMA) is an algorithm used to perform lossless data compression. It has been under development since either 1996 or 1998 by Igor Pavlov and was first used in the 7z format of the 7-Zip archiver. This algorithm uses a dictionary compression scheme somewhat similar to the LZ77 algorithm published by Abraham Lempel and Jacob Ziv in 1977 and features a high compression ratio (generally higher than bzip2) and a variable compression-dictionary size (up to 4 GB), while still maintaining decompression speed similar to other commonly used compression algorithms. LZMA2 is a simple container format that can include both uncompressed data and LZMA data, possibly with multiple different LZMA encoding parameters. LZMA2 supports arbitrarily scalable multithreaded compression and decompression and efficient compression of data which is partially incompressible. Overview LZMA uses a dictionary compression algorithm (a variant of LZ77 with huge dictionary sizes and special support for repeatedly used match distances), whose output is then encoded with a range encoder, using a complex model to make a probability prediction of each bit. The dictionary compressor finds matches using sophisticated dictionary data structures, and produces a stream of literal symbols and phrase references, which is encoded one bit at a time by the range encoder: many encodings are possible, and a dynamic programming algorithm is used to select an optimal one under certain approximations. Prior to LZMA, most encoder models were purely byte-based (i.e. they coded each bit using only a cascade of contexts to represent the dependencies on previous bits from the same byte). The main innovation of LZMA is that instead of a generic byte-based model, LZMA's model uses contexts specific to the bitfields in each representation of a literal or phrase: this is nearly as simple as a generic byte-based model, but gives much better compression because it avoids mixing unrelated bits together in the same context. Furthermore, compared to classic dictionary compression (such as the one used in zip and gzip formats), the dictionary sizes can be and usually are much larger, taking advantage of the large amount of memory available on modern systems. Compressed format overview In LZMA compression, the compressed stream is a stream of bits, encoded using an adaptive binary range coder. The stream is divided into packets, each packet describing either a single byte, or an LZ77 sequence with its length and distance implicitly or explicitly encoded. Each part of each packet is modeled with independent contexts, so the probability predictions for each bit are correlated with the values of that bit (and related bits from the same field) in previous packets of the same type. Both the lzip and the LZMA SDK documentation describes this stream format. There are 7 types of packets: LONGREP[*] refers to LONGREP[0–3] packets, *REP refers to both LONGREP and SHORTREP, and
https://en.wikipedia.org/wiki/7z
7z is a compressed archive file format that supports several different data compression, encryption and pre-processing algorithms. The 7z format initially appeared as implemented by the 7-Zip archiver. The 7-Zip program is publicly available under the terms of the GNU Lesser General Public License. The LZMA SDK 4.62 was placed in the public domain in December 2008. The latest stable version of 7-Zip and LZMA SDK is version 22.01. The official, informal 7z file format specification is distributed with 7-Zip's source code since 2015. The specification can be found in plain text format in the 'doc' sub-directory of the source code distribution. There have been additional third-party attempts at writing more concrete documentation based on the released code. Features and enhancements The 7z format provides the following main features: Open, modular architecture that allows any compression, conversion, or encryption method to be stacked. High compression ratios (depending on the compression method used). AES-256 bit encryption. Zip 2.0 (Legacy) Encryption Large file support (up to approximately 16 exbibytes, or 264 bytes). Unicode file names. Support for solid compression, where multiple files of like type are compressed within a single stream, in order to exploit the combined redundancy inherent in similar files. Compression and encryption of archive headers. Support for multi-part archives : e.g. xxx.7z.001, xxx.7z.002, ... (see the context menu items Split File... to create them and Combine Files... to re-assemble an archive from a set of multi-part component files). Support for custom codec plugin DLLs. The format's open architecture allows additional future compression methods to be added to the standard. Compression methods The following compression methods are currently defined: LZMA – A variation of the LZ77 algorithm, using a sliding dictionary up to 4 GB in length for duplicate string elimination. The LZ stage is followed by entropy coding using a Markov chain-based range coder and binary trees. LZMA2 – modified version of LZMA providing better multithreading support and less expansion of incompressible data. Bzip2 – The standard Burrows–Wheeler transform algorithm. Bzip2 uses two reversible transformations; BWT, then Move to front with Huffman coding for symbol reduction (the actual compression element). PPMd – Dmitry Shkarin's 2002 PPMdH (PPMII (Prediction by Partial matching with Information Inheritance) and cPPMII (complicated PPMII)) with small changes: PPMII is an improved version of the 1984 PPM compression algorithm (prediction by partial matching). DEFLATE – Standard algorithm based on 32 kB LZ77 and Huffman coding. Deflate is found in several file formats including ZIP, gzip, PNG and PDF. 7-Zip contains a from-scratch DEFLATE encoder that frequently beats the de facto standard zlib version in compression size, but at the expense of CPU usage. A suite of recompression tools called AdvanceCOMP contains
https://en.wikipedia.org/wiki/Chris%20Crawford%20%28game%20designer%29
Christopher Crawford (born June 1, 1950) is an American video game designer and writer. Hired by Alan Kay to work at Atari, Inc., he wrote the computer wargame Eastern Front (1941) for the Atari 8-bit family which was sold through the Atari Program Exchange and later Atari's official product line. After leaving Atari, he wrote a string of games beginning with Balance of Power for Macintosh. Writing about the process of developing games, he became known among other creators in the nascent home computer game industry for his passionate advocacy of game design as an art form. He self-published The Journal of Computer Game Design and founded the Computer Game Developers Conference (later renamed to the Game Developers Conference). In 1992 Crawford withdrew from commercial game development and began experimenting with ideas for a next generation interactive storytelling system. In 2018, Crawford announced that he had halted his work on interactive storytelling, concluding that it will take centuries for civilization to embrace the required concepts. Biography Crawford was born in 1950 in Houston, Texas. After receiving a Bachelor's in physics from UC Davis in 1972 and a Master's in physics from the University of Missouri in 1975, Crawford taught at a community college and the University of California. Crawford first encountered computer games in Missouri, when he met someone attempting to computerize Avalon Hill's Blitzkrieg. While teaching, he wrote an early version of Tanktics in Fortran for the IBM 1130 in 1976 as a hobby, then wrote Tanktics and an early version of Legionnaire for personal computers such as the KIM-1 and Commodore PET. In 1978, Crawford began selling the games and by 1979 "made the startling discovery," he later said, "that it is far more lucrative and enjoyable to teach for fun and program for money." He joined Atari that year, founding the Games Research Group under Alan Kay in 1982. 1980s At Atari, Crawford started game work with Wizard for the Atari VCS, but Atari Marketing decided not to publish this work. He then turned his attention to the new "Atari Home Computer System," now referred to as the Atari 8-bit family. His first releases on this platform were Energy Czar and Scram, both of which were written in Atari BASIC and published by Atari. He experimented with the Atari 8-bit computer's hardware-assisted smooth scrolling and used it to produce a scrolling map display. This work led to Eastern Front (1941), which is widely considered one of the first wargames on a microcomputer to compete with traditional paper-n-pencil games in terms of depth. Eastern Front was initially published through the Atari Program Exchange, which was intended for user-written software. It was later moved to Atari's official product line. He followed this with Legionnaire, based on the same display engine but adding real-time instead of turn-based game play. Using the knowledge gathered while writing these games, he helped produce technic
https://en.wikipedia.org/wiki/Wisconsin%20Public%20Radio
Wisconsin Public Radio (WPR) is a network of 34 public radio stations in the state of Wisconsin. WPR's network is divided into two distinct analog services, the Ideas Network and the NPR News & Music Network, as well as the All Classical Network, a digital-only, full-time classical music service. History In 1932, WHA in Madison and WLBL in Stevens Point started limited simulcasting of certain programs. However, the first real steps toward the building of what would become Wisconsin Public Radio began in 1947, with the sign-on of WHA-FM (now WERN) as a sister station to WHA. Between 1948 and 1965, seven more FM stations signed on as part of what was initially dubbed Wisconsin Educational Radio. The network became Wisconsin Public Radio in 1971, when it became a charter member of National Public Radio. Shortly afterward, the merger of the University of Wisconsin and Wisconsin State University systems into the present-day University of Wisconsin System greatly increased WPR's reach. Ideas Network The Ideas Network is devoted mostly to discussion and call-in shows, focusing on the state of Wisconsin and issues involving the state. The name of the network comes from the Wisconsin Idea concept associated with the UW System. During the week, the Ideas Network airs locally produced talk programming, longtime daily reading showcase Chapter a Day, and WBUR's On Point in mid-mornings, WAMU's 1A Monday-Thursdays and National Public Radio's Science Friday in early afternoons, while at night broadcasting Q and As It Happens from CBC Radio One, along with a mix of national programs including Reveal, Latino USA and The Moth Radio Hour, as well as repeats of Chapter a Day, and overnight, the BBC World Service. In election years, expanded political coverage occurs, along with WPR often coordinating in part political debates for the state's highest offices such as Governor and Attorney General, often with PBS Wisconsin (formerly Wisconsin Public Television). On the weekend, it airs WPR-produced shows, such as Zorba Paster On Your Health and To the Best of Our Knowledge. Weekends also include NPR/PRI/APM entertainment programming such as Wait Wait... Don't Tell Me!, Ask Me Another, Radiolab and Live from Here (the former A Prairie Home Companion) on Saturdays, with Says You!, A Way with Words, Milk Street Radio, Bullseye with Jesse Thorn, and This American Life on Sundays. Other WPR-originated programming on the weekends include: University of the Air, the folk music focused Simply Folk, and PRX Remix. Higher Ground, a program of world music hosted by Dr. Jonathan Overby, is broadcast on Saturday night on WHAD, WPR's Ideas Network station in Milwaukee, and otherwise heard on WPR's News & Classical Music stations around the state. Formerly the network carried old time radio programming on weekend evenings, but discontinued doing so in June 2020 due to the racial and sexist views of the era proving outdated and offensive to general audiences. The flagship st
https://en.wikipedia.org/wiki/UTF
UTF may refer to: Computing Unicode Transformation Format UTF-1 UTF-7 UTF-8 UTF-16 UTF-32 Other uses U.T.F. (Undead Task Force), an American comic book title Underground Test Facility, used for testing and developing enhanced oil recovery technology in northern Canada Unión del Trabajo de Filipinas, in the Philippines
https://en.wikipedia.org/wiki/Question%20answering
Question answering (QA) is a computer science discipline within the fields of information retrieval and natural language processing (NLP) that is concerned with building systems that automatically answer questions that are posed by humans in a natural language. Overview A question-answering implementation, usually a computer program, may construct its answers by querying a structured database of knowledge or information, usually a knowledge base. More commonly, question-answering systems can pull answers from an unstructured collection of natural language documents. Some examples of natural language document collections used for question answering systems include: a collection of reference texts internal organization documents and web pages compiled newswire reports a set of Wikipedia pages a subset of World Wide Web pages Types of question answering Question-answering research attempts to develop ways of answering a wide range of question types, including fact, list, definition, how, why, hypothetical, semantically constrained, and cross-lingual questions. Answering questions related to an article in order to evaluate reading comprehension is one of the simpler form of question answering, since a given article is relatively short compared to the domains of other types of question-answering problems. An example of such a question is "What did Albert Einstein win the Nobel Prize for?" after an article about this subject is given to the system. Closed-book question answering is when a system has memorized some facts during training and can answer questions without explicitly being given a context. This is similar to humans taking closed-book exams. Closed-domain question answering deals with questions under a specific domain (for example, medicine or automotive maintenance) and can exploit domain-specific knowledge frequently formalized in ontologies. Alternatively, "closed-domain" might refer to a situation where only a limited type of questions are accepted, such as questions asking for descriptive rather than procedural information. Question answering systems machine reading applications have also been constructed in the medical domain, for instance Alzheimer's disease. Open-domain question answering deals with questions about nearly anything and can only rely on general ontologies and world knowledge. Systems designed for open-domain question answering usually have much more data available from which to extract the answer. An example of an open-domain question is "What did Albert Einstein win the Nobel Prize for?" while no article about this subject is given to the system. Another way to categorize question-answering systems is by the technical approach used. There are a number of different types of QA systems, including rule-based systems, statistical systems, and hybrid systems. Rule-based systems use a set of rules to determine the correct answer to a question. Statistical systems use statistical methods to find the
https://en.wikipedia.org/wiki/Gold%20%28British%20TV%20channel%29
Gold is a British pay television channel from the UKTV network that was launched in late 1992 as UK Gold before it was rebranded UKTV Gold in 2004. In 2008, it was split into current flagship channel Gold and miscellaneous channel, W, with classic comedy based programming now airing on Gold, non-crime drama and entertainment programming airing on W, and quiz shows and more high-brow comedy airing on Dave. It shows repeats of classic programming from the BBC, ITV and other broadcasters. Every December, from 2015 until 2018, the channel was temporarily renamed Christmas Gold. This has since been discontinued, although the channel still continues to broadcast Christmas comedy. History The channel was formed as a joint venture between the BBC, through commercial arm BBC Enterprises, American company Cox Enterprises and outgoing ITV London weekday franchisee Thames Television, known as European Channel Management. The channel, named "UK Gold", was to show repeats of the 'classic' archive programming from the two broadcasters. The channel launched on 1 November 1992 at 7pm with Just Good Friends. The first commercial shown on the channel was for Lucozade, and all commercials shown in the first three breaks on the channel's launch night either had the word gold or golden in either the name of the brand advertised or mentioned in the commercial itself. The rights to the BBC programmes previously were held by the BSB entertainment channel Galaxy, prior to the merger with Sky Television to form BSkyB in November 1990. The channel was initially broadcast on an analogue transponder from an SES satellite at 19.2°E which was less well suited for UK reception. As a result, the channel used to be notorious for being marred with interference, known as 'sparklies', in large parts of the UK. Another initial drawback was the cutting of programming down to fit commercial time slots, and the intensive use of commercial breaks. Reception improved however with the channel added to BSkyB's basic subscription package in 1993, and the launch of the channel on cable services. In 1993, Flextech gained its first stake in the station after acquiring Tele-Communications Inc.'s TV interests in Europe. In 1996, it started discussions about increasing its stake, to gain full control. At that point, Flextech held 27% with Cox (38%), BBC (20%) and Pearson (15%). By the Autumn, Flextech held 80% of UK Gold. Flextech's main reason for increasing its stake in UK Gold was in participation of new talks with the BBC. UKTV The channel's success led to the launch of the UKTV network on 1 November 1997, owned by BBC Worldwide and Flextech, and consisting of three other channels: UK Arena, UK Horizons and UK Style, focusing on the arts, factual and lifestyle programmes respectively. The UKTV network would expand to include numerous more channels as the years progressed. The UK Gold brand was expanded in October 1998 with the launch of the digital only channel UK Gold Classics, broadcast
https://en.wikipedia.org/wiki/Bergerac%20%28TV%20series%29
Bergerac ( ) is a British crime drama television series. Set in Jersey, it ran from 18 October 1981 to 26 December 1991. Produced by the BBC in association with the Australian Seven Network, and first screened on BBC1, it stars John Nettles as the title character Jim Bergerac, who is initially a detective sergeant in Le Bureau des Étrangers ("The Foreigners' Office", a fictional department dealing with non-Jersey residents), within the States of Jersey Police, but later leaves the force and becomes a private investigator. Westward Studios executive producer Brian Constantine said the Bergerac reboot was in the final stages of development, possibly airing 2024. Background The series ran from 1981 to 1991. It was created by producer Robert Banks Stewart after an earlier detective series, Shoestring, starring Trevor Eve, came to an abrupt end. Like Shoestring, the series begins with a man returning to work after a particularly bad period in his life: Eddie Shoestring from a nervous breakdown; Jim Bergerac from alcoholism and from a crushed and badly-broken leg. Bergerac sometimes deals with controversial topics; for example, when an old man is unmasked as a Nazi war criminal, his age raised various moral dilemmas. Supernatural elements occasionally appear in the series, and some episodes end with unpleasant twists, as in "Offshore Trades" and "A Hole In The Bucket". The final episode filmed was the 1991 Christmas Special titled "All for Love", set partly in Bath. The final scene provides a strong hint about Bergerac's future, after Charlie Hungerford recommends Bergerac for a new position heading an expanded Bureau des Étrangers covering the whole of the Channel Islands following its success in Jersey. The show is repeated on channels such as Alibi and Drama. On 24 February 2014, the BBC started a rerun of the series on daytime afternoons on BBC Two. The repeats concluded with series 3 to avoid showing the Haut de la Garenne location. Episodes Cast and characters Main John Nettles as Detective Sergeant Jim Bergerac Terence Alexander as Charlie Hungerford Sean Arnold as Chief Inspector/Superintendent Barney Crozier (series 1–8) Cécile Paoli as Francine Leland (series 1) Deborah Grant as Deborah Bergerac Annette Badland as Charlotte (series 1–3) Celia Imrie as Marianne Bellshade (series 2) Louise Jameson as Susan Young (series 4–8) Thérèse Liotard as Danielle Aubry (series 8–9) Supporting Mela White as Diamante Lil (series 1–5) Lindsay Heath as Kim Bergerac (series 1–5) Geoffrey Leesley as Detective Constable Terry Wilson (series 1–5) Tony Melody as the Chief (series 1–3) Jonathan Adams as Dr. LeJeune (series 1–3) Liza Goddard as Philippa Vale (series 3–7) Nancy Mansfield as Peggy Masters (series 4–7) Jolyon Baker as DC Barry Goddard (series 4–5) John Telfer as DC Willy Pettit (series 6–9) Ben Kershaw as DC Ben Lomas (series 6–9) Lead character and casting Jim Bergerac is a complex character, presented by the series as a s
https://en.wikipedia.org/wiki/Unary
Unary may refer to: Unary numeral system, the simplest numeral system to represent natural numbers Unary function, a function that takes one argument; in computer science, a unary operator is a subset of unary function Unary operation, a kind of mathematical operator that has only one operand Unary relation, a mathematical relation that has one argument Unary coding, an entropy encoding that represents a number n with n − 1 ones followed by a zero See also Primary (disambiguation) Binary (disambiguation)
https://en.wikipedia.org/wiki/Sarah%20Flannery
Sarah Flannery (born 1982, County Cork, Ireland) was, at sixteen years old, the winner of the 1999 Esat Young Scientist Exhibition for her development of the Cayley–Purser algorithm, based on work she had done with researchers at Baltimore Technologies during a brief internship there. The project, entitled "Cryptography – A new algorithm versus the RSA", also won her the EU Young Scientist of the Year Award in 1999. Biography Flannery's education included a primary all-girls school and secondary education at Scoil Mhuire Gan Smál in Blarney. Following the competition win, in 2001 Flannery co-authored In Code with her father, mathematician David Flannery. It tells the story of the making and breaking of the Cayley-Purser algorithm, as well as the enjoyment she got from solving mathematical puzzles while growing up. She dedicates many of her accomplishments in the fields of mathematics and cryptography to her father's support during her childhood. She studied computer science at Peterhouse, a college of the University of Cambridge, graduating in 2003, and, as of 2006, worked for Electronic Arts as a software engineer. She worked at TirNua as a "Chief Scientist". She developed the virtual economy in a game and the back-end web services that powered the game features. She has also worked at RockYou, and several other institutions involved in software development and computer science. Before working at TirNua, Flannery was software engineer working directly with then Electronic Arts Worldwide Chief Technology Officer, Scott Cronce, and, later, with many fellow Tirnua founders on her first virtual world. At EA, she successfully set up the EA Open Source program using the Essential Project. Flannery created data visualizations on software architecture and game content creation which were used to directly impact the quality of both. She also successfully ran and turned around the virtual economy within EA-Land (formerly The Sims Online). Previously, she worked on the technical and scientific computing software product Mathematica for Wolfram Research. The lights on St. Patrick's Street, one of the main thoroughfares of Flannery's home city of Cork, are named after her. Flannery is the sister of the singer and songwriter Mick Flannery. Bibliography (2000) Sarah Flannery and David Flannery. In Code: A Mathematical Journey 271 pages, Pub. London : Profile, (2002) Sarah Flannery and David Flannery. In Code: A Mathematical Journey revised, 341 pages, Pub. Chapel Hill, N.C. : Algonquin Books of Chapel Hill, Cryptography – A new algorithm versus the RSA. See also Intel International Science and Engineering Fair Linear algebra Cryptography Cayley–Purser algorithm References External links Mathematica and the Science of Secrecy Sarah's cracking algorithm homepage at TirNua 1982 births Living people Alumni of Peterhouse, Cambridge Irish women mathematicians Scientists from County Cork Young Scientist and Technology Exhibition 20th-century
https://en.wikipedia.org/wiki/UB
UB or Ub may refer to: Organizations Basel University Library, , abbreviated UB. UltimateBet, a defunct online poker site Ungermann-Bass, a computer networking company in California United Biscuits, a British and European food manufacturer United Breweries Group, a brewery conglomerate in India Urząd Bezpieczeństwa (1945–1954), part of the Polish secret police Church of the United Brethren in Christ, also known as the United Brethren Myanmar National Airlines, IATA code UB Urban Behavior, defunct Canadian retailer Places Ub, Serbia, a town in Serbia Ub (river), a river in Serbia UB postcode area, in London, England Ulaanbaatar (Ulan Bator), Mongolia Universities University of Bridgeport, Connecticut, US Bakrie University, Jakarta, Indonesia University at Buffalo, New York, US University of Baguio, Philippines University of the Bahamas University of Ballarat, Victoria, Australia University of Baltimore, Maryland, US University of Barcelona, Catalonia, Spain University of Basrah, Iraq University of Batangas, Philippines University of Belgrade, Serbia University of Belgrano, Argentina University of Birmingham, UK University of Bohol, Philippines University of Botswana University of Brawijaya, Malang, Indonesia University of Buckingham, Buckinghamshire, UK University of Buea, Buea, Cameroon University of Burgundy, France Science and technology Berezin UB, a Soviet World War II machine gun Ubiquitin, a small regulatory protein Undefined behavior, in computer science, operations that are unspecified Universal Beam, a type of I-beam Upper bound, a mathematical concept in order theory Other uses Ub Iwerks (1901–1971), American animator and cartoonist Ugly Betty, an American drama-comedy television series German submarine UB, German name for the captured British submarine HMS Seal (N37) German Type UB submarine Upward Bound, a United States Department of Education-sponsored program The Urantia Book, a spiritual and philosophical book published in 1955 U.S.C. de Bananier, Guadeloupean football club in Capesterre-Belle-Eau U.S. Bitonto, an Italian association football club in Bitonto, Apulia See also U of B (disambiguation) UBS (disambiguation) Ubo (disambiguation) Ubu (disambiguation) OOB (disambiguation)
https://en.wikipedia.org/wiki/Planisphere
In astronomy, a planisphere () is a star chart analog computing instrument in the form of two adjustable disks that rotate on a common pivot. It can be adjusted to display the visible stars for any time and date. It is an instrument to assist in learning how to recognize stars and constellations. The astrolabe, an instrument that has its origins in Hellenistic astronomy, is a predecessor of the modern planisphere. The term planisphere contrasts with armillary sphere, where the celestial sphere is represented by a three-dimensional framework of rings. Description A planisphere consists of a circular star chart attached at its center to an opaque circular overlay that has a clear elliptical window or hole so that only a portion of the sky map will be visible in the window or hole area at any given time. The chart and overlay are mounted so that they are free to rotate about a common axis. The star chart contains the brightest stars, constellations and (possibly) deep-sky objects visible from a particular latitude on Earth. The night sky that one sees from the Earth depends on whether the observer is in the northern or southern hemispheres and the latitude. A planisphere window is designed for a particular latitude and will be accurate enough for a certain band either side of that. Planisphere makers will usually offer them in a number of versions for different latitudes. Planispheres only show the stars visible from the observer's latitude; stars below the horizon are not included. A complete twenty-four-hour time cycle is marked on the rim of the overlay. A full twelve months of calendar dates are marked on the rim of the starchart. The window is marked to show the direction of the eastern and western horizons. The disk and overlay are adjusted so that the observer's local time of day on the overlay corresponds to that day's date on the star chart disc. The portion of the star chart visible in the window then represents (with a distortion because it is a flat surface representing a spherical volume) the distribution of stars in the sky at that moment for the planisphere's designed location. Users hold the planisphere above their head with the eastern and western horizons correctly aligned to match the chart to actual star positions. History The word planisphere (Latin planisphaerium) was originally used in the second century by Claudius Ptolemy to describe the representation of a spherical Earth by a map drawn in the plane. This usage continued into the Renaissance: for example Gerardus Mercator described his 1569 world map as a planisphere. In this article the word describes the representation of the star-filled celestial sphere on the plane. The first star chart to have the name "planisphere" was made in 1624 by Jacob Bartsch. Bartsch was the son-in-law of Johannes Kepler, discoverer of Kepler's laws of planetary motion. The star chart Since the planisphere shows the celestial sphere in a printed flat, there is always considerable distort
https://en.wikipedia.org/wiki/Backdoor%20%28computing%29
A backdoor is a typically covert method of bypassing normal authentication or encryption in a computer, product, embedded device (e.g. a home router), or its embodiment (e.g. part of a cryptosystem, algorithm, chipset, or even a "homunculus computer"—a tiny computer-within-a-computer such as that found in Intel's AMT technology). Backdoors are most often used for securing remote access to a computer, or obtaining access to plaintext in cryptosystems. From there it may be used to gain access to privileged information like passwords, corrupt or delete data on hard drives, or transfer information within autoschediastic networks. A backdoor may take the form of a hidden part of a program, a separate program (e.g. Back Orifice may subvert the system through a rootkit), code in the firmware of the hardware, or parts of an operating system such as Windows. Trojan horses can be used to create vulnerabilities in a device. A Trojan horse may appear to be an entirely legitimate program, but when executed, it triggers an activity that may install a backdoor. Although some are secretly installed, other backdoors are deliberate and widely known. These kinds of backdoors have "legitimate" uses such as providing the manufacturer with a way to restore user passwords. Many systems that store information within the cloud fail to create accurate security measures. If many systems are connected within the cloud, hackers can gain access to all other platforms through the most vulnerable system. Default passwords (or other default credentials) can function as backdoors if they are not changed by the user. Some debugging features can also act as backdoors if they are not removed in the release version. In 1993, the United States government attempted to deploy an encryption system, the Clipper chip, with an explicit backdoor for law enforcement and national security access. The chip was unsuccessful. Recent proposals to counter backdoors include creating a database of backdoors' triggers and then using neural networks to detect them. Overview The threat of backdoors surfaced when multiuser and networked operating systems became widely adopted. Petersen and Turn discussed computer subversion in a paper published in the proceedings of the 1967 AFIPS Conference. They noted a class of active infiltration attacks that use "trapdoor" entry points into the system to bypass security facilities and permit direct access to data. The use of the word trapdoor here clearly coincides with more recent definitions of a backdoor. However, since the advent of public key cryptography the term trapdoor has acquired a different meaning (see trapdoor function), and thus the term "backdoor" is now preferred, only after the term trapdoor went out of use. More generally, such security breaches were discussed at length in a RAND Corporation task force report published under DARPA sponsorship by J.P. Anderson and D.J. Edwards in 1970. A backdoor in a login system might take the form of a hard