source
stringlengths
32
199
text
stringlengths
26
3k
https://en.wikipedia.org/wiki/Raw
Raw is an adjective usually describing: Raw materials, basic materials from which products are manufactured or made Raw food, uncooked food Raw or RAW may also refer to: Computing and electronics .RAW, a proprietary mass spectrometry data format Raw audio format, a file type used to represent sound in uncompressed form Raw image format, a variety of image files used by digital cameras, containing unprocessed data Rawdisk, binary level disk access Read after write, technologies used for CD-R and CD-RW Read after write (RAW) hazard, a data dependency hazard considered in microprocessor architecture Raw display, a raw framed monitor. Film and television Raw TV, a British TV production company Raw (film), a 2016 film Raw (TV series), an Irish drama series Eddie Murphy Raw, a 1987 live stand-up comedy recording Ramones: Raw, a 2004 music documentary Raw FM, an Australian television series WWE Raw, a weekly World Wrestling Entertainment program Games Rules as written, or RAW, the literal rules of a game, similar to the letter of the law WWF Raw (1994 video game), professional wrestling video game RAW 2, Xbox professional wrestling video game WWE video games, category listing all games based on WWE RAW franchise and more Magazines Raw (comics magazine), comics magazine launched in 1980 Raw (music magazine), British magazine published by EMAP in the 1980s and 1990s Raw Magazine, published by World Wrestling Entertainment, see WWE Magazine Music Albums R.A.W. (album), a 2000 album by Daz Dillinger RAW (City Girls album), 2023 Raw (Alyson Williams album), 1989 Raw (Crack the Sky album), 1986 Raw (Hopsin album), 2010 Raw (Jimmy Barnes album), 2001 Raw (Juvenile album), 2005 Raw (Keith LeBlanc album), 1990 Raw (Moxy album), 2002 Raw (Ra album), 2006 Raw (Sex Pistols album), recorded 1976, released 1997 Raw (Shannon Noll album), 2021 Raw (The Alarm album), 1991 A Little More Personal (Raw), a 2005 album by Lindsay Lohan Songs "Raw", a song by Bad Meets Evil from the Southpaw soundtrack "Raw", the lead single from Spandau Ballet's Heart Like a Sky "Raw", a song by Staind from Dysfunction People Mr Raw, (born 1975), stage name of rapper Okechukwu Edwards Ukeje Robert Anton Wilson (1932–2007), an American author. The Mighty Raw, pseudonym of American musician Ron Wasserman David Raw (born 1944), English cricketer Harry Raw (born 1903), English footballer Nathan Raw (1866–1940), English physician and politician Peter Raw (1922–1988), Royal Australian Air Force officer Rowland Raw (1884–1915), English cricketer Simon Raw (born 1994), South African rugby union player Steve Raw (born 1966), English darts player Sydney Raw (1898–1967), Royal Navy vice admiral Vanessa Raw (born 1984), English triathlete Vause Raw (1921–2001), South African politician Organisation Research and Analysis Wing, or RAW, India's external intelligence agency Rosa Antifa Wien, an Austrian left-wing action group Other uses Ra
https://en.wikipedia.org/wiki/Evolutionary%20programming
Evolutionary programming is one of the four major evolutionary algorithm paradigms. It is similar to genetic programming, but the structure of the program to be optimized is fixed, while its numerical parameters are allowed to evolve. It was first used by Lawrence J. Fogel in the US in 1960 in order to use simulated evolution as a learning process aiming to generate artificial intelligence. Fogel used finite-state machines as predictors and evolved them. Currently evolutionary programming is a wide evolutionary computing dialect with no fixed structure or (representation), in contrast with some of the other dialects. It has become harder to distinguish from evolutionary strategies. Its main variation operator is mutation; members of the population are viewed as part of a specific species rather than members of the same species therefore each parent generates an offspring, using a (μ + μ) survivor selection. See also Artificial intelligence Genetic algorithm Genetic operator References Fogel, L.J., Owens, A.J., Walsh, M.J. (1966), Artificial Intelligence through Simulated Evolution, John Wiley. Fogel, L.J. (1999), Intelligence through Simulated Evolution : Forty Years of Evolutionary Programming, John Wiley. Eiben, A.E., Smith, J.E. (2003), Introduction to Evolutionary Computing, Springer. External links The Hitch-Hiker's Guide to Evolutionary Computation: What's Evolutionary Programming (EP)? Evolutionary Programming by Jason Brownlee (PhD) Evolutionary algorithms Optimization algorithms and methods de:Evolutionäre Programmierung
https://en.wikipedia.org/wiki/Terminal%20%28telecommunication%29
In the context of telecommunications, a terminal is a device which ends a telecommunications link and is the point at which a signal enters or leaves a network. Examples of terminal equipment include telephones, fax machines, computer terminals, printers and workstations. An end instrument is a piece of equipment connected to the wires at the end of a telecommunications link. In telephony, this is usually a telephone connected to a local loop. End instruments that relate to data terminal equipment include printers, computers, barcode readers, automated teller machines (ATMs) and the console ports of routers. See also Communication endpoint Data terminal equipment End system Host (network) Node (networking) Terminal equipment References External links Directive 1999/5/EC of the European Parliament and of the Council of 9 March 1999 on radio equipment and telecommunications terminal equipment and the mutual recognition of their conformity (R&TTE Directive). Telecommunications equipment
https://en.wikipedia.org/wiki/Face%20the%20Nation
Face the Nation is a weekly news and morning public affairs program airing Sundays on the CBS radio and television network. Created by Frank Stanton in 1954, Face the Nation is one of the longest-running news programs in the history of television. Typically, the program features interviews with prominent American officials, politicians, and authors. Margaret Brennan has been the moderator of Face the Nation since 2018, though former host John Dickerson substituted during Brennan's maternity leave in spring and summer 2021. Upon Brennan's return to the program in September 2021, its title was changed to Face the Nation with Margaret Brennan. The show's full hour is broadcast live from the CBS News Washington, D.C., bureau at 10:30 a.m. Eastern Time, though some stations delay or abbreviate episodes to accommodate local and sports programming. In 2017, Face the Nations audience was the largest of all Sunday public affairs programs, with an average of 3.538 million viewers. NBC competitor Meet the Press closely competed for the title in 2018, besting Face the Nations audience for several months. Format Similar to its Sunday morning competitors, Face the Nation begins each episode with a short "tease" segment recapping the week's events and teasing the day's guests, set to the show's theme music. The remainder of the program's first half-hour typically features interviews of prominent politicians, often lawmakers and cabinet or White House officials, responding to issues from the week's news. The program's second-half hour transitions to more discussion-oriented segments, including interviews of notable authors with forthcoming books and a weekly roundtable discussion, with a rotating cast of panelists. The program's inclusion of a roundtable discussion has been indefinitely suspended since circa May 2020, the producers citing their desire to devote more time to interviews (and remained suspended in fall 2022). Unlike some of its competitors, Face the Nation generally books only journalists and columnists for its panel discussions, omitting current and former politicians from providing punditry. During major news events or breaking news, the program will often feature reports from various CBS News correspondents before the day's interviews, to allow guests the opportunity to respond to the latest news. Distribution Face the Nation first half-hour airs on CBS television stations throughout the United States, typically in the morning. In 2018, the CBS News digital streaming network began re-airing the program's full hour at 11:00 a.m., 3:00 p.m., and 6:00 p.m. Eastern Time. Many of the network's affiliates in the Pacific Time Zone air Face the Nation at 8:30 a.m. local time, serving as a lead-in to the CBS Sports program The NFL Today during the football season. A delayed audio broadcast of the program is also carried on a handful of radio affiliates through the CBS Radio Network, and in the late afternoon on C-SPAN's Washington area radi
https://en.wikipedia.org/wiki/PiHex
PiHex was a distributed computing project organized by Colin Percival to calculate specific bits of . 1,246 contributors used idle time slices on almost two thousand computers to make its calculations. The software used for the project made use of Bellard's formula, a faster version of the BBP formula. History To calculate the five trillionth digit (and the following seventy-six digits) took 13,500 CPU hours, using 25 computers from 6 different countries. The forty trillionth digit required 84,500 CPU hours and 126 computers from 18 different countries. The highest calculation, the one quadrillionth digit, took 1.2 million CPU hours and 1,734 computers from 56 different countries. Total resources: 1,885 computers donated 1.3 million CPU hours. The average computer that was used to calculate would have taken 148 years to complete the calculations alone. After setting three records, calculating the five trillionth bit, the forty trillionth bit, and the quadrillionth bit, the project ended on September 11, 2000. While the PiHex project calculated the least significant digits of ever attempted in any base, the second place is held by Peter Trueb who computed some 22+ trillion digits in 2016 and third place by houkouonchi who derived the 13.3 trillionth digit in base 10. Algorithm Unlike most computations of , which compute results in base 10, PiHex computed in base 2 (bits), because Bellard's formula and the BBP formula could only be used to compute in base 2 at the time. The final bit strings for each of the three calculations resulted as such: Binary digits of from five trillion minus three to five trillion and seventy-six (completed August 30, 1998): 0000 0111 1110 0100 0101 0111 0011 0011 1100 1100 ^ Five trillionth bit of 0111 1001 0000 1011 0101 1011 0101 1001 0111 1001 Binary digits of from forty trillion minus three to forty trillion and sixty-four (February 9, 1999): 1010 0000 1111 1001 1111 1111 0011 0111 0001 1101 ^ Forty trillionth bit of 0001 0111 0101 1001 0011 1110 0000 Binary digits of from one quadrillion minus three to one quadrillion and sixty (September 11, 2000): 1110 0110 0010 0001 0110 1011 0000 0110 1001 1100 ^ Quadrillionth bit of 1011 0110 1100 0001 1101 0011 Therefore, the least significant known bit of is 1 at position 1,000,000,000,000,060 (one quadrillion and sixty) or . References Pi-related software Distributed computing projects
https://en.wikipedia.org/wiki/Sanjay%20Kumar%20%28business%20executive%29
Sanjay Kumar (born 1962) is the former chairman and CEO of Computer Associates International (now CA Technologies), serving from 2000 until April 2004. He was sentenced to 12 years in prison in connection with the 35 day month accounting scandal and released in 2017. Early childhood He immigrated with his family to the United States in 1976 to escape civil unrest in his native Sri Lanka. The family originally settled in South Carolina. He attended Furman University from 1980 to 1983, and left without completing a degree. Career Kumar became an employee of Computer Associates in 1987 when it acquired UCCEL Corp. in an $800 million buyout. Kumar was, at the time, UCCEL's director of software development and had been employed by UCCEL only for a few months. Kumar was promoted to vice president of planning the following year, relocating to Computer Associates' Long Island headquarters. Over the years, he held various leadership roles at the firm. In 1989, he became senior vice president of planning and in 1993 moved up to Executive Vice President of Operations. Kumar was named president and chief operating officer in 1994 at age 31. Kumar succeeded the retiring Tony Wang, the older brother of Chief Executive Officer Charles Wang, as Tony was pressured to leave to make way. In 2000, Kumar replaced his mentor Charles Wang as chief executive officer of the firm and in 2002 became chairman of Computer Associates' board of directors. Kumar is widely credited with moving CA to be more customer-friendly. Resignation Kumar resigned as chairman and chief executive in April 2004, following an investigation into securities fraud and obstruction of justice at Computer Associates. He remained with the firm in the new position of chief software architect for about six weeks before leaving the firm altogether on June 4, 2004. A federal grand jury in Brooklyn indicted him on fraud charges on September 22, 2004. Kumar pleaded guilty to obstruction of justice and securities fraud charges on April 24, 2006. On November 2, 2006, it was reported that he was sentenced to 12 years in prison and fined $8 million for his role in a massive accounting fraud at Computer Associates. Prison At the hearing in federal court in Brooklyn, Judge Leo Glasser sentenced Kumar, 44 years old, to 144 months in prison, to be followed by three years supervised release. The judge deferred payment of the fine until after restitution is determined at a hearing scheduled for February 2, 2007. Kumar was scheduled to report to prison on February 27, but that was delayed by two months due to delays in the restitution hearing. The start of the prison sentence was then delayed again, to November. However, in early June U.S. District Judge I. Leo Glasser ordered Kumar to surrender by August 14, 2007, to the federal correctional center in Fairton, New Jersey, to begin serving his 12-year sentence, and on that date he did so. Upon starting to serve his prison sentence, Kumar alleged that
https://en.wikipedia.org/wiki/Cybersix
Cybersix is an Argentinean comic book series published in 1991, created by the Argentine authors Carlos Trillo (story) and Carlos Meglia (art) for the comics magazine Skorpio (Eura Editoriale). The series first appeared in Spanish in November 1993. It follows the eponymous leather-clad genetic engineering survivor who cross-dresses (to conceal her identity) working as a male teacher during the day, and fights against the scientist who created her at night. The series was adapted into a live-action television series and an animated television series that garnered positive critical reception from the Pulcinella Awards. Plot Von Reichter is a surviving member of Schutzstaffel in World War II. He works on experiments in South America, creating the Cyber series of artificial humanoids with super strength and agility. The 5000 original Cybers became servants, mimicked human emotions and making their will. When they disobey orders, Reichter orders them all to be destroyed. After the death of Cyber-29, Reichter transfers his brain into the body of a panther, Data-7. Cyber-6 (Cybersix) is one of the survivors, who escapes and arrives in the city of Meridiana. She disguises herself as school teacher Adrian Seidelman, after the real one is killed in a car crash. Cybersix defeats monsters called "Fixed Ideas" – humanoids of the Techno series – in order to drink the green sustenance liquid contained within them. Along the way, she meets an orphaned boy Julian, Reichter's cloned son José, and high school teacher Lucas Amato. Production Comics The comics were originally published in Italy in the magazine Skorpio in 113 weekly 12-pages installments from May 1991 to July 1994, followed by 45 96-pages comic books between November 1994 and January 1999. Parts of the material were translated in Spanish and published in Argentina (since 1993 by El Globo Editor) and in Spain (since 1995 by Planeta De Agostini). Collections were released in French, with twelve volumes distributed by Editions Vents d'Ouest between 1994 and 1998. Live-action series The series debuted in Argentina on 15 March 1995. It was produced by Luis Gandulfo, Sebastián Parrotta, Fernando Rascovsky and Andre Ronco, and written by Ricardo Rodríguez, Carlos Meglia and Carlos Trillo. The series aired on Telefé, but was cancelled after only a few episodes due to low ratings. Cybersix was played by former model and actress Carolina Peleritti, José was played by Rodrigo de la Serna, and Doguyy was played by Mario Moscoso. Animated series The series debuted in Canada and Argentina on 6 September 1999, and was subsequently dubbed in French, Polish, Japanese, Malaysian and Thai. It was produced by Canadian company NoA and animated by Japanese studio Tokyo Movie Shinsha. The series' music was composed by Robbi Finkel, and character designs were overseen by Teiichi Takiguchi. The show was aimed at children by toning down the comics' darker themes. Two seasons were originally planned, but it was cancelle
https://en.wikipedia.org/wiki/Charles%20Wang
Charles B. Wang (; August 19, 1944 – October 21, 2018) was a Chinese-American billionaire, businessman, and philanthropist, who was a co-founder and CEO of Computer Associates International, Inc. (later renamed CA Technologies). He was a minority owner (and past majority owner) of the NHL's New York Islanders ice hockey team and their AHL affiliate. In 1976, at age 31, Wang (pronounced "Wong") launched Computer Associates, using credit cards for funding. Wang then grew Computer Associates into one of the country's largest ISVs (independent software vendors). Wang authored two books to help executives master technology: Techno Vision (1994, McGraw-Hill) and Techno Vision II (1997, McGraw-Hill). Wang retired from Computer Associates in 2002. He was an active philanthropist, working with such organizations as Smile Train, the World Childhood Foundation, the Islanders Children's Foundation and the National Center for Missing and Exploited Children, among others. In January 2022, the new UBS Arena in Belmont, home of the New York Islanders, raised a plaque to honor Wang for all his work and dedication to the team. Wang's net worth was estimated to be $17.6 billion. Early life Charles B. Wang was born in Shanghai to parents Kenneth and Mary Wang. He has two brothers, Anthony W. Wang and Francis Wang. His father was a Supreme Court judge in the Republic of China. In the closing years of the Chinese Civil War which saw the Nationalist government flee to Taiwan, the Wangs moved to Queens, New York City when he was eight years old. He attended Brooklyn Technical High School in Fort Greene, Brooklyn. Wang earned a Bachelor of Science degree from Queens College and began his computer career at Columbia University's Riverside Research Institute. Business Computer Associates Wang and his business partner Russell Artzt established Computer Associates in 1976, guiding the company toward its current standing as one of the largest ISVs in the world. One year later, Computer Associates became the first enterprise software company to provide multi-platform products, foreshadowing its ongoing emphasis on compatibility and integration. By 1989, Computer Associates became the second software-only company to reach US$1 billion in revenues. Wang's tenure as CEO of Computer Associates was marked by rapid growth, frequently as a result of strict hiring practices and high expectations for executives of acquired companies. Nearly all of Computer Associates' managers were promoted from within, so very few acquired managers were kept. Newly hired salespeople had some sales experience, but specifically not in software. A Master's Degree in Business Administration held little to no value at CA, so employment candidates and acquired employees with MBAs were typically rejected. It was unusual for a technician to be considered for sales because the firm's training program was geared toward products instead of professional selling. The pass/fail demarcation was sharp, so p
https://en.wikipedia.org/wiki/Telegard
Telegard is an early bulletin board system (BBS) software program written for IBM PC-compatible computers running MS-DOS and OS/2. Telegard was written in Pascal with routines written in C++ and assembly language, based on a copy of the WWIV source code. Telegard has several features that make it attractive to BBS sysops, such as being free, having remote administration facilities built into the main program, and the ability to handle CD-ROMs internally. Telegard is still viable today as it can accept telnet connections by using a virtual modem/FOSSIL set up such as NetSerial, a virtual modem driver, and NetFoss, a freeware FOSSIL driver, both for Windows. External links Telegard BBS Software Homepage See also Mystic BBS Renegade (BBS) WWIV OpenTG References Bulletin board system software DOS software
https://en.wikipedia.org/wiki/Loki%20%28computer%29
Loki was the code name for a cancelled home computer developed at Sinclair Research during the mid-1980s. The name came from the Norse god Loki, god of mischief and thieves. Loki was based on the ZX Spectrum, but intended to rival the Amiga for video games. Loki followed two earlier, aborted research projects from Sinclair: the 68008-based SuperSpectrum home computer (cancelled in 1982) and the LC3 game console (cancelled in 1983). Design According to an article published in Sinclair User magazine, Loki was to have a 7 MHz Z80H CPU, a minimum of 128 KiB of RAM and two custom chips providing much enhanced video and audio capabilities compared to the ZX Spectrum, but with a compatibility mode. The video chip, referred to as the Rasterop chip, would have blitter-type functionality and three different modes: 512×256 pixels with 16 colours, 256×212 with 256 colours, or 256×212 with 64 colours and two bits per pixel used for "blitter objects". Comprehensive peripheral support was also claimed, including MIDI, lightpen, joystick and floppy disk. A version of the SuperBASIC language from the Sinclair QL was to be provided in place of the old Sinclair BASIC for the ZX Spectrum and support for the CP/M operating system was also intended. On top of this, the computer would cost as little as £200. Another Spectrum magazine, Crash, poured scorn on the report in Sinclair User, dismissing the design as "dreamware" in the opinion of an ex-Sinclair designer they consulted, analysing the implied components and costs, and adding, "It may be fun to dream about Loki, but the fact is that it won't appear, and nor will anything like it." This was the rationale, according to Crash Technical Editor Simon Goodwin: History When Amstrad bought out Sinclair's computer business in 1986, the project was cancelled. Martin Brennan and John Mathieson, two Sinclair engineers, took the Loki concept with them and founded Flare Technology. There they worked on the cancelled Konix Multisystem game console, then later worked with Atari Corporation on the Panther (cancelled) and Jaguar systems. According to Jaguar developer Andrew Whittaker, two other Sinclair employees, Bruce Gordon and Alan Miles, who went on to form Miles Gordon Technology, also used some of the designs in the SAM Coupé. References External links USENET posting by Rupert Goodwins in comp.sys.sinclair mentioning Loki Sinclair computers and derivatives Cancelled projects
https://en.wikipedia.org/wiki/Electronic%20data%20processing
Electronic data processing (EDP) can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services. The modifier "electronic" or "automatic" was used with "data processing" (DP), especially c. 1960, to distinguish human clerical data processing from that done by computer. History Herman Hollerith then at the U.S. Census Bureau devised a tabulating system that included cards (Hollerith card, later Punched card), a punch for holes in them representing data, a tabulator and a sorter. The system was tested in computing mortality statistics for the city of Baltimore. In the first commercial electronic data processing Hollerith machines were used to compile the data accumulated in the 1890 U.S. Census of population. Hollerith's Tabulating Machine Company merged with two other firms to form the Computing-Tabulating-Recording Company, later renamed IBM. The punch-card and tabulation machine business remained the core of electronic data processing until the advent of electronic computing in the 1950s (which then still rested on punch cards for storing information). The first commercial business computer was developed in the United Kingdom in 1951, by the J. Lyons and Co. catering organization. This was known as the 'Lyons Electronic Office' – or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Limited. By the end of the 1950s punched card manufacturers, Hollerith, Powers-Samas, IBM and others, were also marketing an array of computers. Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent (and often unexpected) organizational and cultural changes. At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort. High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually led to the 'millennium bug'. Data input required intermediate processing via punched paper tape or punched card and separate input to a repetitive, labor-intensive task,
https://en.wikipedia.org/wiki/FICO
FICO (legal name: Fair Isaac Corporation), originally Fair, Isaac and Company, is a data analytics company based in Bozeman, Montana, focused on credit scoring services. It was founded by Bill Fair and Earl Isaac in 1956. Its FICO score, a measure of consumer credit risk, has become a fixture of consumer lending in the United States. In 2013, lenders purchased more than 10 billion FICO scores and about 30 million American consumers accessed their scores themselves. The company reported a revenue of $1.29 billion dollars for the fiscal year of 2020. History FICO was founded in 1956 as Fair, Isaac and Company by engineer William R. "Bill" Fair and mathematician Earl Judson Isaac. The two met while working at the Stanford Research Institute in Menlo Park, California. Selling its first credit scoring system two years after the company's creation, FICO pitched its system to fifty American lenders. FICO went public in July 1987 and is traded on the New York Stock Exchange. The company debuted its first general-purpose FICO score in 1989. FICO scores are based on credit reports and "base" FICO scores range from 300 to 850, while industry-specific scores range from 250 to 900. Lenders use the scores to gauge a potential borrower's creditworthiness. Fannie Mae and Freddie Mac first began using FICO scores to help determine which American consumers qualified for mortgages bought and sold by the companies in 1995. Name changes Originally called Fair, Isaac and Company (hence the abbreviation FICO), this name was changed to Fair Isaac Corporation in 2003. Headquarters moves Originally based in San Rafael, California, FICO moved its headquarters to Minneapolis, Minnesota, in 2004, a few years after Minnesota resident Thomas Grudnowski took over as CEO. In 2013, it moved its headquarters to San Jose, California, a year after CEO William Lansing joined. In 2016 it opened an office in Bozeman, Montana which later became its headquarters. Acquisitions DynaMark 1992 Risk Management Technologies 1997 Prevision 1997 Nykamp Consulting Group 2001 HNC Software 2002 NAREX 2003 Diversified Healthcare Services 2003 Seurat (2003) London Bridge Software 2004 Braun Consulting 2004 RulesPower 2005 Dash Optimization 2008 Entiera 2012 Adeptra 2012 CR Software 2012 Infoglide 2013 InfoCentricity 2014 Karmasphere 2014 TONBELLER AG 2015 QuadMetrics 2016 GoOn 2018 EZMCOM 2019 Operations FICO is headquartered in Bozeman, Montana and it has additional U.S. locations in San Jose, California; Roseville, Minnesota; San Diego; San Rafael, California; Fairfax, Virginia; and Austin, Texas. The company has international locations in Australia, Brazil, Canada, China, Germany, India, Italy, Japan, Korea, Lithuania, Poland, Malaysia, the Philippines, Russia, Singapore, South Africa, Spain, Taiwan, Thailand, Turkey and the United Kingdom. FICO score A measure of credit risk, FICO scores are available through all of the major consumer reporting agencies in the United States: Equifax
https://en.wikipedia.org/wiki/Conditional%20%28computer%20programming%29
In computer science, conditionals (that is, conditional statements, conditional expressions and conditional constructs) are programming language commands for handling decisions. Specifically, conditionals perform different computations or actions depending on whether a programmer-defined Boolean condition evaluates to true or false. In terms of control flow, the decision is always achieved by selectively altering the control flow based on some condition (apart from the case of branch predication). Although dynamic dispatch is not usually classified as a conditional construct, it is another way to select between alternatives at runtime. Conditional statements are the checkpoints in the programe that determines behaviour according to situation. Terminology In imperative programming languages, the term "conditional statement" is usually used, whereas in functional programming, the terms "conditional expression" or "conditional construct" are preferred, because these terms all have distinct meanings. If–then(–else) The if–then construct (sometimes called if–then–else) is common across many programming languages. Although the syntax varies from language to language, the basic structure (in pseudocode form) looks like this: If (boolean condition) Then (consequent) Else (alternative) End If For example: If stock=0 Then message= order new stock Else message= there is stock End If In the example code above, the part represented by (boolean condition) constitutes a conditional expression, having intrinsic value (e.g., it may be substituted by either of the values True or False) but having no intrinsic meaning. In contrast, the combination of this expression, the If and Then surrounding it, and the consequent that follows afterward constitute a conditional statement, having intrinsic meaning (e.g., expressing a coherent logical rule) but no intrinsic value. When an interpreter finds an If, it expects a Boolean condition – for example, x > 0, which means "the variable x contains a number that is greater than zero" – and evaluates that condition. If the condition is true, the statements following the then are executed. Otherwise, the execution continues in the following branch – either in the else block (which is usually optional), or if there is no else branch, then after the end If. After either branch has been executed, control returns to the point after the end If. History and development In early programming languages, especially some dialects of BASIC in the 1980s home computers, an if–then statement could only contain GOTO statements (equivalent to a branch instruction). This led to a hard-to-read style of programming known as spaghetti programming, with programs in this style called spaghetti code. As a result, structured programming, which allows (virtually) arbitrary statements to be put in statement blocks inside an if statement, gained in popularity, until it became the norm even in most BASIC programming circle
https://en.wikipedia.org/wiki/IBM%20ThinkPad%20UltraPort
IBM UltraPort was a nonstandard USB 1.1 port used by IBM on its range of ThinkPad laptop computers. Description Electronically the UltraPort connector is identical to the standard USB port. UltraPort uses a proprietary mechanical connection, so UltraPort devices cannot be plugged into a normal USB interface. However, the UltraPort devices shipped with an adapter which allowed them to be attached to a regular USB port. Select ThinkPad models from 2000 to 2002 came with an UltraPort connector on the top edge of the laptop's screen, and IBM sold a variety of laptop-relevant UltraPort devices, including webcams, speakers, and microphone arrays. UltraPort was designed in 1999 in response to the proliferation of many laptop computers from Sony, Fujitsu, and others that had built-in cameras, but proprietary predecessor can be found on a released in 1996 ThinkPad 850 laptop. See also Compaq Evo's Multiport References USB ThinkPad UltraPort UltraPort
https://en.wikipedia.org/wiki/Theme%20music
Theme music is a musical composition which is often written specifically for radio programming, television shows, video games, or films and is usually played during the title sequence, opening credits, closing credits, and in some instances at some point during the program. The purpose of a theme song is often similar to that of a leitmotif. The phrase theme song or signature tune may also be used to refer to a signature song that has become especially associated with a particular performer or dignitary, often used as they make an entrance. Purpose From the 1950s onwards, theme music, and especially theme songs also became a valuable source of additional revenue for Hollywood film studios, many of which launched their own recording arms. This period saw the beginning of more methodical cross-promotion of music and movies. One of the first big successes, which proved very influential, was the theme song for High Noon (1952). Types Television Theme music has been a feature of the majority of television programs since the medium's inception. Programs have used theme music in a large variety of styles, sometimes adapted from existing tunes, and with some composed specifically for the purpose. A few have been released commercially and become popular hits. Other themes, like the music for The Young and the Restless, Days of Our Lives, and Coronation Street have become iconic mostly due to the shows' respective longevities. Unlike others, these serials have not strayed from the original theme mix much, if at all, allowing them to be known by multiple generations of television viewers. In the United Kingdom and Ireland, iconic sports shows have such strong associations with their theme music that the sports themselves are synonymous with the theme tunes, such as association football (The Match of the Day, Grandstand and The Big Match theme tunes), cricket ("Soul Limbo" by Booker T. & the M.G.'s), motorsport (Roger Barsotti's Motor Sport and the bassline from Fleetwood Mac's "The Chain"), tennis (Keith Mansfield's "Light and Tuneful"), snooker ("Drag Racer" by the Doug Wood Band), skiing (Sam Fonteyn's "Pop Looks Bach", the theme to Ski Sunday) and gaelic games ("Jägerlatein" by James Last). Themes in the United States that have become associated with a sport include Johnny Pearson's "Heavy Action" (used for many years as an intro to Monday Night Football), "Roundball Rock" (composed by John Tesh) as the theme for the NBA on NBC during the 1990s and early 2000s, and for Fox College Hoops (from 2018–19 to present) and Jr. NBA Championships (2019–present), "Bugler's Dream" (used in ABC and NBC's coverage of the Olympic Games) and the theme to ESPN's sports highlight show, SportsCenter. A notable theme that was once associated with a sport, but because of its popularity, spread network-wide was the NFL on Fox theme, which was used for Major League Baseball on Fox (2010–2019) and NASCAR on Fox (2011–2015) and Fox UFC (2012–2018) and Premier Boxing Ch
https://en.wikipedia.org/wiki/Grayscale
In digital photography, computer-generated imagery, and colorimetry, a grayscale image is one in which the value of each pixel is a single sample representing only an amount of light; that is, it carries only intensity information. Grayscale images, a kind of black-and-white or gray monochrome, are composed exclusively of shades of gray. The contrast ranges from black at the weakest intensity to white at the strongest. Grayscale images are distinct from one-bit bi-tonal black-and-white images, which, in the context of computer imaging, are images with only two colors: black and white (also called bilevel or binary images). Grayscale images have many shades of gray in between. Grayscale images can be the result of measuring the intensity of light at each pixel according to a particular weighted combination of frequencies (or wavelengths), and in such cases they are monochromatic proper when only a single frequency (in practice, a narrow band of frequencies) is captured. The frequencies can in principle be from anywhere in the electromagnetic spectrum (e.g. infrared, visible light, ultraviolet, etc.). A colorimetric (or more specifically photometric) grayscale image is an image that has a defined grayscale colorspace, which maps the stored numeric sample values to the achromatic channel of a standard colorspace, which itself is based on measured properties of human vision. If the original color image has no defined colorspace, or if the grayscale image is not intended to have the same human-perceived achromatic intensity as the color image, then there is no unique mapping from such a color image to a grayscale image. Numerical representations The intensity of a pixel is expressed within a given range between a minimum and a maximum, inclusive. This range is represented in an abstract way as a range from 0 (or 0%) (total absence, black) and 1 (or 100%) (total presence, white), with any fractional values in between. This notation is used in academic papers, but this does not define what "black" or "white" is in terms of colorimetry. Sometimes the scale is reversed, as in printing where the numeric intensity denotes how much ink is employed in halftoning, with 0% representing the paper white (no ink) and 100% being a solid black (full ink). In computing, although the grayscale can be computed through rational numbers, image pixels are usually quantized to store them as unsigned integers, to reduce the required storage and computation. Some early grayscale monitors can only display up to sixteen different shades, which would be stored in binary form using 4 bits. But today grayscale images intended for visual display are commonly stored with 8 bits per sampled pixel. This pixel depth allows 256 different intensities (i.e., shades of gray) to be recorded, and also simplifies computation as each pixel sample can be accessed individually as one full byte. However, if these intensities were spaced equally in proportion to the amount of physical lig
https://en.wikipedia.org/wiki/Quicken%20Interchange%20Format
Quicken Interchange Format (QIF) is an open specification for reading and writing financial data to media (i.e. files). Background Although still widely used, QIF is a format older than Open Financial Exchange (OFX). The inability to reconcile imported transactions against the current account information is one of the primary shortcomings of QIF. Most personal money management software, such as Microsoft Money, GnuCash and Quicken's low end products (e.g. Quicken Personal and Quicken Personal Plus), can read QIF files to import information. Intuit's Quicken used to be able to import QIF, too, but with its 2006 version it dropped that support for several important account types, including checking, savings, and credit card accounts. The Australian version of Quicken still allows the importing of QIF files for these account types. However, unlike the American version, it is not possible to export data to QIF or any other file type for any account type. The QIF format does not allow a user to mark the currency in which a transaction was completed. In some cases this may cause problems for users who do use multiple currencies when they export or import into another software package. Quicken's proposed replacement for the QIF format has been the proprietary Quicken Web Connect (QFX) format. It is commonly supported by financial institutions to supply downloadable information to account holders, especially by banks that support integration of Money or Quicken with their online banking. Not everybody, however, was, or is, happy with this replacement. Some banks dislike it because Quicken (Intuit) charges licensing fees to use QFX. Other banks pass the fees on by charging customers for downloading QFX files. Because Microsoft Money imports either QIF or OFX format files, and Microsoft does not charge banks any licensing fees to use OFX for Money, banks do not normally charge for downloading QIF and OFX files. (QIF and OFX are open formats, free for anyone to use.) Data format A QIF file typically has the following structure: Each record ends with a ^ (caret). All the data in the file is stored in ASCII strings, and the file could be edited in any text editor. simple example !Type:Bank D03/03/10 T-379.00 PCITY OF SPRINGFIELD ^ D03/04/10 T-20.28 PYOUR LOCAL SUPERMARKET ^ D03/03/10 T-421.35 PSPRINGFIELD WATER UTILITY ^ ...etc. Header line The first line in the file must be a header line, to identify the type of data contained. Valid values for accounts are: There are also values for QIF files of internal Quicken information: A header line is not followed by a separator line; it is immediately followed by the first field of a detail item. Detail items The Detail section consists of several Detail Items, each on a separate line. Each line begins with a single character identifying code in the first column, followed by the literal data for that field. The detail item is terminated by a separator line. The fields can be in any order
https://en.wikipedia.org/wiki/Open%20Financial%20Exchange
Open Financial Exchange (OFX) is a data-stream format for exchanging financial information that evolved from Microsoft's Open Financial Connectivity (OFC) and Intuit's Open Exchange file formats. History Microsoft, Intuit and CheckFree announced the OFX standard on 16 January 1997. The first OFX specification, version 1.0, was released on 14 February 1997. The specification allows for bank- and application-specific extensions, although only a subset is necessary to describe a financial transaction. Versions 1.0 through 1.6 relied on SGML for data exchange, but later versions are XML based. According to the main OFX site, "The specification is freely licensed, allowing any software developer to design an interface that will be supported on the front-end." In 2019, OFX's consortium joined Financial Data Exchange (FDX) consortium, that manage nowadays OFX working group. Specification The latest reference document (Functional Specification) describing the standard was published in October 2020, in version 2.3, by the FDX consortium's OFX working group. Support in various countries Many banks in the US let customers use personal financial management software to automatically download their bank statements in OFX format, but most Canadian, United Kingdom and Australian banks do not allow this, however, many banks do support downloading financial data in OFX, QFX, QIF, or spreadsheet format via their web interface for later import into financial software. Intuit and QFX QFX is a proprietary variant of OFX used in Intuit's products. In Intuit products, OFX is used for Direct Connect and QFX for Web Connect. Direct Connect allows personal financial management software to connect directly to a bank OFX server, whereas in Web Connect, the user needs to log in and manually download a .qfx file and import it into Quicken. See also Quicken Interchange Format ISO 20022 FinTS (formerly HBCI) References External links More information on the OFX specification OFX Press Release (copy) List of OFX connection details for banks that support OFX OFX forums, list of verified OFX connection details OFX file viewer Computer file formats Computer-related introductions in 1997 Financial software
https://en.wikipedia.org/wiki/Nitzer%20Ebb
Nitzer Ebb () is a British EBM group formed in 1982 by Essex school friends Vaughan "Bon" Harris (programming, synthesizers, drums, vocals), Douglas McCarthy (vocals), and David Gooday (drums). The band was originally named La Comédie De La Mort but soon discarded that and chose the name Nitzer Ebb by cutting up words and letters and arranging them randomly to create something Germanic without using actual German words. History Initial releases (1983–1987) The group released their demo Basic Pain Procedure in 1983, but it was two years until they met PWL producer Phil Harding, who produced their 1985 debut single "Isn't It Funny How Your Body Works?" and helped them set up their own label, Power Of Voice Communications. The band at the time was inspired by the post-punk scene and specifically acts like "Siouxsie and the Banshees, Killing Joke and Bauhaus who were having a big influence on us, in some ways stylistically but also in the energy that they gave". They released three more singles on their own label, "Warsaw Ghetto" (1985), "Warsaw Ghetto Remixes" (1986) and "Let Your Body Learn" (1986), before signing to Mute Records in 1986. The singles "Murderous" (1986) and "Let Your Body Learn" (1987) followed, building their reputation in the Industrial Rock and EBM scenes, as well as making inroads into the developing Chicago House scene. "Join In The Chant" (1987) became part of the Balearic beat scene that influenced the UK acid house scene. International success, disbandment (1987–1995) Their debut album That Total Age was released in 1987. Depeche Mode, longtime friends and label mates of the band, invited them to open for the European leg of their successful Music For The Masses Tour in 1987. David Gooday left after the tour and they completed their next album Belief (1989) as a duo. Mark 'Flood' Ellis became their new producer. They recruited Julian Beeston to assist them on their own world tour, and he soon became a regular contributor both on and off stage. In 1989, they teamed with German EBM pioneers Die Krupps to rerecord their 1981 single "Wahre Arbeit - Wahrer Lohn" as "The Machineries of Joy". The third Nitzer Ebb album Showtime, released in 1990, revealed a less confrontational sound. The single "Fun to Be Had" (1990) featured a remix by George Clinton and was a hit on the US dance chart. Their fourth album, Ebbhead (1991), showcased a more traditional songwriting style with an emphasis on melodic choruses was produced by Alan Wilder from Depeche Mode and Flood. They promoted the album with a global tour that took them from the southern U.S. to northern Siberia (in the Siberian city of Barnaul). Their fifth album, Big Hit (1995), featured a greater use of 'real' instruments, especially guitars and drums. McCarthy and Harris recruited Jason Payne (percussion), to their main line-up and brought in John Napier (guitar, percussion) to assist with live performances. Big Hit was the final release by the band for almost 15 y
https://en.wikipedia.org/wiki/Native%20POSIX%20Thread%20Library
The Native POSIX Thread Library (NPTL) is an implementation of the POSIX Threads specification for the Linux operating system. History Before the 2.6 version of the Linux kernel, processes were the schedulable entities, and there were no special facilities for threads. However, it did have a system call — — which creates a copy of the calling process where the copy shares the address space of the caller. The LinuxThreads project used this system call to provide kernel-level threads (most of the previous thread implementations in Linux worked entirely in userland). Unfortunately, it only partially complied with POSIX, particularly in the areas of signal handling, scheduling, and inter-process synchronization primitives. To improve upon LinuxThreads, it was clear that some kernel support and a new threading library would be required. Two competing projects were started to address the requirement: NGPT (Next Generation POSIX Threads) worked on by a team which included developers from IBM, and NPTL by developers at Red Hat. The NGPT team collaborated closely with the NPTL team and combined the best features of both implementations into NPTL. The NGPT project was subsequently abandoned in mid-2003 after merging its best features into NPTL. NPTL was first released in Red Hat Linux 9. Old-style Linux POSIX threading is known for having trouble with threads that refuse to yield to the system occasionally, because it does not take the opportunity to preempt them when it arises, something that Windows was known to do better at the time. Red Hat claimed that NPTL fixed this problem in an article on the Java website about Java on Red Hat Linux 9. NPTL has been part of Red Hat Enterprise Linux since version 3, and in the Linux kernel since version 2.6. It is now a fully integrated part of the GNU C Library. There exists a tracing tool for NPTL, called POSIX Thread Trace Tool (PTT). And an Open POSIX Test Suite (OPTS) was written for testing the NPTL library against the POSIX standard. Design Like LinuxThreads, NPTL is a 1:1 threads library. Threads created by the library (via pthread_create) correspond one-to-one with schedulable entities in the kernel (processes, in the Linux case). This is the simplest of the three threading models (1:1, N:1, and M:N). New threads are created with the clone() system call called through the NPTL library. NPTL relies on kernel support for futexes to more efficiently implement user-space locks. See also LinuxThreads Library (computer science) Green threads References External links NPTL Trace Tool OpenSource tool to trace and debug multithreaded applications using the NPTL. Linux kernel C POSIX library Threads (computing)
https://en.wikipedia.org/wiki/Dvips
dvips is a computer program that converts the Device Independent file format (DVI) output of TeX typography into a printable or otherwise presentable form. was written by Tomas Rokicki to produce printable PostScript files from DVI input, and is now commonly used for general DVI conversion. The TeX typesetting system outputs DVI files which are intended to be independent of the output device. In particular, they are not understood by printers and lack information such as font shapes. Thus, a converter (i.e., a backend) is needed to translate from a DVI file to a printer language. Although other DVI backends such as dvilj exist, is one of the most common ways of printing DVI files. Another, more recent solution is the use of pdfTeX to directly generate PDF files, which have readers for most platforms. Given its importance, is a standard part of most TeX distributions, such as teTeX, and TeX Live. By using TeX \special commands, it is possible to directly insert "literal PostScript" into the DVI file and have such snippets of PostScript appear in the final file generated by . This flexibility allows the user to include, say, watermarks on his document (especially via the use of proper packages) or further postprocess the PostScript file. When producing postscript files, dvips embeds fonts inside the file. Most recent distributions will normally embed scalable fonts, also known as Type 1 fonts. Files generated with older distributions, however, may embed raster fonts. To substitute raster for scalable fonts in a postscript file in a situation where the original dvi file is unavailable use a utility called pkfix. References External links Official website PostScript Free TeX software
https://en.wikipedia.org/wiki/Configuration%20file
In computing, configuration files (commonly known simply as config files) are files used to configure the parameters and initial settings for some computer programs. They are used for user applications, server processes and operating system settings. Some applications provide tools to create, modify, and verify the syntax of their configuration files; these sometimes have graphical interfaces. For other programs, system administrators may be expected to create and modify files by hand using a text editor, which is possible because many are human-editable plain text files. For server processes and operating-system settings, there is often no standard tool, but operating systems may provide their own graphical interfaces such as YaST or debconf. Some computer programs only read their configuration files at startup. Others periodically check the configuration files for changes. Users can instruct some programs to re-read the configuration files and apply the changes to the current process, or indeed to read arbitrary files as a configuration file. There are no definitive standards or strong conventions. Configuration files and operating systems Unix and Unix-like operating systems Across Unix-like operating systems many different configuration-file formats exist, with each application or service potentially having a unique format, but there is a strong tradition of them being in human-editable plain text, and a simple key–value pair format is common. Filename extensions of .cnf, .conf, .cfg, .cf or .ini are often used. Almost all formats allow comments, in which case, individual settings can be disabled by prepending with the comment character. Often the default configuration files contain extensive internal documentation in the form of comments and man files are also typically used to document the format and options available. System-wide software often uses configuration files stored in /etc, while user applications often use a "dotfile" – a file or directory in the home directory prefixed with a period, which in Unix hides the file or directory from casual listing. Since this causes pollution, newer user applications generally make their own folder in the .config directory, a standardized subdirectory of the home directory. Some configuration files run a set of commands upon startup. A common convention is for such files to have "rc" in their name, typically using the name of the program then an "(.)rc" suffix e.g. ".xinitrc", ".vimrc", ".bashrc", "xsane.rc". See run commands for further details. By contrast, IBM's AIX uses an Object Data Manager (ODM) database to store much of its system settings. MS-DOS MS-DOS itself primarily relied on just one configuration file, CONFIG.SYS. This was a plain text file with simple key–value pairs (e.g. DEVICEHIGH=C:\DOS\ANSI.SYS) until MS-DOS 6, which introduced an INI-file style format. There was also a standard plain text batch file named AUTOEXEC.BAT that ran a series of commands on boot. Both the
https://en.wikipedia.org/wiki/A-Train%20III
A-Train III, known internationally as A-Train, is a 1992 computer game, is the third game in the A-Train series. It was originally developed and published by Japanese game developer Artdink for Japan, and was later published by Maxis for the United States. Overview The game places players in command of a railway company. There are no rival companies; the player controls the only one in the city and the game is resultingly fairly open-ended. A-Train III is the first game in the series to use of near-isometric dimetric projection to present the city, similar to Maxis's SimCity 2000. There are two types of transport that the player's company can take: passengers or building materials. The former is more likely to be profitable, but building materials allow the city to grow. Wherever the building materials are delivered, they can be taken and used to construct buildings for the city. These start with houses, but eventually, as an area grows, roads, and shops and other buildings are built. These can provide extra revenue for a passenger service, but also allowing the city to develop and grow can be seen as a goal in itself. As well as the buildings built by the computer, in response to the materials being present, the player can construct their own buildings, such as ski resorts and hotels, and make profits from them if the conditions are right. Editor A.III. MAP CONSTRUCTION, known internationally as A-Train Construction Set, is an editor that can change existing saved games, or to build landscapes from scratch. It comes with 6 sample maps. Maxis also published A-Train Construction Set with A-Train as a single package in Europe, without the Ocean Software label. Windows version Artdink ported the A-Train III along with the editor to Windows 95, and published both titles as a package as the 3rd ARTDINK BEST CHOICE title in Japan. Maxis distribution and ports The game was tremendously popular in Japan, thus motivating Maxis to license it for US distribution as A-Train, available for DOS, Macintosh and Amiga platforms. It was released in October, 1992, though it sold poorly. Even the release of an add-on pack for the game failed to stir up any real support amongst the gaming community. The game was the first major failure from Maxis. In spite of the PC version's commercial failure in the US, Maxis later released a PlayStation version in 1996, based on Artdink's AIV: Evolution Global. The PlayStation was a relatively new platform at that point and the game suffered many limitations, such as requiring an entire memory card (expensive at the time) to store a single map. Like the PC version, it proved unsuccessful. Economic Model A-Train contains a very challenging economic system that includes a 5% land tax on all property owned, and a 50% income tax. The economic model however fails to capture realistic land prices, which adds a major flaw to the games design. Certain buildings the AI is not allowed to build, such as the Amusement Park, Golf
https://en.wikipedia.org/wiki/Eurisko
Eurisko (Gr., I discover) is a discovery system written by Douglas Lenat in RLL-1, a representation language itself written in the Lisp programming language. A sequel to Automated Mathematician, it consists of heuristics, i.e. rules of thumb, including heuristics describing how to use and change its own heuristics. Lenat was frustrated by Automated Mathematician's constraint to a single domain and so developed Eurisko; his frustration with the effort of encoding domain knowledge for Eurisko led to Lenat's subsequent (and, , continuing) development of Cyc. Lenat envisions ultimately coupling the Cyc knowledgebase with the Eurisko discovery engine. History Development commenced at Carnegie Mellon in 1976 and continued at Stanford University in 1978 when Lenat returned to teach. "For the first five years, nothing good came out of it", Lenat said. But when the implementation was changed to a frame language based representation he called RLL (Representation Language Language), heuristic creation and modification became much simpler. Eurisko was then applied to a number of domains with surprising success, including VLSI chip design. Lenat and Eurisko gained notoriety by submitting the winning fleet (a large number of stationary, lightly-armored ships with many small weapons) to the United States Traveller TCS national championship in 1981, forcing extensive changes to the game's rules. However, Eurisko won again in 1982 when the program discovered that the rules permitted the program to destroy its own ships, permitting it to continue to use much the same strategy. Tournament officials announced that if Eurisko won another championship the competition would be abolished; Lenat retired Eurisko from the game. The Traveller TCS wins brought Lenat to the attention of DARPA, which has funded much of his subsequent work. In popular culture In the first-season The X-Files episode "Ghost in the Machine", Eurisko is the name of a fictional software company responsible for the episode's "monster of the week", facilities management software known as "Central Operating System", or "COS". COS (described in the episode as an "adaptive network") is shown to be capable of learning when its designer arrives at Eurisko headquarters and is surprised to find that COS has given itself the ability to speak. The designer is forced to create a virus to destroy COS after COS commits a series of murders in an apparent effort to prevent its own destruction. Lenat is mentioned and Eurisko is discussed at the end of Richard Feynman's Computer Heuristics Lecture as part of the Idiosyncratic Thinking Workshop Series. Lenat and Eurisko are mentioned in the 2019 James Rollins novel Crucible that deals with artificial intelligence and artificial general intelligence. Notes References Heuristics Applications of artificial intelligence Genetic programming
https://en.wikipedia.org/wiki/CRM114%20%28program%29
CRM114 (full name: "The CRM114 Discriminator") is a program based upon a statistical approach for classifying data, and especially used for filtering email spam. Origin of the name The name comes from the CRM-114 Discriminator in the Stanley Kubrick movie Dr. Strangelove - a piece of radio equipment designed to filter out messages lacking a specific code-prefix. Operation While others have done statistical Bayesian spam filtering based upon the frequency of single word occurrences in email, CRM114 achieves a higher rate of spam recognition through creating hits based upon phrases up to five words in length. These phrases are used to form a Markov Random Field representing the incoming texts. With this additional contextual recognition, it is one of the more accurate spam filters available. Initial testing in 2002 by author Bill Yerazunis gave a 99.87% accuracy; Holden and TREC 2005 and 2006 gave results of better than 99%, with significant variation depending on the particular corpus. CRM114's classifier can also be switched to use Littlestone's Winnow algorithm, character-by-character correlation, a variant on KNN (K-nearest neighbor algorithm) classification called Hyperspace, a bit-entropic classifier that uses entropy encoding to determine similarity, a SVM, by mutual compressibility as calculated by a modified LZ77 algorithm, and other more experimental classifiers. The actual features matched are based on a generalization of skip-grams. The CRM114 algorithms are multi-lingual (compatible with UTF-8 encodings) and null-safe. A voting set of CRM114 classifiers have been demonstrated to detect confidential versus non-confidential documents written in Japanese at better than 99.9% detection rate and a 5.3% false alarm rate. CRM114 is a good example of pattern recognition software, demonstrating how machine learning can be accomplished with a reasonably simple algorithm. The program's C source code is available under the GPL. At a deeper level, CRM114 is also a string pattern matching language, similar to grep or even Perl; although it is Turing complete it is highly tuned for matching text, and even a simple (recursive) definition of the factorial takes almost ten lines. Part of this is because the crm114 language syntax is not positional, but declensional. As a programming language, it may be used for many other applications aside from detecting spam. CRM114 uses the TRE approximate-match regex engine, so it is possible to write programs that do not depend on absolutely identical strings matching to function correctly. CRM114 has been applied to email filtering in the KMail client and a number of other applications, including detection of bots on Twitter and Yahoo, as well as the first-level filter in the US Dept of Transportation's vehicle defect detection system. It has also been used as a predictive method for classifying fault-prone software modules. See also String matching References External links The CRM114 home
https://en.wikipedia.org/wiki/Sixteenth%20Air%20Force
The Sixteenth Air Force (Air Forces Cyber) (16 AF) is a United States Air Force (USAF) organization responsible for information warfare, which encompasses intelligence gathering and analysis, surveillance, reconnaissance, cyber warfare and electronic warfare operations. Its headquarters is at Joint Base San Antonio-Lackland in Texas. The organization was the first newly established Numbered Air Force (NAF) by the USAF after World War II. It was activated in 1954 as a Joint Military Group to provide command and control of USAF activities in Spain, being designated a NAF in 1956. In 1957, 16 AF was realigned under Strategic Air Command (SAC) to provide command and control of SAC bases and B-47 Stratojet rotational units assigned and deployed to Spain and Morocco. In 1966, after SAC withdrew its forces from Europe, 16 AF became part of the United States Air Forces in Europe, providing command and control of USAFE forces initially in Spain and North Africa, and later in Italy and Turkey until 2006. It later became a provisional Air Expeditionary Task Force under USAFE as part of the Global War on Terrorism. Mission The Sixteenth Air Force provides global intelligence, surveillance and reconnaissance, cyber and electronic warfare, and information operations, and serves as the Service Cryptologic Component responsible to the National Security Agency/Central Security Service and the Service Cyber Component to US Cyber Command. Component units The following units are subordinate to the Sixteenth Air Force. Wings 9th Reconnaissance Wing (Beale AFB, California) Aircraft Assigned: RQ-4 Global Hawk, RQ-180, T-38A Talon, U-2S Dragon Lady 55th Wing (Offut AFB, Nebraska) Aircraft Assigned: OC-135B Open Skies, RC-135 Rivet Joint, TC-135W, WC-135R Constant Phoenix, RC-135 COBRA BALL, RC-135 COMBAT SENT, EC-130H COMPASS CALL 67th Cyberspace Wing (Joint Base San Antonio-Lackland, Texas) 70th Intelligence, Surveillance, and Reconnaissance Wing (Fort George G. Meade, Maryland) 319th Reconnaissance Wing (Grand Forks AFB, North Dakota) Aircraft Assigned: RQ-4 Global Hawk 363rd Intelligence, Surveillance, and Reconnaissance Wing (Joint Base Langley-Eustis, Virginia) 480th Intelligence, Surveillance, and Reconnaissance Wing (Joint Base Langley-Eustis, Virginia) 557th Weather Wing (Offutt AFB, Nebraska) 688th Cyberspace Wing (Kelly Field Annex, Joint Base San Antonio-Lackland, Texas) Organizations 616th Operations Center (Joint Base San Antonio-Lackland, Texas) Air Force Technical Applications Center (Patrick Space Force Base, Florida) History Sixteenth Air Force (16 AF)'s original ancestor was the Joint United States Military Group, Air Administration (Spain), which was established on 20 May 1954. It was attached to the Joint U.S. Military Group, which oversaw implementation of the 1953 Spanish-American Defense Cooperation Agreement. On 15 July 1956, Sixteenth Air Force was created when the Air Administration (Spain) was re-designated as Headquarters,
https://en.wikipedia.org/wiki/Leonard%20Kleinrock
Leonard Kleinrock (born June 13, 1934) is an American computer scientist and Internet pioneer. He is a long-tenured professor at UCLA's Henry Samueli School of Engineering and Applied Science. In the early 1960s, Kleinrock pioneered the application of queueing theory to model delays in message switching networks in his Ph.D. thesis, published as a book in 1964. He later published several of the standard works on the subject. In the early 1970s, he applied queueing theory to model and measure the performance of packet switching networks. This work played an influential role in the development of the ARPANET. He supervised many graduate students whose later work on the communication protocols for internetworking led to the Internet protocol suite. His theoretical work on hierarchical routing in the late 1970s with student Farouk Kamoun remains critical to the operation of the Internet today. Education and career Leonard Kleinrock was born in New York City on June 13, 1934, to a Jewish family, and graduated from the noted Bronx High School of Science in 1951. He received a Bachelor of Electrical Engineering degree in 1957 from the City College of New York, and a master's degree and a doctorate (Ph.D.) in electrical engineering and computer science from the Massachusetts Institute of Technology in 1959 and 1963 respectively. He then joined the faculty at the University of California at Los Angeles (UCLA), where he remains to the present day; during 1991–1995 he served as the chairman of the Computer Science Department there. Achievements Queueing theory Kleinrock's best-known and most-significant work is on queueing theory, a branch of operations research that has applications in many fields. His thesis proposal in 1961 led to a doctoral thesis at the Massachusetts Institute of Technology in 1962, later published in book form in 1964. In this work, he analyzed queueing delays in Plan 55-A, a message switching system operated by Western Union for processing telegrams. Kleinrock later published several of the standard works on the subject. ARPANET Larry Roberts brought Leonard Kleinrock into the ARPANET project informally in May 1967. Roberts learned about packet switching from a paper written by Donald Davies, presented at the October 1967 Symposium on Operating Systems Principles. Roberts subsequently incorporated the concept into the ARPANET design. Kleinrock was awarded a contract in 1969 to model and measure the performance of packet switching in the ARPANET. In addition, Kleinrock managed the software team at UCLA — including Steve Crocker, Vint Cerf and Jon Postel — who developed the host-host protocol for the ARPANET, the Network Control Program (NCP). Kleinrock's mathematical work in the early 1970s influenced the development of the early ARPANET. The first message on the ARPANET was sent by a UCLA student programmer, Charley Kline, who was supervised by Kleinrock. At 10:30 p.m, on October 29, 1969, from Boelter Hall 3420, the school's
https://en.wikipedia.org/wiki/Ad%20hoc%20network
An ad hoc network refers to technologies that allow network communications on an ad hoc basis. Associated technologies include: Wireless ad hoc network Mobile ad hoc network Vehicular ad hoc network Intelligent vehicular ad hoc network Protocols associated with ad hoc networking Ad hoc On-Demand Distance Vector Routing Ad Hoc Configuration Protocol Smart phone ad hoc network Ad hoc wireless distribution service References Computer networking
https://en.wikipedia.org/wiki/Dead-code%20elimination
In compiler theory, dead-code elimination (DCE, dead-code removal, dead-code stripping, or dead-code strip) is a compiler optimization to remove dead code (code that does not affect the program results). Removing such code has several benefits: it shrinks program size, an important consideration in some contexts, and it allows the running program to avoid executing irrelevant operations, which reduces its running time. It can also enable further optimizations by simplifying program structure. Dead code includes code that can never be executed (unreachable code), and code that only affects dead variables (written to, but never read again), that is, irrelevant to the program. Examples Consider the following example written in C. int foo(void) { int a = 24; int b = 25; /* Assignment to dead variable */ int c; c = a * 4; return c; b = 24; /* Unreachable code */ return 0; } Simple analysis of the uses of values would show that the value of b after the first assignment is not used inside foo. Furthermore, b is declared as a local variable inside foo, so its value cannot be used outside foo. Thus, the variable b is dead and an optimizer can reclaim its storage space and eliminate its initialization. Furthermore, because the first return statement is executed unconditionally and there is no label after it which a "goto" could reach, no feasible execution path reaches the second assignment to b. Thus, the assignment is unreachable and can be removed. If the procedure had a more complex control flow, such as a label after the return statement and a goto elsewhere in the procedure, then a feasible execution path might exist to the assignment to b. Also, even though some calculations are performed in the function, their values are not stored in locations accessible outside the scope of this function. Furthermore, given the function returns a static value (96), it may be simplified to the value it returns (this simplification is called constant folding). Most advanced compilers have options to activate dead-code elimination, sometimes at varying levels. A lower level might only remove instructions that cannot be executed. A higher level might also not reserve space for unused variables. A yet higher level might determine instructions or functions that serve no purpose and eliminate them. A common use of dead-code elimination is as an alternative to optional code inclusion via a preprocessor. Consider the following code. int main(void) { int a = 5; int b = 6; int c; c = a * (b / 2); if (0) { /* DEBUG */ printf("%d\n", c); } return c; } Because the expression 0 will always evaluate to false, the code inside the if statement can never be executed, and dead-code elimination would remove it entirely from the optimized program. This technique is common in debugging to optionally activate blocks of code; using an optimizer with dead-code elimination eliminates the need for using a preprocessor to perform the same task. In pr
https://en.wikipedia.org/wiki/Elk%20Cloner
Elk Cloner is one of the first known microcomputer viruses that spread "in the wild", i.e., outside the computer system or laboratory in which it was written. It attached itself to the Apple II operating system and spread by floppy disk. It was written around 1982 by programmer and entrepreneur Rich Skrenta as a 15-year-old high school student, originally as a joke, and put onto a game disk. Infection and symptoms Elk Cloner spread by infecting the Apple DOS 3.3 operating system using a technique now known as a boot sector virus. It was attached to a game which was then set to play. The 50th time the game was started, the virus was released, but instead of playing the game, it would change to a blank screen that displayed a poem about the virus. If a computer booted from an infected floppy disk, a copy of the virus was placed in the computer's memory. When an uninfected disk was inserted into the computer, the entire DOS (including Elk Cloner) would be copied to the disk, allowing it to spread from disk to disk. To prevent the DOS from being continually rewritten each time the disk was accessed, Elk Cloner also wrote a signature byte to the disk's directory, indicating that it had already been infected. The poem that Elk Cloner would display was as follows: ELK CLONER: THE PROGRAM WITH A PERSONALITY   IT WILL GET ON ALL YOUR DISKS IT WILL INFILTRATE YOUR CHIPS YES IT'S CLONER!   IT WILL STICK TO YOU LIKE GLUE IT WILL MODIFY RAM TOO SEND IN THE CLONER! Elk Cloner did not cause deliberate harm, but Apple DOS disks without a standard image had their reserved tracks overwritten. Development Elk Cloner was created by Skrenta as a prank in 1982. Skrenta already had a reputation for pranks among his friends because, in sharing computer games and software, he would often alter the floppy disks to shut down or display taunting on-screen messages. Due to this reputation, many of his friends simply stopped accepting floppy disks from him. Skrenta thought of methods to alter floppy disks without physically touching or harming them. During a winter break from Mt. Lebanon High School in Mt. Lebanon, Pennsylvania, Skrenta discovered how to launch the messages automatically on his Apple II computer. He developed what is now known as a boot sector virus, and began circulating it in early 1982 among high school friends and a local computer club. Twenty-five years later, in 2007, Skrenta called it "some dumb little practical joke." Distribution According to contemporary reports, the virus was rather contagious, successfully infecting the floppies of most people Skrenta knew, and upsetting many of them. Part of the "success" was that people were not at all wary of the potential problem, nor were virus scanners or cleaners available. The virus could be removed using Apple's MASTER CREATE utility or other utilities to rewrite a fresh copy of DOS to the infected disk. Furthermore, once Elk Cloner was removed, the previously infected disk would not
https://en.wikipedia.org/wiki/Chris%20McKinstry
Kenneth Christopher McKinstry (February 12, 1967 – January 23, 2006) was a researcher in artificial intelligence. He led the development of the MISTIC project which was launched in May 1996. He founded the Mindpixel project in July 2000, and closed it in December 2005. McKinstry's AI work and similar early death dovetailed with another contemporary AI researcher, Push Singh and his MIT Open Mind Common Sense Project. Life McKinstry was a Canadian citizen. Born in Winnipeg, he resided several years in Chile. From 1999, he lived in Antofagasta as a VLT operator for the European Southern Observatory. At the end of 2004, he moved back to Santiago, Chile. Suffering from bipolar disorder, McKinstry had an armed standoff with police in Toronto in 1990, with it lasting 7 1/2 hours. It ultimately concluded with McKinstry being hit with tear gas, but ending with no casualties In February 1997, Chris McKinstry started an online soap opera, CR6. According to journalist Bartley Kives, around 700 people auditioned for the show, which only lasted for two months, before McKinstry left Winnipeg with "estimated debts in excess of $100,000". McKinstry later claimed to have lost $1 million in the CR6 failure, and the many people he recruited to build the soap opera, including photographers, writers, a director, and several prominent businesses, never received any of the money owed to them for their work. Before his death, McKinstry designed an experiment with two cognitive scientists to study the dynamics of thought processes using data from his Mindpixel project. This work was later published in Psychological Science in its January 2008 issue, with McKinstry as posthumous first author. Mental health Chris McKinstry had a long struggle with his mental health, with him admitting to being diagnosed with bipolar disorder. McKinstry, as a result, suffered from frequent suicidal thoughts and a long-standing depression, discussing it in his suicide note. In his teen years, McKinstry had attempted suicide, intentionally overdosing on drugs, another issue McKinstry struggled with. His bipolar disorder is often attributed to the reason for his standoff in 1990. Death Chris McKinstry was found dead in his apartment on January 23, 2006, with a plastic bag over his head, connected by a hose from the stove gas line. He was found to have posted a suicide note online. McKinstry wrote, "I am tired of feeling the same feelings and experiencing the same experiences. It is time to move on and see what is next if anything...This Louis Vuitton, Prada, Montblanc commercial universe is not for me. If only I was loved as much a Montblanc pen..."' There was some public note of the similarity between the suicide of Chris McKinstry and that of Push Singh, another AI researcher, a little over a month later. Both of their AI projects, McKinstry's Mindpixel project and Singh's MIT-backed Open Mind Common Sense, had similar trajectories over the last six years. His death was determined
https://en.wikipedia.org/wiki/Educational%20assessment
Educational assessment or educational evaluation is the systematic process of documenting and using empirical data on the knowledge, skill, attitudes, aptitude and beliefs to refine programs and improve student learning. Assessment data can be obtained from directly examining student work to assess the achievement of learning outcomes or can be based on data from which one can make inferences about learning. Assessment is often used interchangeably with test, but not limited to tests. Assessment can focus on the individual learner, the learning community (class, workshop, or other organized group of learners), a course, an academic program, the institution, or the educational system as a whole (also known as granularity). The word 'assessment' came into use in an educational context after the Second World War. As a continuous process, assessment establishes measurable and clear student learning outcomes for learning, providing a sufficient amount of learning opportunities to achieve these outcomes, implementing a systematic way of gathering, analyzing and interpreting evidence to determine how well student learning matches expectations, and using the collected information to inform improvement in student learning. Assessment is an important aspect of educational process which determines the level of accomplishments of students. The final purpose of assessment practices in education depends on the theoretical framework of the practitioners and researchers, their assumptions and beliefs about the nature of human mind, the origin of knowledge, and the process of learning. Types The term assessment is generally used to refer to all activities teachers use to help students learn and to gauge student progress. Assessment can be divided for the sake of convenience using the following categorizations: Placement, formative, summative and diagnostic assessment Objective and subjective Referencing (criterion-referenced, norm-referenced, and ipsative (forced-choice)) Informal and formal Internal and external Placement, formative, summative and diagnostic Assessment is often divided into initial, formative, and summative categories for the purpose of considering different objectives for assessment practices. Placement assessment – Placement evaluation is used to place students according to prior achievement or personal characteristics, at the most appropriate point in an instructional sequence, in a unique instructional strategy, or with a suitable teacher conducted through placement testing, i.e. the tests that colleges and universities use to assess college readiness and place students into their initial classes. Placement evaluation also referred to as pre-assessment or initial assessment, is conducted prior to instruction or intervention to establish a baseline from which individual student growth can be measured. This type of assessment is used to know what the student's skill level is about the subject. It helps the teacher to explain the mate
https://en.wikipedia.org/wiki/Buzen
Buzen may refer to: Buzen, Fukuoka, a city located in Fukuoka, Japan Buzen Province, an old province of Japan in northern Kyushu Jeffrey P. Buzen, a computer scientist and businessman
https://en.wikipedia.org/wiki/Mittelland%20Canal
The Mittelland Canal, also known as the Midland Canal, (, ) is a major canal in central Germany. It forms an important link in the waterway network of that country, providing the principal east-west inland waterway connection. Its significance goes beyond Germany as it links France, Switzerland and the Benelux countries with Poland, the Czech Republic and the Baltic Sea. At in length, the Mittelland Canal is the longest artificial waterway in Germany. Route The Mittelland Canal branches off the Dortmund-Ems Canal at Hörstel (near Rheine, at ), runs north along the Teutoburg Forest, past Hanover and meets with the Elbe River near Magdeburg (). Near Magdeburg it connects to the Elbe-Havel Canal, making a continuous shipping route to Berlin and on to Poland. At Minden the canal crosses the river Weser over two aqueducts (completed in 1914 and 1998, respectively), and near Magdeburg it crosses the Elbe, also with an aqueduct. Connections by side canals exist at Ibbenbüren, Osnabrück, Minden (two canals connecting to the Weser), Hanover-Linden, Hanover-Misburg, Hildesheim and Salzgitter. West of Wolfsburg, the Elbe Lateral Canal branches off, providing a connection to Hamburg, and (via the Elbe-Lübeck Canal) to the Baltic Sea. History Construction of the Mittelland Canal was started in 1906, starting from Bergeshövede (municipality Hörstel) on the Dortmund-Ems Canal. The section to Minden on the Weser was opened in February 1915 and was initially named Ems-Weser-Kanal. The section from Minden to Hanover was finished in the autumn of 1916. The section to Sehnde and the branch canal to Hildesheim were completed in 1928, Peine was reached in 1929, and Braunschweig in 1933. The final section to Magdeburg was opened in 1938, thus creating a direct link between Western and Eastern Germany. The branch canal to Salzgitter was opened in 1941. The planned canal bridge over the Elbe, necessary to avoid low water conditions in summer, was not built due to the Second World War. After partitioning of Germany following the Second World War, the Mittelland Canal was split between West Germany and East Germany, with the border to the east of Wolfsburg. To provide access from the western section of the canal to Hamburg and Northern Germany, avoiding both East Germany and the Elbe River's sometimes limited navigability, the Elbe Lateral Canal was opened in 1977. After the reunification of Germany, the importance of the Mittelland Canal as a link from the west to Berlin and the east was reinforced. The project to bridge the Elbe was therefore restarted, and the resulting Magdeburg Water Bridge opened in 2003, providing a direct link to the Elbe-Havel Canal. There are further plans to connect the channel to the Twentekanaal in the Netherlands to shorten the connection towards the Port of Rotterdam. Towns and cities Ibbenbüren Osnabrück (via a branch) Bramsche Lübbecke Minden Garbsen Hannover Sehnde Hildesheim (via a branch) Peine Salzgitter (via a bra
https://en.wikipedia.org/wiki/Distributed%20library
A distributed library is a collection of materials available for borrowing by members of a group, yet not maintained or owned by a single entity. The library catalog is maintained on a database that is made accessible to users through the Internet. This style of library is still in its infancy. Administrative software continues to be developed and distributed. An early example of this style of library (if not the first of its type) is the Distributed Library Project of the San Francisco Bay Area. While distributed libraries are being established in several cities worldwide, the San Francisco Bay Area library still only has a few hundred members. Another example, which takes a slightly different approach, is the Unlibrary. In this system, users are free to create communities of any size and scope, rather than a single citywide community. For instance a church might have its own community, with church members all able to borrow from each other. Users can also have private, invite-only groups. Another example is the digibruted library of Geneva. The name digibruted is coined from “Digital” and “Distributed”. This library is a digital construction that indexes books for local distribution. The difference from Unlibrary is that the books are freely given to readers, who act also as librarians, in a kind of peer-to-peer schema. See also BookCrossing References Library science Libraries by type Types of library
https://en.wikipedia.org/wiki/Dark%20Avenger
Dark Avenger was the pseudonym of a computer virus writer from Sofia, Bulgaria. He gained considerable notoriety during the early 1990s when his viruses spread internationally. Background and origins During the Cold War, the Bulgarian government authorized projects to reverse engineer Western technology. This eventually led to the Pravetz computers of the 1980s, which cloned popular Western personal computers. A community formed around these computers when they were used in schools to teach students computer programming. In April 1988, Bulgaria's trade magazine for computers, Компютър за Вас (Computer for You), published a translation of a German article about computer viruses and methods for writing them. A few months after that, Bulgaria experienced several foreign viruses. The interest spawned by both the article and the viruses inspired young Bulgarian programmers to devise their own viruses. Soon a wave of Bulgarian viruses erupted, started by the "Old Yankee" and "Vacsina" viruses. Dark Avenger made his first appearance in the spring of 1989. At the time, Bulgaria did not have any laws against writing computer viruses. Anti-virus researchers identified Bulgaria as having talented programmers who had few commercial opportunities, and Bulgarian security researcher Vesselin Bontchev blamed the viruses on the country's history of pirating Western computer code and failure to teach students about computer ethics. Viruses Dark Avenger's first virus appeared in early 1989 and contained the string, "This program was written in the city of Sofia (C) 1988–89 Dark Avenger". Thus, this first virus is usually referred to as "Dark Avenger", eponymous to its author. Dark Avenger's viruses made frequent references to heavy metal bands, including Iron Maiden, and Diana, Princess of Wales. His pseudonym is based on a Manowar song. The virus was very infectious: if the virus was active in memory, opening or just copying an executable file was sufficient to infect it. Additionally, the virus also destroyed data, by overwriting a random sector of the disk at every 16th run of an infected program, progressively corrupting files and directories on the disk. Corrupted files contained the string, "Eddie lives... somewhere in time!", a reference to Iron Maiden. Due to its highly infectious nature, the virus spread worldwide, reaching Western Europe, the USSR, the United States, and East Asia. Dutch author Harry Mulisch reported encountering the virus on his laptop while writing The Discovery of Heaven. Mulisch considered it a "favourable sign from higher powers" and briefly considered naming his son Eduard after the virus' output. A few weeks later, he re-encountered the virus and had it professionally removed. This virus was soon followed by others, each employing a new trick. Dark Avenger is believed to have authored the following viruses: Dark Avenger, V2000 (two variants), V2100 (two variants), 651, Diamond (two variants), Nomenklatura, 512 (six
https://en.wikipedia.org/wiki/UniM%C3%A1s
UniMás (, stylized as UNIMÁS, and originally known as TeleFutura from its launch on January 14, 2002, to January 6, 2013) is an American Spanish-language free-to-air television network owned by TelevisaUnivision. The network's programming, which is aimed at Hispanic Americans in the 18-34 age range, includes telenovelas and other serialized drama series, sports, sitcoms, reruns of imported series previously aired on parent network Univision, reality and variety series, and theatrically released feature films (primarily consisting of Spanish-dubbed versions of American movie releases). The network is operated out of Univision's South Florida headquarters in the Miami suburb of Doral, Florida. Since its launch, the network has made major inroads in overall and demographic viewership, eventually ranking as the second highest-rated Spanish-language television network in key dayparts, behind only sister network Univision, by 2012. UniMás is available on cable and satellite television throughout most of the United States, with local stations in over 40 markets with large Hispanic and Latino populations. Most of these stations are pass-throughs for the network's main programming feed, offering limited to no exclusive local programming. Univision Communications chief operating officer Randy Falco has overseen the network's operations since his appointment in the position by the company on June 29, 2011. History Origins The network traces its origins to Barry Diller's November 1995 acquisition of the Home Shopping Network and its broadcasting arm Silver King Communications, which owned television stations affiliated with HSN in several larger media markets. In June 1998, the renamed USA Broadcasting (which had been merged into the Diller-owned USA Networks in 1997) launched a customized independent station format, "CityVision", which infused syndicated programming – including a few produced by sister production unit Studios USA that also aired nationally on USA Network – with a limited amount of local entertainment and magazine programs (reminiscent of the format used by CITY-TV in Toronto and more prominently, its co-owned stations that became charter outlets of Citytv, when CHUM Limited expanded the format to other Canadian markets as a television system in 2002). USA's Miami outlet, WYHS-TV, served as the test station for the format, disaffiliating from HSN and converting into a general entertainment outlet under the new call letters WAMI-TV. By September 2000, USA Broadcasting had expanded the "CityVision" entertainment format to three of its thirteen other HSN outlets – with some of the stations adopting call letters referencing common nicknames for their home cities – WHOT-TV (now WUVG-DT) in Atlanta, KSTR-TV in Dallas–Fort Worth and WHUB-TV (now WUTF-DT) in Boston. Before the group could carry out the proposed conversions of its other stations into independent stations, USA Networks announced that it would sell off its television station group
https://en.wikipedia.org/wiki/Online%20public%20access%20catalog
The online public access catalog (OPAC), now frequently synonymous with library catalog, is an online database of materials held by a library or group of libraries. Online catalogs have largely replaced the analog card catalogs previously used in libraries. History Early online Although a handful of experimental systems existed as early as the 1960s, the first large-scale online catalogs were developed at Ohio State University in 1975 and the Dallas Public Library in 1978. These and other early online catalog systems tended to closely reflect the card catalogs that they were intended to replace. Using a dedicated terminal or telnet client, users could search a handful of pre-coordinate indexes and browse the resulting display in much the same way they had previously navigated the card catalog. Throughout the 1980s, the number and sophistication of online catalogs grew. The first commercial systems appeared, and would by the end of the decade largely replace systems built by libraries themselves. Library catalogs began providing improved search mechanisms, including Boolean and keyword searching, as well as ancillary functions, such as the ability to place holds on items that had been checked-out. At the same time, libraries began to develop applications to automate the purchase, cataloging, and circulation of books and other library materials. These applications, collectively known as an integrated library system (ILS) or library management system, included an online catalog as the public interface to the system's inventory. Most library catalogs are closely tied to their underlying ILS system. Stagnation and dissatisfaction The 1990s saw a relative stagnation in the development of online catalogs. Although the earlier character-based interfaces were replaced with ones for the Web, both the design and the underlying search technology of most systems did not advance much beyond that developed in the late 1980s. At the same time, organizations outside of libraries began developing more sophisticated information retrieval systems. Web search engines like Google and popular e-commerce websites such as Amazon.com provided simpler to use (yet more powerful) systems that could provide relevancy ranked search results using probabilistic and vector-based queries. Prior to the widespread use of the Internet, the online catalog was often the first information retrieval system library users ever encountered. Now accustomed to web search engines, newer generations of library users have grown increasingly dissatisfied with the complex (and often arcane) search mechanisms of older online catalog systems. This has, in turn, led to vocal criticisms of these systems within the library community itself, and in recent years to the development of newer (often termed 'next-generation') catalogs. Next-generation catalogs Newer generations of library catalog systems, typically called discovery systems (or a discovery layer), are distinguished from earlier OP
https://en.wikipedia.org/wiki/Stub%20network
A stub network, or pocket network, is a somewhat casual term describing a computer network, or part of an internetwork, with no knowledge of other networks, that will typically send much or all of its non-local traffic out via a single path, with the network aware only of a default route to non-local destinations. As a practical analogy, think of an island which is connected to the rest of the world through a bridge and no other path is available either through air or sea. Continuing this analogy, the island might have more than one physical bridge to the mainland, but the set of bridges still represents only one logical path. An enterprise that connects to the corporate network by only one router, or multiple default routers connected to the same logical upstream destination. A single LAN which never carries packets between multiple routers connected to it. All traffic is to and/or from local hosts. The routers will only route packets into the LAN if it's destined for the LAN, and out from the LAN if it originated on the LAN. A person, or workgroup, who is connected to an , by only one router, is a stub network with respect to the ISP. This stub network is part of the ISP's , discussed below. For each interface on which no default route (also called the gateway of last resort) has been elected, refers to these subnets as stub networks. An OSPF stubby area is one which receives routes from other areas in the OSPF domain but for external routes, which are communicated via a Type 5 Link-state advertisement, the stubby area is only aware of a default route An OSPF totally stubby area is one which only has a default route to the rest of the OSPF routing domain. Such an area may have more than one router, but these routers will only know about the default route to the outside. A stub autonomous system that is connected to only one other autonomous system, through which it gains access to the Internet. This is also called a stub AS, which characterize the great majority of AS connected to the Internet. as of June 30, 2007, there were 224622 routes seen by the router. These came from 25577 autonomous systems, of which only 74 were transit-only and 22272 were stub/origin-only. 3305 autonomous systems provided some level of transit. Stub networks are not to be confused with transit networks, as transit networks contain at least two routers. These networks differ from stub networks since they are able to allow information to pass through them (hence the name). The stub network is unique in the sense that it contains only one router, that router being the gateway to the network. Stub networks are also capable of implementing multi-homing technology. This technology is focused on setting a single computer to host multiple network connections and IP addresses. Stub networks are useful in situations where the OSPF (Open Shortest Path First) protocol needs to map out the topology of the network. Stub networks serve a special purpose in that they
https://en.wikipedia.org/wiki/IRC%20operator
An IRC operator (often abbreviated as IRCop or oper) is a user on an Internet Relay Chat network who has privileged access. IRC operators are charged with the task of enforcing the network's rules, and in many cases, improving the network in various areas. The permissions available to an IRC operator vary according to the server software in use, and the server's configuration. IRC operators are divided into local and global operators. The former are limited to the server(s) they have specific access to; however, global operators can perform actions affecting all users on the network. In order to perform their duties, IRC operators usually have the ability to: Forcibly disconnect users (Kill) Ban (K-line or G-line) users Change network routing by disconnecting (squitting) or connecting servers Traditionally, a list of operators on a particular server is available in the MOTD, or through the command. A user can become an operator by sending the command /oper to the irc server they currently are on using a pre-selected username and a password as parameters. The command only works for the server which has the proper O-line in its IRCd configuration file. The IP address that the user is operating from may also have to match a predefined one, as an extra layer of security to prevent unauthorized users operating if they have cracked the operator's password. Operator types In many IRC networks, IRCops have different types of access on a network. These ranks often depend upon the IRCd software used, though a few specific access levels remain fairly constant throughout variations: Local operator The Local Operator (LocOp) is the lowest in Operator access levels. The LocOp has a minimal control on one server out of a network, and usually has the ability to kill (disconnect) people from the server or perform local K-lines (server ban). Global operator The Global Operator (GlobOp) is similar to the LocOp, and has control over the entire network of servers, as opposed to a single server. GlobOps may perform G-lines or AKills (network-wide bans) and Shun (forcibly mute) users over an entire network. Services administrator Commonly abbreviated as SA, This admin type has control over all functionality on an IRC network available via network service bots, including the commonly used NickServ, ChanServ, and MemoServ nicks. Usually, an SA has the ability to use the /sa* commands. The /sa* commands, like all actions performed by network services, are typically implemented using a virtual services node on the network, effectively masking the origin of the actions. Network Administrator The Network Administrator (NetAdmin) has the highest level of access on a network. In most cases, the founder of the network is the netadmin. Networks may, however, have multiple netadmins - especially networks with large populations. Ban Types An IRCop with enough privileges, may give a ban to unwanted users. The ban types are listed below: K-line The K-line is a lo
https://en.wikipedia.org/wiki/POKEY
POKEY, an acronym for Pot Keyboard Integrated Circuit, is a digital I/O chip designed by Doug Neubauer at Atari, Inc. for the Atari 8-bit family of home computers. It was first released with the Atari 400 and Atari 800 in 1979 and is included in all later models and the Atari 5200 console. POKEY combines functions for reading paddle controllers (potentiometers) and computer keyboards as well as sound generation and a source for pseudorandom numbers. It produces four voices of distinctive square wave audio, either as clear tones or modified with distortion settings. Neubauer also developed the Atari 8-bit killer application Star Raiders which makes use of POKEY features. POKEY chips are used for audio in many arcade video games of the 1980s including Centipede, Missile Command, Asteroids Deluxe, and Gauntlet. Some of Atari's arcade systems use multi-core versions with 2 or 4 POKEYs in a single package for more audio channels. The Atari 7800 console allows a game cartridge to contain a POKEY, providing better sound than the system's audio chip. Only two licensed games make use of this: the ports of Ballblazer and Commando. The LSI chip has 40 pins and is identified as C012294. The USPTO granted U.S. Patent 4,314,236 to Atari on February 2, 1982 for an "Apparatus for producing a plurality of audio sound effects". The inventors listed are Steven T. Mayer and Ronald E. Milner. No longer manufactured, POKEY is emulated in software by arcade and Atari 8-bit emulators and also via the Atari SAP music format and associated player. Features Audio 4 semi-independent audio channels Channels may be configured as one of: Four 8-bit channels Two 16-bit channels One 16-bit channel and two 8-bit channels Per-channel volume, frequency, and waveform (square wave with variable duty cycle or pseudorandom noise) 15 kHz or 64 kHz frequency divider. Two channels may be driven at the CPU clock frequency. High-pass filter Keyboard scan (up to 64 keys) + 2 modifier bits (Shift, Control) + Break Potentiometer ports (8 independent ports, each with 8-bit resolution) High Resolution Timers (audio channels 1, 2, and 4 can be configured to cause timer interrupts when they cross zero) Random number generator (8 bits of a 17-bit polynomial counter can be read) Serial I/O port Eight IRQ interrupts Versions By part number: C012294 — Used in all Atari 8-bit family computers, including the Atari XEGS, as well as the Atari 5200 console. The suffix on the chip refers to the manufacturer: C012294B-01 — AMI C012294-03 — Signetics C012294-19 — National Semiconductor C012294-22 — OKI C012294-31 — IMP 137430-001 — Part number sometimes used in Atari arcade machines for POKEY. 137324-1221 — Quad-Core POKEY used in Atari arcade machines Major Havoc, I, Robot, Firefox, and Return of the Jedi. Pinout Registers The Atari 8-bit computers map POKEY to the $D2xxhex page and the Atari 5200 console maps it to the $E8xxhex page. POKEY provides 29 Read/Write registers contro
https://en.wikipedia.org/wiki/Door%20%28bulletin%20board%20system%29
In a bulletin board system (BBS), a door is an interface between the BBS software and an external application. The term is also used to refer to the external application, a computer program that runs outside of the main bulletin board program. Sometimes called external programs, doors are the most common way to add games, utilities, and other extensions to BBSes. Because BBSes typically depended on the telephone system, BBSes and door programs tended to be local in nature, unlike modern Internet games and applications. From the 1990s on, most BBS software had the capability to "drop to" doors. Several standards were developed for passing connection and user information to doors; this was usually done with "dropfiles", small binary or text files dropped into known locations in the BBS's file system. Most doors were responsible for operating the serial port or other communications device directly until returning control to the BBS. Later development of FOSSIL drivers have allowed both BBSes and their doors to communicate without being responsible for direct operation of the communications hardware. Door games A major use of doors is for door games: computer games played on the BBS. These games included strategy games such as TradeWars 2002, Food Fight!, Solar Realms Elite, Space Dynasty, Usurper, and Barren Realms Elite. There were also role-playing games (RPG), often derived from earlier email-based games—examples include Seth Robinson's Legend of the Red Dragon, popular dystopian RPG: Operation: Overkill, and Mutants!. BBSes often published high scores, encouraging players to beat others. InterBBS leagues allowed users of different BBSes to compete against each other in the same game. A modern version of this known as BBSlink exists allowing sysops to offer door games on their BBS which are hosted on a remote server, thereby increasing the user base of the game. Other applications While many of the most popular and memorable BBS doors have been games, numerous doors had non-entertainment applications such as user polls or the time bank, permitting users to time-shift their rationed BBS use. Frequently they act as a front-end to themed databases on subject such as astrology, numerology and fortune-telling, recipes, weather prediction, personal ads (sometimes with additional match-making functionality), classified ads and "for sale" listings (sometimes permitting auctions), BBS lists, and parting comments from the most recent BBS callers. References External links Break Into Chat - BBS door game wiki BBS Archives - Collection of BBS doors and related files Dropfile Formats Bulletin board systems
https://en.wikipedia.org/wiki/FileVault
FileVault is a disk encryption program in Mac OS X 10.3 Panther (2003) and later. It performs on-the-fly encryption with volumes on Mac computers. Versions and key features FileVault was introduced with Mac OS X 10.3 Panther, and could only be applied to a user's home directory, not the startup volume. The operating system uses an encrypted sparse disk image (a large single file) to present a volume for the home directory. Mac OS X 10.5 Leopard and Mac OS X 10.6 Snow Leopard use more modern sparse bundle disk images which spread the data over 8 MB files (called bands) within a bundle. Apple refers to this original iteration of FileVault as "legacy FileVault". OS X 10.7 Lion and newer versions offer FileVault 2, which is a significant redesign. This encrypts the entire OS X startup volume and typically includes the home directory, abandoning the disk image approach. For this approach to disk encryption, authorised users' information is loaded from a separate non-encrypted boot volume (partition/slice type Apple_Boot). FileVault The original version of FileVault was added in Mac OS X Panther to encrypt a user's home directory. Master passwords and recovery keys When FileVault is enabled the system invites the user to create a master password for the computer. If a user password is forgotten, the master password or recovery key may be used to decrypt the files instead. FileVault recovery key is different from a Mac recovery key, which is a 28-character code used to reset your password or regain access to your Apple ID. Migration Migration of FileVault home directories is subject to two limitations: there must be no prior migration to the target computer the target must have no existing user accounts. If Migration Assistant has already been used or if there are user accounts on the target: before migration, FileVault must be disabled at the source. If transferring FileVault data from a previous Mac that uses 10.4 using the built-in utility to move data to a new machine, the data continues to be stored in the old sparse image format, and the user must turn FileVault off and then on again to re-encrypt in the new sparse bundle format. Manual encryption Instead of using FileVault to encrypt a user's home directory, using Disk Utility a user can create an encrypted disk image themselves and store any subset of their home directory in there (for example, ). This encrypted image behaves similar to a FileVault encrypted home directory, but is under the user's maintenance. Encrypting only a part of a user's home directory might be problematic when applications need access to the encrypted files, which will not be available until the user mounts the encrypted image. This can be mitigated to a certain extent by making symbolic links for these specific files. Limitations and issues Backups Without Mac OS X Server, Time Machine will back up a FileVault home directory only while the user is logged out. In such cases, Time Machine is limited to backi
https://en.wikipedia.org/wiki/Heartbeat
A heartbeat is one cardiac cycle of the heart. Heartbeat, heart beat, heartbeats, and heart beats may refer to: Computing Heartbeat (computing), a periodic signal to indicate normal operation or to synchronize parts of a system Heartbeat, clustering software from the Linux-HA project a piece of software by Edward Snowden Films Heartbeat (1938 film), a French comedy Heartbeat (1946 film), an American film by Sam Wood, starring Ginger Rogers La Chamade (film) (English title: Heartbeat), a 1968 French film by Alain Cavalier Heart Beat (film), a 1980 American film about the love triangle between Jack Kerouac, Carolyn Cassady and Neal Cassady Heart Beats (film), a 2007 Indian Malayalam-language film Heartbeats (2010 film), a Canadian French–English film by Xavier Dolan Heartbeat (2010 film), a South Korean film about the illegal trade in human organs Heartbeat (2012 film), an Austrian short film Heartbeat (2014 film), a Canadian film Heartbeats (2017 film), an Indian film by Duane Adler Television HeartBeat (1988 TV series), an American medical drama that aired on ABC Heartbeat (British TV series), a 1992–2010 British period drama Heartbeat (1993 film), an NBC TV film Heartbeat (2016 TV series), an American medical drama that aired on NBC Heartbeat (South Korean TV series), a 2023 television series Music Bands The Heartbeats, a 1950s American doo-wop group The Heartbeats (big band), an American jazz ensemble The Heart Beats, a 1960s all-female garage rock band Albums Heartbeat (Bad Boys Blue album), 1986 Heartbeat (Curtis Mayfield album) or the title song, 1979 Heartbeat (Da' T.R.U.T.H. album) or the title song, 2014 Heartbeat (Don Johnson album) or the Wendy Waldman title song (see below), 1986 Heartbeat (G.E.M. album) or the title song, 2015 Heartbeat (Jasmine Rae album) or the title song, 2015 Heartbeat (Jeremy Rosado album) or the title song, 2015 Heartbeat (The Oak Ridge Boys album), 1987 Heartbeat (Ruby Lin album) or the title song, 1999 Heartbeat (Ryuichi Sakamoto album) or the title song (see below), 1991 Heartbeat (Sarah Engels album) or the title song, 2011 Heartbeat - It's a Lovebeat by The DeFranco Family, or the title song, 1973 Heartbeat: The Abbreviated King Crimson or the title song (see below), 1991 The Heartbeat (Bellarive album), 2012 Heart Beat (Wang Leehom album) or the title song, 2008 Heartbeats – Chris Rea's Greatest Hits, 2005 Heart Beats (Dami Im album), 2014 Heart Beats (Danny album), 2007 Heart Beats (Keystone Trio album), 1995 Heartbeat, by Chris & Cosey, 1981 Heart Beat, by Space Tribe, 2002 Heartbeats, by Grum, 2010 Songs "Heartbeat" (2PM song), 2009 "Heartbeat" (Annie song), 2004 "Heartbeat" (BTS song), 2019 "Heartbeat" (Buddy Holly song), 1958 "Heartbeat" (Can-linn song), 2014 "Heartbeat" (Carrie Underwood song), 2015 "Heartbeat" (Childish Gambino song), 2011 "Heartbeat" (Enrique Iglesias song), 2010 "Heartbeat" (The Fray song), 2011 "Heartbeat" (Girlfriend song), 1993 "Heartbeat" (Wendy Waldman song), 1982;
https://en.wikipedia.org/wiki/Programming%20Perl
Programming Perl, best known as the Camel Book among programmers, is a book about writing programs using the Perl programming language, revised as several editions (1991-2012) to reflect major language changes since Perl version 4. Editions have been co-written by the creator of Perl, Larry Wall, along with Randal L. Schwartz, then Tom Christiansen and then Jon Orwant. Published by O'Reilly Media, the book is considered the canonical reference work for Perl programmers. With over 1,000 pages, the various editions contain complete descriptions of each Perl language version and its interpreter. Examples range from trivial code snippets to the highly complex expressions for which Perl is widely known. The camel book editions are also noted for being written in an approachable and humorous style. History The first edition, which gained the nickname "the pink camel" due to its pink spine, was originally published in January 1991 and covered version 4 of the Perl language. It was the work of Larry Wall and Randal L. Schwartz. The second edition, published in August 1996, included updates for the release of Perl 5, among them references, objects, packages and other modern programming constructs. This edition was written from scratch by the original authors and Tom Christiansen. In July 2000, the third edition of Programming Perl was published. This version was again rewritten, this time by Wall, Christiansen and Jon Orwant, and covered the Perl 5.6 language. The fourth edition constitutes a major update and rewrite of the book for Perl version 5.14, and improves the coverage of Unicode usage in Perl. The fourth edition was published in February 2012. This edition is written by Tom Christiansen, brian d foy, Larry Wall and Jon Orwant. Programming Perl has also been made available electronically by O'Reilly, both through its inclusion in various editions of The Perl CD Bookshelf and through the "Safari" service (a subscription-based website containing technical ebooks). The publisher offers online a free sample of Chapter 18 of the third edition and the Chapter 1 of the fourth edition as well as the complete set of code examples in the book (third edition) . O'Reilly maintains a trademark on the use of a camel in association with Perl, but allows noncommercial use. Editions First edition (1991; 482 pages; covers Perl 4; ) Second edition (1996; 670 pages; covers Perl 5.003; ) Third edition (2000; 1104 pages; covers Perl 5.6; ) Fourth edition (2012; 1184 pages; covers Perl 5.14; ) The second edition of the book was the best-selling book in the O'Reilly Media catalog in 1996, and one of the top 100 selling books in any category at Borders in 1996. See also Some related books published by O'Reilly are: Learning Perl, Intermediate Perl, and Mastering Perl. The "Three Virtues of a Programmer" are three entries in the Glossary of the 2nd edition of the book, which have been popularized outside the Perl programming community. References External l
https://en.wikipedia.org/wiki/Learning%20Perl
Learning Perl, also known as the llama book, is a tutorial book for the Perl programming language, and is published by O'Reilly Media. The first edition (1993) was authored solely by Randal L. Schwartz, and covered Perl 4. All subsequent editions have covered Perl 5. The second (1997) edition was coauthored with Tom Christiansen and the third (2001) edition was coauthored with Tom Phoenix. The fourth (2005), fifth (2008), sixth (2011), and seventh (2016) editions were written by Schwartz, Phoenix, and brian d foy. According to the 5th edition of the book, previous editions have sold more than 500,000 copies. Unlike Programming Perl, this book is aimed at computer programmers new to Perl. The publisher offers a complete set of code examples presented in the 3rd Edition book. Schwartz selected the world of The Flintstones for the examples in this book, giving rise to the somewhat frequent use of Fred and Barney as metasyntactic variables, rather than the classic foo and bar. Reactions Brad Morrey, reviewing the book for InfoWorld, praises the book for its "casual, first person style" and concludes that it "is a terrific introduction to the language that will serve as a good reference book once you have read it through." In his Linux Journal review of Perl in a Nutshell, Jan Rooijackers recommends that "If you are totally new to programming and you want to learn Perl, the book Learning Perl ... might be a better place to start." Discussing Schwartz' conviction, the New York Times noted that "Much of the Internet's World Wide Web has been built by programmers who got their start by reading his "Programming Perl" and "Learning Perl" books." Also reflecting in that case in Principles of Information Systems Security, Gurpreet Dhillon calls Learning Perl, "the definitive Perl instruction guide." In Perl Medic, author Peter Scott calls the book "the most common tutorial for learning Perl", but then criticizes its omission of hard references. Later works In 2020, Kylie published a follow-up to Learning Perl titled Learning Perl Objects, References & Modules. It picks up where Learning Perl left off. In 2005, Learning Perl Objects, References & Modules was updated by Schwartz and brian d foy and re-titled Intermediate Perl which is now in its second edition as of 2012. Mastering Perl, the third book in the trilogy and follow-up to Intermediate Perl, was first published in July 2007 and is also in a second edition as of 2014. Editions First edition (Nov. 1993; 274 pages; ) Second edition (Jul. 1997; 300 pages; covers Perl 5.004; ) Third edition (Jul. 2001; 330 pages; covers Perl 5.6; unhyphenated version for search engines (possibly) ) Fourth edition (Jul. 2005; 312 pages; covers Perl 5.8; ) Fifth edition (Jul. 2008; 348 pages; covers Perl 5.10; ) Sixth edition (Jun. 2011; 390 pages; covers Perl 5.14; ) Seventh edition (Oct. 2016; 394 pages; covers Perl 5.24; ) Eighth edition (Aug. 2021; 398 pages; covers Perl 5.34; ) References Ext
https://en.wikipedia.org/wiki/Abstraction%20inversion
In computer programming, abstraction inversion is an anti-pattern arising when users of a construct need functions implemented within it but not exposed by its interface. The result is that the users re-implement the required functions in terms of the interface, which in its turn uses the internal implementation of the same functions. This may result in implementing lower-level features in terms of higher-level ones, thus the term 'abstraction inversion'. Possible ill-effects are: The user of such a re-implemented function may seriously underestimate its running-costs. The user of the construct is forced to obscure their implementation with complex mechanical details. Many users attempt to solve the same problem, increasing the risk of error. Examples Alleged examples from professional programming circles include: In Ada, choice of the rendezvous construct as a synchronisation primitive forced programmers to implement simpler constructs such as semaphores on the more complex basis. In Applesoft BASIC, integer arithmetic was implemented on top of floating-point arithmetic, and there were no bitwise operators and no support for blitting of raster graphics (even though the language supported vector graphics on the Apple II's raster hardware). This caused games and other programs written in BASIC to run slower. Like Applesoft BASIC, Lua has a floating-point type as its sole numeric type when configured for desktop computers, and it had no bitwise operators prior to Lua 5.2. Creating an object to represent a function is cumbersome in object-oriented languages such as Java and C++ (especially prior to C++11 and Java 8), in which functions are not first-class objects. In C++ it is possible to make an object 'callable' by overloading the () operator, but it is still often necessary to implement a new class, such as the Functors in the STL. (C++11's lambda function makes it much easier to create an object representing a function.) Tom Lord has suggested that Subversion version control system pays for the abstraction inversion of implementing a write-only database on a read/write database with poor performance. Using stored procedures to manipulate data in a relational database, without granting programmers right to deploy such procedures, leads to reimplementing queries outside the database. For example, large datasets (in extreme cases - whole tables) are fetched and actual filtering takes place in application code. Alternatively, thousands of rows are updated (inserted or even fetched) one by one instead of running a multiple row query. Microsoft's WinUI 3 systematically replaces the title bar of the windows it creates with a custom one that ignores the end-user's color settings, always appearing gray instead. Applying the end-user's chosen color to the title bar requires using further customization code on Windows 11, and completely replacing the custom title bar with another custom one on Windows 10. Examples that are common outside pro
https://en.wikipedia.org/wiki/Idols%20South%20Africa
Idols is a television show on the South African television network Mzansi Magic, and previously on M-Net, based on the popular British show Pop Idol. The show is a contest to determine the best young singer in South Africa. The general format of the show is that thousands of hopeful performers from across South Africa audition in front of the judges. They are narrowed down to approximately 100 to enter the theatre rounds. They perform in group and solo rounds until 16 finalists are chosen by the judges (usually 8 males and 8 females). From these, the top 10 are selected, then each week viewers have several hours following the broadcast of the previous episode to vote by phone, SMS or online for their favourite contestant. The contestant(s) with the fewest votes is sent home each week. It was presented by Candy Litchfield and Matthew Stewardson in the first season. Halfway through the season, Stewardson was replaced by Sami Sabiti. After Colin Moss and Letoya Makhene co-hosted the second season, Moss went solo for the third and fourth seasons. Liezl van der Westhuizen became the host in the fifth season, and was sidekicked by ProVerb in the sixth season who hosted the semi-final rounds. ProVerb became the sole presenter from the seventh season until present. Randall Abrahams, Dave Thompson, Marcus Brewster and Penny Lebyane were the judges in the first season. Judges provide critiques of each contestant's performance. Brewster and Lebyane did not return for the second season and were replaced by Mara Louw-Thomson and Gareth Cliff. Unathi Nkayi replaced Louw-Thomson in the seventh season. Somizi Mhlongo was added to the judging panel in the eleventh season. Cliff left the show after the twelfth season. After several seasons as judges, Abrahams and Nkayi did not return for the eighteenth season. In February 2022, Thembi Seete and JR replaced the two. In February 2023, it was announced that the nineteenth season would be the final season of the competition. In 2006, the show had a spin-off called Afrikaanse Idols on sister channel KykNet, where exactly the same format was executed. However, the entire programme was in Afrikaans as well as the songs that were performed. There was only one season due to low ratings. The judges were Mynie Grové, Deon Maas and Taliep Petersen, while the presenter was Sean Else. Series overview English series Colour key Afrikaans series Colour key Season synopsis Season 1 The first season of South African Idols started in March 2002. Auditions were held prior in January. After Poland, it was the second international spin-off of the original Pop Idol series that went on air. However, the South African show did last shorter than the Polish version and therefore Heinz Winckler became the second Idol winner worldwide on 17 June 2002. He triumphed over Brandon October who came second and Melanie Lowe who finished third. Season 2 The second season ran from June to October 2003 and was won by then 20-year-old Anke
https://en.wikipedia.org/wiki/SCA%20%28computer%20virus%29
The SCA virus is the first computer virus created for the Amiga and one of the first to gain public notoriety. It appeared in November 1987. The SCA virus is a boot sector virus. It features a line of text that appears at every 15th copy after a warm reboot: Something wonderful has happened Your AMIGA is alive !!! and, even better... Some of your disks are infected by a VIRUS !!! Another masterpiece of The Mega-Mighty SCA !! "SCA" is an acronym for the Swiss Cracking Association, a group engaged in software protection removal, so the geographic origin of the virus was Switzerland. The virus is probably authored by an SCA member known as "CHRIS". SCA will not harm disks per se, but spreads to any write-enabled floppies inserted. If they use custom bootblocks (such as games), they are rendered unusable. SCA also checksums as an original filesystem (OFS) bootblock, hence destroying newer filesystems if the user doesn't know the proper use of the "install" command to remove SCA ("install df0: FFS FORCE" to recover a 'fast filesystem' floppy). The "Mega-Mighty SCA" produced the first Amiga virus checker which killed the SCA virus. This may well have been in response to estimates that approximately 40% of all Amiga users had SCA in their disk collection somewhere, due to rampant piracy. Other authors inspired by the harmless SCA virus would later produce more destructive viruses known as the Byte Bandit and the Byte Warrior. The first line of the infection message refers to the 1986 movie Short Circuit and the subsequent computer game with the line "Something wonderful has happened... No. 5 is alive." References Info magazine description of the virus External links Swiss Cracking Association's homepage SCA Virus entry in the amiga archive of the Virus Test Center SCA-Virus description at Virus Help Team, Amiga Virus Encyclopedia SCA-Virus description at the now defunct Amiga Virus Encyclopedia Swiss Cracking Association Archive, Part 1, Part 2, Part 3 — "got papers?" historical research project Amiga viruses Hacking in the 1980s
https://en.wikipedia.org/wiki/Thee%20Temple%20ov%20Psychick%20Youth
Thee Temple ov Psychick Youth, abbreviated as TOPY, was a British magical organization, fellowship and chaos magic network founded in 1981 by Genesis P-Orridge, lead member of multimedia group Psychic TV. The network, including later members of Coil and Current 93, was a loosely federated organization of members and initiates operating as an order of ceremonial magic and sex magic, as well as an experimental artistic collective. Creation and influence Their early network consisted of a number of "stations" worldwide including TOPY-CHAOS for Australia, TOPYNA for North America and TOPY Station 23 for the United Kingdom and Europe. Smaller, "grass-roots"-level sub-stations called Access Points were located throughout America and Europe. Throughout its existence, TOPY has been an influential group in the underground chaos magic scene. In 2016, French-Canadian director Jacqueline Castel began work on the feature-length documentary about TOPY, titled A Message from the Temple. Theory and praxis Potential TOPY members were encouraged to make magical sigils of a certain prescribed nature. These acts were to be performed on the 23rd hour (11:00pm) of the 23rd day of each month. If an individual chose to do so, they were invited to mail their sigils to a central location where the magical energy in them could be used to enhance others. Schisms In the early 1990s, a rift occurred within the network when Genesis P-Orridge of Psychic TV, one of the few founding members still involved at that time, and probably the best known public face of TOPY during the 1980s, announced their departure from the organization. This was later exacerbated with Genesis P-Orridge later claiming to have shut down the network upon leaving and requesting that the group no longer use the registered trademark of the Psychick Cross. Some of the remaining members of the network chose not to go along with this and carried on with their activities. TOPY continued to grow and evolve throughout the 1990s and into the 21st century while Genesis P-Orridge moved on to other projects such as The Process, as well as a similar project to TOPY called Topi. Genesis P-Orridge's TOPY has been criticized by Dan Siepmann as being a front for abuses of power and developing an actual cult of personality. Key texts There have been a number of texts produced by Thee Temple ov Psychick Youth to expound its philosophies. Some of the key texts produced over the years have been: Axiom 23 Thee Sigilizers Handbook Thee Grey Book (which was important during the 1980s but is no longer distributed by TOPY) Thee Black Book Broadcast (the journal of TOPY) Thee Psychick Bible is a compilation of TOPY literature, with updates and personal additions by Genesis P-Orridge, edited by Jason Louv. References Citations Works cited Primary sources Version 7.2 10.16.00 E229. Secondary sources External links Chaos magic Magical organizations Organizations established in 1981
https://en.wikipedia.org/wiki/PC1512
The Amstrad PC1512 was Amstrad's mostly IBM PC-compatible computer system, first manufactured in 1986. Next year a slight updated version named PC1640 was introduced. It was also marketed as PC6400, and Sinclair PC500. Schneider branded machines for the German market also exists. Features Whereas IBM's PC (and almost all PC compatibles) had a power supply in a corner of the main case, the PC1512's power supply was integrated with that of its monitor. The monitor had sufficient venting to cool itself by convection, instead of needing a fan. The PC1512 was therefore quieter than other PCs. Rumours circulated that an Amstrad PC would overheat, and while existing owners would note that this did not happen, new buyers were discouraged. As a result, later models had a cooling fan integrated into the main case. Another example of rumour was the suggestion that there were issues with the 'unshielded' power supply in the monitor affecting an optional hard drive that could be installed at the back of the base unit and further that this would be solved by taping tin foil or aluminum foil over the back of the base unit or the bottom of the monitor to shield the hard drive. The PC1512 shipped with one or two 360KB 5¼-inch floppy drives, and optionally a hard drive (10 or 20 MB). The 5¼-inch floppy drive(s) could be replaced with 1.2 MB capacity versions. The machine was also marketed as the Sinclair PC500. Amstrad licensed both MS-DOS 3.2 and Digital Research DOS Plus, which was largely compatible with MS-DOS and included some features from CP/M and the ability to read CP/M disks. Only one of these operating systems could be used at a time. They also licensed the GEM windowing system, which supported the customized CGA hardware of the 1512. In 1987 the PC1512 was followed by the PC1640, which had 640 KB of RAM and an EGA-compatible graphics chipset (though only the ECD model could display all EGA modes). The PC1640 also allowed replacing the internal graphics adapter with a 8-bit ISA VGA graphics expansion board, which made it more versatile than the PC1512. Both the PC1512 and the PC1640 could be upgraded with a NEC V30 CPU, that increased and added 80186 instruction set compatibility, and by adding an Intel 8087 mathematical coprocessor (FPU). Upgraded with the FPU, the PC1512 and PC1640 did outperform later PC architectures (i.e. 80286 w/o FPU) in numerical operations, which was highly useful for CAD and numerical calculations. Lotus 1-2-3 and Matlab supported the 8087. A performance benchmarks as conducted with Checkit compares the different models and configurations of the PC1512 and PC1640, with that of a fast 80286. Amstrad also attempted to expand its market share by selling computers in the United States. In the US the PC1640 was marketed as the PC6400 and included a 20 MB hard drive. Reception The PC1512, and also its successor the PC1640, sold very well. Part of it was explained because the basis model (one floppy drive, no hard disk) la
https://en.wikipedia.org/wiki/Idol%20%28Polish%20TV%20series%29
Idol is a television show on the Polish television network Polsat, based on the popular British show Pop Idol. The show is a contest to determine the best young singer in Poland. It is hosted by Maciej Rock. In the show, people first audition but eventually the performers are narrowed down to 10 finalists, with each contestant performing live. There are four judges (or five) who provide critiques of each competitor's performance. Viewers have two hours following the broadcast of the show to vote via telephone and SMS for their favorite contestant. On the night's results episode, the contestant with the fewest votes is sent home. The winners of Idol were Alicja Janosz in season one, Krzysztof Zalewski in season two, Monika Brodka in season three & Maciek Silski in season four. After a hiatus of 12 years, the show returned for its 5th season in 2017. The winner is Mariusz Dyba. Judges Series overview References External links Official website Music competitions in Poland Television series by Fremantle (company) 2002 Polish television series debuts 2000s Polish television series Polsat original programming Polish television series based on British television series
https://en.wikipedia.org/wiki/Idols%20%28Dutch%20TV%20series%29
Idols was a television show on the Dutch television network RTL 4, which is part of the Idols series based on the popular British show Pop Idol. The show is a contest to determine the best young singer in the Netherlands. The show is divided in two sections, the first being the audition round, an open audition where everyone who wants to try is allowed to sing. The first couple of shows usually show the worst and the best contenders in these auditions. Once the best are selected, the theater round starts. Here the singers who survived the auditions have to prove they really have what it takes to become an idol. In a couple of shows these performers are narrowed down to just 10 finalists, with each contestant performing live. In the first 2 seasons there were four judges, but starting with season 3 there are just three. The judges provide critiques of each competitor's performance and determine nine of the ten people who enter the final shows. In the final shows they just comment, but don't have any power anymore. After the first part of the show viewers have around one hour to vote by telephone and text messages to vote for their favorite contestant, later in the night the results of 'Idols' starts in which the results are presented and the contestant with the fewest votes is sent home. On 5 November the Dutch broadcaster RTL announced a fifth season which will be broadcast on RTL 5 in 2016. Season 1 Auditions began in 2002 and were held Zeist, Eindhoven, Hoofddorp, Rotterdam and Assen. 7,626 people auditioned in the debut season. 94 successful auditionees progressed to the next stage at the TheaterHotel De Oranjerie in Roermond, Limburg. In a chorus line of ten, contestants re-audition with a self-chosen song. 50 contestants made the second day of the theatre round where groups based on gender were formed to sing one pre-determined song: "Isn't She Lovely?" and "I'm So Excited" , for the males and females respectively. Semi Final Qualifyings From this stage, all shows are broadcast live from Studio 22 in Hilversum. Top 30 Format: 3 out of 10 making the finals each week + one Wildcard Finals Elimination Chart Season 2 Boris Titulaer won the contest, with Maud being the runner-up. Semi Final Qualifyings Top 27 Format: 3 (2 Viewers & 1 Judges Choice) out of 9 making the finals each week + 1 additional Wildcard Finals Elimination Chart Season 3 Season 3 started on October 22, 2005 with these notable changes: the two new presenters and the jury now consists of three members instead of the previous four. Raffaëla Paton won the contest, with Floortje being runner-up. Semi Final Qualifyings Top 27 Format: 4 (3 Viewers & 1 Judges Choice) out of 9 making the finals each week + 1 additional Wildcard Finals Elimination Chart Season 4 Season 5 Auditions began in January 2016 and were held Novotel Schiphol Airport in Hoofddorp. The first audition show attracted a record-breaking 1.5 million viewers. 93 contestants progressed to the next sta
https://en.wikipedia.org/wiki/Datagram%20Delivery%20Protocol
Datagram Delivery Protocol (DDP) is a member of the AppleTalk networking protocol suite. Its main responsibility is for socket-to-socket delivery of datagrams over an AppleTalk network. Note: All application-level protocols, including the infrastructure protocols NBP, RTMP and ZIP were built on top of DDP. External links - AppleTalk Management Information Base II DDP Variable Specifications Network layer protocols
https://en.wikipedia.org/wiki/Software%20inspection
Inspection in software engineering, refers to peer review of any work product by trained individuals who look for defects using a well defined process. An inspection might also be referred to as a Fagan inspection after Michael Fagan, the creator of a very popular software inspection process. Introduction An inspection is one of the most common sorts of review practices found in software projects. The goal of the inspection is to identify defects. Commonly inspected work products include software requirements specifications and test plans. In an inspection, a work product is selected for review and a team is gathered for an inspection meeting to review the work product. A moderator is chosen to moderate the meeting. Each inspector prepares for the meeting by reading the work product and noting each defect. In an inspection, a defect is any part of the work product that will keep an inspector from approving it. For example, if the team is inspecting a software requirements specification, each defect will be text in the document which an inspector disagrees with. Inspection process The inspection process was developed in the mid-1970s and it has later been extended and modified. The process should have entry criteria that determine if the inspection process is ready to begin. This prevents unfinished work products from entering the inspection process. The entry criteria might be a checklist including items such as "The document has been spell-checked". The stages in the inspections process are: Planning, Overview meeting, Preparation, Inspection meeting, Rework and Follow-up. The Preparation, Inspection meeting and Rework stages might be iterated. Planning: The inspection is planned by the moderator. Overview meeting: The author describes the background of the work product. Preparation: Each inspector examines the work product to identify possible defects. Inspection meeting: During this meeting the reader reads through the work product, part by part and the inspectors point out the defects for every part. Rework: The author makes changes to the work product according to the action plans from the inspection meeting. Follow-up: The changes by the author are checked to make sure everything is correct. The process is ended by the moderator when it satisfies some predefined exit criteria. The term inspection refers to one of the most important elements of the entire process that surrounds the execution and successful completion of a software engineering project. Inspection roles During an inspection the following roles are used. Author: The person who created the work product being inspected. Moderator: This is the leader of the inspection. The moderator plans the inspection and coordinates it. Reader: The person reading through the documents, one item at a time. The other inspectors then point out defects. Recorder/Scribe: The person that documents the defects that are found during the inspection. Inspector: The person that exa
https://en.wikipedia.org/wiki/DMC
DMC may refer to: Computer science and information technology Data Matrix Code, laser etched square code, often used for marking products in the production area Diffusion Monte Carlo method Digital Media Controller, a category within the DLNA standard (for sharing digital media among multimedia devices) tasked with finding content on digital media servers Discrete memoryless channel Dynamic Markov Compression algorithm Dynamic Mesh Communication, a mesh-based intercom system developed for motorcycle communication Media and entertainment Digital mixing console, used in audio mixing Darryl McDaniels, a member of hip hop group Run–DMC Devil May Cry, a Japanese video game series Devil May Cry (video game), the first game in the series DmC: Devil May Cry, a reboot of the series Detroit Metal City, a manga franchise Disco Mix Club, a remix label Dhammakaya Media Channel, a Thai television channel Deathmatch Classic, a Half-Life mod Drummond Money-Coutts, an English magician DMC (Egyptian TV channel), an Arabic-language channel Motor vehicles DeLorean Motor Company, former American automobile manufacturer (1975-1982) DeLorean Motor Company (Texas), company founded in 1995 supplying parts and services to owners of DeLoreans Daelim Motor Company, a South Korean motorcycle, motorscooter and ATV manufacturer Organizations Damak Multiple Campus, Jhapa, Nepal Dhaka Medical College, Bangladesh Diabetes Management Center, Services Hospital Lahore Divisional Model College in Pakistan Deseret Management Corporation, LDS Church company Destination management company, organizing events Detroit Medical Center, Michigan, US Dipolog Medical Centre Foundation College, Inc, Dipolog City, Philippines Data monitoring committee for a clinical trial Other uses Detailed marks certificate, a detailed report of academic performance Digital Media City, Seoul, South Korea Disaster Monitoring Constellation of imaging satellites DMC International Imaging, operator of satellites Dimethyl carbonate, a chemical compound Dissimilar metal corrosion Domestic material consumption, measurement of material used Double monocable, a type of ropeway technology Dubai Maritime City DMC (Company) (Dollfus-Mieg et Compagnie) a textile company in Mulhouse, France 2-Chloro-1,3-dimethylimidazolinium chloride (CAS Number 37091-73-9) a chemical reagent, similar to 1-Ethyl-3-methylimidazolium chloride for coupling reactions 4,4’-dimethoxychalcone
https://en.wikipedia.org/wiki/ISO%209241
ISO 9241 is a multi-part standard from the International Organization for Standardization (ISO) covering ergonomics of human-computer interaction. It is managed by the ISO Technical Committee 159. It was originally titled Ergonomic requirements for office work with visual display terminals (VDTs). From 2006 onwards, the standards were retitled to the more generic Ergonomics of Human System Interaction. As part of this change, ISO is renumbering some parts of the standard so that it can cover more topics, e.g. tactile and haptic interaction. For example, two zeros in the number indicate that the document under consideration is a generic or basic standard. Fundamental aspects are regulated in standards ending with one zero. A standard with three digits other than zero in the number regulate specific aspects. The first part to be renumbered was part 10 (now renumbered to part 110). Part 1 is a general introduction to the rest of the standard. Part 2 addresses task design for working with computer systems. Parts 3 to 9 deal with physical characteristics of computer equipment. Part 110 and parts 11 to 19 deal with usability aspects of software, including Part 110 (a general set of usability heuristics for the design of different types of dialogue) and Part 11 (general guidance on the specification and measurement of usability). Ergonomics of Human System Interaction The revised multipart standard is numbered in series as follows: 100 series: Software ergonomics 200 series: Human system interaction processes 300 series: Displays and display related hardware 400 series: Physical input devices - ergonomics principles 500 series: Workplace ergonomics 600 series: Environment ergonomics 700 series: Application domains - Control rooms 900 series: Tactile and haptic interactions Within those series, the standard currently includes the following parts: Part 100: Introduction to standards related to software ergonomics Part 110: Dialogue principles Part 112: Principles for the presentation of information Part 125: Guidance on visual presentation of information Part 129: Guidance on software individualization Part 151: Guidance on World Wide Web user interfaces Part 143: Forms Part 154: Interactive voice response (IVR) applications Part 161: Guidance on visual user interface elements Part 171: Guidance on software accessibility Part 210: Human-centred design for interactive systems Part 300: Introduction to electronic visual display requirements Part 302: Terminology for electronic visual displays Part 303: Requirements for electronic visual displays Part 304: User performance test methods for electronic visual displays Part 305: Optical laboratory test methods for electronic visual displays Part 306: Field assessment methods for electronic visual displays Part 307: Analysis and compliance test methods for electronic visual displays Part 308: Surface-conduction electron-emitter displays (SED) Part 309 (TR): Organic light-emitting diode (OLED) displays Part
https://en.wikipedia.org/wiki/Sequence%20clustering
In bioinformatics, sequence clustering algorithms attempt to group biological sequences that are somehow related. The sequences can be either of genomic, "transcriptomic" (ESTs) or protein origin. For proteins, homologous sequences are typically grouped into families. For EST data, clustering is important to group sequences originating from the same gene before the ESTs are assembled to reconstruct the original mRNA. Some clustering algorithms use single-linkage clustering, constructing a transitive closure of sequences with a similarity over a particular threshold. UCLUST and CD-HIT use a greedy algorithm that identifies a representative sequence for each cluster and assigns a new sequence to that cluster if it is sufficiently similar to the representative; if a sequence is not matched then it becomes the representative sequence for a new cluster. The similarity score is often based on sequence alignment. Sequence clustering is often used to make a non-redundant set of representative sequences. Sequence clusters are often synonymous with (but not identical to) protein families. Determining a representative tertiary structure for each sequence cluster is the aim of many structural genomics initiatives. Sequence clustering algorithms and packages CD-HIT UCLUST in USEARCH Starcode: a fast sequence clustering algorithm based on exact all-pairs search. OrthoFinder: a fast, scalable and accurate method for clustering proteins into gene families (orthogroups) Linclust: first algorithm whose runtime scales linearly with input set size, very fast, part of MMseqs2 software suite for fast, sensitive sequence searching and clustering of large sequence sets TribeMCL: a method for clustering proteins into related groups BAG: a graph theoretic sequence clustering algorithm JESAM: Open source parallel scalable DNA alignment engine with optional clustering software component UICluster: Parallel Clustering of EST (Gene) Sequences BLASTClust single-linkage clustering with BLAST Clusterer: extendable java application for sequence grouping and cluster analyses PATDB: a program for rapidly identifying perfect substrings nrdb: a program for merging trivially redundant (identical) sequences CluSTr: A single-linkage protein sequence clustering database from Smith-Waterman sequence similarities; covers over 7 mln sequences including UniProt and IPI ICAtools - original (ancient) DNA clustering package with many algorithms useful for artifact discovery or EST clustering Skipredudant EMBOSS tool to remove redundant sequences from a set CLUSS Algorithm to identify groups of structurally, functionally, or evolutionarily related hard-to-align protein sequences. CLUSS webserver CLUSS2 Algorithm for clustering families of hard-to-align protein sequences with multiple biological functions. CLUSS2 webserver Non-redundant sequence databases PISCES: A Protein Sequence Culling Server RDB90 UniRef: A non-redundant UniProt sequence database Uniclust: A
https://en.wikipedia.org/wiki/Aso%20District%2C%20Kumamoto
is a district located in Kumamoto Prefecture, Japan. As of the Aso and Yamato mergers (but with 2003 population data), the district has an estimated population of 40,841 and a density of 58.1 persons per square kilometer. The total area is 703.01 km2. The comes from it Towns and villages Minamioguni Oguni Takamori Minamiaso Nishihara Ubuyama Mergers On February 11, 2005 the old town of Aso absorbed the town of Ichinomiya, and the village of Namino to become the new city of Aso. On February 11, 2005 the town of Soyō merged with the town of Yabe, and the village of Seiwa, both from Kamimashiki District, to form the new town of Yamato (in Kamimashiki District). On February 13, 2005 the villages of Chōyō, Hakusui and Kugino merged to form the new village of Minamiaso. Districts in Kumamoto Prefecture Aso clan
https://en.wikipedia.org/wiki/Mask%20%28computing%29
In computer science, a mask or bitmask is data that is used for bitwise operations, particularly in a bit field. Using a mask, multiple bits in a byte, nibble, word, etc. can be set either on or off, or inverted from on to off (or vice versa) in a single bitwise operation. An additional use of masking involves predication in vector processing, where the bitmask is used to select which element operations in the vector are to be executed (mask bit is enabled) and which are not (mask bit is clear). Common bitmask functions Masking bits to 1 To turn certain bits on, the bitwise OR operation can be used, following the principle that Y OR 1 = 1 and Y OR 0 = Y. Therefore, to make sure a bit is on, OR can be used with a 1. To leave a bit unchanged, OR is used with a 0. Example: Masking on the higher nibble (bits 4, 5, 6, 7) while leaving the lower nibble (bits 0, 1, 2, 3) unchanged. 10010101 10100101 OR 11110000 11110000 = 11110101 11110101 Masking bits to 0 More often in practice, bits are "masked off" (or masked to 0) than "masked on" (or masked to 1). When a bit is ANDed with a 0, the result is always 0, i.e. Y AND 0 = 0. To leave the other bits as they were originally, they can be ANDed with 1 as Y AND 1 = Y Example: Masking off the higher nibble (bits 4, 5, 6, 7) while leaving the lower nibble (bits 0, 1, 2, 3) unchanged. 10010101 10100101 AND 00001111 00001111 = 00000101 00000101 Querying the status of a bit It is possible to use bitmasks to easily check the state of individual bits regardless of the other bits. To do this, turning off all the other bits using the bitwise AND is done as discussed above and the value is compared with 0. If it is equal to 0, then the bit was off, but if the value is any other value, then the bit was on. What makes this convenient is that it is not necessary to figure out what the value actually is, just that it is not 0. Example: Querying the status of the 4th bit 10011101 10010101 AND 00001000 00001000 = 00001000 00000000 Toggling bit values So far the article has covered how to turn bits on and turn bits off, but not both at once. Sometimes it does not really matter what the value is, but it must be made the opposite of what it currently is. This can be achieved using the XOR (exclusive or) operation. XOR returns 1 if and only if an odd number of bits are 1. Therefore, if two corresponding bits are 1, the result will be a 0, but if only one of them is 1, the result will be 1. Therefore inversion of the values of bits is done by XORing them with a 1. If the original bit was 1, it returns 1 XOR 1 = 0. If the original bit was 0 it returns 0 XOR 1 = 1. Also note that XOR masking is bit-safe, meaning that it will not affect unmasked bits because Y XOR 0 = Y, just like an OR. Example: Toggling bit values 10011101 10010101 XOR 00001111 11111111 = 10010010 01101010 To write arbitrary 1s and 0s to a subset of bits, first write 0s to that subset, then
https://en.wikipedia.org/wiki/Purely%20functional
Purely functional may refer to: Computer science Pure function, a function that does not have side effects Purely functional data structure, a persistent data structure that does not rely on mutable state Purely functional programming, a programming paradigm that does not rely on mutable state Law Functionality doctrine, in intellectual property law See also Referential transparency
https://en.wikipedia.org/wiki/Public%20switched%20telephone%20network
The public switched telephone network (PSTN) is the aggregate of the world's telephone networks that are operated by national, regional, or local telephony operators. It provides infrastructure and services for public telephony. The PSTN consists of telephone lines, fiber optic cables, microwave transmission links, cellular networks, communications satellites, and undersea telephone cables interconnected by switching centers, such as central offices, network tandems, and international gateways, which allow telephone users to communicate with each other. Originally a network of fixed-line analog telephone systems, the PSTN is almost entirely digital in its core network and includes mobile and wireless networks, all of which are currently transitioning to use the Internet Protocol to carry their PSTN traffic. The technical operation of the PSTN adheres to the standards internationally promulgated by the ITU-T. These standards have their origins in the development of local telephone networks, primarily in the Bell System in the United States and in the networks of European ITU members. The E.164 standard provides a single global address space in the form of telephone numbers. The combination of the interconnected networks and a global telephone numbering plan allows telephones around the world to connect with each other. History Commercialization of the telephone began shortly after its invention, with instruments operated in pairs for private use between two locations. Users who wanted to communicate with persons at multiple locations had as many telephones as necessary for the purpose. Alerting another user of the desire to establish a telephone call was accomplished by whistling loudly into the transmitter until the other party heard the alert. Bells were soon added to stations for signaling. Later telephone systems took advantage of the exchange principle already employed in telegraph networks. Each telephone was wired to a telephone exchange established for a town or area. For communication outside this exchange area, trunks were installed between exchanges. Networks were designed in a hierarchical manner until they spanned cities and states, and international distances. Automation introduced pulse dialing between the telephone and the exchange so that each subscriber could directly dial another subscriber connected to the same exchange, but long-distance calling across multiple exchanges required manual switching by operators. Later, more sophisticated address signaling, including multi-frequency signaling methods, enabled direct-dialed long-distance calls by subscribers, culminating in the Signalling System 7 (SS7) network that controlled calls between most exchanges by the end of the 20th century. The growth of the PSTN was enabled by teletraffic engineering techniques to deliver quality of service (QoS) in the network. The work of A. K. Erlang established the mathematical foundations of methods required to determine the capacity req
https://en.wikipedia.org/wiki/Verizon%20%28mobile%20network%29
Verizon is an American wireless network operator that previously operated as a separate division of Verizon Communications under the name Verizon Wireless. In a 2019 reorganization, Verizon moved the wireless products and services into the divisions Verizon Consumer and Verizon Business, and stopped using the Verizon Wireless name. Verizon is the second-largest wireless carrier in the United States, with 143.3 million subscribers at the end of Q2 2023. The company is headquartered in Basking Ridge, New Jersey. It was founded in 2000 as a joint venture of American telecommunications firm Bell Atlantic, which would soon become Verizon Communications, and British multinational telecommunications company Vodafone. Verizon Communications became the sole owner in 2014 after buying Vodafone's 45-percent stake in the company. It operates national 5G and 4G LTE networks covering about 99 percent of the U.S. population, which in the second half of 2020 won or tied for top honors in each category of the RootMetrics RootScore Reports. Verizon Wireless offers mobile phone services through a variety of devices. Its LTE in Rural America Program, with 21 rural wireless carriers participating, covers 2.7 million potential users in 169 rural counties. Verizon Wireless announced in 2015 that it was developing a 5G, or fifth-generation, network. In 2020, 230 million people were able to access Verizon's 5G, or fifth-generation, dynamic spectrum sharing (DSS) technology network; by 2022, 200 million people were covered by Verizon's 5G Ultra Wideband network. History In September 1999, American phone company Bell Atlantic and British-based Vodafone Airtouch PLC proposed they would create a new wireless phone service joint venture valued at $70 billion. The joint venture was being created as Bell Atlantic underwent a merger with GTE Corporation. In April 2000, the companies announced that the Bell Atlantic–GTE merger would take the name Verizon and that the Bell Atlantic–Vodafone wireless unit would be called Verizon Wireless (legally Cellco Partnership d.b.a. Verizon Wireless). Verizon Communications owned 55 percent of Verizon Wireless while Vodafone retained 45 percent ownership. Regulators with the Federal Communications Commission approved the Bell Atlantic–GTE merger on June 16, 2000, creating the largest wireless company in the United States. Verizon Wireless held this market position until AT&T Wireless Services acquired Cingular Wireless in 2004. Throughout the 2000s, Verizon acquired several wireless phone companies and assets across the country, including West Virginia Wireless in 2006; Ramcell in 2007; Rural Cellular Corporation and SureWest Communications, both in 2008. Also in 2008, Verizon struck a deal to buy Alltel for $5.9 billion in equity while assuming $22.2 billion worth of debt. The deal was finalized on January 9, 2009, again making Verizon Wireless the country's largest cellphone network. As per the agreement, Verizon sold rural wireless p
https://en.wikipedia.org/wiki/Geometric%20dimensioning%20and%20tolerancing
Geometric dimensioning and tolerancing (GD&T) is a system for defining and communicating engineering tolerances via a symbolic language on engineering drawings and computer-generated 3D models that describes a physical object's nominal geometry and the permissible variation thereof. GD&T is used to define the nominal (theoretically perfect) geometry of parts and assemblies, the allowable variation in size, form, orientation, and location of individual features, and how features may vary in relation to one another such that a component is considered satisfactory for its intended use. Dimensional specifications define the nominal, as-modeled or as-intended geometry, while tolerance specifications define the allowable physical variation of individual features of a part or assembly. There are several standards available worldwide that describe the symbols and define the rules used in GD&T. One such standard is American Society of Mechanical Engineers (ASME) Y14.5. This article is based on that standard. Other standards, such as those from the International Organization for Standardization (ISO) describe a different system which has very different interpretation rules (see GPS&V). The Y14.5 standard provides a fairly complete set of rules for GD&T in one document. The ISO standards, in comparison, typically only address a single topic at a time. There are separate standards that provide the details for each of the major symbols and topics below (e.g. position, flatness, profile, etc.). BS 8888 provides a self-contained document taking into account a lot of GPS&V standards. Origin The origin of GD&T is credited to Stanley Parker, who developed the concept of "true position". While little is known about Parker's life, it is known that he worked at the Royal Torpedo Factory in Alexandria, West Dunbartonshire, Scotland. His work increased production of naval weapons by new contractors. In 1940, Parker published Notes on Design and Inspection of Mass Production Engineering Work, the earliest work on geometric dimensioning and tolerancing. In 1956, Parker published Drawings and Dimensions, which became the basic reference in the field. Dimensioning and tolerancing philosophy According to the ASME Y14.5-2009 standard, the purpose of GD&T is to describe the engineering intent of parts and assemblies. The datum reference frame can describe how the part fits or functions. GD&T can more accurately define the dimensional requirements for a part, allowing over 50% more tolerance zone than coordinate (or linear) dimensioning in some cases. Proper application of GD&T will ensure that the part defined on the drawing has the desired form, fit (within limits) and function with the largest possible tolerances. GD&T can add quality and reduce cost at the same time through producibility. There are some fundamental rules that need to be applied (these can be found on page 7 of the 2009 edition of the standard): All dimensions must have a tolerance. Every featur
https://en.wikipedia.org/wiki/While%20loop
In most computer programming languages, a while loop is a control flow statement that allows code to be executed repeatedly based on a given Boolean condition. The while loop can be thought of as a repeating if statement. Overview The while construct consists of a block of code and a condition/expression. The condition/expression is evaluated, and if the condition/expression is true, the code within all of their following in the block is executed. This repeats until the condition/expression becomes false. Because the while loop checks the condition/expression before the block is executed, the control structure is often also known as a pre-test loop. Compare this with the do while loop, which tests the condition/expression after the loop has executed. For example, in the C programming language (as well as Java, C#, Objective-C, and C++, which use the same syntax in this case), the code fragment int x = 0; while (x < 5) { printf ("x = %d\n", x); x++; } first checks whether x is less than 5, which it is, so then the {loop body} is entered, where the printf function is run and x is incremented by 1. After completing all the statements in the loop body, the condition, (x < 5), is checked again, and the loop is executed again, this process repeating until the variable x has the value 5. Note that it is possible, and in some cases desirable, for the condition to always evaluate to true, creating an infinite loop. When such a loop is created intentionally, there is usually another control structure (such as a break statement) that controls termination of the loop. For example: while (true) { // do complicated stuff if (someCondition) break; // more stuff } Demonstrating while loops These while loops will calculate the factorial of the number 5: ActionScript 3 var counter: int = 5; var factorial: int = 1; while (counter > 1) { factorial *= counter; counter--; } Printf("Factorial = %d", factorial); Ada with Ada.Integer_Text_IO; procedure Factorial is Counter : Integer := 5; Factorial : Integer := 1; begin while Counter > 0 loop Factorial := Factorial * Counter; Counter := Counter - 1; end loop; Ada.Integer_Text_IO.Put (Factorial); end Factorial; APL counter ← 5 factorial ← 1 :While counter > 0 factorial ×← counter counter -← 1 :EndWhile ⎕ ← factorial or simply !5 AutoHotkey counter := 5 factorial := 1 While counter > 0 factorial *= counter-- MsgBox % factorial Microsoft Small Basic counter = 5 ' Counter = 5 factorial = 1 ' initial value of variable "factorial" While counter > 0 factorial = factorial * counter counter = counter - 1 TextWindow.WriteLine(counter) EndWhile Visual Basic Dim counter As Integer = 5 ' init variable and set value Dim factorial As Integer = 1 ' initialize factorial variable Do While counter > 0 factorial = factorial * counter counter = counter - 1 Loop ' program goes here, until co
https://en.wikipedia.org/wiki/For%20loop
In computer science a for-loop or for loop is a control flow statement for specifying iteration. Specifically, a for loop functions by running a section of code repeatedly until a certain condition has been satisfied. For-loops have two parts: a header and a body. The header defines the iteration and the body is the code that is executed once per iteration. The header often declares an explicit loop counter or loop variable. This allows the body to know which iteration is being executed. For-loops are typically used when the number of iterations is known before entering the loop. For-loops can be thought of as shorthands for while-loops which increment and test a loop variable. Various keywords are used to indicate the usage of a for loop: descendants of ALGOL use "", while descendants of Fortran use "". There are other possibilities, for example COBOL which uses . The name for-loop comes from the word for. For is used as the keyword in many programming languages to introduce a for-loop. The term in English dates to ALGOL 58 and was popularized in ALGOL 60. It is the direct translation of the earlier German and was used in Superplan (1949–1951) by Heinz Rutishauser. Rutishauser was involved in defining ALGOL 58 and ALGOL 60. The loop body is executed "for" the given values of the loop variable. This is more explicit in ALGOL versions of the for statement where a list of possible values and increments can be specified. In Fortran and PL/I, the keyword is used for the same thing and it is called a do-loop; this is different from a loop. FOR A for-loop statement is available in most imperative programming languages. Even ignoring minor differences in syntax there are many differences in how these statements work and the level of expressiveness they support. Generally, for-loops fall into one of the following categories: Traditional for-loops The for-loop of languages like ALGOL, Simula, BASIC, Pascal, Modula, Oberon, Ada, Matlab, Ocaml, F#, and so on, requires a control variable with start- and end-values, which looks something like this: for i = first to last do statement (* or just *) for i = first..last do statement Depending on the language, an explicit assignment sign may be used in place of the equal sign (and some languages require the word even in the numerical case). An optional step-value (an increment or decrement ≠ 1) may also be included, although the exact syntaxes used for this differs a bit more between the languages. Some languages require a separate declaration of the control variable, some do not. Another form was popularized by the C programming language. It requires 3 parts: the initialization (loop variant), the condition, and the advancement to the next iteration. All these three parts are optional. This type of "semicolon loops" came from B programming language and it was originally invented by Stephen Johnson. In the initialization part, any variables needed are declared (and usually assigned values). If mu
https://en.wikipedia.org/wiki/PETSCII
PETSCII (PET Standard Code of Information Interchange), also known as CBM ASCII, is the character set used in Commodore Business Machines' 8-bit home computers, starting with the PET from 1977 and including the CBM-II, VIC-20, Commodore 64, Commodore 16, Commodore 116, Plus/4, and Commodore 128. History The character set was largely designed by Leonard Tramiel (the son of Commodore CEO Jack Tramiel) and PET designer Chuck Peddle. The graphic characters of PETSCII were one of the extensions Commodore specified for Commodore BASIC when laying out desired changes to Microsoft's existing 6502 BASIC to Microsoft's Ric Weiland in 1977. The VIC-20 used the same pixel-for-pixel font as the PET, although the characters appeared wider due to the VIC's 22-column screen. The Commodore 64, however, used a slightly re-designed, heavy upper-case font, essentially a thicker version of the PET's, in order to avoid color artifacts created by the machine's higher resolution screen. The C64's lowercase characters are identical to the lowercase characters in the Atari 800's system font (released several years earlier). Peddle claims the inclusion of card suit symbols was spurred by the demand that it should be easy to write card games on the PET (as part of the specification list he received). Specifications "Unshifted" PETSCII is based on the 1963 version of ASCII (rather than the 1967 version, which most if not all other computer character sets based on ASCII use). It has only uppercase letters, an up-arrow instead of caret at $5E and a left-arrow instead of an underscore at $5F, and in the VIC-20 and C64 version, a British pound sign instead of the backslash at $5C. Other characters added in ASCII-1967: lowercase letters, the grave accent, curly braces, vertical bar, and tildedo not exist in PETSCII. Codes $60–$7F and $A0–$BF are allotted to CBM-specific block graphics characters (horizontal and vertical lines, hatches, shades, triangles, circles and card suits). PETSCII also has a "shifted" or "text mode", which changes the uppercase letters at $41–$5A to lowercase, and changes the graphics at $61–$7A to uppercase letters. Upper- and lower-case are swapped from where ASCII has them. The mode is toggled by holding one of the SHIFT keys and then pressing and releasing the Commodore key. The shift can be done by POKEing location 59468 with the value 14 to select the alternative set or 12 to revert to standard. On C64 the sets are alternated by flipping bit 2 of the byte 53272. On some models of PET this can also be achieved via special control code PRINT CHR$(14) which adjust the line spacing as well as changing the character set; the POKE method is still available and does not alter the line spacing. Included in PETSCII are cursor and screen control codes, such as {HOME}, {CLR}, {RVS ON}, and {RVS OFF} (the latter two activating/deactivating reverse-video character display). The control codes appeared in program listings as reverse-video graphic char
https://en.wikipedia.org/wiki/Game%20Developers%20Conference
The Game Developers Conference (GDC) is an annual conference for video game developers. The event includes an expo, networking events, and awards shows like the Game Developers Choice Awards and Independent Games Festival, and a variety of tutorials, lectures, and roundtables by industry professionals on game-related topics covering programming, design, audio, production, business and management, and visual arts. History Originally called the Computer Game Developers Conference, the first conference was organized in April 1988 by Chris Crawford in his San Jose, California-area living room. About twenty-seven designers attended, including Don Daglow, Brenda Laurel, Brian Moriarty, Gordon Walton, Tim Brengle, Cliff Johnson, Dave Menconi, and Carol and Ivan Manley. The second conference, held that same year at a Holiday Inn at Milpitas, attracted about 125 developers. Early conference directors included Brenda Laurel, Tim Brengle, Sara Reeder, Dave Menconi, Jeff Johannigman, Stephen Friedman, Chris Crawford, and Stephanie Barrett. Later directors include John Powers, Nicky Robinson, Anne Westfall, Susan Lee-Merrow, and Ernest W. Adams. In the early years the conference changed venue each year to accommodate its increases in size. Attendance in this period grew from 525 to 2,387. By 1994 the CGDC could afford to sponsor the creation of the Computer Game Developers Association with Adams as its founding director. Miller Freeman, Inc. took on the running of the conference in 1996, nearly doubling attendance to 4,000 that year. In 2005, the GDC moved to the new Moscone Center West, in the heart of San Francisco's SOMA district, and reported over 12,000 attendees. The GDC returned to San Jose in 2006, reporting over 12,500 attendees, and moved to San Francisco in 2007 – where the organizers expect it will stay for the foreseeable future. Attendance figures continued to rise in following years, with 18,000 attendees in the 2008 event. The 2009 Game Developers Conference was held in San Francisco, on March 23–27, 2009. The IGDA awarded 25 scholarships to send qualified students to attend the 2009 GDC. Crawford continued to give the conference keynote address for the first several years of the conference, including one in the early 1990s where he punctuated a point about game tuning and player involvement by cracking a bullwhip perilously close to the front row of the audience. Crawford also founded The Journal of Computer Game Design in 1987 in parallel to beginning the GDC, and served as publisher and editor of the academic-style journal through 1996. During the late 1990s, the conference expanded from its original strict focus on game design to include topics such as marketing and legal issues. The CGDC changed its name to "Game Developers Conference" in 1999. The GDC has also hosted the Spotlight Awards from 1997 to 1999, the Independent Games Festival since 1999 and the Game Developers Choice Awards since 2001. The GDC is also used for the annual
https://en.wikipedia.org/wiki/Hidden-surface%20determination
In 3D computer graphics, hidden-surface determination (also known as shown-surface determination, hidden-surface removal (HSR), occlusion culling (OC) or visible-surface determination (VSD)) is the process of identifying what surfaces and parts of surfaces can be seen from a particular viewing angle. A hidden-surface determination algorithm is a solution to the visibility problem, which was one of the first major problems in the field of 3D computer graphics . The process of hidden-surface determination is sometimes called hiding, and such an algorithm is sometimes called a hider. When referring to line rendering it is known as hidden-line removal. Hidden-surface determination is necessary to render a scene correctly, so that one may not view features hidden behind the model itself, allowing only the naturally viewable portion of the graphic to be visible. Background Hidden-surface determination is a process by which surfaces that should not be visible to the user (for example, because they lie behind opaque objects such as walls) are prevented from being rendered. Despite advances in hardware capability, there is still a need for advanced rendering algorithms. The responsibility of a rendering engine is to allow for large world spaces, and as the world’s size approaches infinity, the engine should not slow down but remain at a constant speed. Optimizing this process relies on being able to ensure the deployment of as few resources as possible towards the rendering of surfaces that will not end up being displayed to the user. There are many techniques for hidden-surface determination. They are fundamentally an exercise in sorting and usually vary in the order in which the sort is performed and how the problem is subdivided. Sorting large quantities of graphics primitives is usually done by divide and conquer. Algorithms Considering the rendering pipeline, the projection, the clipping, and the rasterization steps are handled differently by the following algorithms: Z-buffering During rasterization, the depth/Z value of each pixel (or sample in the case of anti-aliasing, but without loss of generality the term pixel is used) is checked against an existing depth value. If the current pixel is behind the pixel in the Z-buffer, the pixel is rejected, otherwise, it is shaded and its depth value replaces the one in the Z-buffer. Z-buffering supports dynamic scenes easily and is currently implemented efficiently in graphics hardware. This is the current standard. The cost of using Z-buffering is that it uses up to 4 bytes per pixel and that the rasterization algorithm needs to check each rasterized sample against the Z-buffer. The Z-buffer can also suffer from artifacts due to precision errors (also known as Z-fighting). Coverage buffers () and surface buffer (S-buffer) Faster than Z-buffers and commonly used in games in the Quake I era. Instead of storing the Z value per pixel, they store a list of already displayed segments per line of the s
https://en.wikipedia.org/wiki/Spooling
In computing, spooling is a specialized form of multi-programming for the purpose of copying data between different devices. In contemporary systems, it is usually used for mediating between a computer application and a slow peripheral, such as a printer. Spooling allows programs to "hand off" work to be done by the peripheral and then proceed to other tasks, or to not begin until input has been transcribed. A dedicated program, the spooler, maintains an orderly sequence of jobs for the peripheral and feeds it data at its own rate. Conversely, for slow input peripherals, such as a card reader, a spooler can maintain a sequence of computational jobs waiting for data, starting each job when all of the relevant input is available; see batch processing. The spool itself refers to the sequence of jobs, or the storage area where they are held. In many cases, the spooler is able to drive devices at their full rated speed with minimal impact on other processing. Spooling is a combination of buffering and queueing. Print spooling Nowadays, the most common use of spooling is printing: documents formatted for printing are stored in a queue at the speed of the computer, then retrieved and printed at the speed of the printer. Multiple processes can write documents to the spool without waiting, and can then perform other tasks, while the "spooler" process operates the printer. For example, when a large organization prepares payroll cheques, the computation takes only a few minutes or even seconds, but the printing process might take hours. If the payroll program printed cheques directly, it would be unable to proceed to other computations until all the cheques were printed. Similarly, before spooling was added to PC operating systems, word processors were unable to do anything else, including interact with the user, while printing. Spooler or print management software often includes a variety of related features, such as allowing priorities to be assigned to print jobs, notifying users when their documents have been printed, distributing print jobs among several printers, selecting appropriate paper for each document, etc. A print server applies spooling techniques to allow many computers to share the same printer or group of printers. Banner page Print spoolers can be configured to add a banner page, also called a burst page, job sheet, or printer separator, to the beginning and end of each document and job. These separate documents from each other, identify each document (e.g. with its title) and often also state who printed it (e.g. by username or job name). Banner pages are valuable in office environments where many people share a small number of printers. They are also valuable when a single job can produce multiple documents. Depending on the configuration, banner pages might be generated on each client computer, on a centralized print server, or by the printer itself. On printers using fanfold continuous forms a leading banner page wo
https://en.wikipedia.org/wiki/QuikAir
QuikAir (QuikAir Airline Service) was a small Canadian regional airline based in Calgary, Alberta, serving business travellers. QuikAir ceased its operations on October 24, 2006. Code data IATA Code: Q9 Services QuikAir was launched in 2001, operating over 25,000 flights between Calgary and Edmonton until its closure in 2006. The airline operated four aircraft dedicated to providing 24 daily flights between Calgary and Edmonton, while also operating flights to Fort McMurray and Penticton, British Columbia. Until 2004, it also operated in Quebec and Ontario. Private charters were also available. Fleet As of August 2006, the QuikAir fleet included: 2 BAe Jetstream 31 2 BAe Jetstream 32 Failure The demise of QuikAir was caused by several different factors: The scheduled service into Edmonton was restricted to the International Airport only, under the Access Agreement which prevented scheduled flights from serving the City Center Airport. The International Airport, being located in Leduc and not actually Edmonton (18 km from Edmonton, 36 km from Edmonton's downtown), forced many regular business travellers to find other transportation into the downtown core. The proposed scheduled service into Penticton was met with resistance by means of a Transport Canada representative. Unsatisfied with the Jetstream's single engine performance, it was decided that, in the event of an engine failure, the aircraft could not make it safely back out of the valley in which Penticton lies. Multiple revisions of approach procedures occurred, each which had to be reviewed and approved. As the Penticton route debuted with much fanfare, many flights were pre-sold to customers. Not wanting to lose their customer base, QuikAir chartered several different airlines to carry their passengers for the scheduled flights while the approach procedures were being worked out. Often there were communication problems between QuikAir, these contracted airlines, and the passengers, which led to customer dissatisfaction. QuikAir ceased operations as of October 24, 2006, leaving many passengers making new arrangements and fighting for their refunds, and several employees without their last weeks of wages. Bankruptcy was declared November 16, 2007. The Calgary-Edmonton corridor routes were picked up by Peace Air, which also ran scheduled flights to Fort McMurray. (Peace Air has since ceased operations as of May 18, 2007.) See also List of defunct airlines of Canada References External links Edmonton Journal - QuikAir's failure no surprise Calgary Herald - QuikAir folds operations Defunct airlines of Canada Airlines established in 2001 Airlines disestablished in 2006 Companies based in Calgary 2001 establishments in Alberta 2006 disestablishments in Alberta
https://en.wikipedia.org/wiki/Print%20job
In computing, a print job is a file or set of files that has been submitted to be printed with a printer. Jobs are typically identified by a unique number, and are assigned to a particular destination, usually a printer. Jobs can also have options associated with them such as media size, number of copies and priority. A Print Job is a single queueable print system object that represents a document that needs to be rendered and transferred to a printer. Printer jobs are created on specific print queues and can not be transferred between print queues. Components Job Id: Uniquely identifies the print job for the given print queue. Spool file: It is responsible for the on-disk representation of data. Shadow File: It is responsible for the on-disk representation of the job configuration. Status: We can this in three parts : Spooling: It represents the message that the printing application is still working. Printing: It represents the message that spool file is being read by the print processor Printed: It represents the message that the job has been fully written to the port. Data Type: It Identifies the format of the data in the spool file like EMF, RAW. Other configuration: Name, set of named properties, etc Route In larger environments, print jobs may go through a centralized print server, before reaching the printing destination. Some (multifunction) printers have local storage (like a hard disk drive) to process and queue the jobs before printing. Security When getting rid of old printers with local storage, one should keep in mind that confidential print jobs (documents) are potentially still locally unencrypted on the hard disk drive and can be undeleted. See also print (command) References Computer printing
https://en.wikipedia.org/wiki/Printer%20driver
In computers, a printer driver or a print processor is a piece of software on a computer that converts the data to be printed to a format that a printer can understand. The purpose of printer drivers is to allow applications to do printing without being aware of the technical details of each printer model. Printer drivers should not be confused with print spoolers, which queue print jobs and send them successively to a printer. Printer drivers in different operating systems Unix and Unix-like Unix and other Unix-like systems such as Linux and OS X use CUPS (short for Common Unix Printing System), a modular printing system for Unix-like computer operating systems, which allows a computer to act as a print server. A computer running CUPS is a host that can accept print jobs from client computers, process them, and send them to the appropriate printer. Printer drivers are typically implemented as filters. They are usually named the front end of the printing system, while the printer spoolers constitute the back end. Backends are also used to determine the available devices. On startup, each backend is asked for a list of devices it supports, and any information that is available. DOS DOS supports predefined character devices PRN:, LPT1:, LPT2: and LPT3: associated with parallel printers supported in the system. Similarly, serial printers can be used with AUX:, COM1:, COM2:, COM3: and COM4:. Users can use commands like, for example, "COPY file1 LPT1:" to print the content of a file to a printer. The contents is transferred to the printer without any interpretation. Therefore, this method of printing is either for files already stored in the corresponding printer's language or for generic text files without more than simple line-oriented formatting. DOS also provides a dynamically loadable print spooler named PRINT as well as optional support to support screen captures also in graphics mode through GRAPHICS. If the optional character device driver PRINTER.SYS is loaded, DOS provides its code page switching support also for the associated printers—different types of dot matrix and ink jet printers are supported by default. Beyond this, there are no system-wide printer-specific drivers for use at application level under MS-DOS/PC DOS. Under DR-DOS, however, the SCRIPT command can be loaded to run in the background in order to intercept and convert printer output from applications into PostScript to support PS-capable printers also by applications not supporting them directly. In order to support more complex printing for different models of printers, each application (e.g. a word processor) may be shipped with its own printer drivers, which were essentially descriptions of printer escape sequences. Printers, too, have been supplied with drivers for the most popular applications. In addition, it's possible for applications to include tools for editing printer description, in case there was no ready driver. In the days when DOS was widely used
https://en.wikipedia.org/wiki/SSLIOP
In distributed computing, SSLIOP is an Internet Inter-ORB Protocol (IIOP) over Secure Sockets Layer (SSL), providing confidentiality and authentication. , SSLIOP is implemented by (at least) TAO, JacORB, OpenORB , and MICO . See also CSIv2 SECIOP Common Object Request Broker Architecture
https://en.wikipedia.org/wiki/General%20Inter-ORB%20Protocol
In distributed computing, General Inter-ORB Protocol (GIOP) is the message protocol by which object request brokers (ORBs) communicate in CORBA. Standards associated with the protocol are maintained by the Object Management Group (OMG). The current version of GIOP is 2.0.2. The GIOP architecture provides several concrete protocols, including: Internet InterORB Protocol (IIOP) — The Internet Inter-Orb Protocol is an implementation of the GIOP for use over the Internet, and provides a mapping between GIOP messages and the TCP/IP layer. SSL InterORB Protocol (SSLIOP) — SSLIOP is IIOP over SSL, providing encryption and authentication. HyperText InterORB Protocol (HTIOP) — HTIOP is IIOP over HTTP, providing transparent proxy bypassing. Zipped InterORB Protocol (ZIOP) — A zipped version of GIOP that reduces the bandwidth usage Environment Specific Inter-ORB Protocols As an alternative to GIOP, CORBA includes the concept of an Environment Specific Inter-ORB Protocol (ESIOP). While GIOP is defined to meet general-purpose needs of most CORBA implementations, an ESIOP attempts to address special requirements. For example, an ESIOP might use an alternative protocol encoding to improve efficiency over networks with limited bandwidth or high latency. ESIOPs can also be used to layer CORBA on top of some non-CORBA technology stack, such as Distributed Computing Environment (DCE). DCE Common Inter-ORB Protocol (DCE-CIOP) is an ESIOP for use in DCE. It maps CORBA to DCE RPC and CDR (Command Data Representation). DCE-CIOP is defined in chapter 16 of the CORBA 2.6.1 standard. Messages The General Inter-ORB Protocol (GIOP) is the message protocol used by object request brokers (ORBs) to communicate in CORBA-based distributed computing systems. GIOP 2.0.2 is the current version of this protocol, and it provides a number of concrete protocols such as IIOP, SSLIOP, HTIOP, and ZIOP. IIOP is a mapping of GIOP messages to the TCP/IP layer for use over the Internet, while SSLIOP provides encryption and authentication. HTIOP, on the other hand, enables transparent proxy bypassing by using IIOP over HTTP. Finally, ZIOP is a compressed version of GIOP that minimizes bandwidth usage. In addition to these protocols, CORBA also includes the concept of an Environment Specific Inter-ORB Protocol (ESIOP) to address specific requirements. An ESIOP can use an alternative protocol encoding to improve efficiency over networks with limited bandwidth or high latency, or can be used to layer CORBA on top of non-CORBA technologies such as DCE. DCE Common Inter-ORB Protocol (DCE-CIOP) is an ESIOP that maps CORBA to DCE RPC and CDR. Further reading See also DIIOP References Distributed computing
https://en.wikipedia.org/wiki/CSIv2
In distributed computing, CSIv2 (Common Secure Interoperability Protocol Version 2) is a protocol implementing security features for inter-ORB communication. It intends, in part, to address limitations of SSLIOP. CSIv2 also facilitates secure EJB-CORBA interoperability. External links OMG Website CSIv2 Specification (OMG website) Common Object Request Broker Architecture
https://en.wikipedia.org/wiki/SECIOP
In distributed computing, SECIOP (SECure Inter-ORB Protocol) is a protocol for secure inter-ORB communication. References Inter-process communication
https://en.wikipedia.org/wiki/Instantaneously%20trained%20neural%20networks
Instantaneously trained neural networks are feedforward artificial neural networks that create a new hidden neuron node for each novel training sample. The weights to this hidden neuron separate out not only this training sample but others that are near it, thus providing generalization. This separation is done using the nearest hyperplane that can be written down instantaneously. In the two most important implementations the neighborhood of generalization either varies with the training sample (CC1 network) or remains constant (CC4 network). These networks use unary coding for an effective representation of the data sets. This type of network was first proposed in a 1993 paper of Subhash Kak. Since then, instantaneously trained neural networks have been proposed as models of short term learning and used in web search, and financial time series prediction applications. They have also been used in instant classification of documents and for deep learning and data mining. As in other neural networks, their normal use is as software, but they have also been implemented in hardware using FPGAs and by optical implementation. CC4 network In the CC4 network, which is a three-stage network, the number of input nodes is one more than the size of the training vector, with the extra node serving as the biasing node whose input is always 1. For binary input vectors, the weights from the input nodes to the hidden neuron (say of index j) corresponding to the trained vector is given by the following formula: where is the radius of generalization and is the Hamming weight (the number of 1s) of the binary sequence. From the hidden layer to the output layer the weights are 1 or -1 depending on whether the vector belongs to a given output class or not. The neurons in the hidden and output layers output 1 if the weighted sum to the input is 0 or positive and 0, if the weighted sum to the input is negative: Other networks The CC4 network has also been modified to include non-binary input with varying radii of generalization so that it effectively provides a CC1 implementation. In feedback networks the Willshaw network as well as the Hopfield network are able to learn instantaneously. References Learning Artificial neural networks
https://en.wikipedia.org/wiki/Lists%20of%20etymologies
This is a list of etymological lists. General List of company name etymologies List of computer term etymologies List of band name etymologies List of chemical element name etymologies English word origins Non-loanwords Proto-Indo-European — Proto-Germanic — Anglo-Saxon How words have been loaned from various languages to (many) other languages: Australian Aboriginal — African — Afrikaans — Algonquian — Arabic — Bengali — Chinese — Czech — Dutch — Etruscan — French — German — Greek — Hawaiian — Hebrew — Hindi — Hungarian — Irish — Italian — Japanese — Korean — Latin — Malay — Malayalam — Maori — Nahuatl — Norwegian — Old Norse — Persian — Polish — Portuguese — Punjabi — Quechua — Russian — Sanskrit — Scots — Scottish Gaelic — Spanish — Swedish — Tamil — Turkic — Ukrainian —Urdu — Yiddish Lists of foreign words with English derivatives Greek — Latin See: Medical terminology Spanish word origins African — Americas — Arabic — Austronesian — Basque/Iberian — Celtic — Chinese — Etruscan — French — Germanic — Greek — Indo-Aryan — Iranian — Italic — Latin — Semitic — Turkic — uncertain — various Romanian word origins Dacian Toponymy or placename etymology List of country-name etymologies British — UK counties — German — India — Irish — Romanian counties — Bulgarian provinces — Brazilian States — U.S. States — Filipino Provinces List of etymologies of country subdivision names List of national capital city name etymologies List of river name etymologies List of Australian place names of Aboriginal origin List of place names in Canada of aboriginal origin List of indigenous names of Eastern Caribbean islands Origins of names of cities and towns in Hong Kong Lists of North American place name etymologies List of place names of French origin in the United States List of place names of Spanish origin in the United States List of place names in the United States of Native American origin List of Chinook Jargon placenames Sri Lankan place name etymology Toponyms or names derived from places List of words derived from toponyms Chemical elements named after places List of inventions named after places Maghreb toponymy Eponyms (names derived from people) Astronomical objects named after people Cartoon characters named after people Chemical elements named after people Colleges and universities named after people Companies named after people Countries named after people Diseases named after people English adjectives named after people Foods named after people Human anatomical parts named after people Ideologies named after people Inventions named after people Minerals named after people Places and political entities named after people Prizes named after people Scientific constants named after people Scientific laws named after people Scientific phenomena named after people Scientific units named after people Sports terms named after people Names derived from animals and animal eponyms
https://en.wikipedia.org/wiki/Expectation%E2%80%93maximization%20algorithm
In statistics, an expectation–maximization (EM) algorithm is an iterative method to find (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. The EM iteration alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step. History The EM algorithm was explained and given its name in a classic 1977 paper by Arthur Dempster, Nan Laird, and Donald Rubin. They pointed out that the method had been "proposed many times in special circumstances" by earlier authors. One of the earliest is the gene-counting method for estimating allele frequencies by Cedric Smith. Another was proposed by H.O. Hartley in 1958, and Hartley and Hocking in 1977, from which many of the ideas in the Dempster–Laird–Rubin paper originated. Another one by S.K Ng, Thriyambakam Krishnan and G.J McLachlan in 1977. Hartley’s ideas can be broadened to any grouped discrete distribution. A very detailed treatment of the EM method for exponential families was published by Rolf Sundberg in his thesis and several papers, following his collaboration with Per Martin-Löf and Anders Martin-Löf. The Dempster–Laird–Rubin paper in 1977 generalized the method and sketched a convergence analysis for a wider class of problems. The Dempster–Laird–Rubin paper established the EM method as an important tool of statistical analysis. See also Meng and van Dyk (1997). The convergence analysis of the Dempster–Laird–Rubin algorithm was flawed and a correct convergence analysis was published by C. F. Jeff Wu in 1983. Wu's proof established the EM method's convergence also outside of the exponential family, as claimed by Dempster–Laird–Rubin. Introduction The EM algorithm is used to find (local) maximum likelihood parameters of a statistical model in cases where the equations cannot be solved directly. Typically these models involve latent variables in addition to unknown parameters and known data observations. That is, either missing values exist among the data, or the model can be formulated more simply by assuming the existence of further unobserved data points. For example, a mixture model can be described more simply by assuming that each observed data point has a corresponding unobserved data point, or latent variable, specifying the mixture component to which each data point belongs. Finding a maximum likelihood solution typically requires taking the derivatives of the likelihood function with respect to all the unknown values, the parameters and the latent variables, and simultaneously solving the resulting equations. In statistical models with latent
https://en.wikipedia.org/wiki/Advanced%20Spaceborne%20Thermal%20Emission%20and%20Reflection%20Radiometer
The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is a Japanese remote sensing instrument onboard the Terra satellite launched by NASA in 1999. It has been collecting data since February 2000. ASTER provides high-resolution images of Earth in 14 different bands of the electromagnetic spectrum, ranging from visible to thermal infrared light. The resolution of images ranges between 15 and 90 meters. ASTER data is used to create detailed maps of surface temperature of land, emissivity, reflectance, and elevation. In April 2008, the SWIR detectors of ASTER began malfunctioning and were publicly declared non-operational by NASA in January 2009. All SWIR data collected after 1 April 2008 has been marked as unusable. The ASTER Global Digital Elevation Model (GDEM) is available at no charge to users worldwide via electronic download. As of 2 April 2016, the entire catalogue of ASTER image data became publicly available online at no cost. It can be downloaded with a free registered account from either NASA's Earth Data Search delivery system or from the USGS Earth Explorer delivery system. ASTER bands ASTER Global Digital Elevation Model Version 1 On 29 June 2009, the Global Digital Elevation Model (GDEM) was released to the public. A joint operation between NASA and Japan's Ministry of Economy, Trade and Industry (METI), the Global Digital Elevation Model is the most complete mapping of the earth ever made, covering 99% of its surface. The previous most comprehensive map, NASA's Shuttle Radar Topography Mission, covered approximately 80% of the Earth's surface, with a global resolution of 90 meters, and a resolution of 30 meters over the USA. The GDEM covers the planet from 83 degrees North to 83 degrees South (surpassing SRTM's coverage of 56 °S to 60 °N), becoming the first earth mapping system that provides comprehensive coverage of the polar regions. It was created by compiling 1.3 million VNIR images taken by ASTER using single-pass stereoscopic correlation techniques, with terrain elevation measurements taken globally at 30-meter (98 ft) intervals. Despite the high nominal resolution, however, some reviewers have commented that the true resolution is considerably lower, and not as good as that of SRTM data, and serious artifacts are present. Some of these limitations have been confirmed by METI and NASA, who point out that the version 1 of the GDEM product is "research grade". Version 2 During October 2011, version 2 of Global Digital Elevation Model was publicly released. This is considered an improvement upon version 1. These improvements include increased horizontal and vertical accuracy, better horizontal resolution, reduced presence of artifacts, and more realistic values over water bodies. However, one reviewer still regards the Aster version 2 dataset, although showing 'a considerable improvement in the effective level of detail', to still be regarded as 'experimental or research grade' due to presence of
https://en.wikipedia.org/wiki/Diane%20Sawyer
Lila Diane Sawyer (; born December 22, 1945) is an American television broadcast journalist known for anchoring major programs on two networks including ABC World News Tonight, Good Morning America, 20/20, and Primetime newsmagazine while at ABC News. During her tenure at CBS News she hosted CBS Morning and was the first woman correspondent on 60 Minutes. Prior to her journalism career, she was a member of U.S. President Richard Nixon's White House staff and assisted in his post-presidency memoirs. Presently she works for ABC News producing documentaries and interview specials. Early life Sawyer was born in Glasgow, Kentucky, to Jean W. (née Dunagan), an elementary school teacher, and Erbon Powers "Tom" Sawyer, a county judge. Her ancestry includes English, Irish, Scots-Irish, and German. She has an older sister, Linda. Soon after her birth, her family moved to Louisville, where her father rose to local prominence as a Republican politician and community leader. He was Kentucky's Jefferson County Judge/Executive when he was killed in a car accident on Louisville's Interstate 64 in 1969. E. P. "Tom" Sawyer State Park, in the Frey's Hill area of Louisville, is named in his honor. Sawyer attended Seneca High School in the Buechel area of Louisville. She served as an editor-in-chief for her school yearbook, The Arrow, and participated in many artistic activities. She always felt, however, that she was in the shadow of her sister, Linda. Insecure and something of a loner as a teen, Diane found happiness, she later said, going off by herself or with a group of friends that called themselves "reincarnated transcendentalists" and read Emerson and Thoreau down by a creek. In her senior year of high school in 1963, she won the annual America's Junior Miss scholarship pageant representing Kentucky. She won by her strength of poise in the final interview and her essay comparing the music of the North and the South during the Civil War. From 1963 to 1965, Sawyer toured the country as America's Junior Miss to promote the Coca-Cola Pavilion at the 1964–1965 New York World's Fair. She had dreaded travelling around the country as America's Junior Miss, but it taught her to think on her feet with poise and grace. Sawyer attended Wellesley College, graduating in 1967. Career Immediately after her graduation, Sawyer returned to Kentucky and was employed as a weather forecaster for WLKY-TV in Louisville. In Sawyer's opinion, the weather was boring, so she would occasionally add quotes to keep it interesting. Finally, Sawyer was promoted to a general-assignment post, but this did not sustain her interest for long. In 1970, Sawyer moved to Washington, D.C., and, unable to find work as a broadcast journalist, she interviewed for positions in government offices. She eventually became an assistant to Jerry Warren, the White House deputy press secretary. Initially, Sawyer wrote press releases and quickly graduated to other tasks like drafting some of President Richa
https://en.wikipedia.org/wiki/Pick%20operating%20system
The Pick Operating System, also known as the Pick System or simply Pick, is a demand-paged, multi-user, virtual memory, time-sharing computer operating system based around a MultiValue database. Pick is used primarily for business data processing. It is named after one of its developers, Dick Pick. The term "Pick system" has also come to be used as the general name of all operating environments which employ this multivalued database and have some implementation of Pick/BASIC and ENGLISH/Access queries. Although Pick started on a variety of minicomputers, the system and its various implementations eventually spread to a large assortment of microcomputers, personal computers, and mainframe computers. Overview The Pick Operating System consists of a database, dictionary, query language, procedural language (PROC), peripheral management, multi-user management, and a compiled BASIC Programming language. The database is a "hash-file" data management system. A hash-file system is a collection of dynamic associative arrays which are organized altogether and linked and controlled using associative files as a database management system. Being hash-file oriented, Pick provides efficiency in data access time. Originally, all data structures in Pick were hash-files (at the lowest level) meaning records are stored as associated couplets of a primary key to a set of values. Today a Pick system can also natively access host files in Windows or Unix in any format. A Pick database is divided into one or more accounts, master dictionaries, dictionaries, files, and sub-files, each of which is a hash-table oriented file. These files contain records made up of fields, sub-fields, and sub-sub-fields. In Pick, records are called items, fields are called attributes, and sub-fields are called values or sub-values (hence the present-day label "multivalued database"). All elements are variable-length, with field and values marked off by special delimiters, so that any file, record, or field may contain any number of entries of the lower level of entity. As a result, a Pick item (record) can be one complete entity (one entire invoice, purchase order, sales order, etc.), or is like a file on most conventional systems. Entities that are stored as "files" in other common-place systems (e.g. source programs and text documents) must be stored as records within files on Pick. The file hierarchy is roughly equivalent to the common Unix-like hierarchy of directories, sub-directories, and files. The master dictionary is similar to a directory in that it stores pointers to other dictionaries, files and executable programs. The master dictionary also contains the command-line language. All files (accounts, dictionaries, files, sub-files) are organized identically, as are all records. This uniformity is exploited throughout the system, both by system functions, and by the system administration commands. For example, the "find" command will find and report the occurrence of a word
https://en.wikipedia.org/wiki/Aqua%20%28user%20interface%29
Aqua is the graphical user interface, design language and visual theme of Apple's macOS operating system. It was originally based on the theme of water, with droplet-like components and a liberal use of reflection effects and translucency. Its goal is to "incorporate color, depth, translucence, and complex textures into a visually appealing interface" in macOS applications. At its introduction, Steve Jobs noted that "... it's liquid, one of the design goals was when you saw it you wanted to lick it". Aqua was first introduced at the 2000 Macworld Conference & Expo in San Francisco. Its first appearance in a commercial product was in the July 2000 release of iMovie 2, followed by Mac OS X 10.0 the following year. Aqua is the successor to Platinum, which was used in Mac OS 8, Mac OS 9, and developer releases of Rhapsody (including Mac OS X Server 1.2). The appearance of Aqua has changed frequently over the years, most recently and drastically with the release of macOS Big Sur in 2020 which Apple calls the "biggest design upgrade since the introduction of Mac OS X." Background For years, Apple had been trying and failing to produce a next-generation Mac OS operating system, including projects code-named Pink, Taligent, and Copland. Mac OS X was ultimately built on NeXTSTEP, after Apple purchased NeXT and its CEO, Steve Jobs, returned to Apple, the company he had cofounded. Early versions of Mac OS X, called Rhapsody, was a developer release that had an interim user interface, blending MacOS 8's "Platinum" and OpenStep looks. The Rhapsody approach was ultimately abandoned, and the new operating system was dubbed Mac OS X in 1998. Early developer previews of Mac OS X shipped with an interface similar to Rhapsody, combining classic Mac OS and NextStep. The final operating system interface, Aqua, would be unveiled at Macworld Expo in January 2000. Design elements Aqua uses blue, white, and gray as the principal colors throughout its style. Window toolbars, window backgrounds, buttons, menus and other interface elements are all found in either of these colors. For instance, toolbars and sidebars are often grey or metal-colored, window backgrounds and popup menus are white and buttons (in older systems also scrollbar handles) are accented with a bright blue. In versions of prior to OS X Yosemite, most controls have a "glass" or "gel" effect applied to them. David Pogue described this effect as "lickable globs of Crest Berrylicious Toothpaste Gel". macOS has few native customization options to change the overall look of the system. Users can choose a graphite appearance instead of the default blue one. When using the graphite appearance, controls have a slate-like, grey-blue or grey color, including the primary window controls which are red, yellow and green with the default appearance. The appearance option was added at the behest of developers and users who found the blue appearance garish or unprofessional. Yosemite added a dark mode that darke
https://en.wikipedia.org/wiki/Mark-8
The Mark-8 is a microcomputer design from 1974, based on the Intel 8008 CPU (which was the world's first 8-bit microprocessor). The Mark-8 was designed by Jonathan Titus, a Virginia Tech graduate student in chemistry. After building the machine, Titus decided to share its design with the community and reached out to Radio-Electronics and Popular Electronics. He was turned down by Popular Electronics, but Radio-Electronics was interested and announced the Mark-8 as a 'loose kit' in the July 1974 issue of Radio-Electronics magazine. Project kit The Mark-8 was introduced as a 'build it yourself' project in Radio-Electronics'''s July 1974 cover article, offering a US$5 booklet containing circuit board layouts and DIY construction project descriptions, with Titus himself arranging for $50 circuit board sets to be made by a New Jersey company for delivery to hobbyists. Prospective Mark-8 builders had to gather the various electronics parts themselves from various sources. A couple of thousand booklets and some one-hundred circuit board sets were eventually sold. The Mark-8 was introduced in R-E as "Your Personal Minicomputer" as the word 'microcomputer' was still far from being commonly used for microprocessor-based computers. In their announcement of their computer kit, the editors placed the Mark-8 in the same category as the era's other 'minisize' computers. As quoted by an Intel official publication, "The Mark-8 is known as one of the first computers for the home." Influences Although not very commercially successful, the Mark-8 prompted the editors of Popular Electronics'' magazine to consider publishing a similar but more easily accessible microcomputer project, and just six months later, in January 1975, they went through with their plans announcing the Altair 8800. According to a 1998 Virginia Tech University article, Titus' Mark-8 microcomputer now resides in the Smithsonian Institution's "Information Age" display See also Microcomputer Minicomputer SCELBI MCM/70 Micral References External links Mark-8 Minicomputer – an original Mark-8, restored to working condition A Mark-8 Experience – Terry Ritter's detailed memoir of building and running a Mark-8 in 1974. Collection of old analog and digital computers at www.oldcomputermuseum.com Jonathan A. Titus, Microcomputer Pioneer A look at 5 very different MARK-8 computers Titus and the Mark-8, Bit-by-Bit, a Haverford College Publication Early microcomputers 8-bit computers
https://en.wikipedia.org/wiki/Marker%20interface%20pattern
The marker interface pattern is a design pattern in computer science, used with languages that provide run-time type information about objects. It provides a means to associate metadata with a class where the language does not have explicit support for such metadata. To use this pattern, a class implements a marker interface (also called tagging interface) which is an empty interface, and methods that interact with instances of that class test for the existence of the interface. Whereas a typical interface specifies functionality (in the form of method declarations) that an implementing class must support, a marker interface need not do so. The mere presence of such an interface indicates specific behavior on the part of the implementing class. Hybrid interfaces, which both act as markers and specify required methods, are possible but may prove confusing if improperly used. Example An example of the application of marker interfaces from the Java programming language is the interface:package java.io; public interface Serializable { }A class implements this interface to indicate that its non-transient data members can be written to an . The ObjectOutputStream private method writeObject0(Object,boolean) contains a series of instanceof tests to determine writeability, one of which looks for the Serializable interface. If any of these tests fails, the method throws a NotSerializableException. Critique A major problem with marker interfaces is that an interface defines a contract for implementing classes, and that contract is inherited by all subclasses. This means that you cannot "unimplement" a marker. In the example given, if you create a subclass that you do not want to serialize (perhaps because it depends on transient state), you must resort to explicitly throwing NotSerializableException (per ObjectOutputStream docs) Another solution is for the language to support metadata directly: Both the .NET Framework and Java (as of Java 5 (1.5)) provide support for such metadata. In .NET, they are called "custom attributes", in Java they are called "annotations". Despite the different name, they are conceptually the same thing. They can be defined on classes, member variables, methods, and method parameters and may be accessed using reflection. In Python, the term "marker interface" is common in Zope and Plone. Interfaces are declared as metadata and subclasses can use implementsOnly to declare they do not implement everything from their super classes. See also Design markers for an expansion of this pattern. References Further reading Effective Java by Joshua Bloch. Software design patterns Java (programming language)
https://en.wikipedia.org/wiki/Dan%20Geer
Dan Geer is a computer security analyst and risk management specialist. He is recognized for raising awareness of critical computer and network security issues before the risks were widely understood, and for ground-breaking work on the economics of security. Career Geer is currently the chief information security officer for In-Q-Tel, a not-for-profit venture capital firm that invests in technology to support the Central Intelligence Agency. In 2003, Geer's 24-page report entitled "CyberInsecurity: The Cost of Monopoly" was released by the Computer and Communications Industry Association (CCIA). The paper argued that Microsoft's dominance of desktop computer operating systems is a threat to national security. Geer was fired (from consultancy @Stake) the day the report was made public. Geer has cited subsequent changes in the Vista operating system (notably a location-randomization feature) as evidence that Microsoft "accepted the paper." Geer received a Bachelor of Science in Electrical Engineering and Computer Science from MIT, where he was a member of the Theta Deuteron charge of Theta Delta Chi fraternity. He also received a Sc.D. in biostatistics from Harvard, and has worked for: Health Sciences Computing Facility, Harvard School of Public Health Project Athena, MIT Digital Equipment Corporation Geer Zolot & Associates OpenVision Technologies Open Market Certco @stake (acquired by Symantec in November 2004) Verdasys In 2011, Geer received the USENIX Lifetime Achievement Award. References External links Dan Geer's home publications page All Geered Up: An Interview With Dan Geer By Richard Thieme Letter to Massachusetts Senator Marc Pacheco on OpenDocument Standards by Dan Geer Oh Dan Geer, where art thou? by Ellen Messmer Security of Information When Economics Matters by Dan Geer (PDF format) The Shrinking Perimeter: Making the Case for Data-Level Risk Management by Dan Geer (PDF format) Dan Geer's Convergence Time based security and the convergence of both digital and physical security (PDF format) Dan Geer's April 23, 2007 Testimony to Subcommittee on Emerging Threats, Cybersecurity, and Science and Technology (PDF Format) Geer's nomination to the FTC Advisory Committee Geer's keynote speech at Black Hat USA 2014: Cybersecurity as Realpolitik; video of Geer's keynote MIT School of Engineering alumni Harvard T.H. Chan School of Public Health alumni Chief security officers Year of birth missing (living people) Living people Harvard University staff
https://en.wikipedia.org/wiki/HFS
HFS may refer to: Computing Hierarchical file system, a system for organizing directories and files Hierarchical File System (Apple), a file system introduced in 1985 for the classic Mac OS Hierarchical File System (IBM MVS), a file system introduced in 1993 for MVS/ESA and subsequent operating systems Hi Performance FileSystem, a file system used by the HP-UX operating system HTTP File Server, a web server Hardware functionality scan, a security mechanism used in Microsoft Windows operating systems Education Haddonfield Friends School, in New Jersey, United States Harford Friends School, in Maryland, United States Hiranandani Foundation Schools, in India Science and mathematics Hereditarily finite set Hexafluorosilicic acid Hydrogen forward scattering Hyperfine structure Transportation French Frigate Shoals Airport, in Hawaii, United States Hagfors Airport, in Sweden Hatfield and Stainforth railway station, in England Other uses Croatian Film Association (Croatian: ) Hellenic Fire Service, in Greece Hemifacial spasm, a neurologic disorder High fructose syrup Hospitality Franchise Systems, later Cendant HFS Morgan, founder of Morgan Motor Company WHFS (historic), a former radio station in the Washington, D.C./Baltimore, Maryland See also High Performance File System, a file system for the OS/2 operating system
https://en.wikipedia.org/wiki/Alhurra
Alhurra ( , "the Free One") is a U.S. government-owned Arabic-language satellite TV channel that broadcasts news and current affairs programming to audiences in the Middle East and North Africa. Alhurra is funded by the U.S. government and is barred from broadcasting within the United States itself under the 1948 Smith-Mundt Act. Its stated mission is to provide "objective, accurate and relevant news and information" to its audience while seeking to "support democratic values" and "expand the spectrum of ideas, opinions, and perspectives" available in the region's media. The network has also tried to distinguish itself from its numerous regional competitors by providing access to more in-depth coverage of U.S. issues and policies and coverage of a broader range of opinions and perspectives than normally heard on other Arab television networks. Alhurra began broadcasting on 14 February 2004 to 22 countries across the Middle East and North Africa. It has established itself as the third highest-rated pan-Arab news channel, surpassing viewership ratings for the BBC (English and Arabic), France 24 Arabic, RT Arabic, CCTV, CNNi, and Sky Arabia. In April 2004, an additional channel called Alhurra-Iraq was launched, featuring most of the Alhurra content, with additional programming specifically directed at the Iraqi audience. It is also broadcast on satellite and is available on terrestrial antennas throughout Iraq, including in Basra, and Baghdad. Alhurra-Iraq consistently achieves higher ratings in Iraq than both Al Jazeera and Al Arabiya. History The decision to launch Alhurra was prompted by frustration among U.S. government officials over perceived anti-American bias among the leading Arab television networks and the effect these channels were having on Arab public opinion regarding the U.S. Alhurra was intended to serve as an alternative to these channels by presenting the news in a more "balanced and objective" manner in an effort to improve the image of the United States in the Arab world. The driving force behind the launch of Alhurra was Norman Pattiz, a media executive and founder and chairman of broadcast industry giant Westwood One. While serving as a member of the Broadcasting Board of Governors (BBG), currently the U.S. Agency for Global Media (USAGM), the U.S. federal agency that controls all foreign non-military radio and TV broadcasts, Pattiz advocated strongly for the creation of a U.S.-funded television network specifically directed at Arab audiences. Pattiz had also previously been responsible for the creation of Radio Sawa, a USAGM-administered Arabic-language radio network which broadcast a mix of music, entertainment, and news. The idea to launch Alhurra stemmed from the success that Radio Sawa had exhibited in reaching young audiences in the Middle East. Pattiz believed that Arab audiences' views of the United States were being negatively influenced by existing Arab news networks' focus on coverage of the wars in Iraq, A
https://en.wikipedia.org/wiki/Common%20Admission%20Test
The Common Admission Test (CAT) is a computer based test for admission in graduate management programs. The test consists of three sections: Verbal Ability and Reading Comprehension, Data Interpretation and Logical Reasoning and Quantitative Ability.The exam is taken online over a period of three hours, with one hour per section. In 2020, due to the COVID precautions, The Indian Institutes of Management Indore decided to conduct the CAT Exam in 2 hours with 40 minutes devoted to each section. The Indian Institutes of Management started this exam and use the test for selecting students for their business administration programs (MBA or PGDM). The test is conducted every year by one of the Indian Institutes of Managements based on a policy of rotation. In August 2011, it was announced that Indian Institutes of Technology (IITs) and Indian Institute of Science (IISc) would also use the CAT scores, instead of the Joint Management Entrance Test (JMET), to select students for their management programmes starting with the 2012-15 batch. Before 2009, CAT was a paper based test conducted on a single day for all candidates. The pattern, number of questions and duration have seen considerable variations over the years. On 1 May 2009, it was announced that CAT would be a Computer Based Test starting from 2009. The American firm Prometric was entrusted with the responsibility of conducting the test from 2009 to 2013. The first computer based CAT was marred with technical snags. The issue was so serious that it prompted the Government of India to seek a report from the convenor. The trouble was diagnosed as 'Conficker' and 'W32 Nimda', the two viruses that attacked the system display of the test, causing server slow down. Since 2014 onward, CAT has been conducted by Tata Consultancy Services (TCS). CAT 2015 and CAT 2016 were 180-minute tests consisting of 100 questions (34 from Quantitative Ability, 34 from Verbal Ability and Reading Comprehension, and 32 from Data Interpretation and Logical Reasoning. CAT 2020 onwards, the exam duration has been reduced to two hours, with 40 minutes allotted per section. Eligibility for CAT The candidate must satisfy the below specified criteria: Hold a bachelor's degree, with not less than 50% or equal CGPA (45% for Scheduled Caste (SC), Scheduled Tribe (ST) and Persons with Disability (PWD)/Differently Able (DA) classification) The degree should be granted by any of the universities consolidated by an act of the central or state statutory body in India or other instructive organizations built up by an act of Parliament or pronounced to be considered as a university under Section 3 of the UGC Act, 1956, or possess an equivalent qualification recognized by the Ministry of HRD, Government of India. Competitors appearing for the final year of bachelor's degree/equivalent qualification examination and the individuals who have finished degree prerequisites and are anticipating results can likewise apply. If selected, su
https://en.wikipedia.org/wiki/Large-file%20support
Large-file support (LFS) is the term frequently applied to the ability to create files larger than either 2 or 4 GiB on 32-bit filesystems. Details Traditionally, many operating systems and their underlying file system implementations used 32-bit integers to represent file sizes and positions. Consequently, no file could be larger than 232 − 1 bytes (4 GiB − 1). In many implementations, the problem was exacerbated by treating the sizes as signed numbers, which further lowered the limit to 231 − 1 bytes (2 GiB − 1). Files that were too large for 32-bit operating systems to handle came to be known as large files. While the limit was quite acceptable at a time when hard disks were smaller, the general increase in storage capacity combined with increased server and desktop file usage, especially for database and multimedia files, led to intense pressure for OS vendors to overcome the limitation. In 1996, multiple vendors responded by forming an industry initiative known as the Large File Summit to support large files on POSIX (at the time Windows NT already supported large files on NTFS), an obvious backronym of "LFS". The summit was tasked to define a standardized way to switch to 64-bit numbers to represent file sizes. This switch caused deployment issues and required design modifications, the consequences of which can still be seen: The change to 64-bit file sizes frequently required incompatible changes to file system layout, which meant that large-file support sometimes necessitated a file system change. For example, the FAT32 file system does not support files larger than 4 GiB−1 (with older applications even only 2 GiB−1); the variant FAT32+ does support larger files (up to 256 GiB−1), but (so far) is only supported in some versions of DR-DOS, so users of Microsoft Windows have to use NTFS or exFAT instead. To support binary compatibility with old applications, operating system interfaces had to retain their use of 32-bit file sizes and new interfaces had to be designed specifically for large-file support. To support writing portable code that makes use of LFS where possible, C standard library authors devised mechanisms that, depending on preprocessor constants, transparently redefined the functions to the 64-bit large-file aware ones. Many old interfaces, especially C-based ones, explicitly specified argument types in a way that did not allow straightforward or transparent transition to 64-bit types. For example, the C functions fseek and ftell operate on file positions of type long int, which is typically 32 bits wide on 32-bit platforms, and cannot be made larger without sacrificing backward compatibility. (This was resolved by introducing new functions fseeko and ftello in POSIX. On Windows machines, under Visual C++, functions _fseeki64 and _ftelli64 are used.) Adoption The usage of the large-file API in 32-bit programs had been incomplete for a long time. An analysis did show in 2002 that many base libraries of operating sys
https://en.wikipedia.org/wiki/List%20of%20islands%20of%20the%20British%20Isles
This article is a list of some of the islands that form the British Isles that have an area of one kilometre squared (247 acres) or larger, listing area and population data. The total area of the islands is 314,965 km2 (121,608 sq. mi.). Great Britain accounts for the larger part of this area at 66%, with Ireland accounting for 26%, leaving the remaining 8%—an area of 23,996 km2 (9265 sq mi)—consisting of thousands of smaller islands. The largest of the other islands are to be found in the Hebrides and the Northern Isles to the north, and Anglesey and the Isle of Man between Great Britain and Ireland. Not included are the Channel Islands which, positioned off the coast of France, are not part of the archipelago. There are 188 permanently inhabited islands in total: Isle of Man: 1 Republic of Ireland: 62 and a part of Ireland United Kingdom: 123 plus Great Britain and a part of Ireland England: 19 and a part of Great Britain Northern Ireland: 1 and a part of Ireland Scotland: 97 and a part of Great Britain Wales: 6 and a part of Great Britain List of islands by area See also British Isles List of islands in the Atlantic Ocean List of islands of England List of islands of Ireland List of islands of Scotland List of Orkney islands List of Outer Hebrides List of Shetland islands List of islands of the Isle of Man List of islands of the United Kingdom List of islands of Wales References List British Isles Islands by area Islands it:Arcipelago britannico#Lista delle isole
https://en.wikipedia.org/wiki/Don%20LaFontaine
Donald LeRoi LaFontaine (August 26, 1940 – September 1, 2008) was an American voice actor who recorded more than 5,000 film trailers and hundreds of thousands of television advertisements, network promotions, and video game trailers over four decades. He became identified with the phrase "In a world...", used in so many movie trailers that it became a humorous catch-phrase. Widely known in the film industry, the man whose nicknames included "Thunder Throat", "The Voice of God" and "The King of Movie Trailers", became known to a wider audience through commercials for GEICO insurance and the Mega Millions lottery game. LaFontaine voiced promos for Numb3rs on CBS. Early life LaFontaine was born on August 26, 1940, in Duluth, Minnesota, to Alfred and Ruby LaFontaine. LaFontaine said his voice cracked at age 13 in mid-sentence, giving him the bass tones that later brought him much fame and success. After graduating from Duluth Central High School in 1958, he enlisted in the U.S. Army and served as an audio engineer with the U.S. Army Band and the U.S. Army Chorus. Career LaFontaine continued to work as a recording engineer after discharge and began working at the National Recording Studios in New York City, where, in 1962, he had the opportunity to work with producer Floyd Peterson on radio spots for Dr. Strangelove. Peterson incorporated many of LaFontaine's ideas for the spots and, in 1963, they went into business together producing advertising exclusively for the movie industry. LaFontaine claimed that this company first came up with many of the famous movie trailer catchphrases, including his own future signature phrase "In a world...". While working on the 1964 western Gunfighters of Casa Grande, LaFontaine had to fill in for an unavailable voice actor to have something to present to MGM. After MGM bought the spots, LaFontaine began a career as a voiceover artist. He became the head of Kaleidoscope Films Ltd., a movie trailer production company, before starting his own company, Don LaFontaine Associates, in 1976. Shortly thereafter, he was hired by Paramount to do their trailers and was eventually promoted to vice president. He decided to get back into trailer work and left Paramount, moving to Los Angeles in 1981. LaFontaine was contacted by an agent who wanted to promote him for voiceover work, and from then on worked in voiceovers. At his peak, he voiced about 60 promotions a week, and sometimes as many as 35 in a single day. Once he established himself, most studios were willing to pay a high fee for his service. His income was in the millions. LaFontaine often had jobs at several different studios each day. With the advent of ISDN technology, LaFontaine eventually built a recording studio in his Hollywood Hills home and began doing his work from home. LaFontaine lent his distinctive voice to thousands of movie trailers during his career, spanning every genre from every major film studio, including The Cannon Group, for which he vo
https://en.wikipedia.org/wiki/Lighthouse%20Design
Lighthouse Design Ltd. was an American software company that operated from 1989 to 1996. Lighthouse developed software for NeXT computers running the NeXTSTEP operating system. The company was founded in 1989 by Alan Chung, Roger Rosner, Jonathan Schwartz, Kevin Steele and Brian Skinner, in Bethesda, Maryland. Lighthouse later moved to San Mateo, California. In 1996, Lighthouse was acquired by Sun Microsystems. History Two of the first products developed at Lighthouse were Diagram! and Exploder. Diagram! was a drawing tool, originally called BLT (for Box-and-Line Tool) in which objects (boxes) are connected together using "smart links" (lines) to construct diagrams such a flow charts. Exploder was a programming tool for storing Objective-C objects in a relational database. Lighthouse marketed Diagram! directly, and in 1991 spun off the Exploder into a new startup, Persistence Software. Persistence Software went public with an IPO on June 25, 1999. Lighthouse went on to develop and acquire more software products, and marketed an office suite for NeXTSTEP, which included ParaSheet (a traditional spreadsheet), Quantrix (a spreadsheet program based on Lotus Improv), Diagram!, TaskMaster (a project management program), WetPaint (an image editing/retouching program), LightPlan (an OMT-based computer data modeling tool, based on Diagram!), and Concurrence (a presentation program). In the early 1990s, Sun Microsystems entered a major partnership with NeXT to develop OpenStep, essentially a cross-platform version of the "upper layers" of the NeXTSTEP operating system. OpenStep would provide a NeXT-like system running on top of any suitably powerful underlying operating system, in Sun's case, Solaris. Sun planned a distributed computing environment, with users running OpenStep on the desktop, and the transaction processing occurring on servers in the back-office. The two would communicate with NeXT's Portable Distributed Objects technology, which was known as Distributed Objects Everywhere (DOE), later released as NEO. In mid-1996, Sun purchased Lighthouse for $22 million, turning them into their in-house OpenStep applications group. At the time, Scott McNealy had visions of turning Sun into a powerhouse that would compete head-to-head with Microsoft, and an office applications suite was a requirement for any such plan. Lighthouse's applications were not up to par with Microsoft Office as a whole, but certainly could have been developed into a direct competitor with additional development. But even as the purchase of Lighthouse was going through, Sun was already turning their attention from DOE/NEO on the back-end and OpenStep on the front-end to "Java everywhere". Java was seen as a better solution to infiltrating Sun into the applications market, as it ran on all platforms, not just those supported by OpenStep. Lighthouse was soon moved into the JavaSoft division, becoming the Java Applications Group. The only problem with this move was that any
https://en.wikipedia.org/wiki/Header%20%28computing%29
In information technology, header refers to supplemental data placed at the beginning of a block of data being stored or transmitted. In data transmission, the data following the header is sometimes called the payload or body. It is vital that header composition follows a clear and unambiguous specification or format, to allow for parsing. Examples E-mail header: The text (body) is preceded by header lines indicating sender, recipient, subject, sending time stamp, receiving time stamps of all intermediate and the final mail transfer agents, and much more. Similar headers are used in Usenet (NNTP) messages, and HTTP headers. In a data packet sent via the Internet, the data (payload) are preceded by header information such as the sender's and the recipient's IP addresses, the protocol governing the format of the payload and several other formats. The header's format is specified in the Internet Protocol. In data packets sent by wireless communication, and in sectors of data stored on magnetic media, typically the header begins with a syncword to allow the receiver to adapt to analog amplitude and speed variations and for frame synchronization. In graphics file formats, the header might give information about an image's size, resolution, number of colors, and the like. In archive file formats, the file header might serve as a fingerprint or signature to identify the specific file format and corresponding software utility. In some programming languages (for example C and C++) the functions are declared in header files. See also Footer Protocol overhead Trailer (computing), used in computer networking Field (computer science) References Computer data Data transmission
https://en.wikipedia.org/wiki/Maurice%20Wilkes
Sir Maurice Vincent Wilkes (26 June 1913 – 29 November 2010) was an English computer scientist who designed and helped build the Electronic Delay Storage Automatic Calculator (EDSAC), one of the earliest stored program computers, and who invented microprogramming, a method for using stored-program logic to operate the control unit of a central processing unit's circuits. At the time of his death, Wilkes was an Emeritus Professor at the University of Cambridge. Early life, education, and military service Wilkes was born in Dudley, Worcestershire, England the only child of Ellen (Helen), née Malone (1885–1968) and Vincent Joseph Wilkes (1887–1971), an accounts clerk at the estate of the Earl of Dudley. He grew up in Stourbridge, West Midlands, and was educated at King Edward VI College, Stourbridge. During his school years he was introduced to amateur radio by his chemistry teacher. He studied the Mathematical Tripos at St John's College, Cambridge from 1931 to 1934, and in 1936 completed his PhD in physics on the subject of radio propagation of very long radio waves in the ionosphere. He was appointed to a junior faculty position of the University of Cambridge, through which he was involved in the establishment of a computing laboratory. He was called up for military service during World War II and worked on radar at the Telecommunications Research Establishment (TRE) and in operational research. Research and career In 1945, Wilkes was appointed as the second director of the University of Cambridge Mathematical Laboratory (later known as the Computer Laboratory). The Cambridge laboratory initially had many different computing devices, including a differential analyser. One day Leslie Comrie visited Wilkes and lent him a copy of John von Neumann's prepress description of the EDVAC, a successor to the ENIAC under construction by Presper Eckert and John Mauchly at the Moore School of Electrical Engineering. He had to read it overnight because he had to return it and no photocopying facilities existed. He decided immediately that the document described the logical design of future computing machines, and that he wanted to be involved in the design and construction of such machines. In August 1946 Wilkes travelled by ship to the United States to enroll in the Moore School Lectures, of which he was only able to attend the final two weeks because of various travel delays. During the five-day return voyage to England, Wilkes sketched out in some detail the logical structure of the machine which would become EDSAC. EDSAC Since his laboratory had its own funding, he was immediately able to start work on a small practical machine, EDSAC (for "Electronic Delay Storage Automatic Calculator"), once back at Cambridge. He decided that his mandate was not to invent a better computer, but simply to make one available to the university. Therefore, his approach was relentlessly practical. He used only proven methods for constructing each part of the computer.
https://en.wikipedia.org/wiki/A%26E%20%28TV%20network%29
A&E is an American basic cable network, the flagship television property of A&E Networks. The network was originally founded in 1984 as the Arts & Entertainment Network, initially focusing on fine arts, documentaries, dramas, and educational entertainment. Today, the network deals primarily in non-fiction programming, including reality docusoaps, true crime, documentaries, and miniseries. , A&E is available to approximately 95,968,000 pay television households (82.4% of households with television) in the United States. The American version of the channel is being distributed in Canada while international versions were launched for Australia, Latin America, and Europe. History Launch A&E launched on February 1, 1984, initially available to 9.3 million cable television homes in the U.S. and Canada. The network is a result of the 1984 merger of Hearst/ABC's Alpha Repertory Television Service (ARTS) and (pre–General Electric merger) RCA-owned The Entertainment Channel. It was originally available in two versions, one in an 8-hour version, which was to follow Nickelodeon on RCA Satcom III-R, the other was a full 20-hour version, on another satellite prover, the Westar V. In 1984, the signal split off from Nickelodeon, once A&E picked up its 20-hour signal on RCA Satcom III-R. In response, Nickelodeon launched its own nighttime block Nick at Nite to displace A&E on many signals. In 1986, the network premiered one of the first classical music videos to be broadcast in the United States and Canada, the Kendall Ross Bean: Chopin Polonaise in A Flat. By 1990, original programming accounted for 35 to 40 percent of A&E's content. Biography, a one-hour documentary series that was revived in 1987, was considered to be the network's signature show. In 1994, airings of Biography went from weekly broadcasts to airing five nights a week, which helped boost A&E's ratings to record levels. The nightly series became A&E's top-rated show and one of cable television's most notable successes. Biography received Primetime Emmy Awards in 1999 and 2002. In 1993, Rockefeller Group’s Radio City Music Hall sold its 12.5% stake of A&E to Capital Cities/ABC, Hearst & NBC, NBC owns 25% stake of A&E, while the others 37.5% stake of the two. In 1994, the channel picked up reruns of Law & Order on an eight-year agreement, which would help bring in additional viewers. In May 1995, the channel's name officially changed to the A&E Network, to reflect its declining focus on arts and entertainment. The following year, the network had branded itself as simply A&E, using the slogans "Time Well Spent" and "Escape the Ordinary." "The word 'arts,' in regard to television, has associations such as 'sometimes elitist,' 'sometimes boring,' 'sometimes overly refined' and 'doesn't translate well to TV, Whitney Goit, executive vice president for sales and marketing, stated. "Even the arts patron often finds arts on TV not as satisfying as it should be ... And the word 'entertainment' is
https://en.wikipedia.org/wiki/IBM%20System%20R
IBM System R is a database system built as a research project at IBM's San Jose Research Laboratory beginning in 1974. System R was a seminal project: it was the first implementation of SQL, which has since become the standard relational data query language. It was also the first system to demonstrate that a relational database management system could provide good transaction processing performance. Design decisions in System R, as well as some fundamental algorithm choices (such as the dynamic programming algorithm used in query optimization), influenced many later relational systems. System R's first customer was Pratt & Whitney in 1977. See also IBM Db2 IBM SQL/DS Ingres (database) SQL System/38 References External links . . . 1974 software System R Proprietary database management systems Relational database management systems
https://en.wikipedia.org/wiki/System%20programming%20language
A system programming language is a programming language used for system programming; such languages are designed for writing system software, which usually requires different development approaches when compared with application software. Edsger Dijkstra refers to these languages as machine oriented high order languages, or mohol. General-purpose programming languages tend to focus on generic features to allow programs written in the language to use the same code on different platforms. Examples of such languages include ALGOL and Pascal. This generic quality typically comes at the cost of denying direct access to the machine's internal workings, and this often has negative effects on performance. System languages, in contrast, are designed not for compatibility, but for performance and ease of access to the underlying hardware while still providing high-level programming concepts like structured programming. Examples include SPL and ESPOL, both of which are similar to ALGOL in syntax but tuned to their respective platforms. Others are cross-platform but designed to work close to the hardware, like BLISS, JOVIAL and BCPL. Some languages straddle the system and application domains, bridging the gap between these uses. The canonical example is C, which is used widely for both system and application programming. Some modern languages also do this such as Rust and Swift. Features In contrast with application languages, system programming languages typically offer more-direct access to the physical hardware of the machine: an archetypical system programming language in this sense was BCPL. System programming languages often lack built-in input/output (I/O) facilities because a system-software project usually develops its own I/O mechanisms or builds on basic monitor I/O or screen management facilities. The distinction between languages used for system programming and application programming became blurred over time with the widespread popularity of PL/I, C and Pascal. History The earliest system software was written in assembly language primarily because there was no alternative, but also for reasons including efficiency of object code, compilation time, and ease of debugging. Application languages such as FORTRAN were used for system programming, although they usually still required some routines to be written in assembly language. Mid-level languages Mid-level languages "have much of the syntax and facilities of a higher level language, but also provide direct access in the language (as well as providing assembly language) to machine features." The earliest of these was ESPOL on Burroughs mainframes in about 1960, followed by Niklaus Wirth's PL360 (first written on a Burroughs system as a cross compiler), which had the general syntax of ALGOL 60 but whose statements directly manipulated CPU registers and memory. Other languages in this category include MOL-360 and PL/S. As an example, a typical PL360 statement is R9 := R8 and R7 shll 8 or R
https://en.wikipedia.org/wiki/Syskey
The SAM Lock Tool, better known as Syskey (the name of its executable file), is a discontinued component of Windows NT that encrypts the Security Account Manager (SAM) database using a 128-bit RC4 encryption key. Introduced in the Q143475 hotfix for Windows NT 4.0 SP3, the tool was removed in Windows 10's Fall Creators Update in 2017 because its method of cryptography is considered unsecure by modern standards and the fact that the tool has been widely employed in scams as a form of ransomware. Microsoft officially recommended use of BitLocker disk encryption as an alternative. History Introduced in the Q143475 hotfix included in Windows NT 4.0 SP3, Syskey was intended to protect against offline password cracking attacks by preventing the possessor of an unauthorized copy of the SAM file from extracting useful information from it. Syskey can optionally be configured to require the user to enter the key during boot (as a startup password) or to load the key onto removable storage media (e.g., a floppy disk or USB flash drive). In mid-2017, Microsoft removed syskey.exe from future versions of Windows. Microsoft recommends using "BitLocker or similar technologies instead of the syskey.exe utility." Security issues The "Syskey Bug" In December 1999, a security team from BindView found a security hole in Syskey that indicated that a certain form of offline cryptanalytic attack is possible, making a brute force attack appear to be possible. Microsoft later issued a fix for the problem (dubbed the "Syskey Bug"). The bug affected both Windows NT 4.0 and pre-RC3 versions of Windows 2000. Use as ransomware Syskey is commonly abused by "tech support" scammers to lock victims out of their own computers in order to coerce them into paying a ransom. See also LM hash pwdump References Cryptographic software Microsoft Windows security technology Windows administration