source stringlengths 32 199 | text stringlengths 26 3k |
|---|---|
https://en.wikipedia.org/wiki/Japan%20FM%20Network | Japan FM Network (JFN; ) is a Japanese commercial radio network. It was founded in 1981.
Tokyo FM is the flagship station of the network.
List of affiliates
Stations are listed in Japanese order of prefectures which is mirrored in ISO 3166-2:JP.
FMQ League
FMQ League () is a group of radio stations mostly comprising JFN affiliates in the region of Kyūshū; FM Okinawa is not a part of this group, but FMQ League is joined by FM Yamaguchi from Chūgoku region. FM Fukuoka serves as the chief station.
Areas without a JFN station
The prefectures the JFN is yet to have a presence are Chiba, Ibaraki, Kanagawa, Kyoto, Nara, Saitama, Wakayama and Yamanashi. Ibaraki had no assigned frequency, the local frequency was used by NHK FM in 1992.
Frequency assigned, but yet to be launched
Nara - 85.8 MHz
Wakayama - 77.2 MHz
Special affiliates
Former affiliate station
References
External links
of the JFN Association
of the JFN Company
Radio in Japan
1981 in Japan
Radio stations established in 1981
Japanese radio networks |
https://en.wikipedia.org/wiki/Deborah%20Roberts | Deborah Ann Roberts (born September 20, 1960) is an American television journalist for the ABC News division of the ABC broadcast television network.
Early life and education
Roberts was born in Perry, Georgia to Benjamin Roberts, a business owner, and Ruth Roberts, a housewife. She graduated from the Henry W. Grady College of Journalism and Mass Communication at the University of Georgia with a Bachelor of Journalism in 1982. In 1992, Roberts was awarded the University of Georgia Distinguished Alumnus Award for her rapid success as a journalist.
Career
In 1982, Roberts began her career at WTVM, a local television station in Columbus, Georgia, and then she moved on to work at WBIR, a local television station in Knoxville, Tennessee.
From 1987 to 1990, she served as bureau chief/NASA field reporter/weekend news co-anchor at WFTV, a local television station in Orlando, Florida. In 1990, she joined NBC News as a general-assignment reporter and later served as a correspondent for Dateline NBC, an NBC News newsmagazine program.
Roberts moved to ABC News in 1995 as a correspondent for 20/20, a newsmagazine program, as well as an anchor for World News Tonight Weekend, the weekend evening news program, and for Good Morning America, the morning news program. In July and September 2006, she was a guest host on The View, a talk show produced by ABC Daytime, a division of ABC.
Roberts has won an Emmy Award and a Clarion Award for her reporting.
Roberts has contributed to other of the network's platforms, including Good Morning America, Primetime, Nightline, and The Katie Couric Show.
Roberts hosts Lifetime Live on Lifetime Television, a cable and satellite television channel.
In 2016, the non-fiction book, Been There, Done That: Family Wisdom for Modern Times, written by Roberts and husband Al Roker, was published.
Personal life
Roberts resides in Manhattan with her husband Al Roker, whom she married in September 1995; they have a daughter and a son. Roberts has a stepdaughter from her husband's prior marriage.
See also
New Yorkers in journalism
References
External links
Staff writer (April 25, 2007). "Deborah Roberts — ABC News Correspondent". ABC News. Accessed December 20, 2009.
1960 births
ABC News personalities
African-American journalists
African-American television personalities
African-American women journalists
American television reporters and correspondents
Living people
NBC News people
People from Manhattan
People from Perry, Georgia
Television anchors from Orlando, Florida
University of Georgia alumni
American women television journalists
21st-century African-American people
21st-century African-American women
20th-century African-American people
20th-century African-American women |
https://en.wikipedia.org/wiki/Pipeline%20%28software%29 | In software engineering, a pipeline consists of a chain of processing elements (processes, threads, coroutines, functions, etc.), arranged so that the output of each element is the input of the next; the name is by analogy to a physical pipeline. Usually some amount of buffering is provided between consecutive elements. The information that flows in these pipelines is often a stream of records, bytes, or bits, and the elements of a pipeline may be called filters; this is also called the pipe(s) and filters design pattern. Connecting elements into a pipeline is analogous to function composition.
Narrowly speaking, a pipeline is linear and one-directional, though sometimes the term is applied to more general flows. For example, a primarily one-directional pipeline may have some communication in the other direction, known as a return channel or backchannel, as in the lexer hack, or a pipeline may be fully bi-directional. Flows with one-directional tree and directed acyclic graph topologies behave similarly to (linear) pipelines – the lack of cycles makes them simple – and thus may be loosely referred to as "pipelines".
Implementation
Pipelines are often implemented in a multitasking OS, by launching all elements at the same time as processes, and automatically servicing the data read requests by each process with the data written by the upstream process – this can be called a multiprocessed pipeline. In this way, the CPU will be naturally switched among the processes by the scheduler so as to minimize its idle time. In other common models, elements are implemented as lightweight threads or as coroutines to reduce the OS overhead often involved with processes. Depending upon the OS, threads may be scheduled directly by the OS or by a thread manager. Coroutines are always scheduled by a coroutine manager of some form.
Usually, read and write requests are blocking operations, which means that the execution of the source process, upon writing, is suspended until all data could be written to the destination process, and, likewise, the execution of the destination process, upon reading, is suspended until at least some of the requested data could be obtained from the source process. This cannot lead to a deadlock, where both processes would wait indefinitely for each other to respond, since at least one of the two processes will soon thereafter have its request serviced by the operating system, and continue to run.
For performance, most operating systems implementing pipes use pipe buffers, which allow the source process to provide more data than the destination process is currently able or willing to receive. Under most Unices and Unix-like operating systems, a special command is also available which implements a pipe buffer of potentially much larger and configurable size, typically called "buffer". This command can be useful if the destination process is significantly slower than the source process, but it is anyway desired that the source process |
https://en.wikipedia.org/wiki/Survey%20vessel | A survey vessel is any type of ship or boat that is used for underwater surveys, usually to collect data for mapping or planning underwater construction or mineral extraction. It is a type of research vessel, and may be designed for the purpose, modified for the purpose or temporarily put into the service as a vessel of opportunity, and may be crewed, remotely operated, or autonomous. The size and equipment vary to suit the task and availability.
Role
The task of survey vessels is to map the bottom, and measure the characteristics of the benthic zone, full water column, and surface for the purpose of:
hydrography, the measurement and description of the physical features of oceans and other natural bodies of water, and the prediction of their change over time, for the primary purpose of safety of navigation and in support of other activities associated with those bodies of water,
general oceanography, the scientific study of the oceans,
mapping of marine habitats as part of the process of assessing the state of the ecology,
measurement of environmental impact of natural and anthropogenic changes,
planning of marine salvage, the process of recovering a ship and its cargo after a shipwreck or other maritime casualty,
dredging, the excavation of material from underwater, to recover materials or to alter the bottom profile, usually for navigational of construction purposes,
underwater construction, which is industrial construction in an underwater environment,
coastal engineering, the branch of civil engineering concerned with construction at or near the coast, and the development of the coast itself,
maritime archaeology, the study of human interaction with the sea, lakes and rivers through the study of associated physical remains,
underwater mining and extraction of petroleum.
Survey equipment
Typically, modern survey vessels are equipped with one or more of the following equipment:
Satellite navigation to provide autonomous geo-spatial positioning,
Single beam sonar for the measurement of underwater physical and biological components,
Multibeam sonar to accurately and efficiently map the seabed
Side-scan sonar to efficiently create relief images of large areas of the sea floor.
Towed magnetometer for measuring the Earth's magnetic field, in geophysical surveys, to detect magnetic anomalies,
Reflection seismology equipment for subsurface profiling. Seismic sources include air guns, sparkers and boomers.
Bottom sampling equipment such as Van Veen grab sampler, Box corer, Epibenthic sled or other core sampling equipment.
CTD sondes to measure the electrical conductivity, temperature, and pressure of seawater
Inertial measurement unit
Unmanned and autonomous survey vessels
Unmanned surface vehicles (USVs; also known as unmanned surface vessels or in some cases autonomous surface vehicles (ASVs), uncrewed surface vessels, or colloquially, drone ships) are boats or ships that operate on the surface of the water without a crew. USVs |
https://en.wikipedia.org/wiki/Posturography | Posturography is the technique used to quantify postural control in upright stance in either static or dynamic conditions. Among them, Computerized dynamic posturography (CDP), also called test of balance (TOB), is a non-invasive specialized clinical assessment technique used to quantify the central nervous system adaptive mechanisms (sensory, motor and central) involved in the control of posture and balance, both in normal (such as in physical education and sports training) and abnormal conditions (particularly in the diagnosis of balance disorders and in physical therapy and postural re-education). Due to the complex interactions among sensory, motor, and central processes involved in posture and balance, CDP requires different protocols in order to differentiate among the many defects and impairments which may affect the patient's posture control system. Thus, CDP challenges it by using several combinations of visual and support surface stimuli and parameters.
Clinical applications for CDP were first described by L.M. Nashner in 1982, and the first commercially available testing system was developed in 1986, when NeuroCom International, Inc., launched the EquiTest system.
Working
Static posturography is carried out by placing the patient in a standing posture on a fixed instrumented platform (forceplate) connected to sensitive detectors (force and movement transducers), which are able to detect the tiny oscillations of the body.
Dynamic posturography differentiates from static posturography generally by using a special apparatus with a movable horizontal platform. As the patient makes small movements, they transmit in real time to a computer. The computer is also used to command electric motors which can move the forceplate in the horizontal direction (translation) as well as to incline it (rotations). Thus, the posturography test protocols generate a sequence of standardized motions in the support platform in order to disequilibrate the patient's posture in an orderly and reproducible way. The platform is contained within an enclosure which can also be used to generate apparent visual surround motions. These stimuli are calibrated relative to the patient's height and weight. A special computer software integrates all this and produces detailed graphics and reports which can then be compared with normal ranges.
Components of balance
Center of gravity (COG) is an important component of balance and should be assessed when evaluating someone’s posture. COG is often measured with COP (Center of pressure) because COG is hard to quantify. According to Lafage et al. (2008) the COG should be located at the midpoint of the base of support if an individual has ideal posture. COP excursion and velocity are indicators of control over COG and are key factors for identifying proper posture and the ability to maintain balance. COP excursion is defined by Collins & De Luca (1992) as the Euclidean*LINK* displacement in the anterior/posterior and medial/la |
https://en.wikipedia.org/wiki/Department%20of%20Computer%20Science%2C%20FMPI%2C%20Comenius%20University | The Department of Computer Science is a department of the Faculty of Mathematics, Physics and Informatics at the Comenius University in Bratislava, the capital of Slovakia. It is headed by Prof. RNDr. Branislav Rovan, Phd.
Educational and scientific achievements
The first comprehensive computer science curriculum in Czechoslovakia (now Slovakia) was introduced at the Faculty in 1973. The department, established in 1974, continues to be responsible for organizing the major part of the undergraduate and graduate computer science education to this date. The distinguishing feature of the curriculum has been a balanced coverage of the mathematical foundations, theoretical computer science, and practical computer science. The part of the curriculum covered by the department at present includes courses on computer architecture, system software, networks, databases, software design, design and analysis of algorithms, formal languages, computational complexity, discrete mathematics, cryptology, data security and others.
The department succeeded several times in project applications within the TEMPUS Programme of the EU. The projects CIPRO and „Neumann Network“ helped to build the departmental hardware infrastructure and to establish the expertise in Unix workstation technology, networking, and structured document processing. The CUSTU PARLAB parallel computing laboratory run jointly with the Department of Informatics of the Faculty of Electrical Engineering and Informatics of the Slovak Technical University also resulted from one of these projects. Furthermore, the department participated in project LEARN-ED under the COPERNICUS Programme and built a multimedia laboratory.
The department has been involved in the organization of one of the top European conferences in theoretical computer science ? MFCS ? each time it took place in Slovakia. Further conferences recently organized or co-organized by the department include SOFSEM '98 and DISC '99. Besides, the department houses the secretariat of the European Association for Theoretical Computer Science, of the Slovak Society for Computer Science and of the Association for Security of Information Technologies (ASIT).
Research topics
Research in theoretical computer science and discrete mathematics has the longest tradition in the department. Most notably, the result of Róbert Szelepcsényi on the closure of nondeterministic space under complement, independently obtained also by N. Immerman, brought the Gödel Prize of the ACM and EATCS to both of them in 1995. More recently research in parallel and distributed computing, cryptology and information security, and in software development has been initiated. The department is involved in international cooperation on the development of the structured document editor within the Euromath Project.
The department has many international contacts, succeeded in research project application (project ALTEC ? „Algorithms for Future Technologies“) with partners from E |
https://en.wikipedia.org/wiki/Grand%20dictionnaire%20terminologique | The Grand dictionnaire terminologique (GDT) is an online terminological database containing nearly 3 million French, English and Latin technical terms in 200 industrial, scientific and commercial fields.
The GDT has existed in a number of formats over the years, including a dial-up service known as Banque de terminologie du Québec (BTQ), and a CD-ROM version. LGDT is now available only as a freely-accessible web site.
Produced by the Office québécois de la langue française, the GDT is the result of thirty years of work by Quebec-based terminologists. It is the most complete translation resource for Canadian English-language technical terms.
Quebec French
When translations differ between Quebec French and "Standard French", – for example in the expression "cerebrovascular accident" (CVA), translated as accident cérébrovasculaire (ACV) in Quebec French and accident vasculaire cérébral in France – the two forms are both given with a paragraph describing their origins, usage and conformity. The GDT thus allows writers to adapt their writing to suit their audience, be it North American, European or African.
1990 Reforms of French orthography
The GDT uses the 1990 Reforms of French orthography with loanwords and neologisms. It also prioritises usage of each word by its prominence in other authoritative works.
References
External links
Lexicography
French language
Translation databases
Databases in Canada |
https://en.wikipedia.org/wiki/Game%20Network | Game Network was a European free-to-air television channel. It was initially owned by Digital Bros group, and later taken over by Cellcast Group. It was first launched in 1999.
History
Game Network first broadcast in Italy on 17 September 1999. The channel was available all over Southern Europe, and developed an estimated audience of 300,000. It launched in the United Kingdom on May 2001 on Sky EPG number 223. At its UK launch, the Financial Times evaluated the channel's free-to-air business proposition, commenting that its potential viewers should be abundant with consideration to the surge of popularity of video games at the time, inline with the release of hundreds of titles in the UK each year and the launch of the PlayStation 2 months earlier. The channel was expected to provide 24-hour television dedicated to video games and earn money from advertising and sponsorship.
The UK version of the television channel (produced by Cellcast) overlayed their own regional content to the channel, with programs including Digital Crack, Me in Mir, The Weekly Chart Show, Game Guru, Reloaded, LiveWire, Evolution. The channel expanded in 2003 (the year Cellcast started increasing control of the channel, with Sem Mioli of Digital Bros. side and Jonathan French and Craig Gardiner from the Cellcast side spearheading the channel) with the launch of non-gaming programmes such as Babestation, a late-night "tease" show. The success of this show lead to many other clone programmes appearing on other channels, and this alternative revenue stream lead to a later influx of phone-in quiz shows, of which Game Network shows many. By 2005, Game Network's UK games programming went largely limited to a block of raw games footage from 5:30 am – 10am (under the name Game Clip), with Game Guru airing from 5 pm until 7 pm, followed by programmes such as Psychic Interactive, which continue until Babestation starts.
In 2004, the channel's Sky EPG number was 172.
The major gaming shows, such as LiveWire, were cancelled in May 2005, and the Italian-language feed from Hot Bird ceased on September 17, 2005, after six years on air. At this time, Digital Bros. sold Game Network UK completely to Cellcast Group, which completely dropped all video game content by 20 February 2006 and renamed it Babestation. On 28 February, the channel was moved to the adult section of the Sky EPG.
Online games
During the time in which Game Network ran as a television channel, they also hosted servers for various online games including Horizons Online, Droiyan Online, The Legend of Mir series, and Myth of Soma, some of which also prominently advertised on their programming (The Legend of Mir games also received their own dedicated programmes). After Game Network ceased to exist as a television channel, their online side involving the video games that they provided to the public continued as GNOnline until servers for The Legend of Mir 2, The Legend of Mir 3, and Myth of Soma were closed on March 2009. |
https://en.wikipedia.org/wiki/Civilization%20IV | Civilization IV (also known as Sid Meier's Civilization IV) is a 4X turn-based strategy computer game and the fourth installment of the Civilization series, and designed by Soren Johnson under the direction of Sid Meier and his video game development studio Firaxis Games. It was released in North America, Europe, and Australia, between October 25 and November 4, 2005, and followed by Civilization V.
Civilization IV uses the 4X empire-building model for turn-based strategy gameplay, in which the player's main objective is to construct a civilization from limited initial resources. Most standard full-length games start the player with a settler unit and/or a city unit in the year 4000 BC. As with other games in the series, there are by default five objectives the player can pursue in order to finish the game: conquering all other civilizations, controlling a supermajority of the game world's land and population, building and sending the first sleeper ship to the Alpha Centauri star system, increasing the "Culture ratings" of at least three different cities to "legendary" levels, or winning a "World Leader" popularity contest by the United Nations. If the time limit for the game is reached and none of the previous goals has been fulfilled by any players including game AI players, the civilization with the highest total game score is declared winner. A large departure from earlier Civilization games is a new graphics engine created from scratch, based on the Gamebryo engine by Numerical Design Limited (NDL).
The game has received critical acclaim and was hailed as an exemplary product of one of the leading video game producers in the turn-based strategy genre, and has been listed as one of the best video games of all time. Civilization IV sold over 3 million copies by 2008 and won multiple awards, including several Game of the Year awards. Its title song, "Baba Yetu", was the first piece of video game music to win a Grammy Award. Two major expansions were released, Civilization IV: Warlords and Civilization IV: Beyond the Sword, as well as the stand-alone expansion pack Civilization IV: Colonization, which were all combined in 2009 into one release edition titled Sid Meier's Civilization IV: The Complete Edition.
Gameplay
Civilization IV follows some of the 4X model of turn-based strategy games, a genre in which players control an empire and "explore, expand, exploit, and exterminate", by having the player attempt to lead a modest group of peoples from a base with initially scarce resources into a successful empire or civilization. The condition for winning the game is accomplished through one of the five ways: militarily defeating all other civilizations in the game world, controlling over two-thirds of the game world's land and population, building the first spaceship in the Space Age and sending it to Alpha Centauri, having the most dominant Culture ratings over other civilizations, or becoming "World Leader" through the United Nations votes. |
https://en.wikipedia.org/wiki/Graphical%20Models | Graphical Models is an academic journal in computer graphics and geometry processing publisher by Elsevier. , its editor-in-chief is Bedrich Benes of the Purdue University.
History
This journal has gone through multiple names. Founded in 1972 as Computer Graphics and Image Processing by Azriel Rosenfeld, it became the first journal to focus on computer image analysis.
Its first change of name came in 1983, when it became Computer Vision, Graphics, and Image Processing. In 1991 it split into two journals, CVGIP: Graphical Models and Image Processing,
and CVGIP: Image Understanding, which later became Computer Vision and Image Understanding. Meanwhile, in 1995, the journal Graphical Models and Image Processing removed the "CVGIP" prefix from its former name, and finally took its current title, Graphical Models, in 2002.
Ranking
Although initially ranked by SCImago Journal Rank as a top-quartile journal in 1999 in its main topic areas, computer graphics and computer-aided design, and then for many years ranked as second-quartile, by 2020 it had fallen to the third quartile.
References
Geometry processing
Computer science journals |
https://en.wikipedia.org/wiki/Dynamic%20Bayesian%20network | A dynamic Bayesian network (DBN) is a Bayesian network (BN) which relates variables to each other over adjacent time steps.
History
A dynamic Bayesian network (DBN) is often called a "two-timeslice" BN (2TBN) because it says that at any point in time T, the value of a variable can be calculated from the internal regressors and the immediate prior value (time T-1). DBNs were developed by Paul Dagum in the early 1990s at Stanford University's Section on Medical Informatics. Dagum developed DBNs to unify and extend traditional linear state-space models such as Kalman filters, linear and normal forecasting models such as ARMA and simple dependency models such as hidden Markov models into a general probabilistic representation and inference mechanism for arbitrary nonlinear and non-normal time-dependent domains.
Today, DBNs are common in robotics, and have shown potential for a wide range of data mining applications. For example, they have been used in speech recognition, digital forensics, protein sequencing, and bioinformatics. DBN is a generalization of hidden Markov models and Kalman filters.
DBNs are conceptually related to probabilistic Boolean networks and can, similarly, be used to model dynamical systems at steady-state.
See also
Recursive Bayesian estimation
Probabilistic logic network
Generalized filtering
References
Further reading
Software
: the Bayes Net Toolbox for Matlab, by Kevin Murphy, (released under a GPL license)
Graphical Models Toolkit (GMTK): an open-source, publicly available toolkit for rapidly prototyping statistical models using dynamic graphical models (DGMs) and dynamic Bayesian networks (DBNs). GMTK can be used for applications and research in speech and language processing, bioinformatics, activity recognition, and any time-series application.
DBmcmc : Inferring Dynamic Bayesian Networks with MCMC, for Matlab (free software)
: Modeling gene regulatory network via global optimization of dynamic bayesian network (released under a GPL license)
libDAI: C++ library that provides implementations of various (approximate) inference methods for discrete graphical models; supports arbitrary factor graphs with discrete variables, including discrete Markov Random Fields and Bayesian Networks (released under the FreeBSD license)
aGrUM: C++ library (with Python bindings) for different types of PGMs including Bayesian Networks and Dynamic Bayesian Networks (released under the GPLv3)
FALCON: Matlab toolbox for contextualization of DBNs models of regulatory networks with biological quantitative data, including various regularization schemes to model prior biological knowledge (released under the GPLv3)
Bayesian networks |
https://en.wikipedia.org/wiki/CNN/SI | CNN/Sports Illustrated (CNN/SI) was a 24-hour sports news network. It was created by Time Warner, merging together its CNN and Sports Illustrated brands and related resources. It was launched on December 12, 1996.
Other news networks like ESPNews, provided 30-minute blocks of news and highlights in a similar fashion to CNN Headline News at the time, but CNN/SI was live daily from 7am to 2am. Their purpose was to provide the most comprehensive sports news service on television, bringing in-depth sports news from around the world, and integrating the Internet and television.
Closure
CNN/SI's closure had been attributed to competition with other all-sports news networks which started around the same time, such as ESPNews and Fox Sports Net's National Sports Report. Though CNN/SI aired exclusive content, such as the tape of former Indiana University coach Bob Knight choking player Neil Reed, the channel reached only 20 million homes, not adequate enough to receive a rating by Nielsen Media Research, which reduced sponsorship. ESPNews, in contrast, benefited from being bundled with ESPN (86.5 million homes). The news channel parent CNN did not have the same influence with cable operators for its all-sports news channel. CNN's cancellation of their flagship sports program, Sports Tonight (which had already been retooled to compete with SportsCenter) after the September 11 attacks contributed to the closure of CNN/SI, as it lost all connections to their mother network.
Near its closure, Sports Tonight was exclusive to CNN/SI. CNN/SI added NASCAR qualifying, Wimbledon matches, National Lacrosse League matches, and televised the now-defunct Women's United Soccer Association
CNN/SI shut down on May 15, 2002. On many cable systems, CNN/SI was replaced by NBA TV. NBA TV, which launched in 1999, eventually evolved into a joint venture between Time Warner and the NBA that officially launched on October 28, 2008.
While the network closed, its international sports program World Sport continues airing and since 2002 has been produced by CNN International.
Website
The CNN/SI name was maintained for Sports Illustrateds online presence, which was located at cnnsi.com. In January 2013, CNN acquired Bleacher Report and after Time Warner's spin-off of their publishing assets into Time Inc. (and subsequently sale to Meredith Corporation and later, to IAC's Dotdash), they dropped all use of the Sports Illustrated name.
Programming
Sports Tonight (1996–2001) hosted by various anchors
NFL Preview hosted by Bob Lorenz with analysts Trev Alberts, Irving Fryar and Peter King
College Football Preview hosted by Bob Lorenz with analyst Trev Alberts and Ivan Maisel
This Week in the NBA hosted by Andre Aldridge and Kevin Loughery
Sports Illustrated Golf Plus hosted by Bob Fiscella and Phil Jones
World Sport hosted by various anchors
Page One hosted by Laura Okmin
NASCAR Plus hosted by Johnny Phelps
Sports Illustrated - Cover to Cover
Trev Alberts' Full Tilt
The Sporting |
https://en.wikipedia.org/wiki/USB%20mass%20storage%20device%20class | The USB mass storage device class (also known as USB MSC or UMS) is a set of computing communications protocols, specifically a USB Device Class, defined by the USB Implementers Forum that makes a USB device accessible to a host computing device and enables file transfers between the host and the USB device. To a host, the USB device acts as an external hard drive; the protocol set interfaces with a number of storage devices.
Uses
Devices connected to computers via this standard include:
External magnetic hard drives
External optical drives, including CD and DVD reader and writer drives
USB flash drives
Solid-state drives
Adapters between standard flash memory cards and USB connections
Digital cameras
Portable media players
Card readers
PDAs
Mobile phones
Devices supporting this standard are known as MSC (Mass Storage Class) devices. While MSC is the original abbreviation, UMS (Universal Mass Storage) has also come into common use.
Operating system support
Most mainstream operating systems include support for USB mass storage devices; support on older systems is usually available through patches.
Microsoft Windows
Microsoft Windows has supported MSC since Windows 2000. There is no support for USB supplied by Microsoft in Windows before Windows 95 and Windows NT 4.0. Windows 95 OSR2.1, an update to the operating system, featured limited support for USB. During that time no generic USB mass-storage driver was produced by Microsoft (including for Windows 98), and a device-specific driver was needed for each type of USB storage device. Third-party, freeware drivers became available for Windows 98 and Windows 98SE, and third-party drivers are also available for Windows NT 4.0. Windows 2000 has support (via a generic driver) for standard USB mass-storage devices; Windows Me and all later Windows versions also include support.
Windows Mobile supports accessing most USB mass-storage devices formatted with FAT on devices with USB Host. However, portable devices typically cannot provide enough power for hard-drive disk enclosures (a hard drive typically requires the maximum 2.5 W in the USB specification) without a self-powered USB hub. A Windows Mobile device cannot display its file system as a mass-storage device unless the device implementer adds that functionality. However, third-party applications add MSC emulation to most WM devices (commercial Softick CardExport and free WM5torage). Only memory cards (not internal-storage memory) can generally be exported, due to file-systems issues; see device access, below.
The AutoRun feature of Windows worked on all removable media, allowing USB storage devices to become a portal for computer viruses. Beginning with Windows 7, Microsoft limited AutoRun to CD and DVD drives, updating previous Windows versions.
MS-DOS
Neither MS-DOS nor most compatible operating systems included support for USB. Third-party generic drivers, such as Duse, USBASPI and DOSUSB, are available to support USB mass |
https://en.wikipedia.org/wiki/Cub%20Koda | Michael John "Cub" Koda (born October 1, 1948 – July 1, 2000) was an American rock and roll singer, guitarist, songwriter, disc jockey, music critic, and record compiler. Rolling Stone magazine considered him best known for writing the song "Smokin' in the Boys Room", recorded by Brownsville Station, which reached number 3 on the 1974 Billboard chart. He co-wrote and edited the All Music Guide to the Blues, and Blues for Dummies, and selected a version of each of the classic blues songs on the CD accompanying the book. He also wrote liner notes for the Trashmen, Jimmy Reed, J. B. Hutto, the Kingsmen, and the Miller Sisters, among others.
Early life and career
Koda was born in Detroit, Michigan, and graduated from Manchester High School, in Manchester, Michigan. He became interested in music as a boy, learning drums by the age of 5, and by the time he was in high school he had formed his own group, the Del-Tinos, which played rockabilly, rock and roll, and blues. The band released its first single, "Go Go Go" (a version of a Roy Orbison recording), in the fall of 1963. They released two more singles but broke up in 1966, when Koda wanted to pursue other options.
Koda spent a year attending Northern Michigan University in Marquette, Michigan. Koda formed and reformed several bands at this time with other musicians in the area. His original songs and over the top performances drew crowds everywhere the band played. His habit of playing a guitar with a 'y' cord plugged into two Fender Twin Reverb amps gave him plenty of volume. He also played harp and slide guitar. After a year Koda decided college wasn't his thing and moved to Las Vegas where he worked as a sideman. This was his springboard to forming Brownsville Station. The last incarnation of his backing band regrouped as Walrus and became a local and midwest institution in their own right.
Brownsville Station
Koda then worked as a solo artist, releasing two singles, "I Got My Mojo Workin'" and "Ramblin' on My Mind", and working with a couple of bands, before forming Brownsville Station in 1969. Formed in Ann Arbor, Michigan, Brownsville Station also included drummer T. J. Cronley, bassist Tony Driggins, guitarist Mike Lutz, and later Bruce Nazarian and Henry Weck. The group was influenced by Chuck Berry, Bo Diddley, the Who, Jerry Lee Lewis, and Link Wray.
The band began performing throughout the American Midwest and released several singles before getting noticed. They released their first album in 1970. The 1973 single "Smokin' in the Boys Room" remains their best-known song. It went to number 3 on the Billboard Hot 100 chart and eventually sold over two million copies. They continued to perform until disbanding in 1979.
Other recordings by Brownsville Station include "The Martian Boogie", "I Get So Excited", "Rock & Roll Holiday", "Hey Little Girl", "Mama Don't Allow No Parkin'", "I Got It Bad for You", "Kings of the Party", "I'm the Leader of the Gang", "Let Your Yeah Be Yeah" (the tit |
https://en.wikipedia.org/wiki/Dan%20Dickau | Daniel David Dickau (born September 16, 1978) is an American former professional basketball player who currently works as an on-air broadcaster for ESPN, the Pac-12 Network, CBS Sports Network and Westwood One. He is also a co-host of the Dickau and Slim Show on Spokane's 700 ESPN with Sean "Slim" Widmer.
Early life and college
Born in Portland, Oregon, Dickau graduated from Prairie High School in nearby Brush Prairie, Washington. He enrolled at the University of Washington in Seattle in 1997 and played for the Huskies under head coach Bob Bender. Dickau fractured his heel 13 games into the 1998–99 season and announced his decision to transfer in April.
He enrolled at Gonzaga University in Spokane and sat out the 1999–2000 season as a transfer, a de facto redshirt year. He was a standout point guard for the Bulldogs for two seasons under head coach Mark Few, named a first team All-American his senior year in 2002.
NBA career
Player
Dickau was selected in the first round of the 2002 NBA draft by the Sacramento Kings, the 28th overall pick. He was traded eight times and wore various jersey numbers in his six-year NBA career:
to the Atlanta Hawks (#12) on June 26, 2002 (on draft night for a first-round pick);
to the Portland Trail Blazers (#7) on February 9, 2004 (Rasheed Wallace trade);
to the Golden State Warriors (#10) on July 20, 2004 (Nick Van Exel trade);
to the Dallas Mavericks (#21) on August 24, 2004 (Erick Dampier trade);
to the New Orleans Hornets (#2) on December 3, 2004 (Darrell Armstrong trade);
to the Boston Celtics (#20) on October 1, 2005 (New Orleans received a second-round draft pick);
to the Portland Trail Blazers (for a second time, via a trade involving former teammate Theo Ratliff) (#2) on June 28, 2006, and
to the New York Knicks (#1) on June 28, 2007 (Zach Randolph and Steve Francis trade).
For two years in a row, Dickau was traded in a draft-day trade package, first from the Celtics to the Trail Blazers, then from the Trail Blazers to the Knicks.
Dickau's best season came in 2004–05 with the New Orleans Hornets, where he saw significant playing time and led the team in total assists, total steals, and 3-pointers made. During the season, he scored 20 or more points in seven games and had five double-doubles.
On December 17, 2005, as a member of the Celtics, his season was ended by a ruptured Achilles tendon sustained while playing against the Chicago Bulls. At the time, he was averaging 3.3 points per game and 2.1 assists per game. On June 28, 2006, the Celtics traded Dickau, center Raef LaFrentz and the 7th pick in the 2006 NBA draft to the Trail Blazers for center Theo Ratliff and guard Sebastian Telfair. Dickau was then sent to the Knicks along with Randolph, only to be waived when the Knicks acquired Jared Jordan. Two days later, Dickau signed with the Clippers.
On October 1, 2008, Dickau signed with the Golden State Warriors. Terms of the agreement were not disclosed per team policy. He played in two preseaso |
https://en.wikipedia.org/wiki/SMB3 | SMB3 may refer to:
Server Message Block version 3, a network protocol in computing
Super Mario Bros. 3, a 1988 video game
Super Mega Baseball 3, an entry in the Super Mega Baseball video game series |
https://en.wikipedia.org/wiki/Flicker%20fixer | A flicker fixer or scan doubler is a piece of computer hardware that de-interlaces an output video signal. The flicker fixer accomplishes this by adjusting the timing of the natively interlaced video signal to suit the needs of a progressive display Ex: CRT computer monitor. Flicker fixers in essence create a progressive frame of video from two interlaced fields of video.
Flicker fixers sample the NTSC/PAL output from the output device and store each scan line from the field currently being displayed in RAM while simultaneously outputting the line alternately with the corresponding neighboring lines from the field stored previously (weaving). Some more advanced flicker fixers integrated in add-on graphics cards use more sophisticated methods. Outputting the image at double scan rate essentially composes a progressive display with all lines from both fields at full vertical refresh rate. This promotes the horizontal frequency of the signal from 15.734 kHz to 31.47 kHz (in the NTSC case, numbers for PAL are slightly lower), which can be the used to drive a VGA monitor from an output device.
Use with the Amiga
One computer capable of producing an interlaced image is the Amiga. The Amiga's default video mode is PAL or NTSC. NTSC and PAL interlaced screens have two fields called odd and even. The fields switch every 60th of a second on NTSC, or 50th of a second on PAL, which allows for higher resolution while using a narrower signal bandwidth than full 50 or 60 FPS progressive video would require, but it can also produce an alarming jittering effect for graphics with high contrast details between fields. This NTSC/PAL compatibility gave the Amiga a distinct edge in uses such as television production and gaming, but since the original Amigas were unable to produce vertically high-resolution displays without flickering this was unsuitable for office-like usage where one might need to work with a clear high-resolution image for several hours. Flicker fixers were devised to remedy this.
The later iteration of the Amiga - the Amiga 3000 had a custom chip called Amber which could perform flicker-fixing on any signal. The ECS and AGA chipset could also output VGA display modes. Commodore offered the A2320 Display Enhancer Board for this purpose, The board fit neatly in a video graphics adapter slot on the A2000 series computer. It supported the new video modes offered by the Enhanced Chip Set (ECS) and AmigaOS 2.0, including the Productivity Mode. Also, the earlier A2024 'Hedley' greyscale monitor featured an integrated flicker fixer, supporting up to 8 shades of grey.
Amiga
Computing output devices |
https://en.wikipedia.org/wiki/QCAD | QCAD is a computer-aided design (CAD) software application for 2D design and drafting. It is available for Linux, Apple macOS, Unix and Microsoft Windows. The QCAD GUI is based on the Qt framework.
QCAD is partly released under the GNU General Public License. Precompiled packages are available for 32-bit and 64-bit Linux platforms, Microsoft Windows OS and macOS.
QCAD is developed by RibbonSoft. Development on QCAD began in October 1999, starting with code from CAM Expert. QCAD 2, designed to "make QCAD more productive, more user friendly, more flexible and increase its compatibility with other products" began development in May 2002. QCAD 3 was first released in August 2011 with an ECMAScript (JavaScript) interface as major addition.
Some of the interface and concepts behind QCAD are similar to those of AutoCAD.
QCAD uses the DXF file format internally and to save and import files. Support for the popular DWG file format is available as a commercial plugin based on the Open Design Alliance DWG libraries.
Starting from version 3.7 QCAD is distributed as Professional Trial that works for a limited time, or as Community Edition as source code only, so users need to self compile or remove the QCAD Professional add-on running in trial mode.
Although much of the software source is under the GPL-3.0-or-later license there is also significant functionality not available under a free software license.
QCAD has a large library of different templates.
Multiplatform support
QCAD operates on Linux, macOS [10.7 (Lion) or later], UNIX (FreeBSD, NetBSD), Solaris for x86 and SPARC, and Windows [8, 7, Vista, XP, 2000]. This is of importance for collaboration across a diverse computing environment.
See also
Comparison of computer-aided design software
LibreCAD, a fork of QCAD 2.0
References
External links
A good tutorial for QCAD
QCAD downloads for all platforms
QCAD Community Forums
KAD (KDE aided design) is a version of QCAD for the KDE Linux Desktop. See KAD – KDE aided design
2D Computer-aided design software
Computer-aided design software for Linux
Engineering software that uses Qt
Free computer-aided design software
Free educational software
MacOS computer-aided design software |
https://en.wikipedia.org/wiki/MGC | MGC can refer to:
Machine Gun Corps
Malvern Girls College
Massachusetts General Court, legislature of that U.S. state
Media Gateway Controller, a device in Voice over IP networks
Medical Grade Cannabis
Megestrol caproate, a progestin
Mekong-Ganga Cooperation
Melbourne Girls' College
MGC, a British sports car.
Michael Gordon Clifford
Middle Georgia College
Midsize Gas Carrier, a type of ship
Midwest Gaming Classic, video game trade show in Wisconsin
Mississippi Gulf Coast
Mitsubishi Gas Chemical Company, Inc.
Museums and Galleries Commission
The Mircom Group of Companies
The post-nominal for recipients of the British Columbia Medal of Good Citizenship |
https://en.wikipedia.org/wiki/%2425%20Million%20Dollar%20Hoax | $25 Million Dollar Hoax is an unscripted television series that was originally shown on American network NBC in November 2004.
The series is noted to be similar in style to FOX's My Big Fat Obnoxious Fiance, which aired in 2003. It is based on a United Kingdom show titled The Million Pound Hoax, broadcast on Sky One earlier that year.
Synopsis
In the series, daughter Chrissy Sanford plays a hoax on her family by convincing them she had won a $25,000,000 lottery prize through the internet, and that it had changed her from a sweet girl into a spend-a-holic.
$25 Million Dollar Hoax contained guest appearances by Ed McMahon, George Gray, and N*SYNC's Lance Bass.
Chrissy successfully pulled off the hoax, which won her and her family over $400,000 in cash and prizes.
Cast
Chrissy Sanford – As herself
Guy Sanford (father) – As himself
Lois Sanford (mother) – As herself
Paul Sanford (brother) – As himself
Eric Sanford (brother) – As himself
Andrew Sanford (brother) – As himself
David Sanford (brother) – As himself
Phillip Sanford (brother) – As himself
Matthew Sanford (brother) – As himself
Jameson Karns (friend) – As himself
Ed McMahon – As himself
George Gray – As himself
Lance Bass – As himself
Overseas broadcasts
: Global
References
External links
2004 American television series debuts
2004 American television series endings
2000s American reality television series
American television series based on British television series
English-language television shows
NBC original programming
Television series by Reveille Productions |
https://en.wikipedia.org/wiki/TMD | TMD may refer to:
Arts and entertainment
Team Deathmatch, a gaming mode
Telemundo, a Spanish-language television network
The Medic Droid, a synthpop band
"Truly Madly Deeply", a 1997 song by Savage Garden
Military
Texas Military Department, United States
Theater missile defense, targeting medium-range missiles
Science
Temporomandibular joint dysfunction, a jaw condition
Transmembrane domain, part of proteins
Transition metal dichalcogenide monolayers, a thin semiconductor material
Transverse momentum distributions, in particle physics experiments
Technology
Thorn Microwave Devices, a brand of Thorn Electrical Industries
Tip-magnetic driving, a fan technology
Tuned mass damper, a vibration absorber
Other uses
Traction maintenance depot, a rail engine shed
Trade Marks Directive, a European Union law
Tottenham Mandem, an English street gang |
https://en.wikipedia.org/wiki/2HD | 2HD is an Australian radio station in New South Wales. Owned and operated as part of the Super Radio Network of stations, it currently broadcasts a news talk and classic hits format to Newcastle, New South Wales and the Hunter Valley. First broadcast on 27 January 1925, it was established by founder Harry Douglas - from which the station derives its name - and is based in studios in Sandgate alongside sister station New FM.
2HD is Australia's second oldest existing radio station, first going to air just hours after Sydney's 2UE.
History
1925-1945
2HD began broadcasting on 27 January 1925, a day after 2UE in Sydney began operations. The station's call sign was the initials of the founder, Hugh Alexander McKay Douglas (known as Harry), not "Hunter District" as is commonly believed. Douglas was a man ahead of his time. He had been a keen amateur radio enthusiast for some years prior to 1925, and an alderman on the Newcastle City Council from 1919 to 1922. He was also the first person to own a sulky and car tyre retreading business in Newcastle, as well as having the first petrol station in Newcastle with a bowser.
The radio station was originally situated in the Newcastle suburb of Hamilton, but soon after moved to the corner of Darby and King Streets (Douglas's tyre business address). Douglas sold the station to William Johnston in 1928, who sold the station to the "Airsales Broadcasting Company" in 1930. Airsales owned the company for 10 years, and was responsible for the move to its landmark studio building in Sandgate, which was the home of 2HD for nearly 50 years. Although the building is now very different, the middle section is still the 1931 building.
Under controversial circumstances during World War II, 2HD was closed in 1941, under the National Security Regulations. At the time, the station was owned by the Jehovah's Witnesses and around 25 staff were employed. Stories circulated that the owners were sending covert messages to enemy agents, based on the names and times of the music being played. 2HD remained silent until near the end of the war, when the Australian Labor Party and the Labor Council of New South Wales bought the station, resuming transmissions on 15 January 1945.
One of 2HD's notable personalities in the 1930s was "Uncle" Rex Sinclair, who continued to perform on local radio and stage until shortly before his death in 2001.
1945-1977
The Labor Party and the NSW Labour Council owned 2HD from 1945 until 1999. For the first 29 years of its ownership, the station was under the management of Jim Storey, with his wife Twink acting as program director and on-air personality. Other announcers during this time included Harry Randall, Stuart Dibbley and Tom Delaney.
In the late 1950s and early 1960s, 2HD was one of the founding shareholders of local television station NBN Television.
2HD broadcast in the popular The Good Guys of Life format, also used by other stations, including 2SM Sydney. Presenters during this time inc |
https://en.wikipedia.org/wiki/Tom%20Knight%20%28scientist%29 | Tom Knight is an American synthetic biologist and computer engineer, who was formerly a senior research scientist at the MIT Computer Science and Artificial Intelligence Laboratory, a part of the MIT School of Engineering. He now works at the synthetic biology company Ginkgo Bioworks, which he cofounded in 2008.
Work in electrical engineering and computer science
Tom Knight arrived at MIT when he was fourteen. Even though he only started his undergraduate studies at the regular age of 18, he took classes in computer programming and organic chemistry during high school because he lived close to the university. He built early hardware such as ARPANET interfaces for host #6 on the network, some of the first bitmapped displays, the ITS time sharing system, Lisp machines (he was also instrumental in releasing a version of the operating system for the Lisp machine under a BSD license), the Connection Machine, and parallel symbolic processing computer systems.
In 1967 Knight wrote the original kernel for the ITS operating system, as well as the combination of command processor and debugger that was used as its top-level user interface. ITS was the dominant operating system for first Project MAC and later the MIT Artificial Intelligence Laboratory and MIT Laboratory for Computer Science. ITS ran on PDP-6 and, later, PDP-10 computers.
In 1968, Knight designed and supervised the construction of the first PDP-10 ARPANET interfaces with Bob Metcalfe.
Knight developed a system to use standard television sets as a terminal interface to the PDP-10.
In 1972, Knight designed one of the first semiconductor memory-based bitmap displays. This was later commercialized and led directly to the development of the Bedford Computer Systems newspaper layout system and influenced many of the bitmapped display devices available today. That same year, along with Jeff Rubin, Knight designed and implemented a network file system that provided the first transparent remote file access over the ARPANET.
In 1974, Knight designed and implemented the prototype version of the MIT Lisp Machine processor, with the production version following in 1976. The Lisp Machine was a microprogrammed machine, tuned for high-performance emulation of other instruction sets. The design of the Lisp Machine was directly implemented by both Symbolics and LMI and was the basis of all of their computers. Texas Instruments implemented surface mount and single-chip versions of the architecture in 1983 and 1987, respectively.
Knight collaborated with Jack Holloway in designing and implementing the Chaosnet, a re-engineered version of the Xerox 3 Mbit/s Ethernet. In 1975 this network became the first local area network on MIT's campus. Chaosnet's innovation of a preamble bit string for packets was eventually incorporated into the 10 Mbit/s Ethernet standard.
In 1980, Knight participated in the development of the Connection Machine architecture and its original implementation. Other notable |
https://en.wikipedia.org/wiki/History%20of%20mobile%20phones | The history of mobile phones covers mobile communication devices that connect wirelessly to the public switched telephone network.
While the transmission of speech by signal has a long history, the first devices that were wireless, mobile, and also capable of connecting to the standard telephone network are much more recent. The first such devices were barely portable compared to today's compact hand-held devices, and their use was clumsy.
Drastic changes have taken place in both the networking of wireless communication and the prevalence of its use, with smartphones becoming common globally and a growing proportion of Internet access now done via mobile broadband.
Foundations
Predecessors
In 1908, Professor Albert Jahn and the Oakland Transcontinental Aerial Telephone and Power Company claimed to have developed a wireless telephone. They were accused of fraud and the charge was then dropped, but they do not really seem to have proceeded with production. In 1917 the Finnish inventor Eric Tigerstedt successfully filed a patent for a "pocket-size folding telephone with a very thin carbon microphone". Beginning in 1918, the German railroad system tested wireless telephony on military trains between Berlin and Zossen. In 1924 public trials started with telephone connection on trains between Berlin and Hamburg. In 1925 the company was founded to supply train-telephony equipment and, in 1926 telephone service in trains of the Deutsche Reichsbahn and the German mail service on the route between Hamburg and Berlin was approved and offered to first-class travelers.
Fiction anticipated the development of real-world mobile telephones. In 1906 the English caricaturist Lewis Baumer published a cartoon in Punch entitled "Forecasts for 1907" in which he showed a man and a woman in London's Hyde Park each separately engaged in gambling and dating on wireless-telegraphy equipment. Cartoonist W. K. Haselden published The Pocket Telephone: When Will It Ring? in 1919, depicting six awkward possibilities. In 1923 Ilya Ehrenburg casually listed "pocket telephones" among the achievements of contemporary technology in a story in his collection Thirteen Pipes (). In 1926 the artist Karl Arnold drew a visionary cartoon about the use of mobile phones in the street, in the picture "wireless telephony", published in the German satirical magazine Simplicissimus. The popular American cartoon detective Dick Tracy acquired a two-way, atomic-battery-powered wrist radio in 1946, upgraded to a wrist TV in 1964.
The Second World War (1939-1945) saw the military use of radio-telephony links. Hand-held radio transceivers have been available since the 1940s. Mobile telephones for automobiles became available from some telephone companies in the 1940s. Early devices were bulky, consumed large amounts of power, and the network supported only a few simultaneous conversations. (Modern cellular networks allow automatic and pervasive use of mobile phones for voice- and data-communic |
https://en.wikipedia.org/wiki/Lance%20Williams%20%28graphics%20researcher%29 | Lance J. Williams (September 25, 1949 – August 20, 2017) was a prominent graphics researcher who made major contributions to texture map prefiltering, shadow rendering algorithms, facial animation, and antialiasing techniques. Williams was one of the first people to recognize the potential of computer graphics to transform film and video making.
Williams died at 67 years old on August 20, 2017, after a battle with cancer. He is survived by his wife and two children.
Education
Williams was an Honors student majoring in English with a minor in Asian Studies at the University of Kansas and graduated with a B.A. in 1972. While a student at KU he competed in collegiate chess tournaments and is said to have had a rating of 1800. He was drawn to the University of Utah by a "Humanistic Computation" summer seminar held by Jef Raskin at KU. He joined the graduate Computer Science program at the University of Utah in 1973 and studied computer graphics and animation under Ivan Sutherland, David Evans, and Steven Coons. At this time in the early 1970s, the University of Utah was the hub for much of the pioneering work being done in computer graphics. Lance left Utah (having completed his PhD course work and exams except the writing of a thesis) in 1977 to join the New York Institute of Technology (NYIT). While at NYIT, Williams invented the mipmapping technique for texture filtering, which is ubiquitously used today by graphics hardware for PCs and video games, and wrote and directed the abandoned project The Works which would have been the first entirely 3D CGI film had it been finished in the early 1980s as intended.
Williams was awarded his PhD in 2000 from the University of Utah based on a rule allowing someone who published three seminal papers in his field to bind them together as his thesis. The three papers are Casting Curved Shadows on Curved Surfaces (1978), Pyramidal Parametrics (1983) and View Interpolation for Image Synthesis (1993).
Professional career
Williams worked at the New York Institute of Technology (NYIT) from 1976-1986 on research and commercial animation, and the development of shadow mapping and "mip" texture mapping. Subsequently Williams consulted for Jim Henson Associates, independently developed facial tracking for computer animation, worked for six years in Apple Computer's Advanced Technology Group starting in 1987. While there he collaborated with Eric Chen to pioneer early image based rendering work, developed "Virtual Integral Holography," (with Dan Venolia), created 3D paint systems and contributed to QuickTime VR. He has pioneered work in motion capture facial animation systems for over 20 years. In 1997, Williams joined DreamWorks SKG. In 2002 he became Chief Scientist at Walt Disney Animation Studios. In 2006, Williams joined Google and worked with the Google Geo Group (Maps and Earth). In 2008 he was a Principal Member of Research Staff at Nokia and as of 2012, he joined NVIDIA Research.
Publications
• “Shadows for |
https://en.wikipedia.org/wiki/IPUMS | Integrated Public Use Microdata Series (IPUMS) is the world's largest individual-level population database. IPUMS consists of microdata samples from United States (IPUMS-USA) and international (IPUMS-International) census records, as well as data from U.S. and international surveys. The records are converted into a consistent format and made available to researchers through a web-based data dissemination and analysis system.
IPUMS is housed at the Institute for Social Research and Data Innovation (ISRDI), an interdisciplinary research center at the University of Minnesota, under the direction of Professor Steven Ruggles.
Description
IPUMS includes all persons enumerated in the United States Censuses from 1850 to 2020 (though, the 1890 census is missing because it was destroyed in a fire) and from the American Community Survey since 2000 and the Current Population Survey since 1962. IPUMS includes household-level data for United States Censuses from 1790 to 1840, due to the first six censuses only including the name of the head of household, with tallied household totals following. IPUMS provides consistent variable names, coding schemes, and documentation across all the samples, facilitating the analysis of long-term change.
IPUMS-International includes countries from Africa, Asia, Europe, and Latin America for 1960 forward. The database currently includes more than a billion individuals enumerated in 365 censuses from 94 countries around the world. IPUMS-International converts census microdata for multiple countries into a consistent format, allowing for comparisons across countries and time periods. Special efforts are made to simplify use of the data while losing no meaningful information. Comprehensive documentation is provided in a coherent form to facilitate comparative analyses of social and economic change.
Additional databases in the IPUMS family include the:
North Atlantic Population Project (NAPP)
IPUMS National Historical Geographic Information System (NHGIS)
IPUMS Health Surveys
IPUMS Global Health
IPUMS Time Use
The Journal of American History described the effort as "One of the great archival projects of the past two decades." Liens Socio, the French portal for the social sciences, gave IPUMS the only “best site” designation that has gone to any non-French website, writing “IPUMS est un projet absolument extraordinaire...époustouflante [mind-blowing]!”
The official motto of IPUMS is "use it for good, never for evil." All IPUMS data and documentation are available online free of charge.
References
External links
Institute for Social Research and Data Innovation
Databases
Demographics
University of Minnesota |
https://en.wikipedia.org/wiki/KNBC | KNBC (channel 4) is a television station in Los Angeles, California, United States, serving as the West Coast flagship of the NBC network. It is owned and operated by the network's NBC Owned Television Stations division alongside Corona-licensed Telemundo outlet KVEA (channel 52). Both stations share studios at the Brokaw News Center in the northwest corner of the Universal Studios Hollywood lot off of Lankershim Boulevard in Universal City, while KNBC's transmitter is located on Mount Wilson.
History
Channel 4 first went on the air as KNBH (standing for "NBC Hollywood") on January 16, 1949. It was the second-to-last VHF station in Los Angeles to debut, and the last of NBC's five original owned-and-operated stations to sign on. Unlike the other four, KNBH was the only NBC-owned television station that did not benefit from having a sister radio station. Though the NBC Radio Network had long been affiliated with KFI in Los Angeles, that relationship did not extend into television when KFI-TV (channel 9, now KCAL-TV) signed on in August 1948. When KNBH signed on, it marked the debut of NBC programs on the West Coast. Channel 4 originally broadcast from the NBC Radio City Studios on Sunset Boulevard and Vine Street in Hollywood.
The station changed its callsign to KRCA (for NBC's then-parent company, the Radio Corporation of America) on October 18, 1954. The call letters were changed again on November 11, 1962, when NBC moved the KNBC identity from its San Francisco radio station (which became KNBR) and applied it to channel 4 in Los Angeles. That call letter change coincided with the station's physical relocation from NBC Radio City to the network's color broadcast studio facility in suburban Burbank. NBC Color City, as it was then known, had been in operation since March 1955, and was at least four to five times larger than Radio City, and could easily accommodate KNBC's locally produced studio programming. NBC Radio's West Coast operations eventually followed channel 4 to Burbank not too long after.
KNBC
The station officially modified its callsign to KNBC-TV in August 1986, shortly after NBC and RCA were purchased by General Electric; the -TV suffix was dropped effective September 6, 1995.
On October 11, 2007, NBCUniversal announced that it would put its Burbank studios up for sale and construct a new, all-digital facility near the Universal Studios Hollywood backlot in Universal City, to merge all of NBCUniversal's West Coast operations (including KNBC, KVEA and NBC News' Los Angeles bureau) into one area. The studio opened on February 1, 2014. Shortly thereafter, NBCUniversal named the new broadcast center in honor of former KNBC and NBC News anchor/reporter Tom Brokaw, christened the Brokaw News Center.
In fall 2007 with the rollout of digital broadcasting, the station began airing a 24/7 newschannel News Raw on the .2 subchannel.
KNBC shut down its analog signal, over VHF channel 4, on June 12, 2009, as part of the federally mandated |
https://en.wikipedia.org/wiki/Dynamic%20link%20matching | Dynamic link matching is a graph-based system for image recognition. It uses wavelet transformations to encode incoming image data.
References
External links
Original paper on Dynamic Link Matching
Wavelets
Pattern recognition
Graph algorithms |
https://en.wikipedia.org/wiki/Killzone%20%28video%20game%29 | Killzone is a first-person shooter video game developed by Guerrilla Games and published by Sony Computer Entertainment for the PlayStation 2. It was originally released on 2 November 2004 in North America and 26 November 2004 in Europe. The game was remastered in HD by Supermassive Games and re-released within the Killzone Trilogy for PlayStation 3 as well as a standalone PSN title in 2012.
Killzone takes place in the middle of the 24th century and chronicles the war between two human factions; the Vektans, and the Helghast. The game is played from a first-person view and follows Jan Templar, a high-ranking officer within the Interplanetary Strategic Alliance, as he battles invading Helghast forces on his homeworld of Vekta.
Prior to its release Killzone was heavily anticipated with several publications considering it to be Sony's "Halo killer" title. Upon release, however, the game was met with mixed responses, with critics praising the visuals, sound, and music, but criticizing the gameplay, AI, and technical issues. Despite mixed reactions, Killzone spawned numerous sequels, beginning with a direct sequel, Killzone 2.
Plot
Killzone takes place in a fictional world set in the year 2357. After nuclear war rendered much of the Earth uninhabitable in 2055, world governments formed an international order known as the United Colonial Nations. Partnering with private firms, the UCN moved to establish human colonies in Alpha Centauri, a system occupied by two planets: Vekta, a rich Earth-like world (named after the CEO of the mining conglomerate Helghan, Philip Vekta), and Helghan, a barren wasteland named after the same company. The Helghan Corporation sought to buy ownership of Vekta as well, but when the UCN imposed sanctions against its unfair business practices, a war broke out (known as the First Extrasolar War), which led to the ISA, the military arm of the UCN, driving the company out of Vekta. In response, the exiled colonists established their own civilization on Helghan, built on the principles of militarism and authoritarianism. The harsh environment and atmosphere killed many Helghans, forcing the survivors to use respirators and air tanks just to breathe. Eventually, the population, now known as the Helghast, mutated into pale-skinned hairless humanoids with increased strength, stamina, and intelligence. Violently xenophobic and convinced of their superiority, the Helghan consider humans to be beneath them, and dream of one day reconquering Vekta and expanding their empire to Earth and the neighboring star systems.
Story
Scolar Visari, emperor of Helghan, sends the Helghast Third Army to launch a secret invasion of Vekta. Alerted to the attack, the ISA attempt to prevent it with their SD (Solar Defense) network, but are unable to activate it in time to stop the invaders. With the element of surprise on their side, the Helghast quickly overwhelm the unprepared ISA ground forces and capture several strategic locations, including ISA C |
https://en.wikipedia.org/wiki/Jesse%20B.%20Aikin | Jesse Bowman Aikin (1808–1900) was a shape note "singing master", and compiler of the shape note tunebook The Christian Minstrel. He was born in Chester County, Pennsylvania and lived on a farm in Hatfield, Pennsylvania. Aikin, a member of the Church of the Brethren, was the first to successfully produce a song book (The Christian Minstrel) with a seven-shape note system, in 1846. He vigorously defended his "invention" and his patent, which included the elimination of bass and treble clefs and the simplification of time signatures. After the influential Ruebush & Kieffer Publishing Company began using his notehead shapes around 1876 (previously they used Funk's shapes), the Aikin shapes eventually became the prevailing standard in shape note and gospel music publication, although few other compilers adopted his other innovations.
Aikin's names for the notes were originally written: Doe Ray Mee Faw Sole Law See. All the note stems pointed downwards, and the stems for Doe, Ray, and See were placed centrally on the shape, rather than to the side. These conventions were discarded by later users of his system, so as not to deviate so much from standard notation. The name See was also changed to Ti (as used in the Tonic sol-fa system), so as not to be confused with a sharpened Sol.
Today Aikin's system is still in use, though it is often referred to as the Aiken system, a spelling error introduced by George Pullen Jackson and perpetuated by the Sibelius music notation program.
Aikin's shapes
Publications
The Christian Minstrel, 1846
Harmonia Ecclesiæ, Companion to The Christian Minstrel, 1853
The Sabbath School Minstrel, 1859
See also
The Christian Harmony
References
Sing with Understanding, by Harry Eskew and Hugh McElrath, Broadman Press.
White Spirituals in the Southern Uplands, by George Pullen Jackson, University of North Carolina Press, 1933.
"Jesse B. Aikin and The Christian Minstrel", by Paul G. Hammond, American Music, Vol. 3, No. 4 (Winter, 1985), pp. 442–451.
External links
The Christian Minstrel (1852)
The Christian Minstrel (1858)
1808 births
1900 deaths
Songwriters from Pennsylvania
People from Chester County, Pennsylvania
Shape note
19th-century American singers |
https://en.wikipedia.org/wiki/Skinner%27s%20Sense%20of%20Snow | "Skinner's Sense of Snow" is the eighth episode of the twelfth season of the American animated television series The Simpsons. It first aired on the Fox network in the United States on December 17, 2000. In the episode, a snowstorm traps the students with principal Seymour Skinner and Groundskeeper Willie in Springfield Elementary. When Skinner uses his army skills to control the students, they overthrow him and take over the school. Meanwhile, Homer and Ned set out to rescue the children using Ned's car.
"Skinner's Sense of Snow" was written by Tim Long and directed by Lance Kramer. While the episode's premise is based on an occurrence in Long's childhood, the setpiece came from staff writer Matt Selman. Because the episode takes place in winter, Kramer found it difficult to animate. It features references to Smilla's Sense of Snow, The Deer Hunter and Kristi Yamaguchi, among other things.
In its original broadcast, the episode was seen by approximately 8.8 million viewers, finishing in 33rd place in the ratings the week it aired. Following the home video release, the episode received mostly positive reviews from critics.
Plot
While the Simpson family attends a French Canadian circus called "Cirque de Purée", a violent thunderstorm strikes Springfield and forces an early end to the performance. The storm turns into a snow squall overnight, leading to the closure of nearly every local school and business. Springfield Elementary School remains open, but only a few students and faculty members show up since it is the day before Christmas break. To pass the time, Principal Skinner plays a long, low-budget, poor-quality Christmas film for the children. When class lets out at the end of the day, they discover that they are now trapped in the building by the snow blocking the doors and windows.
With the school's telephone service knocked out by the storm, Skinner tries to keep control over the children and begins to ration the available food. After Nelson tries and fails to escape, Skinner looks through his footlocker of memorabilia from his United States Army service and remembers when he was able to command respect from his subordinates. Hanging Nelson by his vest on a coat hook, Skinner threatens to do the same to the other children and briefly frightens them into submission. However, Bart defies Skinner and tries to tunnel his way out; Skinner stops him, but ends up half-buried in snow when the tunnel caves in. The children take Skinner captive and begin to run amok throughout the school.
Meanwhile, Homer and Ned decide to rescue the children, clearing the roads with an improvised snowplow built by attaching a section of Ned's roof to the front end of his car. The car skids out of control and crashes into a fire hydrant, which sprays water that freezes it in place. Homer attempts to break them loose by gunning the engine, with no success, and the car fills with carbon monoxide that causes both men to hallucinate wildly. Skinner sends out a cal |
https://en.wikipedia.org/wiki/She%20Used%20to%20Be%20My%20Girl | "She Used to Be My Girl" is the fourth episode of the sixteenth season of the American animated television series The Simpsons. It originally aired on the Fox network in the United States on December 5, 2004. It features actress Kim Cattrall from Sex and the City.
Plot
One day, Marge sees a friend from high school, Chloe Talbot, on TV and is jealous of her success as a news reporter. When they meet, an embarrassed Marge confesses she never left Springfield, but the two are glad to see each other again. Chloe comes to the Simpsons' house for dinner, but her exciting stories annoy Marge and inspire Lisa, who goes out to dinner with Chloe.
Marge reveals that she and Chloe were reporters for their high school newspaper, but after high school Marge stayed with her sweetheart Homer after Bart was born, with Chloe leaving her sweetheart Barney when he proposed. With all of Chloe's success, Marge seems to begin to resent both her decision and her family but receives supporting words from Homer.
On their way back from dinner, Chloe invites Lisa to the United Nations women's conference, with Lisa saying she would need parental permission. Upon arriving at the Simpsons house, a drunk Marge, who is worried that Lisa likes Chloe more, provokes Chloe and the two fight on the lawn. This leaves Marge with a black eye.
After Marge talks with Lisa about what happened, she forbids her to go to the women's conference, but Lisa sneaks out and hides in Chloe's car's trunk. Then, as Chloe drives off, her boss calls her, telling her to cover the story of the eruption of Springfield Volcano. When Lisa pops out of the trunk, Chloe has her be her cameraman after her original one fled at the sight of lava.
When Marge and Homer arrive at the women's conference to find Lisa, they see Chloe's live broadcast from the volcano, crediting Lisa behind the camera and the two trapped by a sea of lava. Marge and Homer race to the volcano and the former leaps from rock to rock to rescue Lisa. Moments later, Barney descends in a helicopter to rescue Chloe, who grants him a half hour of pity sex.
When Marge imagines her life as a reporter, she screams to her family, who shows little interest.
Production
The episode was written by Tim Long and guest starred Kim Cattrall as Chloe Talbot.
Reception
In its original American broadcast, "She Used to Be My Girl" was viewed by 10.3 million people.
References
External links
The Simpsons (season 16) episodes
2004 American television episodes
Cultural depictions of Bob Dylan |
https://en.wikipedia.org/wiki/List%20of%20video%20games%20set%20in%20New%20York%20City | This article lists computer and video games in which a major part of the action takes place in New York City or a fictional city closely based on it.
List of games which feature New York City
List of games which feature a fictional city closely based on New York City
Ace Combat 6: Fires of Liberation (Xbox 360) is set in Gracemeria, based on New York City.
Batman: Arkham Knight (Windows, PS4, Xbox One) is set in Gotham City, which is roughly based on New York City: for example, it includes the Lady of Gotham which resembles the Statue of Liberty
Crazy Taxi 2 (Dreamcast) contains two cities, "Around Apple" and "Small Apple", which are both somewhat based on New York City. The latter is also included in Crazy Taxi 3: High Roller.
EarthBound (SNES) has a city called Fourside, which is also referred as the Big Banana; a parody of New York's nickname, the Big Apple.
While the final version of Final Fantasy VII (PlayStation) does not feature anything that resembles New York City specifically, the initial concept was supposed to take place in New York City, and the idea of a realistic setting stuck in the final game.
Final Fight (arcade) is set in a fictional city called Metro City, which features areas based on New York City.
Futurama (PS2, Xbox), which is set in "New New York"
Various games in the Grand Theft Auto series set in Liberty City, a New York City look-alike
Grand Theft Auto (PC, PS, GBC)
Grand Theft Auto III (Windows, PS2, Xbox)
Grand Theft Auto Advance (GBA)
Grand Theft Auto: San Andreas (Windows, PlayStation 2, Xbox), featured one mission in Liberty City
Grand Theft Auto: Liberty City Stories (PlayStation 2, PSP)
Grand Theft Auto IV (Windows, PS3, Xbox 360)
GTA IV: The Lost and Damned (Windows, Xbox 360, PS3)
GTA IV: The Ballad of Gay Tony (Windows, Xbox 360, PS3)
Grand Theft Auto: Chinatown Wars (Nintendo DS, PSP)
inFamous (PS3) is set in Empire City, based on New York City.
Kingpin: Life of Crime (Windows, Linux) is set in Radio City, a city that resembles New York City in the art-deco era
Mafia (Windows, PlayStation 2, Xbox), set in Lost Heaven, a representation of New York City and Chicago in the 1930s
Mafia II (Windows, PlayStation 3, Xbox 360), set in Empire Bay, a representation of New York City in the late 1940s
Mario Kart 8 (Wii U, Nintendo Switch) features the track "Toad Harbor", which closely resembles New York City.
Mother 3 (GBA) features a city known as New Pork City. This city is also featured as a stage in Super Smash Bros. Brawl.
Pokémon Black and White and its sequels (DS) are set in the Unova region, modeled after New York City.
The Sonic the Hedgehog series features many cities based on New York City.
Streets of Rage is set in an unnamed city based on New York City; many prominent New York City landmarks, such as the World Trade Center are visible throughout the game and on the European Mega Drive cover of Streets of Rage 3
Super Mario Odyssey (Nintendo Switch) features New Donk City that is based on New York, with |
https://en.wikipedia.org/wiki/GOMS | GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction. written in 1983 by Stuart K. Card, Thomas P. Moran and Allen Newell, the authors introduce: "a set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals."
GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.
Overview
A GOMS model is composed of methods that are used to achieve specific goals. These methods are then composed of operators at the lowest level. The operators are specific steps that a user performs and are assigned a specific execution time. If a goal can be achieved by more than one method, then selection rules are used to determine the method.
Goals are symbolic structures that define a state of affairs to be achieved and determinate a set of possible methods by which it may be accomplished
Operators are elementary perceptual, motor or cognitive acts, whose execution is necessary to change any aspect of the user's mental state or to affect the task environment
Methods describe a procedure for accomplishing a goal
Selection Rules are needed when a goal is attempted, there may be more than one method available to the user to accomplish it.
There are several different GOMS variations which allow for different aspects of an interface to be accurately studied and predicted. For all of the variants, the definitions of the major concepts are the same. There is some flexibility for the designer's/analyst's definition of all of the entities. For instance, an operator in one method may be a goal in a different method. The level of granularity is adjusted to capture what the particular evaluator is examining.
For a simple applied example see CMN-GOMS.
Qualification
Advantages
The GOMS approach to user modeling has strengths and weaknesses.
While it is not necessarily the most accurate method to measure human-computer interface interaction, it does allow visibility of all procedural knowledge. With GOMS, an analyst can easily estimate a particular interaction and calculate it quickly and easily. This is only possible if the average Methods-Time Measurement data for each specific task has previously been measured experimentally to a high degree of accuracy.
Disadvantages
GOMS only applies to skilled users. It does not work for beginners or intermediates for errors may occur which can alter the data.
Also the model doesn't apply to learning the system or a user using the system after a longer time of not using it.
Another big disadvantage is the lack of account for errors, even skilled users make errors but GOMS does not account for errors.
Mental workload is not addressed in the model, mak |
https://en.wikipedia.org/wiki/7400 | 7400 or variant, may refer to:
In general
A.D. 7400, a year in the 8th millennium CE
7400 BCE, a year in the 8th millennium BC
7400, a number in the 7000 (number) range
Electronics and computing
Texas Instruments 7400-series integrated circuits
PowerPC 7400 CPU chip
MITS 7400 Scientific and Engineering Calculator
Other uses
7400 Lenau, an asteroid in the Asteroid Belt, the 7400th asteroid registered
7400 (District of Kolonjë), one of the postal codes in Albania
GWR 7400 Class locomotives
See also
List of 7400-series integrated circuits |
https://en.wikipedia.org/wiki/Manhattan%20wiring | Manhattan wiring (also known as right-angle wiring) is a technique for laying out circuits in computer engineering. Inputs to a circuit (specifically, the interconnects from the inputs) are aligned into a grid, and the circuit "taps" (connects to) them perpendicularly. This may be done either virtually or physically. That is, it may be shown this way only in the documentation and the actual circuit may look nothing like that; or it may be laid out that way on the physical chip. Typically, separate lanes are used for the inverted inputs and are tapped separately.
The name Manhattan wiring relates to its Manhattan geometry. Reminiscent of how streets in Manhattan, New York tend to criss-cross in a very regular grid, it relates to appearance of such circuit diagrams.
Manhattan wiring is often used to represent a programmable logic array.
Alternatives include X-architecture wiring, or 45° wiring, and Y-architecture wiring (using wires running in the 0°, 120°, and 240° directions).
See also
Manhattan metric
References
Electronic circuits |
https://en.wikipedia.org/wiki/ODA | Oda or ODA may refer to:
Computing
Open Data Center Alliance, a cloud-computing standards organisation
Open Design Alliance, a CAD-promoting group
Optical Disc Archive, an archiving technology
Open Document Architecture and interchange format, a file format
Oracle Database Appliance, an Oracle Corporation engineered system
Government
Organization Designation Authorization, FAA program for designating airworthiness authority
Office of Detainee Affairs of the US Department of Defense
Official development assistance, development aid provided by the member states of the Development Assistance Committee (DAC)
Oklahoma Department of Agriculture, Food, and Forestry
Ontarians with Disabilities Act, provincial legislation for disabled persons
Oregon Department of Agriculture
Oregon Department of Aviation
Oregon Office of Degree Authorization
Overseas Development Administration, predecessor to the United Kingdom's Department for International Development
Special forces, Operational Detachments-A
Organizations
Civic Democratic Alliance (Czech: ), a political party in the Czech Republic, functional 1989–2007
Civic Democratic Alliance (2016) (Czech: ), the current political party in the Czech Republic.
United Nations Office for Disarmament Affairs, United Nations Office for Disarmament Affairs
Ohio Department of Agriculture
Ohio Dental Association
Olympic Delivery Authority, one of the two main agencies that organised the London Olympic Games
Ontario Dental Association
Organization for Democratic Action, alternate name for the Da'am Workers Party, a political party in Israel
Overseas Development Administration, forerunner of the UK Department for International Development
People
Given name
Oda, a Germanic female name with diminutive Odette
Oda of Canterbury (died 958), Archbishop of Canterbury from 942
Saint Oda (680–726) of Scotland ( – ), a Dutch Roman Catholic saint supposedly of Scottish origin
Oda of Meissen (c. 996 – aft. 1018), first Queen of Poland
Surname
Oda (surname)
Oda clan (Japanese: ), a Japanese feudal clan from the Muromachi/Sengoku period
Places
Ōda, Shimane, a city in Japan
Oda, Ghana
Others
Only Dreamers Achieve, a record label created by Polo G
Oda (Albania), typical Albanian room
1144 Oda, an asteroid
Oda, nickname for the Izh 2126, a compact hatchback automobile
Out-of-Door Academy, school in Sarasota, Florida, United States
Operational Detachment-Alpha, the standard 12-man team composed of US Army Special Forces operators
Offline Data Authentication, a stage in EMV credit card payment authorization
Oda Station (disambiguation) |
https://en.wikipedia.org/wiki/Sky%20Open%20%28TV%20channel%29 | Sky Open (formerly known as Prime) is a New Zealand free-to-air television network. It airs a varied mix of programming, largely imported from Australia, the United Kingdom and the United States.
It was originally owned by Prime Television Limited in Australia. Prime later entered into a joint-venture agreement with Nine Entertainment Co. (Nine Network Australia) in February 2002, causing the network's graphics to look like the Nine Network. On 8 February 2006, the Commerce Commission gave Sky clearance to purchase the station for NZ$31 million.
Prime's analogue terrestrial signals had covered 91% of the population via the state-owned Kordia transmission network. It is currently available free-to-air on Sky on satellite and Kordia on terrestrial.
History
Prime (1998–2023)
During early 1998, the United Christian Broadcasters purchased 34 TV licences of UHF spectrum from TVNZ that had been used for the defunct Horizon Pacific and MTV channels. Then during June 1998, Prime Television Limited in Australia purchased the unused 34 TV licences from United Christian Broadcasters for approximately A$3.6 million. The licences covered all major cities and towns, mainly on UHF, except for the Gisborne area, which is served via a VHF signal. On 30 August 1998, Prime Television New Zealand began broadcasting at 6.30 pm with Two Fat Ladies.
Originally the station broadcast classic British programming, documentaries, sports and dramas aimed at the 30 years and above age bracket. In Waikato and Christchurch, Prime produced a half-hour nightly news programme. Although these programmes rated well, they were unprofitable.
In February 2002, Prime New Zealand entered into an agreement with Australian media mogul Kerry Packer's PBL (parent of the Nine Network). Under this five-year agreement, Nine agreed to provide the station with content it owned the rights to, expertise and an amount of cash. In return, Nine was given the right to acquire 54% of Prime New Zealand at the end of the contract. If Prime New Zealand continued to lose money, Nine could choose not to take this up.
After this deal, Prime took on a broader and more expensive programming mix and axed the local content to cut costs. This increased ratings and profits significantly. This new format was modelled closely on the Global Television Network in Canada, whose parent company Canwest happened to own TV3 at the time.
Almost immediately, some Australian programmes produced by Nine Network and shown on Prime NZ took on a slightly New Zealand flavour. For example, one New Zealander per week began to appear on the Australian version of Who Wants to Be a Millionaire?, and weather details for New Zealand cities appeared on the Australian Today breakfast programme. Localisation of Australian programmes increased, with New Zealander Charlotte Dawson becoming the presenter of a New Zealand version of Nine travel programme Getaway (Dawson left this position at the end of 2006). Many Australian programmes |
https://en.wikipedia.org/wiki/Computational%20topology | Algorithmic topology, or computational topology, is a subfield of topology with an overlap with areas of computer science, in particular, computational geometry and computational complexity theory.
A primary concern of algorithmic topology, as its name suggests, is to develop efficient algorithms for solving problems that arise naturally in fields such as computational geometry, graphics, robotics, structural biology and chemistry, using methods from computable topology.
Major algorithms by subject area
Algorithmic 3-manifold theory
A large family of algorithms concerning 3-manifolds revolve around normal surface theory, which is a phrase that encompasses several techniques to turn problems in 3-manifold theory into integer linear programming problems.
Rubinstein and Thompson's 3-sphere recognition algorithm. This is an algorithm that takes as input a triangulated 3-manifold and determines whether or not the manifold is homeomorphic to the 3-sphere. It has exponential run-time in the number of tetrahedral simplexes in the initial 3-manifold, and also an exponential memory profile. Moreover, it is implemented in the software package Regina. Saul Schleimer went on to show the problem lies in the complexity class NP. Furthermore, Raphael Zentner showed that the problem lies in the complexity class coNP, provided that the generalized Riemann hypothesis holds. He uses instanton gauge theory, the geometrization theorem of 3-manifolds, and subsequent work of Greg Kuperberg on the complexity of knottedness detection.
The connect-sum decomposition of 3-manifolds is also implemented in Regina, has exponential run-time and is based on a similar algorithm to the 3-sphere recognition algorithm.
Determining that the Seifert-Weber 3-manifold contains no incompressible surface has been algorithmically implemented by Burton, Rubinstein and Tillmann and based on normal surface theory.
The Manning algorithm is an algorithm to find hyperbolic structures on 3-manifolds whose fundamental group have a solution to the word problem.
At present the JSJ decomposition has not been implemented algorithmically in computer software. Neither has the compression-body decomposition. There are some very popular and successful heuristics, such as SnapPea which has much success computing approximate hyperbolic structures on triangulated 3-manifolds. It is known that the full classification of 3-manifolds can be done algorithmically, in fact, it is known that deciding whether two closed, oriented 3-manifolds given by triangulations (simplicial complexes) are equivalent (homeomorphic) is elementary recursive. This generalizes the result on 3-sphere recognition.
Conversion algorithms
SnapPea implements an algorithm to convert a planar knot or link diagram into a cusped triangulation. This algorithm has a roughly linear run-time in the number of crossings in the diagram, and low memory profile. The algorithm is similar to the Wirthinger algorithm for constructin |
https://en.wikipedia.org/wiki/Direct%20cable%20connection | Direct Cable Connection (DCC) is a feature of Microsoft Windows that allows a computer to transfer and share files (or connected printers) with another computer, via a connection using either the serial port, parallel port or the infrared port of each computer. It is well-suited for computers that do not have an ethernet adapter installed, although DCC in Windows XP can be configured to use one (with a proper crossover cable if no Ethernet hub is used) if available.
Connection types
Serial port
If using the serial ports of the computer, a null modem cable (or a null modem adapter connected to a standard serial cable) must be used to connect each of the two computers to communicate properly. Such connection uses PPP protocol.
Parallel port
If the parallel ports are used, Windows supports standard or basic 4-bit cable (commonly known as LapLink cable), Enhanced Capabilities Port (ECP) cable, or Universal Cable Module (UCM) cable (which was known as DirectParallel cable by Parallel Technologies).
IR
Infrared communication ports, like the ones found on laptop computers (such as IrDA), can also be used.
USB
Connecting any two computers using USB requires a special proprietary bridge cable. A directly connected pin-to-pin USB type A cable does not work, as USB does not support such a type of communication. In fact, attempting to do so may even damage the connecting computers, as it will effectively short the two computers' power supplies together by connecting their 5V and GND lines. This can possibly destroy one or both machines and cause a fire hazard since the two machines may not have exactly the same USB source voltage. Therefore, Direct Cable Connection over USB is not possible; a USB link cable must be used, as seen in the Microsoft knowledge base article 814982. However, with a USB link cable, a program which supports data transfer using that cable must be used. Typically, such a program is supplied with the USB link cable. The DCC wizard or Windows Explorer cannot be used to transfer files over a USB link cable.
Newer hardware technology with identical functionality
There are at least 2 known famous USB-crossover cables capable of bidirectional data-transfer between computers similar to RJ45/Ethernet cables: ProlificUSA.com's TE-C0372 High Speed USB 2.0 Host to Host Bridge Cable (PL25A1 Chipset) and ProlificUSA.com's TE-C0363 Superspeed USB 3.0 Host to Host Bridge Cable (PL27A1 Chipset), and drivers for these seem to have been included in newer versions of the Linux kernel.
Windows Vista changes
Windows Vista drops support for the Direct cable connection feature as ethernet, Wi-Fi and Bluetooth have become ubiquitous on current generation computers. To transfer files and settings, Windows Vista includes Windows Easy Transfer, which uses a proprietary USB-to-USB bridge cable known as the Easy Transfer Cable.
See also
Null modem
LapLink cable
Serial line internet protocol (SLIP)
Parallel line internet protocol (PLIP)
Point-to-Poi |
https://en.wikipedia.org/wiki/Yopy | The Yopy was the name of a series of Personal Digital Assistants (PDA) made by GMate Corporation, also used as a popular PDA Phone in Korea and based on the Linux operating system. The Linux Documentation Project considers the Yopy series to be "true Linux PDAs" because their manufacturers install Linux-based operating systems on them by default.
Overview
At the CeBIT 2000, GMate introduced the YDK1000, the Yopy Development Kit. Without a physical keyboard this device looked very different from later versions. It came with an embedded Linux operating system and the W Window System. Later also precompiled versions of the X Window System and the IceWM window manager became available.
The first official model in the Yopy line of PDAs was the YP3000. It introduced the clam shell design with a full-Qwerty keyboard, and featured a 3.5 inch TFT screen. It also came with the X Window System and IceWM.
One of the features of the YP3500 is a CDMA module, so it can be used as a mobile phone. In 2003, Wi-Fi was widely used in Korea, and so the YP3700 targeted this environment with an additional Wi-Fi module.
By March 2005 Gmate had stopped producing and selling the Yopy PDA and closed down its official web sites.
Yopy models
YDK1000, the Yopy Development Kit
YP3000, the first official model of the Yopy
YP3500, CDMA module was added in YP3500
YP3700, Wi-Fi module was added in YP3700
Yopy software
Because it used the Linux operating system, the Yopy was capable of running a variety of open source software.
References
External links
Official Yopy site (offline)
Yopy User Group (in Korean)
The unofficial YopyWiki, find and share information about Yopy PDAs
UK Based Re-seller for Europe (offline)
UK Based Yopy User Group
Gmate Yopy PDA review (offline)
Mac OS X USB driver for Zaurus and Yopy
Yopytheque: Yopy FAQs and how-tos in French @ Tuxmedia.com (offline)
Personal digital assistants
Linux-based devices
Embedded Linux |
https://en.wikipedia.org/wiki/Water%20supply%20network | A water supply network or water supply system is a system of engineered hydrologic and hydraulic components that provide water supply. A water supply system typically includes the following:
A drainage basin (see water purification – sources of drinking water)
A raw water collection point (above or below ground) where the water accumulates, such as a lake, a river, or groundwater from an underground aquifer. Raw water may be transferred using uncovered ground-level aqueducts, covered tunnels, or underground water pipes to water purification facilities.
Water purification facilities. Treated water is transferred using water pipes (usually underground).
Water storage facilities such as reservoirs, water tanks, or water towers. Smaller water systems may store the water in cisterns or pressure vessels. Tall buildings may also need to store water locally in pressure vessels in order for the water to reach the upper floors.
Additional water pressurizing components such as pumping stations may need to be situated at the outlet of underground or aboveground reservoirs or cisterns (if gravity flow is impractical).
A pipe network for distribution of water to consumers (which may be private houses or industrial, commercial, or institution establishments) and other usage points (such as fire hydrants)
Connections to the sewers (underground pipes, or aboveground ditches in some developing countries) are generally found downstream of the water consumers, but the sewer system is considered to be a separate system, rather than part of the water supply system.
Water supply networks are often run by public utilities of the water industry.
Water abstraction and raw water transfer
Raw water (untreated) is from a surface water source (such as an intake on a lake or a river) or from a groundwater source (such as a water well drawing from an underground aquifer) within the watershed that provides the water resource.
The raw water is transferred to the water purification facilities using uncovered aqueducts, covered tunnels or underground water pipes.
Water treatment
Virtually all large systems must treat the water; a fact that is tightly regulated by global, state and federal agencies, such as the World Health Organization (WHO) or the United States Environmental Protection Agency (EPA). Water treatment must occur before the product reaches the consumer and afterwards (when it is discharged again). Water purification usually occurs close to the final delivery points to reduce pumping costs and the chances of the water becoming contaminated after treatment.
Traditional surface water treatment plants generally consists of three steps: clarification, filtration and disinfection. Clarification refers to the separation of particles (dirt, organic matter, etc.) from the water stream. Chemical addition (i.e. alum, ferric chloride) destabilizes the particle charges and prepares them for clarification either by settling or floating out of the water stream. Sand, a |
https://en.wikipedia.org/wiki/QS | QS may refer to:
Business
QS mark (for "quality and safety"), on Chinese products
Quacquarelli Symonds, an education and careers networking company
QS World University Rankings, an annual publication
Quality System (QS) Regulation, a business process
Quantity surveyor, a professional in the construction industry concerned with costs and contracts
Travel Service (IATA airline designator QS)
Quadraphonic sound, the Sansui QS Regular Matrix system
Politics
Québec solidaire, a political party in Quebec, Canada
Quis separabit?, an Irish Loyalist motto
Science, technology, and mathematics
Biology and medicine
ATCvet code QS Sensory organs, a section of the Anatomical Therapeutic Chemical Classification System for veterinary medicinal products
Quantum satis, a Latin term meaning "the amount which is needed", used in food and drug regulation
Quinolinate synthase, an enzyme
Quorum sensing, a system of interaction in natural and synthetic populations
Computing and mathematics
Quadratic sieve, an integer factorization algorithm
Quality Score, a variable used by search engines to set the rank and cost of ads
Quicksort, a sorting algorithm
Other uses in science and technology
Quadraphonic sound, or "Quadphonic Synthesizer", a matrix quadraphonic gramophone record format developed by Sansui
Quantified self, a movement to incorporate technology into data acquisition on aspects of a person's daily life
Quicksand, a colloid hydrogel consisting of fine granular material and water
Sport
Quality start, a baseball statistic calculated for starting pitchers
Qualcomm Stadium, a stadium in San Diego, California
Other uses
Queen's Scout, a scout who has attained the Queen's Scout Award
Queen's Serjeant, an obsolete legal position in the United Kingdom
See also
Cue (disambiguation)
Q (disambiguation)
Queue (disambiguation)
QZ (disambiguation)
SQ (disambiguation) |
https://en.wikipedia.org/wiki/Computers%20and%20writing | Computers and writing is a sub-field of college English studies about how computers and digital technologies affect literacy and the writing process. The range of inquiry in this field is broad including discussions on ethics when using computers in writing programs, how discourse can be produced through technologies, software development, and computer-aided literacy instruction. Some topics include hypertext theory, visual rhetoric, multimedia authoring, distance learning, digital rhetoric, usability studies, the patterns of online communities, how various media change reading and writing practices, textual conventions, and genres. Other topics examine social or critical issues in computer technology and literacy, such as the issues of the "digital divide", equitable access to computer-writing resources, and critical technological literacies. Many study by scientist such have shown that writing on computer is better than writing in a book
"Computers and Writing" is also the name of an academic conference (see below).
The Field
This interdisciplinary field has grown out of rhetoric and composition studies. Members do scholarly work and teach in allied and diverse areas as technical and professional communication, linguistics, sociology, and law. Important journals supporting this field are Computers & Composition, Computers & Composition Online, and Kairos: A Journal of Rhetoric, Technology, and Pedagogy. The professional organization Conference on College Composition and Communication has a committee, known as the 7Cs committee (CCCC Committee on Computers in Composition and Communication), that selects onsite and online hosts for the Computers & Writing conference and coordinates the "Technology Innovator Award" presented at that annual conference.
Conference and Conference History
The conference "Computers and Writing" was established in 1982 in Minneapolis, Minnesota by Donald Ross and Lillian Bridwell. The conference was informal at first, but has grown from a grassroots organized conference to an established, mainstream conference that examines the ways in which computers change writing practice and pedagogy. In earlier conferences, the scholarship presented often explored how computers influenced individual writers, but during the late 1980s and 1990s, scholarship shifted to hypertext and hypermedia, and the social nature of computer mediated writing. The conference initially presented original or "homemade" software design associated with word processing and editing, but eventually switched to commercial software as commercial software became more common for both individual students and educational institutions.
The conference has a history of technological optimism, and the scholarship presented is optimistic regarding technology's influence on writing. The conference also examines and voices fears and concerns related to computer technology. Some of these fears are related to institutional policies and control as well as the fe |
https://en.wikipedia.org/wiki/APEXC | The APE(X)C, or All Purpose Electronic (X) Computer series was designed by Andrew Donald Booth at Birkbeck College, London in the early 1950s. His work on the APE(X)C series was sponsored by the British Rayon Research Association. Although the naming conventions are slightly unclear, it seems the first model belonged to the BRRA. According to Booth, the X stood for X-company.
One of the series was also known as the APE(X)C or All Purpose Electronic X-Ray Computer and was sited at Birkbeck.
Background
From 1943 on, Booth started working on the determination of crystal structures using X-ray diffraction data. The computations involved were extremely tedious and there was ample incentive for automating the process and he developed an analogue computer to compute the reciprocal spacings of the diffraction pattern.
In 1947, along with his collaborator and future spouse Kathleen Britten, he spent a few months with von Neumann's team, which was the leading edge in computer research at the time.
ARC and SEC
Booth designed an electromechanical computer, the ARC (Automatic Relay Computer), in the late 1940s (1947-1948). Later on, they built an experimental electronic computer named SEC (Simple Electronic Computer, designed around 1948-1949) - and finally, the APE(X)C (All-Purpose Electronic Computer) series.
The computers were programmed by Kathleen.
The APE(X) C series
The APE(X)C series included the following machines:
APE(X)C: Birkbeck College, London, first time operated in May 1952, ready for use at the end of 1953
APE(N)C: Board of Mathematical Machines, Oslo ('N' likely stands for 'Norway'), also known as NUSSE
APE(H)C: British Tabulating Machine Company (It is unclear what 'H' stands for - perhaps 'Hollerith' as the company sold Hollerith Unit record equipment
APE(R)C: British Rayon Research Association ('R' stands for 'Rayon'), ready for use in June 1952
UCC: University College, London (circa January 1956)
MAC or MAGIC (Magnetic Automatic Calculator): "built by Wharf Engineering Laboratories" (February 1955)
The HEC (Hollerith Electronic Computer), built by the British Tabulating Machine Company (later to become International Computers and Tabulators (ICT), then International Computers Limited (ICL)), a commercial machine sold in several models and later known as the ICT200 series. There were likely the derivatives HEC 1, HEC 2, HEC 2M - M for 'marketable' denoting the machine's orientation toward commercial rather than scientific customers, and HEC 4 (before 1955)
Only one of each of these machines was built, with the exception of HEC (and possibly MAC) which were commercial machines produced in quite large numbers for the time, around 150. They were similar in design, with various small differences, mostly in I/O equipment. The APEHC was a punched card machine while the APEXC, APERC and APENC were teletypers (keyboard and printer, plus paper tape reader and puncher). Also, the UCC had 8k words of storage, instead of 1k wor |
https://en.wikipedia.org/wiki/Trondheim%20Region | The Trondheim Region () is a statistical metropolitan region in the county of Trøndelag in Norway. It is centered in the city of Trondheim.
† Population data as of October 2012, from ssb;‡ Orkdal has been added to region due to new road completed
The new limited-access road to Orkdal, a part of European route E39, was completed on 30 June 2005, shortening the driving time between Trondheim and Orkanger with some 15 minutes, adding Orkdal to the region.
Although rarer, there is also some commuting from Rennebu, Levanger and Frosta.
See also
Trondheim og omland
Trøndelag
References
External links
Trøndelag
Metropolitan regions of Norway |
https://en.wikipedia.org/wiki/Program%20trading | Program trading is a type of trading in securities, usually consisting of baskets of fifteen stocks or more that are executed by a computer program simultaneously based on predetermined conditions. Program trading is often used by hedge funds and other institutional investors pursuing index arbitrage or other arbitrage strategies. There are essentially two reasons to use program trading, either because of the desire to trade many stocks simultaneously (for example, when a mutual fund receives an influx of money it will use that money to increase its holdings in the multiple stocks which the fund is based on), or alternatively to arbitrage temporary price discrepancies between related financial instruments, such as between an index and its constituent parts.
According to the New York Stock Exchange, in 2006 program trading accounts for about 30% and as high as 46.4% of the trading volume on that exchange every day. Barrons breaks down its weekly figures for program trading between index arbitrage and other types of program trading. As of July 2012, program trading made up about 25% of the volume on the NYSE; index arbitrage made up less than 1%.
History
Several factors help to explain the explosion in program trading. Technological advances spawned the growth of electronic communication networks. These electronic exchanges, like Instinet and Archipelago Exchange, allow thousands of buy and sell orders to be matched very rapidly, without human intervention.
In addition, the proliferation of hedge funds with all their sophisticated trading strategies have helped drive program-trading volume.
As technology advanced and access to electronic exchanges became easier and faster, program trading developed into the much broader algorithmic trading and high-frequency trading strategies employed by the investment banks and hedge funds.
Program trading firms
Program Trading is a strategy normally used by large institutional traders. Barrons shows a detailed breakdown of the NYSE-published program trading figures each week, giving the figures for the largest program trading firms (such as investment banks).
Index arbitrage
Index Arbitrage is a particular type of Program Trading which attempts to profit from price discrepancies between the basket of stocks which make up a stock index and its derivatives (such as the future based on that index). As of July 2012, it makes up less than 5% of the active Program Trading volume on the NYSE daily.
Premium buy and sell execution levels
The "premium" (PREM) or "spread" is the difference between the stock index future fair value and the actual index level. As the derivative is based on the index, the two should normally have a very close relationship. If there is a sufficiently large difference the arbitraging program will attempt to buy the relatively cheap level (whether that is the basket of stocks which make up the index or the index future) and sell the relatively expensive product, making money from the pric |
https://en.wikipedia.org/wiki/X11vnc | x11vnc is a Virtual Network Computing (VNC) server program. It allows remote access from a remote client to a computer hosting an X Window session and the x11vnc software, continuously polling the X server's frame buffer for changes. This allows the user to control their X11 desktop (KDE, GNOME, Xfce, etc.) from a remote computer either on the user's own network, or from over the Internet as if the user were sitting in front of it. x11vnc can also poll non-X11 frame buffer devices, such as webcams or TV tuner cards, iPAQ, Neuros OSD, the Linux console, and the Mac OS X graphics display.
x11vnc is part of the LibVNCServer project and is free software available under the GNU General Public License.
x11vnc was written by Karl Runge.
x11vnc does not create an extra display (or X desktop) for remote control. Instead, it uses the existing X11 display shown on the monitor of a Unix-like computer in real time, unlike other Linux alternatives such as TightVNC Server. However, it is possible to use Xvnc or Xvfb to create a 'virtual' extra display, and have x11vnc connect to it, enabling X-11 access to headless servers.
x11vnc has security features that allows the user to set an access password or to use Unix usernames and passwords. It also has options for connection via a secure SSL link. An SSL Java VNC viewer applet is provided that enables secure connections from a web browser. The VeNCrypt SSL/TLS VNC security type is also supported.
Many of the UltraVNC extensions to VNC are supported by x11vnc, including file transfer.
Polling algorithm
x11vnc keeps a copy of the X server's frame buffer in RAM. The X11 programming interface XShmGetImage is used to retrieve the frame buffer pixel data. x11vnc compares the X server's frame buffer against its copy to see which pixel regions have changed (and hence need to be sent to the VNC viewers.) Reading pixel data from the physical frame buffer can be much slower than writing to it (because graphics devices are not optimized for reading) and so a sequential pixel by pixel check would often be too slow.
To improve the situation, x11vnc reads in full rows of pixels separated by 32 pixels vertically. Once it gets to the bottom of the screen it starts again near the top with a slightly different offset. After 32 passes like this it has covered the entire screen. This method enables x11vnc to detect changes on the screen roughly 32 times more quickly than a sequential check would (unless the changes are very small, say only 1 pixel tall.) If the X11 DAMAGE extension is present, x11vnc uses it to provide hints where to focus its polling, thereby finding changes even more quickly and also lowering the system load.
Input injection
When x11vnc receives user input events (keystrokes, pointer motion, and pointer button clicks) from a VNC viewer, it must inject them synthetically into the X server. The X11 programming interfaces XTestFakeKeyEvent, XTestFakeMotionEvent, and XTestFakeButtonEvent of the XTEST exten |
https://en.wikipedia.org/wiki/Perihelion%20Software | Perihelion Software Limited was a United Kingdom company founded in 1986 by Dr. Tim King along with a number of colleagues who had all worked together at MetaComCo on AmigaOS and written compilers for both the Amiga and the Atari ST.
Perihelion Software produced an operating system for the INMOS Transputer called HeliOS. This was a system that looked like Unix but which could pass messages to processes running on either the same processor or another one. This was used in the Atari Transputer Workstation, among other places.
Later HeliOS was ported to other processors including the ARM architecture.
Perihelion Software also produced an in-memory database system called Polyhedra. The group responsible for this product was set up as a subsidiary, Perihelion Technology Limited (PTL), which did a management buyout in 1994. PTL later changed its name to Polyhedra plc in 1995, and in 2001 was acquired by a Swedish company called ENEA.
References
External links
Dr. Tim King's Homepage
Software companies of the United Kingdom |
https://en.wikipedia.org/wiki/HeliOS | Helios is a discontinued Unix-like operating system for parallel computers. It was developed and published by Perihelion Software. Its primary architecture is the Inmos Transputer. Helios' microkernel implements a distributed namespace and messaging protocol, through which services are accessed. A POSIX compatibility library enables the use of Unix application software, and the system provides most of the usual Unix utilities.
Work on Helios began in the autumn of 1986. Its success was limited by the commercial failure of the Transputer, and efforts to move to other architectures met with limited success. Perihelion ceased trading in 1998.
The name of the product was Helios. In the materials they produced, Perihelion Software never referred to the operating system as HeliOS.
Development
In the early 1980s, Tim King joined MetaComCo from the University of Bath, bringing with him some rights to an operating system called TRIPOS.
MetaComCo secured a contract from Commodore to work on AmigaOS, with the AmigaDOS component being derived from TRIPOS. In 1986, King left MetaComCo to found Perihelion Software, and began development of a parallel operating system, initially targeted at the INMOS Transputer series of processors. Helios extended TRIPOS' use of a light-weight message passing architecture to networked parallel machines.
Helios 1.0 was the first commercial release in the summer of 1988, followed by version 1.1 in autumn 1989, 1.1a in early 1990, 1.2 in December 1990 followed by 1.2.1 and 1.2.2 updates. Version 1.3 was a significant upgrade with numerous utility, library, server and driver improvements. The last commercial release was 1.3.1. Later Tim King and Nick Garnett gave permission to release the sources under the GNU Public License v3. Helios 1.4 Alpha was planned to include support for OpenLook, OpenMotif, KDE, Gnome, and X11R6 support for the Transputer, Linux, Solaris, and Windows. This release was not completed.
Kernel and nucleus
Helios was designed for a network of multiple nodes, connected by multiple high-bandwidth communications links. Nodes can be dedicated processing nodes, or processors with attached I/O devices. Small systems might consist of a host PC or workstation connected to a set of several processing nodes, while larger systems might have hundreds of processing nodes supported by dedicated nodes for storage, graphics, or user terminals.
A Helios network requires at least one I/O Server node that is able to provide a file system server, console server and reset control for the processing nodes. At power on, the Helios nucleus is bootstrapped from the I/O server into the network. Each node is booted using a small first-stage loader that then downloads and initialises the nucleus proper. Once running, a node communicates with its neighbours, booting them in turn, if required.
The Helios nucleus is composed of the kernel, libraries, loader service and the processor manager service.
Kernel
The Helios ker |
https://en.wikipedia.org/wiki/Lee%20Felsenstein | Lee Felsenstein (born April 27, 1945) is an American computer engineer who played a central role in the development of the personal computer. He was one of the original members of the Homebrew Computer Club and the designer of the Osborne 1, the first mass-produced portable computer.
Before the Osborne, Felsenstein designed the Intel 8080 based Sol-20 computer from Processor Technology, the PennyWhistle modem, and other early "S-100 bus" era designs. His shared-memory alphanumeric video display design, the Processor Technology VDM-1 video display module board, was widely copied and became the basis for the standard display architecture of personal computers.
Many of his designs were leaders in reducing costs of computer technologies for the purpose of making them available to large markets. His work featured a concern for the social impact of technology and was influenced by the philosophy of Ivan Illich. Felsenstein was the engineer for the Community Memory project, one of the earliest attempts to place networked computer terminals in public places to facilitate social interactions among individuals, in the era before the commercial Internet.
Life
Felsenstein graduated from Central High School in Philadelphia as a member of class 219. As a young man, Felsenstein was a New Left radical. From October through December 1964, he was a participant in the Free Speech Movement and was one of 768 arrested in the climactic "Sproul Hall Sit-In" of December 2–3, 1964. He also wrote for the Berkeley Barb, one of the leading underground newspapers.
He had entered University of California, Berkeley first in 1963, joined the Co-operative Work-Study Program in Engineering in 1964 and dropped out at the end of 1967, working as a Junior Engineer at the Ampex Corporation from 1968 through 1971, when he re-enrolled at Berkeley. He received a B.S. in electrical engineering and computer science from the University of California, Berkeley in 1972.
From 1981–1983, Felsenstein was employed at the Osborne Computer Corporation. At Osborne, he was the designer of the Osborne 1, the first mass-produced portable computer. He then returned to freelance consulting. In 1992, he joined Interval Research Corporation, where he worked until 2000. From then until 2005, he worked for Pemstar Pacific Consultants, an electronics design and contract manufacturing firm, which was subsequently acquired by Benchmark Electronics. Throughout, he acted as an occasional free-lance consulting designer or worked at his own design firm.
Many of his designs were leaders in reducing the costs of computer technologies for the purpose of making them available to large markets. His work featured a concern for the social impact of technology. The Community Memory project, begun as a project of Resource One, Inc. in 1972 and later incorporated in 1977 by Felsenstein with Efrem Lipkin, Ken Colstad, Jude Milhon, and Mark Szpakowski, was one of the earliest attempts to place networked computer termin |
https://en.wikipedia.org/wiki/Non-maskable%20interrupt | In computing, a non-maskable interrupt (NMI) is a hardware interrupt that standard interrupt-masking techniques in the system cannot ignore. It typically occurs to signal attention for non-recoverable hardware errors. Some NMIs may be masked, but only by using proprietary methods specific to the particular NMI.
An NMI is often used when response time is critical or when an interrupt should never be disabled during normal system operation. Such uses include reporting non-recoverable hardware errors, system debugging and profiling, and handling of special cases like system resets.
Modern computer architectures typically use NMIs to handle non-recoverable errors which need immediate attention. Therefore, such interrupts should not be masked in the normal operation of the system. These errors include non-recoverable internal system chipset errors, corruption in system memory such as parity and ECC errors, and data corruption detected on system and peripheral buses.
On some systems, a computer user can trigger an NMI through hardware and software debugging interfaces and system reset buttons.
Programmers typically use debugging NMIs to diagnose and fix faulty code. In such cases, an NMI can execute an interrupt handler that transfers control to a special monitor program. From this program, a developer can inspect the machine's memory and examine the internal state of the program at the instant of its interruption. This also allows the debugging or diagnosing of computers which appear hung.
History
In older architectures, NMIs were used for interrupts which were typically never disabled because of the required response time. They were hidden signals. Examples include the floppy disk controller on the Amstrad PCW, the 8087 coprocessor on the x86 when used in the IBM PC or its compatibles (even though Intel recommended connecting it to a normal interrupt), and the Low Battery signal on the HP 95LX.
In the original IBM PC, an NMI was triggered if a parity error was detected in system memory, or reported by an external device. In either case, the PC would display an error message and halt. Some later PC clones used an NMI to conceal the hardware differences from that of a standard PC. On such computers, an NMI would be generated when a program attempted to access incompatible hardware. A BIOS interrupt handler would then translate the program's request to match the hardware that was actually present. The SMM in the 386SL is a better way to do this.
Some 8-bit home computers used the NMI line to permit a "warm start" if the system had locked up. Typically, this would restore the control registers to known good values stored in ROM, without destroying whatever data that the user might currently have loaded. On the Commodore 8-bit machines, the key was hooked up directly or indirectly to the NMI line on the 6502-series CPU, but the reset would take place only if the NMI handler routine in ROM detected that was also being held down when was struck ( |
https://en.wikipedia.org/wiki/Random%20number%20generator%20attack | The security of cryptographic systems depends on some secret data that is known to authorized persons but unknown and unpredictable to others. To achieve this unpredictability, some randomization is typically employed. Modern cryptographic protocols often require frequent generation of random quantities. Cryptographic attacks that subvert or exploit weaknesses in this process are known as random number generator attacks.
A high quality random number generation (RNG) process is almost always required for security, and lack of quality generally provides attack vulnerabilities and so leads to lack of security, even to complete compromise, in cryptographic systems. The RNG process is particularly attractive to attackers because it is typically a single isolated hardware or software component easy to locate. If the attacker can substitute pseudo-random bits generated in a way they can predict, security is totally compromised, yet generally undetectable by any upstream test of the bits. Furthermore, such attacks require only a single access to the system that is being compromised. No data need be sent back in contrast to, say, a computer virus that steals keys and then e-mails them to some drop point.
Human generation of random quantities
Humans generally do poorly at generating random quantities. Magicians, professional gamblers and con artists depend on the predictability of human behavior. In World War II German code clerks were instructed to select three letters at random to be the initial rotor setting for each Enigma machine message. Instead some chose predictable values like their own or a girlfriend's initials, greatly aiding Allied breaking of these encryption systems. Another example is the often predictable ways computer users choose passwords (see password cracking).
Nevertheless, in the specific case of playing mixed strategy games, use of human gameplay entropy for randomness generation was studied by Ran Halprin and Moni Naor.
Attacks
Software RNGs
Just as with other components of a cryptosystem, a software random number generator should be designed to resist certain attacks. Some attacks possible on a RNG include (from):
Direct cryptanalytic attack when an attacker obtained part of the stream of random bits and can use this to distinguish the RNG output from a truly random stream.
Input-based attacks modify the input to the RNG to attack it, for example by "flushing" existing entropy out of the system and put it into a known state.
State compromise extension attacks when the internal secret state of the RNG is known at some time, use this to predict future output or to recover previous outputs. This can happen when a generator starts up and has little or no entropy (especially if the computer has just been booted and followed a very standard sequence of operations), so an attacker may be able to obtain an initial guess at the state.
Hardware RNGs
A number of attacks on hardware random number generators are possible, including |
https://en.wikipedia.org/wiki/Edward%20Yourdon | Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis techniques of the 1970s and a co-developer of both the Yourdon/Whitehead method for object-oriented analysis/design in the late 1980s and the Coad/Yourdon methodology for object-oriented analysis/design in the 1990s.
Biography
Yourdon obtained his B.S. in applied mathematics from Massachusetts Institute of Technology (MIT) in 1965, and did graduate work in electrical engineering and computer science at MIT and the Polytechnic Institute of New York.
In 1964 Yourdon started working at Digital Equipment Corporation developing FORTRAN programs for the PDP-5 minicomputer and later assembler for the PDP-8. In the late 1960s and early 1970s he worked at a small consulting firm and as an independent consultant. In 1974 Yourdon founded his own consulting firm, YOURDON Inc., to provide educational, publishing, and consulting services. After he sold this firm in 1986 he served on the Board of multiple IT consultancy corporations and was advisor on several research project in the software industry throughout the 1990s.
In June 1997, Yourdon was inducted into the Computer Hall of Fame, along with such notables as Charles Babbage, James Martin, Grace Hopper, and Gerald Weinberg. In December 1999 Crosstalk: The Journal of Defense Software Engineering named him one of the ten most influential people in the software field.
In the late 1990s, Yourdon became the center of controversy over his beliefs that Y2K-related computer problems could result in severe software failures that would culminate in widespread social collapse. Due to the efforts of Yourdon and thousands of dedicated technologists, developers and project managers, these potential critical system failure points were successfully remediated, thus avoiding the problems Yourdon and others identified early enough to make a difference.
In the new millennium, Yourdon became Faculty Fellow at the Information Systems Research Center of the University of North Texas as well as Fellow of the Business Technology Trends Council for the Cutter Consortium, where he also was editor of the Cutter IT Journal.
Work
After developing structured analysis techniques of the 1970s, and object-oriented analysis/design in the late 1980s and 1990s, in the new millennium Yourdon specialized in project management, software engineering methodologies, and Web 2.0 development. He also founded and published American Programmer magazine (now titled Cutter IT Journal). He is the author of the book Decline and Fall of the American Programmer.
Yourdon Inc.
In 1974, Yourdon founded the consulting firm Yourdon Inc. in New York, which provided consulting, educational and publishing in the field of software engineering. In the early 1980s, the company had multiple offices in North America and Eu |
https://en.wikipedia.org/wiki/Monkey%20and%20banana%20problem | The monkey and banana problem is a famous toy problem in artificial intelligence, particularly in logic programming and planning.
Formulation of the problem
A monkey is in a room. Suspended from the ceiling is a bunch of bananas, beyond the monkey's reach. However, in the room there are also a chair and a stick. The ceiling is just the right height so that a monkey standing on a chair could knock the bananas down with the stick. The monkey knows how to move around, carry other things around, reach for the bananas, and wave a stick in the air. What is the best sequence of actions for the monkey?
Purpose of the problem
The problem seeks to answer the question of whether monkeys are intelligent. Both humans and monkeys have the ability to use mental maps to remember things like where to go to find shelter, or how to avoid danger. They can also remember where to go to gather food and water, as well as how to communicate with each other. Monkeys have the ability not only to remember how to hunt and gather but to learn new things, as is the case with the monkey and the bananas: despite the fact that the monkey may never have been in an identical situation, with the same artifacts at hand, a monkey is capable of concluding that it needs to make a ladder, position it below the bananas, and climb up to reach for them.
The degree to which such abilities should be ascribed to instinct or learning is a matter of debate.
In 1984, a pigeon was observed as having the capacity to solve a problem.
Software solutions
The problem is used as a toy problem for computer science. It can be solved with an expert system such as CLIPS. The example set of rules that CLIPS provides is somewhat fragile in that naive changes to the rulebase that might seem to a human of average intelligence to make common sense can cause the engine to fail to get the monkey to reach the banana.
Other examples exist using Rules Based System (RBS) a project implemented in Python.
See also
Tool use by animals
References
Logic puzzles |
https://en.wikipedia.org/wiki/Locomotive%20Software | Locomotive Software was a small British software house that did most of its development for Amstrad's home and small business computers of the 1980s. It was founded by Richard Clayton and Chris Hall on 14 February 1983.
It wrote or contributed significantly to the ROMs of the Amstrad CPC 464, Amstrad CPC 664 and Amstrad CPC 6128 home computers, the Amstrad PCW wordprocessor and the later Amstrad-manufactured ZX Spectrum +2A, +2B and +3 machines, amongst others. Its Locomotive BASIC for the CPC range was a fast and highly featured implementation of BASIC for the time and later led to the development of Mallard BASIC for Amstrad's CP/M+ machines. Locomotive was also responsible for the ports of the CP/M operating system to Amstrad machines — initially 2.2 for the CPC464 and CPC664 and later CP/M 3.0 ("CP/M+") for the CPC6128, PCW range and Spectrum +3.
A later Locomotive BASIC was BASIC2 for Digital Research's GEM graphical user interface, as supplied with the Amstrad PC1512 and PC1640 range of PC clones.
The company also developed the LocoScript word processor for the PCW, which was a complete bootable environment in its own right with no separate underlying operating system. The company later produced a PC version of this software, but it was not very successful, partly because it was a DOS application, just as the PC market was moving to Microsoft Windows, but also because the program compared poorly to incumbents like WordPerfect in the more competitive environment of PC word processors.
The same team later went on to develop the Turnpike Internet client for Windows, which was for many years distributed as the standard access software by pioneering dial-up Internet access provider Demon Internet. Demon Internet later acquired Locomotive Software.
References
Amstrad CPC |
https://en.wikipedia.org/wiki/Mallard%20BASIC | Mallard BASIC is a BASIC interpreter for CP/M produced by Locomotive Software and supplied with the Amstrad PCW range of small business computers, the ZX Spectrum +3 version of CP/M Plus, and the Acorn BBC Micro's Zilog Z80 second processor.
In the 1980s, it was standard industry practice to bundle a BASIC interpreter with microcomputers, and the PCW followed this practice. While the PCW was primarily a dedicated word processor for business use running LocoScript, it was running on top of the CP/M operating system.
There were many existing implementations of BASIC for CP/M, such as Digital Research's CBASIC and the third-party ZBasic, but they followed the earlier 1970s model of compilers that were fed source code prepared in a separate text editor. BASIC was not built-in in these cases, the user would prepare a program and then invoke BASIC to run it.
In contrast, home computers of the era had moved to using BASIC as the primary interface for the machine. Instead of booting into CP/M or a similar OS, these machines booted directly into a BASIC normally stored on ROM. These also included a built-in screen editor. Mallard was based on this model, with an integrated editor that was tailored for the PCW's non-standard 90-column screen.
Although the PCW actually had excellent monochrome graphics support for its time and specification, closely comparable to the Hercules Graphics Card for IBM PC compatible computers, Mallard BASIC had no graphics support whatsoever. Instead, Locomotive Software optimised it for business use, with, for instance, full ISAM random-access file support, making it easier to write database applications.
It was also optimised for speed — it is named after the LNER Class A4 4468 Mallard locomotive, the fastest steam locomotive in the world, once again displaying the company's fondness for railway-oriented nomenclature. In fact, the Locomotive Software name came from the phrase "to run like a train" and it was this theme that was used to name Mallard BASIC — no other Locomotive Software product was named after anything railway-oriented.
The Acorn version was designed simply to run the Compact Software small business accounting products Acorn was including to target its Z80 second processor at small businesses. Mallard's major innovation designed specifically for Acorn was the addition of the Jetsam B*-tree keyed access filing system to give similar (but superior) features to the Miksam product Compact had originally designed around.
Graphics could be implemented by loading the GSX extension to CP/M, but this was cumbersome for BASIC programmers.
The lack of graphics support was rectified by several BASIC toolkits, of which the most popular was Lightning Extended BASIC (LEB — see external links). This patched Mallard BASIC, replacing the redundant LET keyword with LEB, which could be followed by a wide variety of parameters to allow sophisticated graphics (for the time) to be drawn on screen, saved to disc, printed, and |
https://en.wikipedia.org/wiki/Olam | Olam may refer to:
Olam International, a food and agri-business company based in Singapore
Olam (network), a network of Jewish and Israeli development and humanitarian organizations
Justin Olam (born 1993), Papua New Guinean rugby league footballer
See also
Alam, a name
El (deity), or El olam |
https://en.wikipedia.org/wiki/Member%20variable | In object-oriented programming, a member variable (sometimes called a member field) is a variable that is associated with a specific object, and accessible for all its methods (member functions).
In class-based programming languages, these are distinguished into two types: class variables (also called static member variables), where only one copy of the variable is shared with all instances of the class; and instance variables, where each instance of the class has its own independent copy of the variable.
For Examples
C++
class Foo {
int bar; // Member variable
public:
void setBar(const int newBar) {
bar = newBar;
}
};
int main () {
Foo rect; // Local variable
return 0;
}
Java
public class Program
{
public static void main(String[] args)
{
// This is a local variable. Its lifespan
// is determined by lexical scope.
Foo foo;
}
}
public class Foo
{
/* This is a member variable - a new instance
of this variable will be created for each
new instance of Foo. The lifespan of this
variable is equal to the lifespan of "this"
instance of Foo
*/
int bar;
}
Python
class Foo:
def __init__(self):
self._bar = 0
@property
def bar(self):
return self._bar
@bar.setter
def bar(self, new_bar):
self._bar = new_bar
f = Foo()
f.bar = 100
print(f.bar)
Common Lisp
(defclass foo () (bar))
(defvar f (make-instance 'foo))
(setf (slot-value f 'bar) 100)
(print (slot-value f 'bar))
Ruby
/*
Ruby has three member variable types: class, class instance, and instance.
*/
class Dog
# The class variable is defined within the class body with two at-signs
# and describes data about all Dogs *and* their derived Dog breeds (if any)
@@sniffs = true
end
mutt = Dog.new
mutt.class.sniffs #=> true
class Poodle < Dog
# The "class instance variable" is defined within the class body with a single at-sign
# and describes data about only the Poodle class. It makes no claim about its parent class
# or any possible subclass derived from Poodle
@sheds = false
# When a new Poodle instance is created, by default it is untrained. The 'trained' variable
# is local to the initialize method and is used to set the instance variable @trained
# An instance variable is defined within an instance method and is a member of the Poodle instance
def initialize(trained = false)
@trained = trained
end
def has_manners?
@trained
end
end
p = Poodle.new
p.class.sheds #=> false
p.has_manners? #=> false
PHP
<?php
class Example
{
/**
* Example instance member variable.
*
* Member variables may be public, protected or private.
*
* @var int
*/
public int $foo;
/**
* Example static member variab |
https://en.wikipedia.org/wiki/TerraServer | Terraserver may refers to either of two databases for viewing geospatial imagery:
Terraserver.com, a commercial web site
TerraServer-USA, which hosts public domain United States Geological Survey aerial images on Microsoft servers |
https://en.wikipedia.org/wiki/Animated%20mapping | Animated mapping is the application of animation, either a computer or video, to add a temporal component to a map displaying change in some dimension. Most commonly the change is shown over time, generally at a greatly changed scale (either much faster than real-time or much slower). An example would be the animation produced after the 2004 tsunami showing how the waves spread across the Indian Ocean.
History
The concept of animated maps began in the 1930s but did not become more developed by cartographers until the 1950s. In 1959, Norman Thrower published Animated Cartography, discussing the use of animated maps in adding a new dimension that was difficult to express in static maps: time. These early maps were created by drawing "snap-shots" of static maps, putting a series of maps together to form a scene, and creating animation through photography tricks (Thrower 1959). Such early maps rarely had an associated scale, legends or oriented themselves to lines of longitude or latitude.
With the development of computers in the 1960s and 1970s, animation programs were developed allowing the growth of animation in mapping. Waldo Tobler created one of the first computer-generated map animations, using a 3-D computer-generated map to portray population growth over a specified time in Detroit. Hal Moellering created another animated map in 1976 representing a spatiotemporal pattern in traffic accidents.
Further development in the animated maps was stalled until the 1990s due to a lack of animation in academics, financial restrictions on research, and lack of distribution means. In the 1990s, however, the invention of faster, more efficient computers, compact discs, and the Internet solved such problems. Today, there are many free options for hosting animated maps online, including YouTube and GitHub. Internet GIS and web mapping both make extensive use of animated maps, particularly when showing time. Because of the nature of the internet, this may lead to the distribution of misinformation and contribute to the infodemic.
Visual variables
With the growth of animated mapping came the development of guidelines for creating animated maps. Visual variables such as spacing, lightness, and shape used for static maps apply. However, in 1991, David DiBiase and colleagues developed visual variables unique to animated maps: duration, rate of change, and order. Duration is the unit of time a frame or scene is displayed, affecting the smoothness of the animation. The shorter a frame is displayed, the smoother the animation will appear. Smoothness of animation is also a function of the rate of change. Order refers to the time sequence in which animation is played out, usually presented in chronological sequence. Alan MacEachren extended these visual variables in 1995 to include display date (time at which change is initiated), frequency (number of times identifiable forms are displayed), and synchronization (correspondence of 2 or more time series).
Types
A |
https://en.wikipedia.org/wiki/Owen%20Thomas%20Edgar | Owen Thomas Edgar (June 17, 1831 – September 3, 1929) was, according to data from the United States Department of Veterans Affairs, the longest surviving U.S. veteran of the Mexican–American War.
Biography
He was born in Philadelphia, Pennsylvania. He enlisted in the United States Navy as a 2nd-class apprentice on February 10, 1846, and was discharged as an Apprentice First Class on August 8, 1849. Edgar saw service on the frigates Potomac, Allegheny, Pennsylvania and Experience.
After the war, he worked at the Bureau of Engraving and Printing for twenty-one years, then worked at a bank for another thirty-one years. He spent his final ten years living at the John Dickson Home in Washington, D.C.
He became the last surviving American veteran of the Mexican-American War on June 17, 1929 when fellow war veteran William Buckner died at the age of 101 in Paris, Missouri.
Death and interment
Edgar died September 3, 1929, at the age of 98 after suffering a fall from a chair that fractured his leg, and was buried in Washington's Congressional Cemetery.
Gallery
References
External links
Owen Thomas Edgar
1831 births
1929 deaths
United States Navy personnel of the Mexican–American War
Military personnel from Philadelphia
United States Navy sailors
Accidental deaths from falls
Burials at the Congressional Cemetery |
https://en.wikipedia.org/wiki/Dunnet%20%28video%20game%29 | Dunnet is a surreal, cyberpunk text adventure written by Ron Schnell, based on a game he wrote in 1982. The name is derived from the first three letters of dungeon and the last three letters of ARPANET. It was first written in Maclisp for the DECSYSTEM-20, then ported to Emacs Lisp in 1992. Since 1994 the game has shipped with GNU Emacs; it also has been included with XEmacs.
The game has been recommended to writers considering writing interactive fiction.
Plot
The game starts out with the player standing at the end of a dirt road, but it turns to the surreal when players realize that they are actually walking around inside a Unix system, and teleporting themselves around the Arpanet. There are many subtle jokes in this game, and there are multiple ways of ending the game. Throughout the game the player moves through different areas and rooms trying to collect treasure to earn points.
Legacy
Dunnet is playable on any operating system with the Emacs editor. Emacs comes with most Unices, including macOS (prior to version 10.15 Catalina) and distributions of Linux. Several articles targeted to Mac OS X owners have recommended it as an easter egg as a game that can be run in Terminal.app. It can be run by running emacs -batch -l dunnet in a shell or the key sequence M-x dunnet within Emacs, the former being the preferred and official way to run it. Dunnet was used as a benchmark in the effort to port Emacs Lisp to Guile, progressing from running standalone games to running the entire Emacs system in less than a person-year of work.
References
External links
Source code, of the eLisp port, GPLv3 license
1982 video games
1980s interactive fiction
Emacs
Emacs modes
Linux games
MacOS games
Video games developed in the United States
Open-source video games
Video games with textual graphics |
https://en.wikipedia.org/wiki/Temporal%20database | A temporal database stores data relating to time instances. It offers temporal data types and stores information relating to past, present and future time.
Temporal databases can be uni-temporal, bi-temporal or tri-temporal.
More specifically the temporal aspects usually include valid time, transaction time or decision time.
Valid time is the time period during or event time at which a fact is true in the real world.
Transaction time is the time at which a fact was recorded in the database.
Decision time is the time at which the decision was made about the fact.
Types
Uni-temporal
A uni-temporal database has one axis of time, either the validity range or the system time range.
Bi-temporal
A bi-temporal database has two axes of time:
valid time
transaction time or decision time
Tri-temporal
A tri-temporal database has three axes of time:
valid time
transaction time
decision time
This approach introduces additional complexities.
Temporal databases are in contrast to current databases (not to be confused with currently available databases), which store only facts which are believed to be true at the current time.
Features
Temporal databases support managing and accessing temporal data by providing one or more of the following features:
A time period datatype, including the ability to represent time periods with no end (infinity or forever)
The ability to define valid and transaction time period attributes and bitemporal relations
System-maintained transaction time
Temporal primary keys, including non-overlapping period constraints
Temporal constraints, including non-overlapping uniqueness and referential integrity
Update and deletion of temporal records with automatic splitting and coalescing of time periods
Temporal queries at current time, time points in the past or future, or over durations
Predicates for querying time periods, often based on Allen's interval relations
History
With the development of SQL and its attendant use in real-life applications, database users realized that when they added date columns to key fields, some issues arose. For example, if a table has a primary key and some attributes, adding a date to the primary key to track historical changes can lead to creation of more rows than intended. Deletes must also be handled differently when rows are tracked in this way. In 1992, this issue was recognized but standard database theory was not yet up to resolving this issue, and neither was the then-newly formalized standard.
Richard Snodgrass proposed in 1992 that temporal extensions to SQL be developed by the temporal database community. In response to this proposal, a committee was formed to design extensions to the 1992 edition of the SQL standard (ANSI X3.135.-1992 and ISO/IEC 9075:1992); those extensions, known as TSQL2, were developed during 1993 by this committee. In late 1993, Snodgrass presented this work to the group responsible for the American National Standard for Database Language SQL, |
https://en.wikipedia.org/wiki/File%20deletion | File deletion is the removal of a file from a computer's file system.
All operating systems include commands for deleting files (rm on Unix, era in CP/M and DR-DOS, del/erase in MS-DOS/PC DOS, DR-DOS, Microsoft Windows etc.). File managers also provide a convenient way of deleting files. Files may be deleted one-by-one, or a whole blacklist directory tree may be deleted.
Purpose
Examples of reasons for deleting files are:
Freeing the disk space
Removing duplicate or unnecessary data to avoid confusion
Making sensitive information unavailable to others
Removing an operating system or blanking a hard drive
Accidental removal
A common problem with deleting files is the accidental removal of information that later proves to be important. A common method to prevent this is to back up files regularly. Erroneously deleted files may then be found in archives.
Another technique often used is not to delete files instantly, but to move them to a temporary directory whose contents can then be deleted at will. This is how the "recycle bin" or "trash can" works. Microsoft Windows and Apple's macOS, as well as some Linux distributions, all employ this strategy.
In MS-DOS, one can use the undelete command. In MS-DOS the "deleted" files are not really deleted, but only marked as deleted—so they could be undeleted during some time, until the disk blocks they used are eventually taken up by other files. This is how data recovery programs work, by scanning for files that have been marked as deleted. As the space is freed up per byte, rather than per file, this can sometimes cause data to be recovered incompletely. Defragging a drive may prevent undeletion, as the blocks used by deleted file might be overwritten since they are marked as "empty".
Another precautionary measure is to mark important files as read-only. Many operating systems will warn the user trying to delete such files. Where file system permissions exist, users who lack the necessary permissions are only able to delete their own files, preventing the erasure of other people's work or critical system files.
Sensitive data
The common problem with sensitive data is that deleted files are not really erased and so may be recovered by interested parties. Most file systems only remove the link to data. But even overwriting parts of the disk with something else or formatting it may not guarantee that the sensitive data is completely unrecoverable. Special software is available that overwrites data, and modern (post-2001) ATA drives include a secure erase command in firmware. However, high-security applications and high-security enterprises can sometimes require that a disk drive be physically destroyed to ensure data is not recoverable, as microscopic changes in head alignment and other effects can mean even such measures are not guaranteed. When the data is encrypted only the encryption key has to be unavailable. Crypto-shredding is the practice of 'deleting' data by (only) deleting or overwri |
https://en.wikipedia.org/wiki/Graph%20rewriting | In computer science, graph transformation, or graph rewriting, concerns the technique of creating a new graph out of an original graph algorithmically. It has numerous applications, ranging from software engineering (software construction and also software verification) to layout algorithms and picture generation.
Graph transformations can be used as a computation abstraction. The basic idea is that if the state of a computation can be represented as a graph, further steps in that computation can then be represented as transformation rules on that graph. Such rules consist of an original graph, which is to be matched to a subgraph in the complete state, and a replacing graph, which will replace the matched subgraph.
Formally, a graph rewriting system usually consists of a set of graph rewrite rules of the form , with being called pattern graph (or left-hand side) and being called replacement graph (or right-hand side of the rule). A graph rewrite rule is applied to the host graph by searching for an occurrence of the pattern graph (pattern matching, thus solving the subgraph isomorphism problem) and by replacing the found occurrence by an instance of the replacement graph. Rewrite rules can be further regulated in the case of labeled graphs, such as in string-regulated graph grammars.
Sometimes graph grammar is used as a synonym for graph rewriting system, especially in the context of formal languages; the different wording is used to emphasize the goal of constructions, like the enumeration of all graphs from some starting graph, i.e. the generation of a graph language – instead of simply transforming a given state (host graph) into a new state.
Graph rewriting approaches
Algebraic approach
The algebraic approach to graph rewriting is based upon category theory. The algebraic approach is further divided into sub-approaches, the most common of which are the double-pushout (DPO) approach and the single-pushout (SPO) approach. Other sub-approaches include the sesqui-pushout and the pullback approach.
From the perspective of the DPO approach a graph rewriting rule is a pair of morphisms in the category of graphs and graph homomorphisms between them: , also written , where is injective. The graph K is called invariant or sometimes the gluing graph. A rewriting step or application of a rule r to a host graph G is defined by two pushout diagrams both originating in the same morphism , where D is a context graph (this is where the name double-pushout comes from). Another graph morphism models an occurrence of L in G and is called a match. Practical understanding of this is that is a subgraph that is matched from (see subgraph isomorphism problem), and after a match is found, is replaced with in host graph where serves as an interface, containing the nodes and edges which are preserved when applying the rule. The graph is needed to attach the pattern being matched to its context: if it is empty, the match can only designate a whole conne |
https://en.wikipedia.org/wiki/Laurel%20Networks | Laurel Networks was founded in 1999, and specialized in routers for telecommunications carriers.
Funding was provided in four rounds the first two of which were:
Round 1: $12.3 million, led by New Enterprise Associates (NEA) and Rein Capital
Round 2: $60M, led by NEA, Trinity Ventures, Worldview Technology Partners and WorldCom Venture Fund
In 2005, after ultimately consuming $120M in venture capital funding, they were purchased by ECI Telecom for $88M, and formally renamed as the Data Networking Division within ECI.
Their primary product is the ST Series of service edge routers. ECI considers the router's ability to do complicated traffic shaping, monitoring and QoS at line rate to be its primary competitive advantage.
They are located in Robinson Township in the Pittsburgh region. They began the startup initially in Sewickley, Pennsylvania.
On November 8, 2011, it was announced that the Pittsburgh office would be closed and that all employees would be laid off by September 30, 2012.
References
External links
ECI Telecom, Data Networking Division web site
Companies based in Allegheny County, Pennsylvania
Telecommunications companies of the United States
Telecommunications companies established in 1999
American companies established in 1999
1999 establishments in Pennsylvania
Companies disestablished in 2012
2012 disestablishments in Pennsylvania
Defunct companies based in Pennsylvania |
https://en.wikipedia.org/wiki/One-way%20voice%20link | A one-way voice link (OWVL) is typically a radio based communication method used by spy networks to communicate with agents in the field typically (but not exclusively) using shortwave radio frequencies.
Shortwave frequencies were and are generally highly preferred for their long range, as a communications link of 1200 km is easily possible. VHF and UHF frequencies can be used for one-way voice circuits, but are generally not preferred as their range is at best 300 km (on flat terrain). Since the 1970s infrared point-to-point communication systems have been used that offer OWVLs, but the number of users was always limited.
This communications system often employs recorders to transmit pre-recorded messages in real time or in burst transmissions, which minimize the time that a spy needs to be on the air. Voice-scrambling systems have been selectively used for this kind of communications circuit since the 1980s, based on operational needs.
Since personal computers became cheap and readily available in the 2000s, time compressed voice scrambling for one-way and bi-directional circuits is a practically free technology.
OWVLs have existed outside espionage, for example the NICAM transmission system was modified in the UK to allow for an OWVL to BBC mobile units. This OWVL was typically used for sports events, as it was highly flexible.
Historical context
During the mid- to late Cold War the STASI (the East German intelligence agency) used point-to-point infrared technology for 2-way voice links within the divided city of Berlin. OWVLs were used intermittently.
OWVL transmission methods were used during the Falklands War by UK elite forces to provide information about suitable troop landing areas. This fact emerged in the late 1980s when UK veterans of the war were writing their memoirs. Argentina had access to similar technology to communicate with its military, but did not really use it during this conflict.
See also
Numbers station
Espionage techniques |
https://en.wikipedia.org/wiki/Kohn%20Pedersen%20Fox | Kohn Pedersen Fox Associates (KPF) is an American architecture firm that provides architecture, interior, programming and master planning services for clients in both the public and private sectors. KPF is one of the largest architecture firms in New York City, where it is headquartered.
History
Beginnings in the United States (1976–1980s)
KPF was founded in 1976 by A. Eugene Kohn, William Pedersen, and Sheldon Fox, all of whom coordinated their departure from John Carl Warnecke & Associates, among the largest architectural firms in the country. Shortly thereafter, the American Broadcasting Company (ABC) chose KPF to redevelop a former armory building on Manhattan's West Side to house TV studios and offices. This led to 14 more projects for ABC over the next 11 years, as well as commissions from major corporations across the country, including AT&T and Hercules Incorporated. By the mid-1980s, KPF had nearly 250 architects working on projects in cities throughout the United States. In 1985, John Burgee (of rival architecture firm John Burgee Architects) called KPF "The best commercial firm now practicing in the U.S." KPF's design for 333 Wacker Drive in Chicago (1983), which was awarded the AIA National Honor Award in 1984, made the firm nationally famous. It remains a Chicago landmark, and was voted "Favorite Building" by the readers of the Chicago Tribune in both 1995 and 1997. In 1986, KPF's Procter & Gamble Headquarters in Cincinnati, which included an open plan interior design by Patricia Conway, was recognized for its innovative design with the AIA National Honor Award.
After its success with these projects, KPF was selected to design the IBM World Headquarters in Armonk, New York (1997), the Chicago Title and Trust Building in Chicago (1992), and the Federal Reserve Bank of Dallas (1993).
In the 1990s, KPF also took on a larger number of government and civic projects, including the Foley Square U.S. Courthouse in New York (1995), the Mark O. Hatfield U.S. Courthouse in Portland, Oregon (1996), the U.S. Courthouse of Minneapolis (1996), the Buffalo Niagara International Airport (1993) and the multiple award-winning redevelopment of The World Bank Headquarters in Washington, D.C. (1996).
KPF's winning entry in the international competition for the World Bank Headquarters, which drew 76 entrants from 26 countries, was the only entry that included the retention of existing structures.
Expansion to Europe (1980s–1990s)
In the 1980s and 1990s, KPF transformed from an American firm known for its corporate designs into an international firm with institutional, government, and transportation commissions in addition to corporate work.
KPF completed the design for two blocks of the large-scale Canary Wharf redevelopment (1987) and the Goldman Sachs Headquarters on Fleet Street (1987–1991). KPF has been selected for projects in the Canary Wharf area through to the present day, including the Clifford Chance Tower (2002) to KPMG's European Headqua |
https://en.wikipedia.org/wiki/RQC | RQC may refer to:
Relativistic quantum chemistry, a subfield of quantum chemistry.
Remote Access Quarantine Client, a program, rqc.exe, in the Windows Server 2003 operating system.
Review Quality Collector, a service aiming at improving the quality of scientific peer review.
Russian Quantum Center, a non-commercial scientific organization in Russia doing research in quantum mechanics and quantum computing. |
https://en.wikipedia.org/wiki/Parallel%20coordinates | Parallel coordinates are a common way of visualizing and analyzing high-dimensional datasets.
To show a set of points in an n-dimensional space, a backdrop is drawn consisting of n parallel lines, typically vertical and equally spaced. A point in n-dimensional space is represented as a polyline with vertices on the parallel axes; the position of the vertex on the i-th axis corresponds to the i-th coordinate of the point.
This visualization is closely related to time series visualization, except that it is applied to data where the axes do not correspond to points in time, and therefore do not have a natural order. Therefore, different axis arrangements may be of interest.
History
Parallel coordinates were often said to be invented by Philbert Maurice d'Ocagne in 1885, but even though the words "Coordonnées parallèles" appear in the book title this work has nothing to do with the visualization techniques of the same name; the book only describes a method of coordinate transformation. But even before 1885, parallel coordinates were used, for example in Henry Gannetts "General Summary, Showing the Rank of States, by Ratios, 1880", or afterwards in Henry Gannetts "Rank of States and Territories in Population at Each Census, 1790-1890" in 1898. They were popularised again 87 years later by Alfred Inselberg in 1985 and systematically developed as a coordinate system starting from 1977. Some important applications are in collision avoidance algorithms for air traffic control (1987—3 USA patents), data mining (USA patent), computer vision (USA patent), Optimization, process control, more recently in intrusion detection and elsewhere.
Higher dimensions
On the plane with an xy cartesian coordinate system, adding more dimensions in parallel coordinates (often abbreviated ||-coords or PCP) involves adding more axes. The value of parallel coordinates is that certain geometrical properties in high dimensions transform into easily seen 2D patterns. For example, a set of points on a line in n-space transforms to a set of polylines in parallel coordinates all intersecting at n − 1 points. For n = 2 this yields a point-line duality pointing out why the mathematical foundations of parallel coordinates are developed in the projective rather than euclidean space. A pair of lines intersects at a unique point which has two coordinates and, therefore, can correspond to a unique line which is also specified by two parameters (or two points). By contrast, more than two points are required to specify a curve and also a pair of curves may not have a unique intersection. Hence by using curves in parallel coordinates instead of lines, the point line duality is lost together with all the other properties of projective geometry, and the known nice higher-dimensional patterns corresponding to (hyper)planes, curves, several smooth (hyper)surfaces, proximities, convexity and recently non-orientability. The goal is to map n-dimensional relations into 2D patterns. Hence, para |
https://en.wikipedia.org/wiki/Brent%27s%20method | In numerical analysis, Brent's method is a hybrid root-finding algorithm combining the bisection method, the secant method and inverse quadratic interpolation. It has the reliability of bisection but it can be as quick as some of the less-reliable methods. The algorithm tries to use the potentially fast-converging secant method or inverse quadratic interpolation if possible, but it falls back to the more robust bisection method if necessary. Brent's method is due to Richard Brent and builds on an earlier algorithm by Theodorus Dekker. Consequently, the method is also known as the Brent–Dekker method.
Modern improvements on Brent's method include Chandrupatla's method, which is simpler and faster for functions that are flat around their roots; Ridders' method, which performs exponential interpolations instead of quadratic providing a simpler closed formula for the iterations; and the ITP method which is a hybrid between regula-falsi and bisection that achieves optimal worst-case and asymptotic guarantees.
Dekker's method
The idea to combine the bisection method with the secant method goes back to .
Suppose that we want to solve the equation f(x) = 0. As with the bisection method, we need to initialize Dekker's method with two points, say a0 and b0, such that f(a0) and f(b0) have opposite signs. If f is continuous on [a0, b0], the intermediate value theorem guarantees the existence of a solution between a0 and b0.
Three points are involved in every iteration:
bk is the current iterate, i.e., the current guess for the root of f.
ak is the "contrapoint," i.e., a point such that f(ak) and f(bk) have opposite signs, so the interval [ak, bk] contains the solution. Furthermore, |f(bk)| should be less than or equal to |f(ak)|, so that bk is a better guess for the unknown solution than ak.
bk−1 is the previous iterate (for the first iteration, we set bk−1 = a0).
Two provisional values for the next iterate are computed. The first one is given by linear interpolation, also known as the secant method:
and the second one is given by the bisection method
If the result of the secant method, s, lies strictly between bk and m, then it becomes the next iterate (bk+1 = s), otherwise the midpoint is used (bk+1 = m).
Then, the value of the new contrapoint is chosen such that f(ak+1) and f(bk+1) have opposite signs. If f(ak) and f(bk+1) have opposite signs, then the contrapoint remains the same: ak+1 = ak. Otherwise, f(bk+1) and f(bk) have opposite signs, so the new contrapoint becomes ak+1 = bk.
Finally, if |f(ak+1)| < |f(bk+1)|, then ak+1 is probably a better guess for the solution than bk+1, and hence the values of ak+1 and bk+1 are exchanged.
This ends the description of a single iteration of Dekker's method.
Dekker's method performs well if the function f is reasonably well-behaved. However, there are circumstances in which every iteration employs the secant method, but the iterates bk converge very slowly (in particular, |bk − bk−1| may be arbitra |
https://en.wikipedia.org/wiki/PLS%20%28file%20format%29 | PLS is a computer file format for a multimedia playlist. It is typically used by media players for streaming media over the Internet, but may also be used for playing local media.
For online streaming, typically the .PLS file would be downloaded just once from the media source—such as from an online radio station—for immediate or future use. While most computers and players automatically recognize .PLS format, the first time a PLS file is used on a computer, the media player's settings may need to be changed to recognize ("associated" with) .PLS files.
PLS was originally developed for use with the museArc audio player software by codeArts, and was later used by SHOUTcast and Icecast for streaming media over the Internet. PLS is a more expressive playlist format than the basic M3U playlist, as it can store (cache) information on the song title and length (this is supported in extended M3U only).
File format
The format is case-sensitive and essentially that of an INI file structured as follows
Header
[playlist] : This tag indicates that it is a Playlist File
Track Entry
Assuming track entry #X
FileX : Variable defining location of media file/stream (like .m3u/.m3u8 playlists).
TitleX : Defines track title. (optional)
LengthX : Length in seconds of track. Value of -1 indicates indefinite (streaming). (optional)
Footer
NumberOfEntries : This variable indicates the number of tracks and therefore equals the number used for the last track
Version : Playlist version. Currently only a value of 2 is valid.
Examples
Example of a complete PLS file used for "streaming audio;" in this case, to connect to a particular online radio station and receive its audio stream:
[playlist]
File1=http://stream2.streamq.net:8020/
Title1=Here enter name of the station
NumberOfEntries=1
Alternative Example containing local paths:
[playlist]
File1=http://relay5.181.fm:8068
Length1=-1
File2=example2.mp3
Title2=Just some local audio that is 2mins long
Length2=120
File3=F:\Music\whatever.m4a
Title3=absolute path on Windows
File4=%UserProfile%\Music\short.ogg
Title4=example for an Environment variable
Length4=5
NumberOfEntries=4
Version=2
Unix/BSD/Linux/OS X
In Unix-like operating systems absolute and relative file paths differ from Windows, because there are no drive letters, Environment variables differ and [/] (forward slashes) are used as directory separators instead of [\] (backslashes). Therefore, playlists pointing to absolute paths or media files outside of the folder containing the playlist will only work for one type of operating system - either Windows or Unix-like. URLs work the same for all types.
To make the second example from above work the 3rd and 4th path need to be changed to something like:
File3=/media/hdd/whatever.m4a
File4=~/Music/short.ogg
Compatible media player software
iTunes, VLC media player, Totem, RealPlayer, Winamp, Yahoo! Music Jukebox, MediaMonkey, Windows Media Player, AIMP, Kodi, Rhythmbox, foobar2000, Audacious and more th |
https://en.wikipedia.org/wiki/Numerical%20differentiation | In numerical analysis, numerical differentiation algorithms estimate the derivative of a mathematical function or function subroutine using values of the function and perhaps other knowledge about the function.
Finite differences
The simplest method is to use finite difference approximations.
A simple two-point estimation is to compute the slope of a nearby secant line through the points and . Choosing a small number , represents a small change in , and it can be either positive or negative. The slope of this line is
This expression is Newton's difference quotient (also known as a first-order divided difference).
The slope of this secant line differs from the slope of the tangent line by an amount that is approximately proportional to . As approaches zero, the slope of the secant line approaches the slope of the tangent line. Therefore, the true derivative of at is the limit of the value of the difference quotient as the secant lines get closer and closer to being a tangent line:
Since immediately substituting 0 for results in indeterminate form, calculating the derivative directly can be unintuitive.
Equivalently, the slope could be estimated by employing positions and .
Another two-point formula is to compute the slope of a nearby secant line through the points and . The slope of this line is
This formula is known as the symmetric difference quotient. In this case the first-order errors cancel, so the slope of these secant lines differ from the slope of the tangent line by an amount that is approximately proportional to . Hence for small values of this is a more accurate approximation to the tangent line than the one-sided estimation. However, although the slope is being computed at , the value of the function at is not involved.
The estimation error is given by
where is some point between and .
This error does not include the rounding error due to numbers being represented and calculations being performed in limited precision.
The symmetric difference quotient is employed as the method of approximating the derivative in a number of calculators, including TI-82, TI-83, TI-84, TI-85, all of which use this method with .
Step size
An important consideration in practice when the function is calculated using floating-point arithmetic of finite precision is the choice of step size, . If chosen too small, the subtraction will yield a large rounding error. In fact, all the finite-difference formulae are ill-conditioned and due to cancellation will produce a value of zero if is small enough. If too large, the calculation of the slope of the secant line will be more accurately calculated, but the estimate of the slope of the tangent by using the secant could be worse.
For basic central differences, the optimal step is the cube-root of machine epsilon.
For the numerical derivative formula evaluated at and , a choice for that is small without producing a large rounding error is (though not when x = 0), where the machine eps |
https://en.wikipedia.org/wiki/The%20Ed%20Schultz%20Show | The Ed Schultz Show was a progressive talk radio program hosted by Ed Schultz. It was formerly broadcast from KFGO in Fargo, North Dakota. It was heard on a network of over 100 stations, including seven of the ten largest radio markets. It was also on XM and Sirius satellite radio.
Schultz's radio show moved to New York City in May 2009, a relocation brought on by the launch of his new television show, The Ed Show, on MSNBC.
History
Schultz launched The Ed Schultz Show on January 5, 2004. Created and financed by Democracy Radio and distributed by Jones Radio Networks, the show started in two markets (Needles, California and Langdon, North Dakota) and quickly grew, signing another dozen stations in smaller, mostly upper Midwest markets. For a while, Schultz continued his News and Views broadcasts, though by February 2005 it was announced that Joel Heitkamp, a North Dakota state senator, was taking over that show. On February 1, 2007, Ed Schultz returned to hosting the News and Views show.
After growing to approximately 95 affiliates, Democracy Radio sold its majority stake in The Ed Schultz Show to Product First in June 2005, a company started by Randy Michaels and Stu Krane, who had previously been involved with launching Rush Limbaugh's radio show. Distribution continued with Jones Radio Networks and subsequently by its successor, Dial Global.
Schultz's flagship KFGO dropped The Ed Schultz Show between January 2006 and February 2007 due to an apparent conflict with the station's management which new ownership cleared up. Fargo's KQWB aired the program in the interim.
In 2009, Talkers magazine rated Ed Schultz as the 18th most important talk show host in the United States.
During his show on May 24, 2011, Schultz called Laura Ingraham both a "right-wing slut" and a "talk slut". Feminist organizations, including the Women's Media Center, called for his suspension. The following day he issued an apology, saying he "used vile and inappropriate language when talking about talk show host Laura Ingraham. I am deeply sorry, and I apologize. It was wrong, uncalled for and I recognize the severity of what I said. I apologize to you, Laura, and ask for your forgiveness," and offering an indefinite self-suspension without pay. MSNBC issued a statement saying that it had accepted Schultz's offer to take one week of unpaid leave.
In May 2014, Schultz announced he was replacing his three-hour weekday radio program with a one-hour program, and shifting to a Web format. Schultz continued to host his Ed Show television program, which was then aired on weekdays at 5 p.m. ET.
Show features
Ed Schultz may be best known for his pro-labor reputation, because he was one of the few remaining pundits who still frequently featured labor issues and promoted the labor movement, associating the health of "labor" with the health of the middle class.
Ed Schultz often brought Los Angeles lawyer Norman Goldman to his show for legal analysis, thus given the title of |
https://en.wikipedia.org/wiki/Barbie%20Liberation%20Organization | The Barbie Liberation Organization, or BLO, are a group of artists and activists involved in culture jamming. Self described as "an underground network of creative activists," the group gained notoriety in 1993 after switching voice boxes in talking G.I. Joes and Barbie dolls. They resurfaced in August 2023, claiming to be the toy giant Mattel in order to announce a new collection of MyCeliaBarbie EcoWarrior Edition compostable dolls, and a corporate wide move to plastic free toy production. The group is currently active.
In their first campaign the BLO performed "surgery" on a reported 300–500 dolls from retail and returned them to shelves, an action they refer to as shopgiving. Thus, Teen Talk Barbie dolls would say phrases such as "Vengeance is mine", while G.I. Joe dolls would say phrases such as "The beach is the place for summer!" Two leading members of the BLO, Jacques Servin and Igor Vamos, would go on to found the culture jamming and political action group The Yes Men.
In the 2023 action, actress Daryl Hannah, posing as a spokesperson for Mattel introduced the collection of biodegradable dolls and announced in a short video that the company would stop using plastic by 2030. This was one of several videos produced by Yellow Dot Studios as part of the BLO's campaign to leverage the publicity surrounding the launch of the Barbie film to shine a light on the costs to society posed by our overwhelming use of plastics.
Motivation and context
The BLO was originally conceived in an effort to question and ultimately change the gender stereotypes American culture is known for after Mattel released a speaking Barbie that said "Math class is tough." It took place in the middle of the culture wars of the 1990s when creative dissent was once again gaining popularity and artists and activists were often trying to conceive of new ways to rebel against cultural stereotypes and powerful forms like network TV. By 1993, criticism of Barbie as a negative gender stereotype for women was commonplace both in academia and popular culture. This may have been partially responsible for the generally positive response of the public to the project—the criticism they were making was familiar and not a controversial point to make during the 1990s. Although their criticism was not new, the creative form of hacking used by the BLO was noteworthy.
Methods
There is a detailed description of the complex "surgery" they performed available on their website, encouraging others to take part in the surgeries themselves. The surgery required some technical skills, tools and precision, but the voice boxes in the dolls were similar enough that the surgery could be reproduced fairly easily in other parts of the country. They outlined the surgery in easy to understand images. After the surgery they would secretly return the toys to shelves, what they call reverse shoplifting.
They also produced a video to explain their point. They used the familiar form of the nightl |
https://en.wikipedia.org/wiki/Arithmetic%20underflow | The term arithmetic underflow (also floating point underflow, or just underflow) is a condition in a computer program where the result of a calculation is a number of more precise absolute value than the computer can actually represent in memory on its central processing unit (CPU).
Arithmetic underflow can occur when the true result of a floating point operation is smaller in magnitude (that is, closer to zero) than the smallest value representable as a normal floating point number in the target datatype. Underflow can in part be regarded as negative overflow of the exponent of the floating point value. For example, if the exponent part can represent values from −128 to 127, then a result with a value less than −128 may cause underflow.
Storing values that are too low in an integer variable (e.g., attempting to store −1 in an unsigned integer) is properly referred to as integer , or more broadly, integer wraparound. The term underflow normally refers to floating point numbers only, which is a separate issue. It is impossible in most floating-point designs to store a too-low value, as usually they are signed and have a negative infinity value.
Underflow gap
The interval between −fminN and fminN, where fminN is the smallest positive normal floating point value, is called the underflow gap. This is because the size of this interval is many orders of magnitude larger than the distance between adjacent normal floating point values just outside the gap. For instance, if the floating point datatype can represent 20 bits, the underflow gap is 221 times larger than the absolute distance between adjacent floating point values just outside the gap.
In older designs, the underflow gap had just one usable value, zero. When an underflow occurred, the true result was replaced by zero (either directly by the hardware, or by system software handling the primary underflow condition). This replacement is called "flush to zero".
The 1984 edition of IEEE 754 introduced subnormal numbers. The subnormal numbers (including zero) fill the underflow gap with values where the absolute distance between adjacent values is the same as for adjacent values just outside the underflow gap. This enables "gradual underflow", where a nearest subnormal value is used, just as a nearest normal value is used when possible. Even when using gradual underflow, the nearest value may be zero.
The absolute distance between adjacent floating point values just outside the gap is called the machine epsilon, typically characterized by the largest value whose sum with the value 1 will result in the answer with value 1 in that floating point scheme. This can be written as , where is a function which converts the real value into the floating point representation. While the machine epsilon is not to be confused with the underflow level (assuming subnormal numbers), it is closely related. The machine epsilon is dependent on the number of bits which make up the significand, whereas the underflo |
https://en.wikipedia.org/wiki/Stroke%20count%20method | The Stroke Count Method (), Wubihua method, Stroke input method or Bihua IME ( or ) (lit. 5-stroke input method) is a relatively simple Chinese input method for writing text on a computer or a mobile phone. It is based on the stroke order of a word, not pronunciation. It uses five or six buttons, and is often placed on a numerical keypad. Although it is possible to input Traditional Chinese characters with this method, this method is often associated with Simplified Chinese characters. The Wubihua method should not be confused with the Wubi method.
Each of the five keys from 1 to 5 are assigned a certain type of stroke (resembling the Eight Principles of Yong; these five are sometimes called (héng-shù-piē-nà-zhé) with each character of this phrase being a one-syllable description of the respective five strokes:
A horizontal stroke from left to right (一)
A vertical stroke from top to bottom (丨)
A long diagonal stroke downward from right to left (丿)
A very short dash stroke downward from left to right (丶)
A horizontal stroke from left to right, ending with a downwards hook to the left (乙)
To input any character, the user simply presses the keys corresponding to the strokes of a character then select from a list of matching characters. The list of suggestions to choose from becomes more and more specific as more digits of the code are entered. The system will not recognize a character input with an incorrect stroke order. Some people find this method of entering characters into a mobile phone to be faster than pinyin. In fact, as pinyin is based upon Mandarin Chinese, many Chinese people – particularly in the southern regions of China like Hong Kong and Macau – who speak other varieties of Chinese and never learned pinyin relied solely on this method of entering characters on their phones, until touchscreen-based Smartphones allowed the possibility of Handwriting recognition.
Wubihua is one of the easiest to learn methods because it is simple and does not require knowledge of pronunciation or Pinyin. However, it tends to be vague, as a Wubihua code will normally match ten characters, and each character has one correct code, which confuses users whose stroke orders are wrong.
Strokes map to Wubihua input generally according to the following table:
See also
Wubi method
Chinese input methods for computers
Stroke (CJK character)
Eight principles of yong: how stroke styles are taught to student calligraphers
Notes
References
External links
Wubihua For Speakers of English
Thesis on chinese language processing and computing – Wubihua
Han character input |
https://en.wikipedia.org/wiki/Hewson%20Consultants | Hewson Consultants were one of the smaller software companies which produced video games for home computers in the mid-1980s. They had a reputation for high-quality games which continually pushed the boundaries of what the computers were capable of and can be compared favourably with other ground-breaking software houses like Ultimate Play the Game and Beyond. Fourteen of their games were awarded "Megagame" by Your Sinclair.
Hewson was founded by Andrew Hewson in the early 1980s. He became interested in computers while working at the British Museum when their first machine arrived. After learning to program, Andrew wrote the programming guide book Hints and Tips for the ZX80. Following the publication, bedroom coders began to send Andrew the games they had programmed on cassette tape, giving Andrew the idea to publish the games. Hewson Consultants was born, and initially released games via mail order advertisements in computing magazines. Andrew was also a columnist in Sinclair User magazine throughout the 1980s.
Releases
Space Intruders - 1981 - Space Invaders clone for the ZX81
Pilot - 1982
Nightflite - 1982
Nightflite II - 1983
Knight Driver - 1983
Quest Adventure - 1983
3D Space-Wars - 1983
3D Seiddab Attack - 1984
3D Lunattack - 1984 - Reviewed in Crash issue 4 - 90%
Avalon - 1984 - Reviewed in Crash issue 10 - 91%
Technician Ted - 1984 - Reviewed in Crash issue 13 - 96%
Dragontorc - 1985
Astro Clone - 1985
Paradroid - 1985
Gribbly's Day Out - 1985
Firelord - 1986
Pyracurse - 1986
Quazatron - 1986
Southern Belle - 1986
Uridium - 1986
Technician Ted: The Megamix - 1986
Cybernoid - 1987
Exolon - 1987
Zynaps - 1987
Impossaball - 1987
Nebulus - 1987
Ranarama - 1987
Evening Star - 1987
Anarchy - 1987
Netherworld - 1988
Cybernoid II: The Revenge - 1988
Marauder - 1988
Zamzara - 1988
Eliminator - 1988
Battle Valley - 1988
Steel - 1989
Stormlord - 1989
Astaroth - 1989
Onslaught - 1989
Deliverance - 1990
Insects in Space - 1990
Zarathrusta - 1991
Legacy
Andrew and other members of the Hewson management team went on to form 21st Century Entertainment after Hewson shut down in the early 1990s, releasing several games such as Pinball Dreams, Pinball Fantasies and Pinball Illusions. Andrew was also the founder of ELSPA (European Leisure Software Publishers Association) which continues to be the European regulating body for the video games industry.
References
Defunct video game companies of the United Kingdom |
https://en.wikipedia.org/wiki/Personal%20Storage%20Table | In computing, a Personal Storage Table (.pst) is an open proprietary file format used to store copies of messages, calendar events, and other items within Microsoft software such as Microsoft Exchange Client, Windows Messaging, and Microsoft Outlook. The open format is controlled by Microsoft who provide free specifications and free irrevocable technology licensing.
The file format may also be known as a Personal Folders (File) or Post Office File. When functioning in its capacity as a cache for Outlook's Cached Exchange Mode feature, it may be called an Off-line Storage Table (.ost) or an Off-line Folders (File).
Overview
In Microsoft Exchange Server, the messages, the calendar, and other data items are delivered to and stored on the server. Microsoft Outlook stores these items in a personal-storage-table (.pst) or off-line-storage-table (.ost) files that are located on the local computer. Most commonly, the .pst files are used to store archived items and the .ost files to maintain off-line availability of the items. This is an essential feature of Microsoft Outlook.
The size of these files no longer counts against the size of the mailbox used; by moving files from a server mailbox to .pst files, users can free storage space on their mailservers. To use the .pst files from another location the user needs to be able to access the files directly over a network from their mail client. While it is possible to open and use a .pst file from over a network, this is unsupported, and Microsoft advises against it, as .pst files are prone to corruption when used in this manner.
Both the .pst and .ost files use a fixed-block-based allocation scheme; the file is enlarged by a fixed amount of bytes, and the file internally maintains information about the allocated and non-allocated blocks. So, when data files like email messages are added to a .pst file, its file size is automatically adjusted by the mail client (if necessary). When mail is deleted from a .pst file, the size of the .pst file will stay the same, marking the space as unallocated so that it will hold future data items. Recently removed data items can actually be recovered from .pst and .ost files.
To reduce the size of .pst files, the user needs to compact them.
Data access
Password protection can be used to protect the content of the .pst files. However, Microsoft admits that the password adds very little protection, due to the existence of commonly available tools which can remove or simply bypass the password protection. The password to access the table is stored without the first and last XOR CRC-32 integer representation of itself
in the .pst file. Outlook checks to make sure that
it matches the user-specified password and refuses to operate if there is
no match. The data is readable by
the libpst project code.
Microsoft (MS) offers three values for the encryption setting: none,
compressible, and high.
None the .pst data is stored as plain text.
Compressible the .pst data is en |
https://en.wikipedia.org/wiki/IBM%203850 | The IBM 3850 Mass Storage System (MSS) was an online tape library used to hold large amounts of infrequently accessed data. It was one of the earliest examples of nearline storage.
History
Starting in the late-1960s IBM's lab in Boulder, Colorado began development of a low-cost mass storage system based on magnetic tape cartridges. The tapes would be accessed automatically by a robot (known as an accessor) and fed into a reader/writer unit that could work on several tapes at the same time. Originally the system was going to be used as a directly attached memory device, but as the speed of computers grew in relation to the storage, the product was re-purposed as an automated system that would offload little-used data from hard disk systems. Known internally as Comanche while under development, IBM management found a number of niche uses for the concept, and announced it officially as the IBM 3850 on October 9, 1974. After more than a decade (comparable to the IBM 2321 Data Cell, 1964–1975), it was withdrawn August 5, 1986.
Description
The Mass Storage System consisted of a library of cylindrical plastic cartridges, two inches wide and long, each holding a spool of tape long storing 50 MB; each virtual disk required a pair of cartridges. These cartridges were held in a hexagonal array of bins in the IBM 3851 Mass Storage Facility. New cartridges were rolled into the facility and were automatically stored in a vacant bin. The data were accessed via virtual IBM 3330 disk drives, and physically cached on a combination of 3330 and 3350 staging drives, the data being transferred automatically between cartridge and disk drive in processes called staging and destaging. These were all connected together with the IBM 3830 Storage Control (also used for disk storage alone), the entire system making up a 3850 unit.
Cartridges were moved into and out of read stations by two motorized accessor arms, electrically connected via flat cable on a drum. Stage time for data from cartridge to disk was typically 15 seconds, including about two seconds to move the cartridge into a read station, and eight to ten seconds to read the 200-foot tape.
The recording method was unusual for its time. The drive pulled the tape from the cartridge and wound it once around a cylindrical mandrel in a helix, then stopped the tape. The drive's head, mounted on a rotating drum, then rotated once to read or record a diagonal track. Then drive then wound the tape a small step forward and the head rotated to do the next track. Depending on technical definitions this might be even considered a first example of a digital helical scan recording, long before Exabyte's helical drive (which was based on analog video helical recording systems developed earlier).
When free disk space was required a group of cylinders were selected to be destaged to tape, these were transferred with minimal or no change of format. Each tape could store 202 cylinder images of 19 tracks each, half of the |
https://en.wikipedia.org/wiki/Phasor%20%28disambiguation%29 | Phasor is a phase vector representing a sine wave.
Phasor may also be:
Phasor (sound synthesizer), a stereo music, sound and speech synthesizer for the Apple II computer
Phasor measurement unit, a device that measures phasors on an electricity grid
Phasor (radio broadcasting), a network of inductors and capacitors used to control the relative amplitude and phase of the radio frequency currents driving a directional antenna array
See also
Phase (disambiguation)
Phaser (disambiguation)
FASOR (disambiguation) |
https://en.wikipedia.org/wiki/Fab%20lab | A fab lab (fabrication laboratory) is a small-scale workshop offering (personal) digital fabrication.
A fab lab is typically equipped with an array of flexible computer-controlled tools that cover several different length scales and various materials, with the aim to make "almost anything". This includes technology-enabled products generally perceived as limited to mass production.
While fab labs have yet to compete with mass production and its associated economies of scale in fabricating widely distributed products, they have already shown the potential to empower individuals to create smart devices for themselves. These devices can be tailored to local or personal needs in ways that are not practical or economical using mass production.
The fab lab movement is closely aligned with the DIY movement, open-source hardware, maker culture, and the free and open-source movement, and shares philosophy as well as technology with them.
History
The fab lab program was initiated to broadly explore how the content of information relates to its physical representation and how an under-served community can be powered by technology at the grassroots level. The program began as a collaboration between the Grassroots Invention Group and the Center for Bits and Atoms at the Media Lab in the Massachusetts Institute of Technology with a grant from the National Science Foundation (Washington, D.C.) in 2001.
Vigyan Ashram in India was the first fab lab to be set up outside MIT. It is established in 2002 and received capital equipment by NSF-USA and IITK
While the Grassroots Invention Group is no longer in the Media Lab, The Center for Bits and Atoms consortium is still actively involved in continuing research in areas related to description and fabrication but does not operate or maintain any of the labs worldwide (with the excmobile fab lab).
The fab lab concept also grew out of a popular class at MIT (MAS.863) named "How To Make (Almost) Anything". The class is still offered in the fall semesters.
Popular equipment and projects
Flexible manufacturing equipment within a fab lab can include:
Mainly, a rapid prototyper: typically a 3D printer of plastic or plaster parts
3-axis CNC machines: 3 or more axes, computer-controlled subtractive milling or turning machines
Printed circuit board milling or etching: two-dimensional, high precision milling to create circuit traces in pre-clad copper boards
Microprocessor and digital electronics design, assembly, and test stations
Cutters, for sheet material: laser cutter, plasma cutter, water jet cutter, knife cutter.
FabFi
One of the larger projects undertaken by fab labs include free community FabFi wireless networks (in Afghanistan, Kenya and US). The first city-scale FabFi network, set up in Afghanistan, has remained in place and active for three years under community supervision and with no special maintenance. The network in Kenya, (Based in the University of Nairobi (UoN)) building on that experience, |
https://en.wikipedia.org/wiki/Agent%20handling | In intelligence organizations, agent handling is the management of so-called agents (called secret agents or spies in common parlance), principal agents, and agent networks (called "assets") by intelligence officers typically known as case officers.
Human intelligence
A primary purpose of intelligence organizations is to penetrate a target with a human agent, or a network of human agents. Such agents can either infiltrate the target, or be recruited "in place". Case officers are professionally trained employees of intelligence organizations that manage human agents and human agent networks. Intelligence that derives from such human sources is known by the abbreviation HUMINT.
Sometimes, agent handling is done indirectly, through "principal agents" that serve as proxies for case officers. It is not uncommon, for example, for a case officer to manage a number of principal agents, who in turn handle agent networks, which are preferably organized in a cellular fashion. In such a case, the principal agent can serve as a "cut-out" for the case officer, buffering him or her from direct contact with the agent network.
Using a principal agent as a cut-out, and ensuring that the human agent network is organized in a cellular fashion, can provide some protection for other agents in the network, as well as for the principal agent, and for the case officer if an agent in the network is compromised. Assuming that standard principles of intelligence tradecraft have been strictly observed by the principal agent and the agents in the network, compromised agents will not be able to identify the case officer, nor the other members of the network. Ideally, agents may work side by side in the same office, and conduct their clandestine collection activities with such discipline, that they will not realize that they are both engaged in espionage, much less members of the same network.
Since an agent can sometimes identify his or her principal agent, however, or reveal information under interrogation that can lead to the identification of a principal agent, the protection provided by cellular network organization can be time-sensitive.
If principles of intelligence tradecraft have not been strictly observed, it is also possible that compromised agents can reveal information that exposes other members of the network. In the real world of espionage, human lapses are very much the norm, and violations of the principles of tradecraft are common. It is for this reason that agents are ideally trained to resist interrogation for a defined period of time.
If an agent is able to resist interrogation for a defined period of time, the odds improve that other members of the network can be alerted to the compromise.
Case officer
A case officer is an intelligence officer who is a trained specialist in the management of agents and agent networks. Case officers manage human agents and human intelligence networks. Case officers spot potential agents, recruit prospective agents a |
https://en.wikipedia.org/wiki/Password%20policy | A password policy is a set of rules designed to enhance computer security by encouraging users to employ strong passwords and use them properly. A password policy is often part of an organization's official regulations and may be taught as part of security awareness training. Either the password policy is merely advisory, or the computer systems force users to comply with it. Some governments have national authentication frameworks that define requirements for user authentication to government services, including requirements for passwords.
NIST guidelines
The United States Department of Commerce's National Institute of Standards and Technology (NIST) has put out two standards for password policies which have been widely followed.
2004
From 2004, the "NIST Special Publication 800-63. Appendix A," advised people to use irregular capitalization, special characters, and at least one numeral. This was the advice that most systems followed, and was "baked into" a number of standards that businesses needed to follow.
2017
However, in 2017 a major update changed this advice, particularly that forcing complexity and regular changes is now seen as bad practice.
The key points of these are:
Verifiers should not impose composition rules e.g., requiring mixtures of different character types or prohibiting consecutively repeated characters
Verifiers should not require passwords to be changed arbitrarily or regularly e.g. the previous 90 day rule
Passwords must be at least 8 characters in length
Password systems should permit subscriber-chosen passwords at least 64 characters in length.
All printing ASCII characters, the space character, and Unicode characters should be acceptable in passwords
When establishing or changing passwords, the verifier shall advise the subscriber that they need to select a different password if they have chosen a weak or compromised password
Verifiers should offer guidance such as a password-strength meter, to assist the user in choosing a strong password
Verifiers shall store passwords in a form that is resistant to offline attacks. Passwords shall be salted and hashed using a suitable one-way key derivation function. Key derivation functions take a password, a salt, and a cost factor as inputs then generate a password hash. Their purpose is to make each password guessing trial by an attacker who has obtained a password hash file expensive and therefore the cost of a guessing attack high or prohibitive.
NIST included a rationale for the new guidelines in its Appendix A.
Aspects
Typical components of a password policy include:
Password length and formation
Many policies require a minimum password length. Eight characters is typical but may not be appropriate. Longer passwords are generally more secure, but some systems impose a maximum length for compatibility with legacy systems.
Some policies suggest or impose requirements on what type of password a user can choose, such as:
the use of both upper-case and lower- |
https://en.wikipedia.org/wiki/Roke%20Manor%20Research | Roke Manor Research Limited is a British company based at Roke Manor near Romsey, Hampshire, which conducts research and development in the fields of communications, networks, electronic sensors, artificial intelligence, machine learning, data science, Military decision support consultancy and operational analysis, information assurance, and human science. In addition to supporting its parent Chemring, Roke undertakes contract research and development, and product development work for both public and private sector customers. Products developed from research at Roke Manor include the Hawk-Eye ball tracker, which is now used widely in sports such as tennis, football, and cricket.
Roke has been part of the Chemring Group since 2010, having been founded as part of the Plessey company to operate as a dedicated research and development centre, with mass production elsewhere, and later owned for almost 20 years by Siemens where it had a similar research role.
History
1956 – Founded as Plessey Research Roke Manor Limited by the Plessey company. The first managing director was Harold J. Finden, an electrical engineer at Plessey.
1990 – Passed to GEC-Siemens AG in a joint takeover.
1991 – Became wholly owned by Siemens AG when GEC sold their 50% shareholding to Siemens Plessey Electronic Systems.
2010 – Acquired by the Chemring Group PLC.
2021 – Roke made its first acquisition since the founding in 1956, acquiring Cubica Technology Ltd. and their holding company Cubica Group.
Sites
The company's head office is at Roke Manor, Hampshire. It also has facilities in the Barnwood area of Gloucester and at MediaCityUK, Salford, Greater Manchester.
The Roke Manor site is based around a manor house which dates in part from 1653. The grounds had a walled garden, stable block and cottages when bought by Plessey in 1956, and initially these were re-used for laboratory space and meeting rooms, but in various modernization programmes, many of the original buildings have been demolished and much of the grounds covered by purpose-built facilities and car parking. The main house, entrance lodge and walled garden remain.
Technology timeline
1960 – Working prototype memory systems developed for the supercomputer, Atlas.
1975 – Designed and developed the world's first monolithic gallium arsenide microwave circuit.
1995 – Work began on the Hostile Artillery LOcation system (HALO), an acoustic locator of guns and mortars. HALO was developed to monitor ceasefire violations in the Yugoslav wars, and is in use with the British Army and other nations.
2000 – Won the 2000 Worldaware Innovation Award for work on land mine clearance.
2001 – Developed the Hawk-Eye vision-based ball tracking system.
Selected products
RESOLVE – an electronic warfare manpack system for the intercept, geolocation and exploitation of tactical communications signals within the HF to SHF bands. In 2011 RESOLVE won a Queen's Award for Enterprise and Innovation.
Vigilance – a wide area multilaterati |
https://en.wikipedia.org/wiki/ZDS | ZDS may refer to:
Zenith Data Systems, a computer manufacturer in the 1980s
Za dom spremni, a Croatian nationalist salute
9,9'-Dicis-zeta-carotene desaturase, an enzyme
Zheng Design Services, a professional architectural practice by architect Leslie Zheng
Zaaza Design Studio, ZDS is a professional 360 advertising agency specialized in graphic design, motion graphic, web design & 3D modeling |
https://en.wikipedia.org/wiki/Tellabs | Tellabs, Inc. is a global network technology company that provides networking and communications solutions to both private and governmental agencies. The company offers a range of products and services, including optical transport systems, access systems, managed access solutions, and network management software. The company was founded by Michael Birck in 1974 and is headquartered in Carrollton, Texas. It is currently owned by Marlin Equity Partners, who established an independent business for its product portfolio to accelerate the development of Optical local area network (OLAN) technology. This technology was designed for Enterprise and Government agency clients. OLAN uses fiber, which is faster, more secure, and more stable in comparison to traditional copper infrastructure.
History
Early years
Tellabs traces its roots to a meeting in 1974 over a kitchen table in suburban Chicago. According to company founder Michael Birck, a group of six men with backgrounds in electrical engineering and sales drank coffee and brainstormed ideas for a new telecom company. Their aim was to build a company that offered products and services tailored to customer needs. After raising $110,000 in capital, they incorporated as Tellabs in the spring of 1975; the name combined the idea of telephones and laboratories. The start-up only had a one-man research department, a second-hand soldering iron picked up for $25, and an outdated oscilloscope. In a matter of months, Tellabs began making echo suppressors, which suppress annoying echoes on phone calls. During this time, the founding partners drew no salaries.
The company went public in July 1980, ending the year with sales of $43.7 million. In September 1981, Tellabs introduced the industry's first echo canceller, an advance over the original echo suppressors that synthesized an echo and electronically subtracted it. By 1990, Tellabs had grown to 2,000 employees at 25 locations globally and sales of $211 million.
Tellabs made several acquisitions and expanded globally in the 1980s and into the 1990s, including Coherent Communications Systems Corp. and Martis Oy in Finland. In 1991, the company took a new direction, releasing its SONET-based TITAN 5500 digital cross-connect system. These systems switched traffic from one circuit to another, connecting traffic inside and between networks.
Richard Notebaert, who had led Ameritech, the Midwestern AT&T spin-off until it was acquired by SBC in 1999, took over Tellabs as CEO in September 2000. Pundits labeled Notebaert the “$6 billion man.” However, as the Chicago Sun-Times also reported, the telecom industry also collapsed. The Chicago Sun-Times reported: “Telecom went from boom to bust as venture capital dried up and customers cancelled orders for the sort of equipment made by Tellabs and its competitors, including Nortel Networks and Lucent Technologies.” In 2003, following industry trends and after 28 years as a manufacturer, Tellabs sold its last plant in Ill |
https://en.wikipedia.org/wiki/Cuda | Cuda or CUDA may refer to:
CUDA, a parallel programming framework by Nvidia
Barracuda Networks, an American computer security and data storage company
Milan Čuda (born 1939), Czech volleyball player
Plymouth Barracuda, an automobile
Cuda, a Celtic/Brythonic goddess residing in what is now the Cotswolds
See also
Barracuda (disambiguation) |
https://en.wikipedia.org/wiki/International%20Intellectual%20Property%20Alliance | The International Intellectual Property Alliance (IIPA) is a coalition of seven trade associations representing American companies that produce copyright-protected material, including computer software, films, television programs, music, books, and journals (electronic and print media). Formed in 1984, it seeks to strengthen international copyright protection and enforcement by working with the U.S. government, foreign governments, and private-sector representatives.
Activities
IIPA works closely with the U.S. Trade Representative in compiling the annual Special 301 reviews of foreign countries that the Office of the U.S. Trade Representative considers to have inadequate protection of intellectual property rights. IIPA was the principal representative of the entertainment industry in assisting the U.S. government in the World Trade Organization (WTO) TRIPS negotiations, the North American Free Trade Agreement (NAFTA) negotiations, and at the Diplomatic Conference leading to the completion in 1996 of the World Intellectual Property Organization (WIPO) "Internet" treaties—i.e., the Copyright Treaty and the Performances and Phonograms Treaty.
IIPA has also worked with the U.S. government in drafting Intellectual Property Rights Chapters of free trade agreements. It participates in policy development in copyright and enforcement issues in bilateral and regional initiatives such as the Asia Pacific Economic Cooperation (APEC). IIPA participates in trade actions brought under trade laws like the Generalized System of Preferences and other trade preference programs.
IIPA is a non-governmental organization at WIPO.
Member associations
The seven trade associations that make up the IIPA membership include:
Association of American Publishers (AAP)
Business Software Alliance (BSA)
Entertainment Software Association (ESA)
Independent Film & Television Alliance (IFTA)
Motion Picture Association (MPA)
National Music Publishers' Association (NMPA)
Recording Industry Association of America (RIAA)
See also
Intellectual property organization
Pharmaceutical Research and Manufacturers of America (PhRMA)
Special 301 Report
Notorious markets
References
External links
Intellectual property organizations |
https://en.wikipedia.org/wiki/Zeno%20machine | In mathematics and computer science, Zeno machines (abbreviated ZM, and also called accelerated Turing machine, ATM) are a hypothetical computational model related to Turing machines that are capable of carrying out computations involving a countably infinite number of algorithmic steps. These machines are ruled out in most models of computation.
The idea of Zeno machines was first discussed by Hermann Weyl in 1927; the name refers to Zeno's paradoxes, attributed to the ancient Greek philosopher Zeno of Elea. Zeno machines play a crucial role in some theories. The theory of the Omega Point devised by physicist Frank J. Tipler, for instance, can only be valid if Zeno machines are possible.
Definition
A Zeno machine is a Turing machine that can take an infinite number of steps, and then continue take more steps. This can be thought of as a supertask where units of time are taken to perform the -th step; thus, the first step takes 0.5 units of time, the second takes 0.25, the third 0.125 and so on, so that after one unit of time, a countably infinite number of steps will have been performed.
Infinite time Turing machines
A more formal model of the Zeno machine is the infinite time Turing machine. Defined first in unpublished work by Jeffrey Kidder and expanded upon by Joel Hamkins and Andy Lewis, in Infinite Time Turing Machines, the infinite time Turing machine is an extension of the classical Turing machine model, to include transfinite time; that is time beyond all finite time. A classical Turing machine has a status at step (in the start state, with an empty tape, read head at cell 0) and a procedure for getting from one status to the successive status. In this way the status of a Turing machine is defined for all steps corresponding to a natural number. An maintains these properties, but also defines the status of the machine at limit ordinals, that is ordinals that are neither nor the successor of any ordinal. The status of a Turing machine consists of 3 parts:
The state
The location of the read-write head
The contents of the tape
Just as a classical Turing machine has a labeled start state, which is the state at the start of a program, an has a labeled limit state which is the state for the machine at any limit ordinal. This is the case even if the machine has no other way to access this state, for example no node transitions to it. The location of the read-write head is set to zero for at any limit step. Lastly the state of the tape is determined by the limit supremum of previous tape states. For some machine , a cell and, a limit ordinal then
That is the th cell at time is the limit supremum of that same cell as the machine approaches . This can be thought of as the limit if it converges or otherwise.
Computability
Zeno machines have been proposed as a model of computation more powerful than classical Turing machines, based on their ability to solve the halting problem for classical Turing machines. Cristian |
https://en.wikipedia.org/wiki/Mangled%20packet | In computer networking, a mangled or invalid packet is a packet — especially IP packet — that either lacks order or self-coherence, or contains code aimed to confuse or disrupt computers, firewalls, routers, or any service present on the network.
Their usage is associated with a type of network attack called a denial-of-service (DoS) attack. They aim to destabilize the network and sometimes to reveal its available services – when network operators must restart the disabled ones. Mangled packets can be generated by dedicated software such as nmap.
, most invalid packets are easily filtered by modern stateful firewalls.
References
Packets (information technology) |
https://en.wikipedia.org/wiki/Iser | Iser or ISER may refer to:
iSCSI Extensions for RDMA, a computer network storage protocol
[[Jizera (river)
Institute for Social and Economic Research, an institute at the University of Essex
People with the surname
Iosif Iser (1881–1958), Romanian painter and graphic artist
Wolfgang Iser (1926–2007), German literary scholar
See also
Isar, a river in Germany
Isère, a department in the Auvergne-Rhône-Alpes region in eastern France
Isère (river), southeastern France
Isser (disambiguation)
Yser, a river in Belgium |
https://en.wikipedia.org/wiki/Thermalright | Thermalright Inc. is a Taiwan-based company headquartered in Taipei. It was established in 2001.
The company produces aftermarket heat sinks and other components for cooling desktop computers. Thermalright advertises its products as suitable for cooling processors from AMD and Intel.
Types of products
CPU heat sinks
VGA heat sinks
Chipset heat sinks
MOSFET heat sinks
Memory heat sinks
Cooling fans
Other small accessories
Similar companies
Arctic
Cooler Master
Deepcool
Notes
External links
2001 establishments in Taiwan
Computer hardware companies
Computer hardware cooling
Electronics companies established in 2001
Electronics companies of Taiwan
Taiwanese brands |
https://en.wikipedia.org/wiki/Remedy%20Flashboards | Remedy Flashboards is an extension of the Action Request System. Flashboards allows users to define and view graphical representations of data held in the AR System. The name is derived from the resemblance between the standard Flashboard display and the classic speedometer found on the dashboard of an automobile.
Related entries
Remedy Corp
BMC Software
External links
Flashboards page on www.bmc.com
Integrated development environments |
https://en.wikipedia.org/wiki/D1X | D1X may refer to:
Nikon D1X, a digital single-lens reflex camera
a source port of the computer game Descent |
https://en.wikipedia.org/wiki/Hard%20coding | Hard coding (also hard-coding or hardcoding) is the software development practice of embedding data directly into the source code of a program or other executable object, as opposed to obtaining the data from external sources or generating it at runtime.
Hard-coded data typically can only be modified by editing the source code and recompiling the executable, although it can be changed in memory or on disk using a debugger or hex editor.
Data that is hard-coded is best suited for unchanging pieces of information, such as physical constants, version numbers, and static text elements.
Softcoded data, on the other hand, encodes arbitrary information through user input, text files, INI files, HTTP server responses, configuration files, preprocessor macros, external constants, databases, command-line arguments, and is determined at runtime.
Overview
Hard coding requires the program's source code to be changed any time the input data or desired format changes, when it might be more convenient to the end user to change the detail by some means outside the program.
Hard coding is often required, but can also be considered an anti-pattern. Programmers may not have a dynamic user interface solution for the end user worked out but must still deliver the feature or release the program. This is usually temporary but does resolve, in a short term sense, the pressure to deliver the code. Later, softcoding is done to allow a user to pass on parameters that give the end user a way to modify the results or outcome.
The term "hard-coded" was initially used as an analogy to hardwiring circuits - and was meant to convey the inflexibility which results from its usage within software design and implementation.
In the context of run-time extensible collaborative development environments such as MUDs, hardcoding also refers to developing the core engine of the system responsible for low-level tasks and executing scripts, as opposed to softcoding which is developing the high-level scripts that get interpreted by the system at runtime, with values from external sources, such as text files, INI files, preprocessor macros, external constants, databases, command-line arguments, HTTP server responses, configuration files, and user input. In this case, the term is not pejorative and refers to general development, rather than specifically embedding output data.
Hardcoding and backdoors
Hardcoding credentials is a popular way of creating a backdoor. Hardcoded credentials are usually not visible in configuration files or the output of account-enumeration commands and cannot be easily changed or bypassed by users. If discovered, a user might be able to disable such a backdoor by modifying and rebuilding the program from its source code (if source is publicly available), decompiling, or reverse-engineering software, directly editing the program's binary code, or instituting an integrity check (such as digital signatures, anti-tamper, and anti-cheat) to prevent the unexpe |
https://en.wikipedia.org/wiki/TVI | TVI may refer to:
RTL-TVI, a French-language television station in Belgium
TVi (channel), a Ukrainian TV-channel
TVi, former name of TV Okey, a Malaysian TV-network
Tamil Vision International, a Tamil language television channel in Toronto, Canada
TeleVideo, a manufacturer of computer terminals
Televisão Independente, a Portuguese television channel
TVI Community College (now Central New Mexico Community College) in Albuquerque, New Mexico
Television interference
Television Iwate, a television company in Iwate Prefecture, Japan
Tactical vehicle intervention, a pursuit tactic by which a pursuing car can force a fleeing car to turn sideways abruptly, causing the driver to lose control and stop |
https://en.wikipedia.org/wiki/Wesley%20A.%20Clark | Wesley Allison Clark (April 10, 1927 – February 22, 2016) was an American physicist who is credited for designing the first modern personal computer. He was also a computer designer and the main participant, along with Charles Molnar, in the creation of the LINC computer, which was the first minicomputer and shares with a number of other computers (such as the PDP-1) the claim to be the inspiration for the personal computer.
Clark was born in New Haven, Connecticut, and grew up in Kinderhook, New York, and in northern California. His parents, Wesley Sr. and Eleanor Kittell, moved to California, and he attended the University of California, Berkeley, where he graduated with a degree in physics in 1947. Clark began his career as a physicist at the Hanford Site.
In 1981, Clark received the Eckert–Mauchly Award for his work on computer architecture. He was awarded an honorary degree by Washington University in St. Louis in 1984. He was elected to the National Academy of Engineering in 1999. Clark is a charter recipient of the IEEE Computer Society Computer Pioneer Award for "First Personal Computer".
At Lincoln Laboratory
Clark moved to the MIT Lincoln Laboratory in 1952 where he joined the Project Whirlwind staff. There he was involved in the development of the Memory Test Computer (MTC), a testbed for ferrite core memory that was to be used in Whirlwind. His sessions with the MTC, "lasting hours rather than minutes" helped form his views that computers were to be used as tools on demand for those who needed them. That view carried over into his designs for the TX-0 and TX-2 and the LINC. He expresses this view clearly here:
...both of the Cambridge machines, Whirlwind and MTC, had been completely committed to the air defense effort and were no longer available for general use. The only surviving computing system paradigm seen by M.I.T. students and faculty was that of a very large International Business Machine in a tightly sealed Computation Center: the computer not as tool, but as demigod. Although we were not happy about giving up the TX-0, it was clear that making this small part of Lincoln's advanced technology available to a larger M.I.T. community would be an important corrective step.
Clark is
one of the fathers of the personal computer... he was the architect of both the TX-0 and TX-2 at Lincoln Labs. He believed that "a computer should be just another piece of lab equipment." At a time when most computers were huge remote machines operated in batch mode, he advocated far more interactive access. He practiced what he preached, even though it often meant bucking current "wisdom" and authority (in a 1981 lecture, he mentioned that he had the distinction of being, "the only person to have been fired three times from MIT for insubordination".)
Clark's design for the TX-2 "integrated a number of man-machine interfaces that were just waiting for the right person to show up to use them in order to make a computer that was 'on-line'. When se |
https://en.wikipedia.org/wiki/IRC%20script | IRC scripts are a way of shortening commands and responding automatically to certain events while connected to an IRC network. There are many different scripting languages for different types of IRC clients: ircII, BitchX, HexChat, mIRC, Visual IRC, Bersirc, and others have their own scripting languages, many of which share common features and syntax and therefore are easily portable from one IRC client to another.
Basis
Aliases
Most IRC scripts contain one or more aliases. Aliases are used to bind some command to a set of commands, or give it parameters, to save time when typing such commands over and over. For example, a simple alias might allow the user to type "/j channel" instead of "/join #channel", saving exactly 5 keystrokes (counting Shift). Aliases can add new commands, replace commands built into the IRC client, or provide abbreviations for long commands or sequences of commands. Aliases can usually be used as functions to produce a value that is used elsewhere in the script. In some cases, an alias hypo can be associated with a keyboard shortcut.
Event-based Scripting
Events, also known as remotes, allow a script to respond automatically when a particular type of message is received from the IRC server, or when a certain action is performed by the user, such as pressing a key or closing a window. Advanced event scripting usually requires knowledge of the IRC protocol, though basic events can usually be written without it.
Popups in Scripts
Scripts for graphical IRC clients may contain pop-ups, which extend or replace the menus built into the client. Many scripts contain nothing but long lists of pop-ups that send humorous or cute canned messages to the channel; many of those messages take the form of a "slap", ridiculing a victim chosen by the user.
Security concerns
Since IRC scripts are used to interface with a public network, they are a favourable target for attack. Event handling code must be careful when dealing with input received from other IRC users; a poorly written IRC script may leave the user vulnerable, allowing attackers to possibly read the user's passwords or private conversations, execute arbitrary commands in the user's IRC client, or access files on the user's hard disk.
IRC Scripts downloaded from public web sites or received from other IRC users may contain backdoors or similar malicious commands. Some users prefer to write their own IRC scripts to avoid the potential problems caused by a malicious or buggy script.
Similarities to Other Programming/Scripting Languages
IRC Scripts have many of the same concepts of other scripting or programming languages, such as variables, event-based execution, modification of core components and functions. IRC Scripts look similar to modular software configuration files, such as those used for some IRC daemons such as UnrealIRCd.
External links
mircscripts.orgSite for mIRC scripts, addons, themes and snippets
mircscripts.comOne of the oldest sources of mIRC script |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.