source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/DreamHost
DreamHost is a Los Angeles-based web hosting provider and domain name registrar. It is owned by New Dream Network, LLC, founded in 1996 by Dallas Bethune, Josh Jones, Michael Rodriguez and Sage Weil, undergraduate students at Harvey Mudd College in Claremont, California, and registered in 1997 by Michael Rodriguez. DreamHost began hosting customers' sites in 1997. In May 2012, DreamHost spun off Inktank. Inktank is a professional services and support company for the open source Ceph file system. In November 2014, DreamHost spun off Akanda, an open source network virtualization project. As of February 2016, Dreamhost employs about 200 people and has close to 400,000 customers. Web hosting DreamHost's shared, VPS, and dedicated hosting network consists of Apache, nginx and lighttpd web servers running on the Ubuntu operating system. DreamHost also offers cloud storage and computing services for entrepreneurs and developers, launched in 2012. The control panel for users to manage all services is a custom application designed in-house, and includes integrated billing and a support ticket system. DreamHost's staff contribute to an official blog and a customer support wiki. DreamHost does not offer call-in phone support, but customers can pay extra to request callbacks from support staff. Furthermore, a live chat option is available for all accounts when the level of support emails is low. This option is always available for customers that already pay the monthly fee for callbacks. The company hosts in excess of one million domains. File hosting In 2006, the company began a beta version file hosting service they called "Files Forever". The company stated that existing customers could store files "forever" after paying a one-time storage fee, and redistribute or sell them with DreamHost handling the transactions. As of November 2012, this service was no longer offered to new customers. In April 2013, DreamHost mentioned that the Files Forever service had been discontinu
https://en.wikipedia.org/wiki/Temperature%20measurement
Temperature measurement (also known as thermometry) describes the process of measuring a current local temperature for immediate or later evaluation. Datasets consisting of repeated standardized measurements can be used to assess temperature trends. History Attempts at standardized temperature measurement prior to the 17th century were crude at best. For instance in 170 AD, physician Claudius Galenus mixed equal portions of ice and boiling water to create a "neutral" temperature standard. The modern scientific field has its origins in the works by Florentine scientists in the 1600s including Galileo constructing devices able to measure relative change in temperature, but subject also to confounding with atmospheric pressure changes. These early devices were called thermoscopes. The first sealed thermometer was constructed in 1654 by the Grand Duke of Tuscany, Ferdinand II. The development of today's thermometers and temperature scales began in the early 18th century, when Gabriel Fahrenheit produced a mercury thermometer and scale, both developed by Ole Christensen Rømer. Fahrenheit's scale is still in use, alongside the Celsius and Kelvin scales. Technologies Many methods have been developed for measuring temperature. Most of these rely on measuring some physical property of a working material that varies with temperature. One of the most common devices for measuring temperature is the glass thermometer. This consists of a glass tube filled with mercury or some other liquid, which acts as the working fluid. Temperature increase causes the fluid to expand, so the temperature can be determined by measuring the volume of the fluid. Such thermometers are usually calibrated so that one can read the temperature simply by observing the level of the fluid in the thermometer. Another type of thermometer that is not really used much in practice, but is important from a theoretical standpoint, is the gas thermometer. Other important devices for measuring temperature inc
https://en.wikipedia.org/wiki/University%20of%20Michigan%20Library
The University of Michigan Library is the academic library system of the University of Michigan. The university's 38 constituent and affiliated libraries together make it the second largest research library by number of volumes in the United States. As of 2019–20, the University Library contained more than 14,543,814 volumes, while all campus library systems combined held more than 16,025,996 volumes. As of the 2019–2020 fiscal year, the Library also held 221,979 serials, and over 4,239,355 annual visits. Founded in 1838, the University Library is the university's main library and is housed in 12 buildings with more than 20 libraries, among the most significant of which are the Shapiro Undergraduate Library, Hatcher Graduate Library, Special Collections Library, and Taubman Health Sciences Library. However, several U-M libraries are independent of the University Library: the Bentley Historical Library, the William L. Clements Library, the Gerald R. Ford Library, the Kresge Business Administration Library of the Ross School of Business, and the Law Library of the University of Michigan Law School. The University Library is also separate from the libraries of the University of Michigan–Dearborn (Mardigian Library) and the University of Michigan–Flint (Frances Willson Thompson Library). The University of Michigan was the original home of the JSTOR database, which contains about 750,000 digitized pages from the entire pre-1990 backfile of ten journals of history and economics. In December 2004, the University of Michigan announced a book digitization program in collaboration with Google (known as Michigan Digitization Project), which is both revolutionary and controversial. Books scanned by Google are included in HathiTrust, a digital library created by a partnership of major research institutions. As of March 2014, the following collections had been digitized: Art, Architecture and Engineering Library; Bentley Historical Library; Buhr Building (large portions); Dent
https://en.wikipedia.org/wiki/European%20Parliament%20Committee%20on%20the%20Environment%2C%20Public%20Health%20and%20Food%20Safety
The Committee on the Environment, Public Health and Food Safety (ENVI) is a committee of the European Parliament. It has 81 full members and is currently chaired by Pascal Canfin. Evolution During the 1990s its relatively low importance led to its being referred to unfavourably by MEPs as the "Cinderella committee". However, since then the committee's powers have increased. The co-decision procedure for legislation, which grants greater powers to the Parliament, has been extended to more policy areas. Notably, the areas covered by this committee were the main recipients of these new powers. The rising importance of the issues it deals with (for example, global warming) has also meant that it has become one of the most important committees in Parliament. The committee's open sessions, as well as constituting a major forum within the Parliament, are usually well attended by both business lobbyists and representatives from environmental NGOs. Responsibilities The committee is responsible for: 1. environmental policy and environmental protection measures, in particular concerning: (a) climate change, (b) air, soil and water pollution, waste management and recycling, dangerous substances and preparations, noise levels and the protection of biodiversity, (c) sustainable development, (d) international and regional measures and agreements aimed at protecting the environment, (e) restoration of environmental damage, (f) civil protection, (g) the European Environment Agency, (h) the European Chemicals Agency; 2. public health, in particular: (a) programmes and specific actions in the field of public health, (b) pharmaceutical and cosmetic products, (c) health aspects of bioterrorism, (d) the European Medicines Agency and the European Centre for Disease Prevention and Control; 3. food safety issues, in particular: (a) the labelling and safety of foodstuffs, (b) veterinary legislation concerning protection against risks to hu
https://en.wikipedia.org/wiki/Fasciation
Fasciation (pronounced , from the Latin root meaning "band" or "stripe"), also known as cresting, is a relatively rare condition of abnormal growth in vascular plants in which the apical meristem (growing tip), which normally is concentrated around a single point and produces approximately cylindrical tissue, instead becomes elongated perpendicularly to the direction of growth, thus producing flattened, ribbon-like, crested (or "cristate"), or elaborately contorted tissue. Fasciation may also cause plant parts to increase in weight and volume in some instances. The phenomenon may occur in the stem, root, fruit, or flower head. Some plants are grown and prized aesthetically for their development of fasciation. Any occurrence of fasciation has several possible causes, including hormonal, genetic, bacterial, fungal, viral and environmental causes. Cause Fasciation can be caused by hormonal imbalances in the meristematic cells of plants, which are cells where growth can occur. Fasciation can also be caused by random genetic mutation. Bacterial and viral infections can also cause fasciation. The bacterial phytopathogen Rhodococcus fascians has been demonstrated as one cause of fasciation, such as in sweet pea (Lathyrus odoratus) plants, but many fasciated plants have tested negative for the bacteria in studies, hence bacterial infection is not an exclusive causation. Additional environmental factors that can cause fasciation include fungi, mite or insect attack and exposure to chemicals. General damage to a plant's growing tip and exposure to cold and frost can also cause fasciation. Some plants, such as peas and cockscomb Celosia, may inherit the trait. Genetic fasciation is not contagious, but infectious fasciation can be spread from infected plants to others from contact with wounds on infected plants, and from water that carries the bacteria to other plants. Occurrence Although fasciation is rare overall, it has been observed in over 100 vascular plant families
https://en.wikipedia.org/wiki/Certified%20Server%20Validation
Certified Server Validation (CSV) is a technical method of email authentication intended to fight spam. Its focus is the SMTP HELO-identity of mail transfer agents. Purpose CSV was designed to address the problems of MARID and the ASRG, as defined in detail as the intent of Lightweight MTA Authentication Protocol (LMAP) in an expired ASRG draft. As of January 3, 2007, all Internet Drafts have expired and the mailing list has been closed down since there had been no traffic for 6 months. Principles of operation CSV considers two questions at the start of each SMTP session: Does a domain's management authorize this MTA to be sending email? Do reputable independent accreditation services consider that domain's policies and practices sufficient for controlling email abuse? CSV answers these questions as follows: to validate an SMTP session from an unknown sending SMTP client using CSV, the receiving SMTP server: Obtains the remote IP address of the TCP connection. Extracts the domain name from the HELO command sent by the SMTP client. Queries DNS to confirm the domain name is authorized for use by the IP (CSA). Asks a reputable Accreditation Service if it has a good reputation (DNA). Determines the level of trust to give to the sending SMTP client, based on the results of (3) and (4) If the level of trust is high enough, process all email from that session in the traditional manner, delivering or forwarding without the need for further validation. If the level of trust is too low, return an error showing the reason for not trusting the sending SMTP client. If the level of trust is in between, document the result in a header in each email delivered or forwarded, and/or perform additional checks. If the answers to both of the questions at the top of this article are 'Yes', then receivers can expect the email received to be email they want. Mail sources are motivated to make the answers yes, and it's easy for them to do so (unless their email flow is so toxic that
https://en.wikipedia.org/wiki/University%20of%20Kentucky%20Research%20and%20Education%20Center%20Botanical%20Garden
The University of Kentucky Research and Education Center Botanical Garden, also known as the UK REC Botanical Garden, is a research farm and botanical garden for the University of Kentucky in Princeton, Kentucky. The University's Agricultural Experiment Station was established in 1885, with the West Kentucky Substation at Princeton founded in 1925. Today the Experiment Station Farm consists of almost 1,300 acres (520 hectares) where crops such as corn, wheat, soybeans, tobacco, fruits, vegetables and ornamentals are studied. The Princeton site also includes a 10-acre (40,000 m²) orchard/vineyard, plus 2 acres (8,000 m²) of grapes, and 1.5 acres (6,000 m²) for research in small fruit trees and ornamentals. See also List of botanical gardens in the United States Botanical gardens in Kentucky Botanical research institutes Research institutes in Kentucky University of Kentucky Protected areas of Caldwell County, Kentucky Princeton, Kentucky Education in Caldwell County, Kentucky
https://en.wikipedia.org/wiki/Liminality
In anthropology, liminality () is the quality of ambiguity or disorientation that occurs in the middle stage of a rite of passage, when participants no longer hold their pre-ritual status but have not yet begun the transition to the status they will hold when the rite is complete. During a rite's liminal stage, participants "stand at the threshold" between their previous way of structuring their identity, time, or community, and a new way (which completing the rite establishes). The concept of liminality was first developed in the early twentieth century by folklorist Arnold van Gennep and later taken up by Victor Turner. More recently, usage of the term has broadened to describe political and cultural change as well as rites. During liminal periods of all kinds, social hierarchies may be reversed or temporarily dissolved, continuity of tradition may become uncertain, and future outcomes once taken for granted may be thrown into doubt. The dissolution of order during liminality creates a fluid, malleable situation that enables new institutions and customs to become established. The term has also passed into popular usage and has been expanded to include liminoid experiences that are more relevant to post-industrial society. Rites of passage Arnold van Gennep Van Gennep, who coined the term liminality, published in 1909 his Rites de Passage, a work that explores and develops the concept of liminality in the context of rites in small-scale societies. Van Gennep began his book by identifying the various categories of rites. He distinguished between those that result in a change of status for an individual or social group, and those that signify transitions in the passage of time. In doing so, he placed a particular emphasis on rites of passage, and claimed that "such rituals marking, helping, or celebrating individual or collective passages through the cycle of life or of nature exist in every culture, and share a specific three-fold sequential structure". This th
https://en.wikipedia.org/wiki/Gene%20mapping
Gene mapping or genome mapping describes the methods used to identify the location of a gene on a chromosome and the distances between genes. Gene mapping can also describe the distances between different sites within a gene. The essence of all genome mapping is to place a collection of molecular markers onto their respective positions on the genome. Molecular markers come in all forms. Genes can be viewed as one special type of genetic markers in the construction of genome maps, and mapped the same way as any other markers. In some areas of study, gene mapping contributes to the creation of new recombinants within an organism. Gene maps help describe the spatial arrangement of genes on a chromosome. Genes are designated to a specific location on a chromosome known as the locus and can be used as molecular markers to find the distance between other genes on a chromosome. Maps provide researchers with the opportunity to predict the inheritance patterns of specific traits, which can eventually lead to a better understanding of disease-linked traits. The genetic basis to gene maps is to provide an outline that can potentially help researchers carry out DNA sequencing. A gene map helps point out the relative positions of genes and allows researchers to locate regions of interest in the genome. Genes can then be identified quickly and sequenced quickly. Two approaches to generating gene maps (gene mapping) include physical mapping and genetic mapping. Physical mapping utilizes molecular biology techniques to inspect chromosomes. These techniques consequently allow researchers to observe chromosomes directly so that a map may be constructed with relative gene positions. Genetic mapping on the other hand uses genetic techniques to indirectly find association between genes. Techniques can include cross-breeding (hybrid) experiments and examining pedigrees. These technique allow for maps to be constructed so that relative positions of genes and other important sequences
https://en.wikipedia.org/wiki/Charles%20Glen%20King
Charles Glen King (October 22, 1896 – January 23, 1988) was an American biochemist who was a pioneer in the field of nutrition research and who isolated vitamin C at the same time as Albert Szent-Györgyi. A biography of King states that many feel he deserves equal credit with Szent-Györgyi for the discovery of this vitamin. Early life and education King was born in Entiat, Washington to Charles Clement King and Mary Jane Bookwalter. He entered Washington State University early, as his local one-room school did not have a twelfth grade. He was a Member of Lambda Chi Alpha. World War I interrupted his college studies, where he served in the 12th Infantry, a machine gun company. He did not receive his B.S. in chemistry until 1918. He immediately departed for the University of Pittsburgh, earning his M.S. in 1920 and Ph.D. in 1923. From the outset of his graduate studies, the nascent field of vitamins interested him. He remained in Pittsburgh as a professor until 1942 when he left to become the first scientific director of the Nutrition Foundation, Inc., which worked to promote scientific and public health research, both in the U.S. and internationally. Career King's contribution to the science of nutrition revolves around his isolation of vitamin C in 1931-1932 by studying the antiscorbic activities of guinea pigs with preparations from lemon juice. Albert Szent-Györgyi was conducting similar research at the University of Szeged in Hungary, focusing on hexuronic acid. The chemical identity of King's active substance was almost identical to Szent-Györgyi's hexuronic acid, but the research of S.S. Silva had declared the hexuronic acid was not vitamin C. However, within two weeks of each other in the spring of 1932, King first, and then Szent-Györgyi, published articles declaring that vitamin C and hexuronic acid were indeed the same compound. Szent-Györgyi would later win a Nobel Prize for his part in the discovery, and controversy remains over whether both men deserve
https://en.wikipedia.org/wiki/Razer%20Inc.
Razer Inc., (stylized as RΛZΞR), is an American-Singaporean multinational technology company that designs, develops, and sells consumer electronics, financial services, and gaming hardware. The company was founded in 1998 by Min-Liang Tan and Robert "RazerGuy" Krakoff. It is dual headquartered in the one-north subzone of Queenstown, Singapore, and Irvine, California, US. History Razer began as a San Diego, California-based subsidiary of kärna LLC in 1998, created to develop and market a high-end computer gaming mouse, the Boomslang, targeted to computer gamers. Kärna ceased operations in 2000 due to poor financial issues. The current iteration of Razer was founded in 2005 by Min-Liang Tan, a Singaporean NUS graduate, and Robert Krakoff after they procured the rights to the Razer brand following a large investment from Hong Kong tycoon Li Ka-shing and Singaporean holding company Temasek Holdings. Razer bought the software assets of the Android-based microconsole Ouya from its parent company Ouya Inc. on 27 July 2015, while the hardware was discontinued. Ouya's technical team joined Razer's team in developing their own microconsole, which was called the Forge TV. It was discontinued in 2016. In October 2016, Razer purchased THX from Creative Technology according to THX CEO Ty Ahmad-Taylor. In January 2017, Razer bought manufacturer Nextbit, the startup behind the Robin smartphone. Shortly after in November that, Razer unveiled the Razer Phone, its first smartphone whose design is based on that of the Robin. In July 2017, Razer filed to go public through an IPO in Hong Kong. In October, it was confirmed that Razer plans to offer 1,063,600,000 shares at a range of $0.38–$0.51. On 14 November, Razer was officially listed on Hong Kong stock exchange under the stock code 1337, a reference to leet speak commonly used by gamers. Razer's IPO closed 18% up on the first day of trading and was the 2nd most successful IPO of 2017 in Hong Kong. In April 2018, Razer announce
https://en.wikipedia.org/wiki/Jackscrew
A jackscrew, or screw jack, is a type of jack that is operated by turning a leadscrew. It is commonly used to lift moderately and heavy weights, such as vehicles; to raise and lower the horizontal stabilizers of aircraft; and as adjustable supports for heavy loads, such as the foundations of houses. Description A screw jack consists of a heavy-duty vertical screw with a load table mounted on its top, which screws into a threaded hole in a stationary support frame with a wide base resting on the ground. A rotating collar on the head of the screw has holes into which the handle, a metal bar, fits. When the handle is turned clockwise, the screw moves further out of the base, lifting the load resting on the load table. In order to support large load forces, the screw is usually formed with Acme threads. Advantages An advantage of jackscrews over some other types of jack is that they are self-locking, which means when the rotational force on the screw is removed, it will remain motionless where it was left and will not rotate backwards, regardless of how much load it is supporting. This makes them inherently safer than hydraulic jacks, for example, which will move backwards under load if the force on the hydraulic actuator is accidentally released. Mechanical advantage The ideal mechanical advantage of a screw jack, the ratio of the force the jack exerts on the load to the input force on the lever ignoring friction is where is the force the jack exerts on the load. is the rotational force exerted on the handle of the jack is the length of the jack handle, from the screw axis to where the force is applied is the lead of the screw. The screw jack consists of two simple machines in series; the long operating handle serves as a lever whose output force turns the screw. So the mechanical advantage is increased by a longer handle as well as a finer screw thread. However, most screw jacks have large amounts of friction which increase the input force necessary, so th
https://en.wikipedia.org/wiki/Floyd%E2%80%93Steinberg%20dithering
Floyd–Steinberg dithering is an image dithering algorithm first published in 1976 by Robert W. Floyd and Louis Steinberg. It is commonly used by image manipulation software, for example when an image is converted into GIF format that is restricted to a maximum of 256 colors. Implementation The algorithm achieves dithering using error diffusion, meaning it pushes (adds) the residual quantization error of a pixel onto its neighboring pixels, to be dealt with later. It spreads the debt out according to the distribution (shown as a map of the neighboring pixels): The pixel indicated with a star (*) indicates the pixel currently being scanned, and the blank pixels are the previously-scanned pixels. The algorithm scans the image from left to right, top to bottom, quantizing pixel values one by one. Each time, the quantization error is transferred to the neighboring pixels, while not affecting the pixels that already have been quantized. Hence, if a number of pixels have been rounded downwards, it becomes more likely that the next pixel is rounded upwards, such that on average, the quantization error is close to zero. The diffusion coefficients have the property that if the original pixel values are exactly halfway in between the nearest available colors, the dithered result is a checkerboard pattern. For example, 50% grey data could be dithered as a black-and-white checkerboard pattern. For optimal dithering, the counting of quantization errors should be in sufficient accuracy to prevent rounding errors from affecting the result. In some implementations, the horizontal direction of scan alternates between lines; this is called "serpentine scanning" or boustrophedon transform dithering. The algorithm described above is in the following pseudocode. This works for any approximately linear encoding of pixel values, such as 8-bit integers, 16-bit integers or real numbers in the range [0, 1]. for each y from top to bottom do for each x from left to right do
https://en.wikipedia.org/wiki/Terminology%20server
A terminology server is a piece of software providing a range of terminology-related software services through an applications programming interface to its client applications. Typical terminology services might include: Matching an arbitrary, user-defined text entry string (or regular expression) against a fixed internal list of natural language expressions, possibly using word equivalent, alternate spelling, abbreviation or part-of-speech substitution tables, and other lexical resources, to increase the recall and precision of the matching algorithm Retrieving any asserted associations between a fixed list of terminology expressions in one language and translations in another natural language Retrieving any asserted associations between a fixed list of terminology expressions, and entities in a concept system or ontology (information science) Retrieving any asserted or inferrable semantic links between concepts in a concept system or ontology (information science), particularly subsumption (Is-a) relationships Retrieving any directly asserted, or the best approximate indirectly inferrable, associations between concepts in an ontology and entities in one or more external resources (e.g. libraries of images, decision support rules or statistical classifications) See also Clinical terminology server
https://en.wikipedia.org/wiki/De%20Bruijn%20graph
In graph theory, an -dimensional De Bruijn graph of symbols is a directed graph representing overlaps between sequences of symbols. It has vertices, consisting of all possible sequences of the given symbols; the same symbol may appear multiple times in a sequence. For a set of symbols the set of vertices is: If one of the vertices can be expressed as another vertex by shifting all its symbols by one place to the left and adding a new symbol at the end of this vertex, then the latter has a directed edge to the former vertex. Thus the set of arcs (that is, directed edges) is Although De Bruijn graphs are named after Nicolaas Govert de Bruijn, they were invented independently by both De Bruijn and I. J. Good. Much earlier, Camille Flye Sainte-Marie implicitly used their properties. Properties If , then the condition for any two vertices forming an edge holds vacuously, and hence all the vertices are connected, forming a total of edges. Each vertex has exactly incoming and outgoing edges. Each -dimensional De Bruijn graph is the line digraph of the De Bruijn graph with the same set of symbols. Each De Bruijn graph is Eulerian and Hamiltonian. The Euler cycles and Hamiltonian cycles of these graphs (equivalent to each other via the line graph construction) are De Bruijn sequences. The line graph construction of the three smallest binary De Bruijn graphs is depicted below. As can be seen in the illustration, each vertex of the -dimensional De Bruijn graph corresponds to an edge of the De Bruijn graph, and each edge in the -dimensional De Bruijn graph corresponds to a two-edge path in the De Bruijn graph. Dynamical systems Binary De Bruijn graphs can be drawn in such a way that they resemble objects from the theory of dynamical systems, such as the Lorenz attractor: This analogy can be made rigorous: the -dimensional -symbol De Bruijn graph is a model of the Bernoulli map The Bernoulli map (also called the map for ) is an ergodic dynamical system,
https://en.wikipedia.org/wiki/Microformat
Microformats (μF) are a set of defined HTML classes created to serve as consistent and descriptive metadata about an element, designating it as representing a certain type of data (such as contact information, geographic coordinates, events, blog posts, products, recipes, etc.). They allow software to process the information reliably by having set classes refer to a specific type of data rather than being arbitrary. Microformats emerged around 2005 and were predominantly designed for use by search engines, web syndication and aggregators such as RSS. Although the content of web pages has been capable of some "automated processing" since the inception of the web, such processing is difficult because the markup elements used to display information on the web do not describe what the information means. Microformats can bridge this gap by attaching semantics, and thereby obviating other, more complicated, methods of automated processing, such as natural language processing or screen scraping. The use, adoption and processing of microformats enables data items to be indexed, searched for, saved or cross-referenced, so that information can be reused or combined. , microformats allow the encoding and extraction of event details, contact information, social relationships and similar information. Microformats2 abbreviated as mf2 is the updated version of microformats. Mf2 provides a more easy way of interpreting HTML(hypertext Markup Language) structured syntax and vocabularies than the earlier ways that made use of RDFa and microdata. Background Microformats emerged around 2005 as part of a grassroots movement to make recognizable data items (such as events, contact details or geographical locations) capable of automated processing by software, as well as directly readable by end-users. Link-based microformats emerged first. These include vote links that express opinions of the linked page, which search engines can tally into instant polls. CommerceNet, a nonprofit or
https://en.wikipedia.org/wiki/Ambient%20network
Ambient networks is a network integration design that seeks to solve problems relating to switching between networks to maintain contact with the outside world. This project aims to develop a network software-driven infrastructure that will run on top of all current or future network physical infrastructures to provide a way for devices to connect to each other, and through each other to the outside world. The concept of Ambient Networks comes from the IST Ambient Network project, which was a research project sponsored by the European Commission within the Sixth Framework Programme (FP6). The Ambient Networks Project Ambient Networks was a collaborative project within the European Union's Sixth Framework Programme that investigates future communications systems beyond fixed and 3rd generation mobile networks. It is part of the Wireless World Initiative. The project worked at a new concept called Ambient Networking, to provide suitable mobile networking technology for the future mobile and wireless communications environment. Ambient Networks aimed to provide a unified networking concept that can adapt to the very heterogeneous environment of different radio technologies and service and network environments. Special focus was put on facilitating both competition and cooperation of various market players by defining interfaces, which allow the instant negotiation of agreements. This approach went beyond interworking of well-defined protocols and was expected to have a long-term effect on the business landscape in the wireless world. Central to the project was the concept of composition of networks, an approach to address the dynamic nature of the target environment, based on an open framework for network control functionality, which can be extended with new capabilities as well as operating over existing connectivity infrastructure. Phase 1 of the project (2004–2005) laid the conceptual foundations. The Deliverable D1-5 "Ambient Networks Framework Architecture" su
https://en.wikipedia.org/wiki/WiCell
WiCell Research Institute is a scientific research institute in Madison, Wisconsin that focuses on stem cell research. Independently governed and supported as a 501(c)(3) organization, WiCell operates as an affiliate of the Wisconsin Alumni Research Foundation and works to advance stem cell research at the University of Wisconsin–Madison and beyond. History Established in 1998 to develop stem cell technology, WiCell Research Institute is a nonprofit organization that creates and distributes human pluripotent stem cell lines worldwide. WiCell also provides cytogenetic and technical services, establishes scientific protocols and supports basic research on the UW-Madison campus. WiCell serves as home to the Wisconsin International Stem Cell Bank. This stem cell repository stores, characterizes and provides access to stem cell lines for use in research and clinical development. The cell bank originally stored the first five human Embryonic stem cell lines derived by Dr. James Thomson of UW–Madison. It currently houses human embryonic stem cell lines, induced pluripotent stem cell lines, clinical grade cell lines developed in accordance with Good Manufacturing Practices (GMP) and differentiated cell lines including neural progenitor cells. To support continued progress in the field and help unlock the therapeutic potential of stem cells, in 2005 WiCell began providing cytogenetic services and quality control testing services. These services allow scientists to identify genetic abnormalities in cells or changes in stem cell colonies that might affect research results. Organization Chartered with a mission to support scientific investigation and research at UW–Madison, WiCell collaborates with faculty members and provides support with stem cell research projects. The institute established its cytogenetic laboratory to meet the growing needs of academic and commercial researchers to monitor genetic stability in stem cell cultures. Facilities WiCell maintains its stem c
https://en.wikipedia.org/wiki/Nomen%20nudum
In taxonomy, a nomen nudum ('naked name'; plural nomina nuda) is a designation which looks exactly like a scientific name of an organism, and may have originally been intended to be one, but it has not been published with an adequate description. This makes it a "bare" or "naked" name, which cannot be accepted as it stands. A largely equivalent but much less frequently used term is nomen tantum ("name only"). Sometimes, "nomina nuda" is erroneously considered a synonym for the term "unavailable names". However, not all unavailable names are nomina nuda. In zoology According to the rules of zoological nomenclature a nomen nudum is unavailable; the glossary of the International Code of Zoological Nomenclature gives this definition: And among the rules of that same Zoological Code: In botany According to the rules of botanical nomenclature a nomen nudum is not validly published. The glossary of the International Code of Nomenclature for algae, fungi, and plants gives this definition: The requirements for the diagnosis or description are covered by articles 32, 36, 41, 42, and 44. Nomina nuda that were published before 1 January 1959 can be used to establish a cultivar name. For example, Veronica sutherlandii, a nomen nudum, has been used as the basis for Hebe pinguifolia 'Sutherlandii'. See also Glossary of scientific naming Unavailable name Nomen dubium
https://en.wikipedia.org/wiki/Frontier%20Organic%20Research%20Farm%20Botanical%20Garden
The Frontier Organic Research Farm Botanical Garden 1 acre (4,000 m2) is a botanical garden operated by the Frontier Co-op corporation, and located with the research farm at company headquarters in Norway, Iowa, in the United States. The garden contains 200 botanical species in 17 medicinal herb beds, with an elderberry grove as a windbreak. The cooperative also manages a 15-acre (61,000 m2) native prairie at the site, as well as 68 acres (275,000 m2) in Meigs County, Ohio operated as the National Center for the Preservation of Medicinal Herbs. See also List of botanical gardens in the United States Botanical gardens in Iowa Botanical research institutes Research institutes in Iowa Protected areas of Benton County, Iowa
https://en.wikipedia.org/wiki/Nomen%20dubium
In binomial nomenclature, a nomen dubium (Latin for "doubtful name", plural nomina dubia) is a scientific name that is of unknown or doubtful application. Zoology In case of a nomen dubium, it may be impossible to determine whether a specimen belongs to that group or not. This may happen if the original type series (i. e. holotype, isotype, syntype or paratype) is lost or destroyed. The zoological and botanical codes allow for a new type specimen, or neotype, to be chosen in this case. A name may also be considered a nomen dubium if its name-bearing type is fragmentary or lacking important diagnostic features (this is often the case for species known only as fossils). To preserve stability of names, the International Code of Zoological Nomenclature allows a new type specimen, or neotype, to be chosen for a nomen dubium in this case. 75.5. Replacement of unidentifiable name-bearing type by a neotype. When an author considers that the taxonomic identity of a nominal species-group taxon cannot be determined from its existing name-bearing type (i.e. its name is a nomen dubium), and stability or universality are threatened thereby, the author may request the Commission to set aside under its plenary power [Art. 81] the existing name-bearing type and designate a neotype. For example, the crocodile-like archosaurian reptile Parasuchus hislopi Lydekker, 1885 was described based on a premaxillary rostrum (part of the snout), but this is no longer sufficient to distinguish Parasuchus from its close relatives. This made the name Parasuchus hislopi a nomen dubium. In 2001 a paleontologist proposed that a new type specimen, a complete skeleton, be designated. The International Commission on Zoological Nomenclature considered the case and agreed in 2003 to replace the original type specimen with the proposed neotype. Bacteriology In bacteriological nomenclature, nomina dubia may be placed on the list of rejected names by the Judicial Commission. The meaning of these names
https://en.wikipedia.org/wiki/Guillotine%20cutting
Guillotine cutting is the process of producing small rectangular items of fixed dimensions from a given large rectangular sheet, using only guillotine-cuts. A guillotine-cut (also called an edge-to-edge cut) is a straight bisecting line going from one edge of an existing rectangle to the opposite edge, similarly to a paper guillotine. Guillotine cutting is particularly common in the glass industry. Glass sheets are scored along horizontal and vertical lines, and then broken along these lines to obtain smaller panels. It is also useful for cutting steel plates, cutting of wood sheets to make furniture, and cutting of cardboard into boxes. There are various optimization problems related to guillotine cutting, such as: maximize the total area of the produced pieces, or their total value; minimize the amount of waste (unused parts) of the large sheet, or the total number of sheets. They have been studied in combinatorial geometry, operations research and industrial engineering. A related but different problem is guillotine partition. In that problem, the dimensions of the small rectangles are not fixed in advance. The challenge comes from the fact that the original sheet might not be rectangular - it can be any rectilinear polygon. In particular, it might contain holes (representing defects in the raw material). The optimization goal is usually to minimize the number of small rectangles, or minimize the total length of the cuts. Terminology and assumptions The following terms and notations are often used in the literature on guillotine cutting. The large rectangle, also called the stock sheet, is the raw rectangular sheet which should be cut. It is characterized by its width W0 and height H0, which are the primary inputs to the problem The small rectangles, also called items, are the required outputs of the cutting. They are characterized by their width wi and height hi and for i in 1,...,m, where m is the number of rectangles. Often, it is allowed to have sever
https://en.wikipedia.org/wiki/Myrmecophyte
Myrmecophytes (; literally "ant-plant") are plants that live in a mutualistic association with a colony of ants. There are over 100 different genera of myrmecophytes. These plants possess structural adaptations that provide ants with food and/or shelter. These specialized structures include domatia, food bodies, and extrafloral nectaries. In exchange for food and shelter, ants aid the myrmecophyte in pollination, seed dispersal, gathering of essential nutrients, and/or defense. Specifically, domatia adapted to ants may be called myrmecodomatia. Mutualism Myrmecophytes share a mutualistic relationship with ants, benefiting both the plants and ants. This association may be either facultative or obligate. Obligate In obligate mutualisms, both of the organisms involved are interdependent; they cannot survive on their own. An example of this type of mutualism can be found in the plant genus Macaranga. All species of this genus provide food for ants in various forms, but only the obligate species produce domatia. Some of the most common species of myrmecophytic Macaranga interact with ants in the genus Crematogaster. C. borneensis have been found to be completely dependent on its partner plant, not being able to survive without the provided nesting spaces and food bodies. In laboratory tests, the worker ants did not survive away from the plants, and in their natural habitat they were never found anywhere else. Facultative Facultative mutualism is a type of relationship where the survival of both parties (plant and ants, in this instance) is not dependent upon the interaction. Both organisms can survive without the other species. Facultative mutualisms most often occur in plants that have extrafloral nectaries but no other specialized structures for the ants. These non-exclusive nectaries allow a variety of animal species to interact with the plant. Facultative relationships can also develop between non-native plant and ant species, where co-evolution
https://en.wikipedia.org/wiki/Cocktail%20sauce
Cocktail sauce, also known as seafood sauce, is one of several types of cold or room temperature sauces often served as part of a dish referred to as a seafood cocktail or as a condiment with other seafoods. The sauce, and the dish for which it is named, are often credited to British celebrity chef Fanny Cradock, but seafood cocktails predate her 1967 recipe by some years (for example, Constance Spry published a seafood cocktail using Dublin Bay Prawns in 1956). Origin Seafood cocktails originated in the 19th century in the United States, usually made with oyster or shrimp. Seafood with spiced, cold sauces was a well-established part of the 20th century culinary repertoire. While cocktail sauce is most associated with the prawn cocktail, it can be served with any shellfish. Varieties North America In the United States and Canada it generally consists of, at a minimum, ketchup or chili sauce mixed with prepared horseradish. Lemon juice, Worcestershire sauce and Tabasco sauce are common additives, often all three. Elsewhere The common form of cocktail sauce in the United Kingdom, Ireland, Iceland, France, Belgium, Italy and The Netherlands, usually consists of mayonnaise mixed with a tomato sauce to the same pink colour as prawns, producing a result that could be compared to fry sauce. It is similar to Thousand Island dressing, but the more usual British name is Marie (or Mary) Rose Sauce. The origins of the name are unclear and it is variously credited to a 1980s dive team cook working at the site of the Tudor ship, the Mary Rose, and Fanny Cradock. However, the term first appeared in the 1920s as a term for a garnish of shrimp, and was in use for cocktail sauce by at least 1963. The name was linked to the colour and Escoffier uses it to describe a pink iced pudding. It was so ubiquitous in the 1960s and 1970s that it has since become something of a joke in Britain, along with its most commonly associated dish, the prawn cocktail. In Belgium, a dash of whisky
https://en.wikipedia.org/wiki/Index%20of%20radiation%20articles
absorbed dose Electromagnetic radiation equivalent dose hormesis Ionizing radiation Louis Harold Gray (British physicist) rad (unit) radar radar astronomy radar cross section radar detector radar gun radar jamming (radar reflector) corner reflector radar warning receiver (Radarange) microwave oven radiance (radiant: see) meteor shower radiation Radiation absorption Radiation acne Radiation angle radiant barrier (radiation belt: see) Van Allen radiation belt Radiation belt electron Radiation belt model Radiation Belt Storm Probes radiation budget Radiation burn Radiation cancer (radiation contamination) radioactive contamination Radiation contingency Radiation damage Radiation damping Radiation-dominated era Radiation dose reconstruction Radiation dosimeter Radiation effect radiant energy Radiation enteropathy (radiation exposure) radioactive contamination Radiation flux (radiation gauge: see) gauge fixing radiation hardening (radiant heat) thermal radiation radiant heating radiant intensity radiation hormesis radiation impedance radiation implosion Radiation-induced lung injury Radiation Laboratory radiation length radiation mode radiation oncologist radiation pattern radiation poisoning (radiation sickness) radiation pressure radiation protection (radiation shield) (radiation shielding) radiation resistance Radiation Safety Officer radiation scattering radiation therapist radiation therapy (radiotherapy) (radiation treatment) radiation therapy (radiation units: see) :Category:Units of radiation dose (radiation weight factor: see) equivalent dose radiation zone radiative cooling radiative forcing radiator radio (radio amateur: see) amateur radio (radio antenna) antenna (radio) radio astronomy radio beacon (radio broadcasting: see) broadcasting radio clock (radio communications) radio radio control radio controlled airplane radio controlled car radio-controlled helicopter radio control
https://en.wikipedia.org/wiki/Enom
Enom, Inc. is a domain name registrar and Web hosting company that also sells other products closely tied to domain names, such as SSL certificates, e-mail services, and Website building software. As of May 2016, it manages over 15 million domains. Company history Enom was founded in 1997 in Kirkland, Washington operating as a wholesale business, allowing resellers to sell domains and other services under their own branding. Enom also operates retail sites enomcentral.com and bulkregister.com. In May 2006, Enom was one of the original businesses that were acquired to form privately held Demand Media, headquartered in Santa Monica, California. Within Demand Media, Enom operated as a domain name registrar and as the registrar platform for its media properties, until separating from Demand Media as a brand of Rightside Group, Ltd in 2014. In July 2006, Enom bought out competitor BulkRegister. Prior to its purchase, BulkRegister was a member-supported service where clients were not resellers, but companies large enough to pay an annual membership fee to acquire low registration fees on their domain name registrations, due to the volume they potentially register. With this acquisition, Enom rose to become the second largest domain name registrar. Enom maintained BulkRegister as a separate service until Tucows discontinued it after acquiring Enom. In June 2016, Enom officially launched its revitalized retail experience in a major series of improvements to its developer platform. The changes affected over 14 million domains handled through Enom's channel of partners and resellers, as well as directly through the company's retail interface. Which boasts a revitalized aesthetic including color palette and updated logo. Enom has a team in Electronic Sports (CSGO, BF3, SMITE) This team has 20 world championships on its records 15 as 1st team and 5 as 2nd team. In January 2017, Enom was sold to Canadian domain seller Tucows for US$83.5M. In January 2022, Enom experience
https://en.wikipedia.org/wiki/Bausch%20Health
Bausch Health Companies Inc. (formerly Valeant Pharmaceuticals International, Inc.) is a Canadian multinational specialty pharmaceutical company based in Laval, Quebec, Canada. It develops, manufactures and markets pharmaceutical products and branded generic drugs, primarily for skin diseases, gastrointestinal disorders, eye health and neurology. Bausch Health owns Bausch & Lomb, a supplier of eye health products. Valeant was originally founded in 1959 as ICN Pharmaceuticals by Milan Panić in California. During the 2010s, Valeant adopted a strategy of buying up other pharmaceutical companies which manufactured effective medications for a variety of medical problems, and then increasing the price of those medications. As a result, the company grew rapidly and in 2015 was the most valuable company in Canada. Valeant was involved in a number of controversies surrounding drug price hikes and the use of a specialty pharmacy for the distribution of its drugs. This led to an investigation by the U.S. Securities and Exchange Commission, causing its stock price to plummet more than 90 percent from its peak, while its debt surpassed $30 billion. In July 2018, the name of the company was changed to Bausch Health Companies Inc., in order to distance itself from the public outrage associated with massive price increases introduced by Valeant. At the same time, a new ticker symbol, BHC replaced VRX. History 1959–2002: the Panić years In 1959, Yugoslavian immigrant Milan Panić, who had defected to the US three years earlier, founded ICN Pharmaceuticals (International Chemical and Nuclear Corporation) in his Pasadena garage. Panić ran the company for 43 years, during which ICN established a foothold in the industry by acquiring niche pharmaceuticals and through the development of Ribavirin, an antiviral drug that became the standard treatment for hepatitis C. In 1994, ICN merged with SPI Pharmaceuticals Inc., Viratek Inc., and ICN Biomedicals Inc. On June 12, 2002, followin
https://en.wikipedia.org/wiki/Jacobi%27s%20formula
In matrix calculus, Jacobi's formula expresses the derivative of the determinant of a matrix A in terms of the adjugate of A and the derivative of A. If is a differentiable map from the real numbers to matrices, then where is the trace of the matrix . (The latter equality only holds if A(t) is invertible.) As a special case, Equivalently, if stands for the differential of , the general formula is The formula is named after the mathematician Carl Gustav Jacob Jacobi. Derivation Via Matrix Computation We first prove a preliminary lemma: Lemma. Let A and B be a pair of square matrices of the same dimension n. Then Proof. The product AB of the pair of matrices has components Replacing the matrix A by its transpose AT is equivalent to permuting the indices of its components: The result follows by taking the trace of both sides: Theorem. (Jacobi's formula) For any differentiable map A from the real numbers to n × n matrices, Proof. Laplace's formula for the determinant of a matrix A can be stated as Notice that the summation is performed over some arbitrary row i of the matrix. The determinant of A can be considered to be a function of the elements of A: so that, by the chain rule, its differential is This summation is performed over all n×n elements of the matrix. To find ∂F/∂Aij consider that on the right hand side of Laplace's formula, the index i can be chosen at will. (In order to optimize calculations: Any other choice would eventually yield the same result, but it could be much harder). In particular, it can be chosen to match the first index of ∂ / ∂Aij: Thus, by the product rule, Now, if an element of a matrix Aij and a cofactor adjT(A)ik of element Aik lie on the same row (or column), then the cofactor will not be a function of Aij, because the cofactor of Aik is expressed in terms of elements not in its own row (nor column). Thus, so All the elements of A are independent of each other, i.e. where δ is the Kronecker delta, so
https://en.wikipedia.org/wiki/Carte%20Bleue
Carte Bleue () was a major debit card payment system operating in France. Unlike Visa Electron or Maestro debit cards, Carte Bleue transactions worked without requiring authorization from the cardholder's bank. In many situations, the card worked like a credit card but without fees for the cardholder. The system has now been integrated into a wider scheme called CB or carte bancaire ("banking card"). All Carte Bleue cards were part of CB, but not all CB cards were Carte Bleue. The system was national, and pure Carte Bleue cards did not operate outside France. However, it is possible and commonplace to get a CB Visa card that operates outside France. Carte Bleue was, technically speaking, the local Visa affiliate. Carte Bleue started in 1967, associating six French banks: BNP, CCF, Crédit du Nord, CIC, Crédit Lyonnais, and Société Générale. Combined Visa cards have existed since 1973 under the name Carte Bleue Internationale, changing to Carte Bleue Visa in 1976. From 1992 on, all Cartes Bleues / CB have been smart cards. When using a Carte Bleue at a French merchant, the PIN of the card must be used, and a microchip on the card verifies and authenticates the transaction. Only some very limited transactions, such as motorway tolls or parking fees, are paid without PIN. Since automatic teller machines also check for the PIN, this measure strongly reduces the incentive to steal Cartes Bleues, since the cards are essentially useless without the PIN (though one may try using the card number for mail-order or e-retailing). Foreign cards without microchips can still be used at French merchants if they accept them, with the usual procedure of swiping the magnetic stripe and signing the receipt. In 2000, Serge Humpich, after failing to convince the makers of a serious flaw he had found two years before, purchased some metro tickets to prove it. He sent the proof to Groupement des Cartes Bancaires. They then initiated criminal action against him, and he was convicted and
https://en.wikipedia.org/wiki/Ethyl%20cinnamate
Ethyl cinnamate is the ester of cinnamic acid and ethanol. It is present in the essential oil of cinnamon. Pure ethyl cinnamate has a "fruity and balsamic odor, reminiscent of cinnamon with an amber note". The p-methoxy derivative is reported to be a monoamine oxidase inhibitor. It can be synthesized by the esterification reaction involving ethanol and cinnamic acid in the presence of sulfuric acid. List of plants that contain the chemical Kaempferia galanga
https://en.wikipedia.org/wiki/Thermal%20runaway
Thermal runaway describes a process that is accelerated by increased temperature, in turn releasing energy that further increases temperature. Thermal runaway occurs in situations where an increase in temperature changes the conditions in a way that causes a further increase in temperature, often leading to a destructive result. It is a kind of uncontrolled positive feedback. In chemistry (and chemical engineering), thermal runaway is associated with strongly exothermic reactions that are accelerated by temperature rise. In electrical engineering, thermal runaway is typically associated with increased current flow and power dissipation. Thermal runaway can occur in civil engineering, notably when the heat released by large amounts of curing concrete is not controlled. In astrophysics, runaway nuclear fusion reactions in stars can lead to nova and several types of supernova explosions, and also occur as a less dramatic event in the normal evolution of solar-mass stars, the "helium flash". Chemical engineering Chemical reactions involving thermal runaway are also called thermal explosions in chemical engineering, or runaway reactions in organic chemistry. It is a process by which an exothermic reaction goes out of control: the reaction rate increases due to an increase in temperature, causing a further increase in temperature and hence a further rapid increase in the reaction rate. This has contributed to industrial chemical accidents, most notably the 1947 Texas City disaster from overheated ammonium nitrate in a ship's hold, and the 1976 explosion of zoalene, in a drier, at King's Lynn. Frank-Kamenetskii theory provides a simplified analytical model for thermal explosion. Chain branching is an additional positive feedback mechanism which may also cause temperature to skyrocket because of rapidly increasing reaction rate. Chemical reactions are either endothermic or exothermic, as expressed by their change in enthalpy. Many reactions are highly exothermic, so ma
https://en.wikipedia.org/wiki/Glycolic%20acid
Glycolic acid (or hydroxyacetic acid; chemical formula ) is a colorless, odorless and hygroscopic crystalline solid, highly soluble in water. It is used in various skin-care products. Glycolic acid is widespread in nature. A glycolate (sometimes spelled "glycollate") is a salt or ester of glycolic acid. History The name "glycolic acid" was coined in 1848 by French chemist Auguste Laurent (1807–1853). He proposed that the amino acid glycine—which was then called glycocolle—might be the amine of a hypothetical acid, which he called "glycolic acid" (acide glycolique). Glycolic acid was first prepared in 1851 by German chemist Adolph Strecker (1822–1871) and Russian chemist Nikolai Nikolaevich Sokolov (1826–1877). They produced it by treating hippuric acid with nitric acid and nitrogen dioxide to form an ester of benzoic acid and glycolic acid (), which they called "benzoglycolic acid" (Benzoglykolsäure; also benzoyl glycolic acid). They boiled the ester for days with dilute sulfuric acid, thereby obtaining benzoic acid and glycolic acid (Glykolsäure). Preparation Glycolic acid can be synthesized in various ways. The predominant approaches use a catalyzed reaction of formaldehyde with synthesis gas (carbonylation of formaldehyde), for its low cost. It is also prepared by the reaction of chloroacetic acid with sodium hydroxide followed by re-acidification. Other methods, not noticeably in use, include hydrogenation of oxalic acid, and hydrolysis of the cyanohydrin derived from formaldehyde. Some of today's glycolic acids are formic acid-free. Glycolic acid can be isolated from natural sources, such as sugarcane, sugar beets, pineapple, cantaloupe and unripe grapes. Glycolic acid can also be prepared using an enzymatic biochemical process that may require less energy. Properties Glycolic acid is slightly stronger than acetic acid due to the electron-withdrawing power of the terminal hydroxyl group. The carboxylate group can coordinate to metal ions forming coord
https://en.wikipedia.org/wiki/Broadcast%20reference%20monitor
A video reference monitor, also called a broadcast reference monitor or just reference monitor, is a specialized display device similar to a television set, used to monitor the output of a video-generating device, such as playout from a video server, IRD, video camera, VCR, or DVD player. It may or may not have professional audio monitoring capability. Unlike a television set, a video monitor has no tuner and, as such, is unable independently to tune into an over-the-air broadcast like a television receiver. One common use of video monitors is in television stations, television studios, production trucks and in outside broadcast vehicles, where broadcast engineers use them for confidence checking of analog signal and digital signals throughout the system. They can also be used for color grading if calibrated, during post-production. Common display types for video monitors Cathode ray tube Liquid crystal display Plasma display OLED Common monitoring formats for security Composite video S-Video Broadcast reference monitor Broadcast reference monitors must be used for video compliance at television or television studio facilities, because they do not perform any video enhancements and try to produce as accurate an image as possible. For quality control purposes, it is necessary for a broadcast reference monitor to produce (reasonably) consistent images from facility to facility, to reveal any flaws in the material, and also not to introduce any image artifacts (such as aliasing) that is not in the source material. Broadcast monitors will try to avoid post processing such as a video scaler, line doubling and any image enhancements such as dynamic contrast. However, display technologies with fixed pixel structures (e.g. LCD, plasma) must perform image scaling when displaying SD signals as the signal contains non-square pixels while the display has square pixels. LCDs and plasmas are also inherently progressive displays and may need to perform deinterlacing
https://en.wikipedia.org/wiki/Unit%20function
In number theory, the unit function is a completely multiplicative function on the positive integers defined as: It is called the unit function because it is the identity element for Dirichlet convolution. It may be described as the "indicator function of 1" within the set of positive integers. It is also written as u(n) (not to be confused with μ(n), which generally denotes the Möbius function). See also Möbius inversion formula Heaviside step function Kronecker delta
https://en.wikipedia.org/wiki/Philips%20circle%20pattern
The Philips circle pattern (also referred to as the Philips pattern or PTV Circle pattern) refers to a family of related electronically generated complex television station colour test cards. The content and layout of the original colour circle pattern was designed by Danish engineer (1939–2011) in the Philips TV & Test Equipment laboratory in Brøndby Municipality near Copenhagen under supervision of chief engineer Erik Helmer Nielsen in 1966–67, largely building on their previous work with the monochrome PM5540 pattern. The first piece of equipment, the PM5544 colour pattern generator, which generates the pattern, was made by Finn Hendil and his group in 1968–69. The same team would also develop the Spanish TVE colour test card in 1973. Since the widespread introduction of the original PM5544 from the early-1970s, the Philips Pattern has become one of the most commonly used test cards, with only the SMPTE and EBU colour bars as well as the BBC's Test Card F coming close to its usage. The Philips circle pattern was later incorporated into other test pattern generators from Philips itself, as well as test pattern generators from various other manufacturers. Equipment from Philips and succeeding companies which generate the circle pattern are the PM5544, PM5534, PM5535, PM5644, PT5210, PT5230 and PT5300. Other related (non circle pattern) test card generators by Philips are the PM5400 (TV serviceman) family, PM5515/16/18, PM5519, PM5520 (monochrome), PM5522 (PAL), PM5540 (monochrome), PM5547, PM5552 and PM5631. Operation Rather than previous test card approaches that worked by a live camera or monoscope filming a printed card, the Philips PM5544 generates the test patterns fully using electronic circuits, with separate paths for Y, R-Y and B-Y colour components (), allowing engineers to reliably test and adjust transmitters and receivers for signal disturbances and colour separation, for instance for PAL broadcasts. In simple terms, the displayed pattern provid
https://en.wikipedia.org/wiki/Strategic%20dominance
In game theory, strategic dominance (commonly called simply dominance) occurs when one strategy is better than another strategy for one player, no matter how that player's opponents may play. Many simple games can be solved using dominance. The opposite, intransitivity, occurs in games where one strategy may be better or worse than another strategy for one player, depending on how the player's opponents may play. Terminology When a player tries to choose the "best" strategy among a multitude of options, that player may compare two strategies A and B to see which one is better. The result of the comparison is one of: B is equivalent to A: choosing B always gives the same outcome as choosing A, no matter what the other players do. B strictly dominates A: choosing B always gives a better outcome than choosing A, no matter what the other players do. B weakly dominates A: choosing B always gives at least as good an outcome as choosing A, no matter what the other players do, and there is at least one set of opponents' action for which B gives a better outcome than A. (Notice that if B strictly dominates A, then B weakly dominates A. Therefore, we can say "B dominates A" as synonymous of "B weakly dominates A".) B and A are intransitive: B and A are not equivalent, and B neither dominates, nor is dominated by, A. Choosing A is better in some cases, while choosing B is better in other cases, depending on exactly how the opponent chooses to play. For example, B is "throw rock" while A is "throw scissors" in Rock, Paper, Scissors. B is weakly dominated by A: there is at least one set of opponents' actions for which B gives a worse outcome than A, while all other sets of opponents' actions give A the same payoff as B. (Strategy A weakly dominates B). B is strictly dominated by A: choosing B always gives a worse outcome than choosing A, no matter what the other player(s) do. (Strategy A strictly dominates B). This notion can be generalized beyond the comparison of two s
https://en.wikipedia.org/wiki/Solar%20cell
A solar cell or photovoltaic cell (PV cell) is an electronic device that converts the energy of light directly into electricity by means of the photovoltaic effect. It is a form of photoelectric cell, a device whose electrical characteristics (such as current, voltage, or resistance) vary when exposed to light. Individual solar cell devices are often the electrical building blocks of photovoltaic modules, known colloquially as "solar panels". The common single-junction silicon solar cell can produce a maximum open-circuit voltage of approximately 0.5 to 0.6 volts. Photovoltaic cells may operate under sunlight or artificial light. In addition to producing energy, they can be used as a photodetector (for example infrared detectors), detecting light or other electromagnetic radiation near the visible range, or measuring light intensity. The operation of a PV cell requires three basic attributes: The absorption of light, generating excitons (bound electron-hole pairs), unbound electron-hole pairs (via excitons), or plasmons. The separation of charge carriers of opposite types. The separate extraction of those carriers to an external circuit. In contrast, a solar thermal collector supplies heat by absorbing sunlight, for the purpose of either direct heating or indirect electrical power generation from heat. A "photoelectrolytic cell" (photoelectrochemical cell), on the other hand, refers either to a type of photovoltaic cell (like that developed by Edmond Becquerel and modern dye-sensitized solar cells), or to a device that splits water directly into hydrogen and oxygen using only solar illumination. Photovoltaic cells and solar collectors are the two means of producing solar power. Applications Assemblies of solar cells are used to make solar modules that generate electrical power from sunlight, as distinguished from a "solar thermal module" or "solar hot water panel". A solar array generates solar power using solar energy. Vehicular applications Application
https://en.wikipedia.org/wiki/Joint%20constraints
Joint constraints are rotational constraints on the joints of an artificial system. They are used in an inverse kinematics chain, in fields including 3D animation or robotics. Joint constraints can be implemented in a number of ways, but the most common method is to limit rotation about the X, Y and Z axis independently. An elbow, for instance, could be represented by limiting rotation on X and Z axis to 0 degrees, and constraining the Y-axis rotation to 130 degrees. To simulate joint constraints more accurately, dot-products can be used with an independent axis to repulse the child bones orientation from the unreachable axis. Limiting the orientation of the child bone to a border of vectors tangent to the surface of the joint, repulsing the child bone away from the border, can also be useful in the precise restriction of shoulder movement.
https://en.wikipedia.org/wiki/Eisenstein%20integer
In mathematics, the Eisenstein integers (named after Gotthold Eisenstein), occasionally also known as Eulerian integers (after Leonhard Euler), are the complex numbers of the form where and are integers and is a primitive (hence non-real) cube root of unity. The Eisenstein integers form a triangular lattice in the complex plane, in contrast with the Gaussian integers, which form a square lattice in the complex plane. The Eisenstein integers are a countably infinite set. Properties The Eisenstein integers form a commutative ring of algebraic integers in the algebraic number field – the third cyclotomic field. To see that the Eisenstein integers are algebraic integers note that each is a root of the monic polynomial In particular, satisfies the equation The product of two Eisenstein integers and is given explicitly by The 2-norm of an Eisenstein integer is just its squared modulus, and is given by which is clearly a positive ordinary (rational) integer. Also, the complex conjugate of satisfies The group of units in this ring is the cyclic group formed by the sixth roots of unity in the complex plane: , the Eisenstein integers of norm . Euclidean domain The ring of Eisenstein integers forms a Euclidean domain whose norm is given by the square modulus, as above: A division algorithm, applied to any dividend and divisor , gives a quotient and a remainder smaller than the divisor, satisfying: Here, , , , are all Eisenstein integers. This algorithm implies the Euclidean algorithm, which proves Euclid's lemma and the unique factorization of Eisenstein integers into Eisenstein primes. One division algorithm is as follows. First perform the division in the field of complex numbers, and write the quotient in terms of : for rational . Then obtain the Eisenstein integer quotient by rounding the rational coefficients to the nearest integer: Here may denote any of the standard rounding-to-integer functions. The reason this satisfi
https://en.wikipedia.org/wiki/Wake%20therapy
Wake therapy (sometimes sleep deprivation therapy) is a specific application of intentional sleep deprivation. It encompasses many sleep-restricting paradigms that aim to address mood disorders with a form of non-pharmacological therapy. Description Wake therapy was first popularized in 1966 and 1971 following articles by Schulte and by Pflug and Tölle describing striking symptom relief in depressed individuals after 1 night of total sleep deprivation. Wake therapy can involve partial sleep deprivation, which usually consists of restricting sleep to 4–6 hours, or total sleep deprivation, in which an individual stays up for more than 24 consecutive hours. During total sleep deprivation, an individual typically stays up about 36 hours, spanning a normal awakening time until the evening after the deprivation. It can also involve shifting the sleep schedule to be later or earlier than a typical schedule (eg. going to bed at 5 am), which is called Sleep Phase Advancement. Older studies involved the repetition of sleep deprivation in the treatment of depression, either until the person showed a response to the treatment or until the person had reached a threshold for the possible number of sleep deprivation treatments. Chronic sleep deprivation is dangerous and in modern research studies it is no longer common to repeat sleep deprivation treatments close together in time. Recent closed-loop paradigms have been developed to selectively deprive individuals of some stages—but not others—of sleep. During slow-wave sleep deprivation, researchers monitor the electrical activity on an individual's scalp with electroencephalography (EEG) in real-time, and use sensory stimulation (eg. playing a noise such as a burst of pink noise) to disrupt that stage of sleep (eg. to interfere with slow waves during non-rapid eye movement sleep). The goal of these paradigms is to investigate how the deprivation of one type of sleep stage, or sleep microarchitecture, affects outcome measures,
https://en.wikipedia.org/wiki/One-dimensional%20symmetry%20group
A one-dimensional symmetry group is a mathematical group that describes symmetries in one dimension (1D). A pattern in 1D can be represented as a function f(x) for, say, the color at position x. The only nontrivial point group in 1D is a simple reflection. It can be represented by the simplest Coxeter group, A1, [ ], or Coxeter-Dynkin diagram . Affine symmetry groups represent translation. Isometries which leave the function unchanged are translations x + a with a such that f(x + a) = f(x) and reflections a − x with a such that f(a − x) = f(x). The reflections can be represented by the affine Coxeter group [∞], or Coxeter-Dynkin diagram representing two reflections, and the translational symmetry as [∞]+, or Coxeter-Dynkin diagram as the composite of two reflections. Point group For a pattern without translational symmetry there are the following possibilities (1D point groups): the symmetry group is the trivial group (no symmetry) the symmetry group is one of the groups each consisting of the identity and reflection in a point (isomorphic to Z2) Discrete symmetry groups These affine symmetries can be considered limiting cases of the 2D dihedral and cyclic groups: Translational symmetry Consider all patterns in 1D which have translational symmetry, i.e., functions f(x) such that for some a > 0, f(x + a) = f(x) for all x. For these patterns, the values of a for which this property holds form a group. We first consider patterns for which the group is discrete, i.e., for which the positive values in the group have a minimum. By rescaling we make this minimum value 1. Such patterns fall in two categories, the two 1D space groups or line groups. In the simpler case the only isometries of R which map the pattern to itself are translations; this applies, e.g., for the pattern − −−− − −−− − −−− − −−− Each isometry can be characterized by an integer, namely plus or minus the translation distance. Therefore the symmetry group is Z. In the other case, amon
https://en.wikipedia.org/wiki/The%20Advanced%20Visualizer
The Advanced Visualizer (TAV), a 3D graphics software package, was the flagship product of Wavefront Technologies from the 1980s until the 1990s. History A software package famous for its use in the production of numerous Oscar-winning movies such as The Abyss, Terminator 2: Judgment Day and Jurassic Park. Alias|Wavefront Merger This was widely seen as the result of Microsoft purchasing Softimage in an attempt to take over the 3D computer graphics market. Silicon Graphics responded by purchasing Alias Systems Corporation, and their two major competitors, Wavefront, and the French company TDI (Thomson Digital Images) for their Explore, IPR, and GUI technologies. Thus SGI created the super-company Alias|Wavefront. Wavefront's programmers continued to reside in California but the management of the company was carried out in Toronto, Canada. Autodesk Era In 1996 Alias|Wavefront announced the release of Maya which incorporated aspects of all 3 software suites. Wavefront was renamed to Alias Technologies and acquired by Autodesk in 2005. Some of the technology under Autodesk's ownership is still sold today as part of Maya. Architecture In contrast to many modern day (2011) computer graphics animation software, TAV was a set of independent programs that each focused on one aspect of image synthesis as opposed to a monolithic product. The collection of these smaller programs formed the entire suite based on simple interchange of mostly ASCII file formats such as OBJ. The major components of the TAV software suite included: Model, Paint, Dynamation, Kinemation, Preview, and fcheck. Composer was also available as an add-on for compositing of imagery. Many primitive utility programs such as graphics conversion were included in the toolkit and were frequently employed for batch processing via shell scripts. The modular nature allowed these loosely coupled lightweight programs to start-up quickly with relatively small memory footprints. It was not uncommon to run several
https://en.wikipedia.org/wiki/Carl%20Johnson%20%28Grand%20Theft%20Auto%29
Carl "CJ" Johnson is a fictional character and the playable protagonist of the 2004 video game Grand Theft Auto: San Andreas, the fifth main installment in Rockstar Games' Grand Theft Auto series. He is voiced by Young Maylay, who also served as the likeness for the character. In the game's storyline, Carl is depicted as the underboss of the Grove Street Families, a fictional street gang based in his home city of Los Santos, San Andreas (a fictional parody of Los Angeles, California). The gang is led by his older brother, "Sweet" Johnson, with whom Carl became estranged after the death of their younger brother, Brian, in a gang-related attack prior to the events of the game. Feeling his life in Los Santos is unpromising, Carl eventually decides to move to Liberty City in 1987, only to return home five years later after his mother, Beverly, is similarly killed. While seeking to make amends with friends and family for abandoning them, Carl returns to his previous gangster lifestyle as he embarks on an eventful journey across the entire state of San Andreas, which sees him making new allies and clashing with several powerful criminal organisations and corrupt authorities. Although initially portrayed as somewhat naïve, clumsy, and selfish, Carl develops over the course of San Andreas' storyline, both on a professional level, becoming a successful criminal and businessman, and on an emotional level, as he learns to appreciate his roots and those around him. Carl received critical acclaim, with praise going to his complexity, lack of stereotype and his sense of conscience, and is regarded as one of the greatest video game characters of all time. Concept and design Portrayal Carl's physical appearance is modeled after Los Angeles-based rapper and actor Young Maylay, who also provided the character's voice and motion-capture work. When asked about the character model for Carl, Young Maylay stated that the development team took "very professional" photographs of him to
https://en.wikipedia.org/wiki/Jericho%20Forum
The Jericho Forum was an international group working to define and promote de-perimeterisation. It was initiated by David Lacey from the Royal Mail, and grew out of a loose affiliation of interested corporate CISOs (Chief Information Security Officers), discussing the topic from the summer of 2003, after an initial meeting hosted by Cisco, but was officially founded in January 2004. It declared success, and merged with The Open Group industry consortium's Security Forum in 2014. The problem It was created because the founding members claimed that no one else was appropriately discussing the problems surrounding de-perimeterisation. They felt the need to create a forum to define and solve consistently such issues. One of the earlier outputs of the group is a position paper entitled the Jericho Forum Commandments which are a set of principles that describe how best to survive in a de-perimeterised world. Membership The Jericho Forum consisted of "user members" and "vendor members". Originally, only user members were allowed to stand for election. In December 2008 this was relaxed, allowing either vendor or user members to be eligible for election. The day-to-day management was provided by the Open Group. While the Jericho Forum had its foundations in the UK, nearly all the initial members worked for corporates and had global responsibilities, and involvement grew to Europe, North America and Asia Pacific. Results After the initial focus on defining the problem, de-perimeterisation, the Forum then moved onto focussing on defining the solution, which it delivered in the publication of the Collaboration Oriented Architecture (COA) paper and COA Framework paper. The next focus of the Jericho Forum was "Securely Collaborating in Clouds", which involves applying the COA concepts to the emerging Cloud Computing paradigm. The basic premise is that a collaborative approach is essential to gain most value from "the cloud". Much of this work was transferred to the Cloud
https://en.wikipedia.org/wiki/Doenjang
Doenjang (; "thick sauce") or soybean paste is a type of fermented bean paste made entirely of soybean and brine. It is also a byproduct of soup soy sauce production. It is sometimes used as a relish. History The earliest soybean fermentations in Korea seem to have begun prior to the era of the Three Kingdoms. The Records of the Three Kingdoms, a Chinese historical text written and published in the third century AD, mentions that "Goguryeo people are good at brewing fermented soybeans" in the section named Dongyi (Eastern foreigners), in the Book of Wei. Jangdoks used for doenjang production are found in the mural paintings of Anak Tomb No.3 from the 4th century Goguryeo. In Samguk Sagi, a historical record of the Three Kingdoms era, it is written that doenjang and ganjang along with meju and jeotgal were prepared for the wedding ceremony of the King Sinmun in February 683. Sikhwaji, a section from Goryeosa (History of Goryeo), recorded that doenjang and ganjang were included in the relief supplies in 1018, after a Khitan invasion, and in 1052, when a famine occurred. Joseon texts such as Guhwangchwaryo and Jeungbo sallim gyeongje contain detailed procedures on how to brew good-quality doenjang and ganjang. Gyuhap chongseo explains how to pick a date for brewing, what to forbear, and how to keep and preserve doenjang and ganjang. Production Doenjang is made entirely of fermented soybean and brine. Soup soy sauce is also made during the doenjang production. Meju, Korean soybean brick, is made around ipdong in early November. Soybeans are soaked overnight, boiled in salt water, and then pounded in a mortar (jeolgu) or coarsely ground in a millstone. About a doe (≈1.8 litres) or two does of pounded soybean is chunked, compressed, and shaped into a cube or a sphere called meju. The meju bricks are then dried in a cool, shaded area for a week to several weeks until firm. When the bricks harden, they are tied with rice straws to the eaves of the house, or put in th
https://en.wikipedia.org/wiki/Droste%20effect
The Droste effect (), known in art as an example of mise en abyme, is the effect of a picture recursively appearing within itself, in a place where a similar picture would realistically be expected to appear. This produces a loop which in theory could go on forever, but in practice only continues as far as the image's resolution allows. The effect is named after Droste, a Dutch brand of cocoa, with an image designed by Jan Misset in 1904. The Droste effect has since been used in the packaging of a variety of products. Apart from advertising, the effect is also seen in the Dutch artist M. C. Escher's 1956 lithograph Print Gallery, which portrays a gallery that depicts itself. The effect has been widely used on the covers of comic books, mainly in the 1940s. Effect Origins The Droste effect is named after the image on the tins and boxes of Droste cocoa powder which displayed a nurse carrying a serving tray with a cup of hot chocolate and a box with the same image, designed by Jan Misset. This familiar image was introduced in 1904 and maintained for decades with slight variations from 1912 by artists including Adolphe Mouron. The poet and columnist Nico Scheepmaker introduced wider usage of the term in the late 1970s. Mathematics The appearance is recursive: the smaller version contains an even smaller version of the picture, and so on. Only in theory could this go on forever, as fractals do; practically, it continues only as long as the resolution of the picture allows, which is relatively short, since each iteration geometrically reduces the picture's size. Medieval art The Droste effect was anticipated by Giotto early in the 14th century, in his Stefaneschi Triptych. The altarpiece portrays in its centre panel Cardinal Giacomo Gaetani Stefaneschi offering the triptych itself to St. Peter. There are also several examples from medieval times of books featuring images containing the book itself or window panels in churches depicting miniature copies of the w
https://en.wikipedia.org/wiki/Atkins%20Nutritionals
Atkins Nutritionals, Inc. was founded by Robert Atkins in order to promote the low-carbohydrate packaged foods of the Atkins diet. As of 2017, it is part of The Simply Good Foods Company. The company sells low-carbohydrate bars, shakes, and snacks. History Atkins Nutritionals, Inc. was originally founded as Complementary Formulations in 1989. The company was renamed to Atkins Nutritionals in 1998. It was founded to supplement the way of the Atkins diet. The diet was developed after Atkins read a research paper in the Journal of the American Medical Association. The paper, entitled "Weight Reduction," was published by Alfred W. Pennington in 1958. Atkins used information from the study to resolve his own overweight condition. In October 2003 Parthenon Capital LLC and Goldman Sachs both acquired stakes in the company. Following the death of its founder in 2003, the popularity of the diet and demand for Atkins products waned, causing Atkins Nutritionals Inc. to file for bankruptcy in July 2005, citing losses of $340 million. The company emerged from bankruptcy in 2007 owned by North Castle Partners. Roark Capital Group bought the company in 2010. In February 2015 it was reported that Roark was seeking to sell the company. In April 2017, it was reported that Conyers Park Acquisition Corporation had acquired Atkins from Roark. Conyers Park and Atkins combined under a new company called The Simply Good Foods Company. See also List of food companies South Beach Living
https://en.wikipedia.org/wiki/Heat%20shock%20response
The heat shock response (HSR) is a cell stress response that increases the number of molecular chaperones to combat the negative effects on proteins caused by stressors such as increased temperatures, oxidative stress, and heavy metals. In a normal cell, proteostasis (protein homeostasis) must be maintained because proteins are the main functional units of the cell. Many proteins take on a defined configuration in a process known as protein folding in order to perform their biological functions. If these structures are altered, critical processes could be affected, leading to cell damage or death. The heat shock response can be employed under stress to induce the expression of heat shock proteins (HSP), many of which are molecular chaperones, that help prevent or reverse protein misfolding and provide an environment for proper folding. Protein folding is already challenging due to the crowded intracellular space where aberrant interactions can arise; it becomes more difficult when environmental stressors can denature proteins and cause even more non-native folding to occur. If the work by molecular chaperones is not enough to prevent incorrect folding, the protein may be degraded by the proteasome or autophagy to remove any potentially toxic aggregates. Misfolded proteins, if left unchecked, can lead to aggregation that prevents the protein from moving into its proper conformation and eventually leads to plaque formation, which may be seen in various diseases. Heat shock proteins induced by the HSR can help prevent protein aggregation that is associated with common neurodegenerative diseases such as Alzheimer's, Huntington's, or Parkinson's disease. Induction of the heat shock response With the introduction of environmental stressors, the cell must be able to maintain proteostasis. Acute or chronic subjection to these harmful conditions elicits a cytoprotective response to promote stability to the proteome. HSPs (e.g. HSP70, HSP90, HSP60, etc.) are present under
https://en.wikipedia.org/wiki/Winged%20edge
In computer graphics, the winged edge data structure is a way to represent polygon meshes in computer memory. It is a type of boundary representation and describes both the geometry and topology of a model. Three types of records are used: vertex records, edge records, and face records. Given a reference to an edge record, one can answer several types of adjacency queries (queries about neighboring edges, vertices and faces) in constant time. This kind of adjacency information is useful for algorithms such as subdivision surface. Features The winged edge data structure explicitly describes the geometry and topology of faces, edges, and vertices when three or more surfaces come together and meet at a common edge. The ordering is such that the surfaces are ordered counter-clockwise with respect to the innate orientation of the intersection edge. Moreover the representation allows numerically unstable situations like that depicted below. The winged edge data structure allows for quick traversal between faces, edges, and vertices due to the explicitly linked structure of the network. It serves adjacency queries in constant time with little storage overhead. This rich form of specifying an unstructured grid is in contrast to simpler specifications of polygon meshes such as a node and element list, or the implied connectivity of a regular grid. An alternative to the winged edge data structure is the Half-edge data structure. Structure and pseudocode The face and vertex records are relatively simple, while the edge record is more complex. For each vertex, its record stores only the vertex's position (e.g. coordinates) and a reference to one incident edge. The other edges can be found by following further references in the edge. Similarly each face record only stores a reference to one of the edges surrounding the face. There is no need to store the direction of the edge relative to the face (CCW or CW) as the face can be trivially compared to the edge's own lef
https://en.wikipedia.org/wiki/Fly-killing%20device
A fly-killing device is used for pest control of flying insects, such as houseflies, wasps, moths, gnats, and mosquitoes. Flyswatter A flyswatter (or fly-swat, fly swatter) usually consists of a small rectangular or round sheet of a lightweight, flexible, vented material (usually thin metallic, rubber, or plastic mesh) around across, attached to a handle about long made of a lightweight material such as wire, wood, plastic, or metal. The venting or perforations minimize the disruption of air currents, which are detected by an insect and allow escape, and also reduces air resistance, making it easier to hit a fast-moving target. A flyswatter is ideally lightweight and stiff, allowing quick acceleration to overcome the fast reaction time of the fly (six to ten times faster than a human), while also minimizing damage caused by hitting other objects. The flyswatter usually works by mechanically crushing the fly against a hard surface, after the user has waited for the fly to land somewhere. However, users can also injure or stun an airborne insect mid-flight by whipping the swatter through the air at an extreme speed. History The abeyance of insects by use of short horsetail staffs and fans is an ancient practice, dating back to the Egyptian pharaohs. The earliest flyswatters were in fact nothing more than some sort of striking surface attached to the end of a long stick. An early patent on a commercial flyswatter was issued in 1900 to Robert R. Montgomery who called it a fly-killer. Montgomery sold his patent to John L. Bennett, a wealthy inventor and industrialist who made further improvements on the design. The origin of the name "flyswatter" comes from Dr. Samuel Crumbine, a member of the Kansas board of health, who wanted to raise public awareness of the health issues caused by flies. He was inspired by a chant at a local Topeka softball game: "swat the ball". In a health bulletin published soon afterwards, he exhorted Kansans to "swat the fly". In respon
https://en.wikipedia.org/wiki/The%20Fountain%20of%20Age
The Fountain of Age is a book written by Betty Friedan, who also wrote The Feminine Mystique. It is a study of aging and how people face aging. External links Booknotes interview with Friedan on Fountain of Age, November 28, 1993. Geriatrics
https://en.wikipedia.org/wiki/River%20Raid
River Raid is a vertically scrolling shooter designed and programmed by Carol Shaw and published by Activision in 1982 for the Atari 2600 video game console. Over a million game cartridges were sold. Activision later ported the title to the Atari 5200, ColecoVision, and Intellivision consoles, as well as to the Commodore 64, IBM PCjr, MSX, ZX Spectrum, and Atari 8-bit family. Shaw did the Atari 8-bit and Atari 5200 ports herself. Activision published a less successful sequel in 1988 without Shaw's involvement. Gameplay Viewed from a top-down perspective, the player flies a fighter jet over the River of No Return in a raid behind enemy lines. The player's jet can only move left and right—it cannot maneuver up and down the screen—but it can accelerate and decelerate. The player's jet crashes if it collides with the riverbank or an enemy craft, or if the jet runs out of fuel. Assuming fuel can be replenished, and if the player evades damage, gameplay is essentially unlimited. The player scores points for shooting enemy tankers (30 points), helicopters (60 points), fuel depots (80 points), jets (100 points), and bridges (500 points). The jet refuels when it flies over a fuel depot. A bridge marks the end of a game level. Non-Atari 2600 ports of the game add hot air balloons that are worth 60 points when shot as well as tanks along the sides of the river that shoot at the player's jet. Destroying bridges also serve as the game's checkpoints. If the player crashes the plane they will start their next jet at the last destroyed bridge. Development For its time, River Raid provided an inordinate amount of non-random, repeating terrain despite constrictive computer memory limits. For the Atari 2600 the game with its program code and graphics had to fit into a 4 KB ROM. The game program does not actually store the sequence of terrain and other objects. Instead, a procedural generation algorithm employing a linear-feedback shift register with a hard-coded starting value
https://en.wikipedia.org/wiki/Genome%20size
Genome size is the total amount of DNA contained within one copy of a single complete genome. It is typically measured in terms of mass in picograms (trillionths (10−12) of a gram, abbreviated pg) or less frequently in daltons, or as the total number of nucleotide base pairs, usually in megabases (millions of base pairs, abbreviated Mb or Mbp). One picogram is equal to 978 megabases. In diploid organisms, genome size is often used interchangeably with the term C-value. An organism's complexity is not directly proportional to its genome size; total DNA content is widely variable between biological taxa. Some single-celled organisms have much more DNA than humans, for reasons that remain unclear (see non-coding DNA and C-value enigma). Origin of the term The term "genome size" is often erroneously attributed to a 1976 paper by Ralph Hinegardner, even in discussions dealing specifically with terminology in this area of research (e.g., Greilhuber 2005). Notably, Hinegardner used the term only once: in the title. The term actually seems to have first appeared in 1968, when Hinegardner wondered, in the last paragraph of another article, whether "cellular DNA content does, in fact, reflect genome size". In this context, "genome size" was being used in the sense of genotype to mean the number of genes. In a paper submitted only two months later, Wolf et al. (1969) used the term "genome size" throughout and in its present usage; therefore these authors should probably be credited with originating the term in its modern sense. By the early 1970s, "genome size" was in common usage with its present definition, probably as a result of its inclusion in Susumu Ohno's influential book Evolution by Gene Duplication, published in 1970. Variation in genome size and gene content With the emergence of various molecular techniques in the past 50 years, the genome sizes of thousands of eukaryotes have been analyzed, and these data are available in online databases for animals, plant
https://en.wikipedia.org/wiki/M%C3%BCllerian%20mimicry
Müllerian mimicry is a natural phenomenon in which two or more well-defended species, often foul-tasting and sharing common predators, have come to mimic each other's honest warning signals, to their mutual benefit. The benefit to Müllerian mimics is that predators only need one unpleasant encounter with one member of a set of Müllerian mimics, and thereafter avoid all similar coloration, whether or not it belongs to the same species as the initial encounter. It is named after the German naturalist Fritz Müller, who first proposed the concept in 1878, supporting his theory with the first mathematical model of frequency-dependent selection, one of the first such models anywhere in biology. Müllerian mimicry was first identified in tropical butterflies that shared colourful wing patterns, but it is found in many groups of insects such as bumblebees, and other animals such as poison frogs and coral snakes. The mimicry need not be visual; for example, many snakes share auditory warning signals. Similarly, the defences involved are not limited to toxicity; anything that tends to deter predators, such as foul taste, sharp spines, or defensive behaviour can make a species unprofitable enough to predators to allow Müllerian mimicry to develop. Once a pair of Müllerian mimics has formed, other mimics may join them by advergent evolution (one species changing to conform to the appearance of the pair, rather than mutual convergence), forming mimicry rings. Large rings are found for example in velvet ants. Since the frequency of mimics is positively correlated with survivability, rarer mimics are likely to adapt to resemble commoner models, favouring both advergence and larger Müllerian mimicry rings. Where mimics are not strongly protected by venom or other defences, honest Müllerian mimicry becomes, by degrees, the better-known bluffing of Batesian mimicry. History Origins Müllerian mimicry was proposed by the German zoologist and naturalist Fritz Müller (1821–1897). An
https://en.wikipedia.org/wiki/Narrow%20bipolar%20pulse
Narrow bipolar pulses are high-energy, high-altitude, intra-cloud electrical discharges associated with thunderstorms. NBP are similar to other forms of lightning events such as return strokes and dart leaders, but produce an optical emission of at least an order of magnitude smaller. They typically occur in the 10–20 km altitude range and can emit a power on the order of a few hundred gigawatts. They produce far-field asymmetric bipolar electric field change signatures (called narrow bipolar events).
https://en.wikipedia.org/wiki/GBR%20code
The GBR code (or Guy–Blandford–Roycroft code) is a system of representing the position of chess pieces on a chessboard. Publications such as EG use it to classify endgame types and to index endgame studies. The code is named after Richard Guy, Hugh Blandford and John Roycroft. The first two devised the original system (the Guy–Blandford code) using different figures to represent the number of pieces. Roycroft suggested to count one for a white piece and three for a black piece in order to make the code easier to memorise. Definition In the GBR code, every chess position is represented by six digits, in the following format: abcd.ef a = queens b = rooks c = bishops d = knights e = white pawns f = black pawns For the first four digits, each of the first two white pieces counts as 1, and each of the first two black pieces counts as 3. Thus, for example, if White has two knights and Black has one knight, numeral d = 1 + 1 + 3 = 5. If that is all the other than the kings, the position is classified 0005. Values 0 through 8 represent all normal permutations of force. 9 is used if either side has three or more pieces of the same non-pawn type; these positions are possible in standard chess due to pawn promotion. The last two digits of the code represent the number of white and black pawns, respectively. Usage GBR code can be used to refer to a general class of material. For example, the endgame of two knights against pawn (as famously analysed by A.A. Troitsky, leading to his discovery of the Troitsky line), is GBR class 0002.01. When indexing or referring to specific positions, rather than generalised material imbalances, the code may be extended in various ways. Two common ones are to prefix "+" to indicate the stipulation "White to play and win" or "=" for "White to play and draw"; and to suffix the position of the white and black kings. With these additions, the position to the right, a draw study by Leonid Kubbel (First Prize, Shakhmaty, 1925), is clas
https://en.wikipedia.org/wiki/Ethnobiology
Ethnobiology is the scientific study of the way living things are treated or used by different human cultures. It studies the dynamic relationships between people, biota, and environments, from the distant past to the immediate present. "People-biota-environment" interactions around the world are documented and studied through time, across cultures, and across disciplines in a search for valid, reliable answers to two 'defining' questions: "How and in what ways do human societies use nature, and how and in what ways do human societies view nature?" History Beginnings (15th century–19th century) Biologists have been interested in local biological knowledge since the time Europeans started colonising the world, from the 15th century onwards. Paul Sillitoe wrote that: Local biological knowledge, collected and sampled over these early centuries significantly informed the early development of modern biology: during the 17th century Georg Eberhard Rumphius benefited from local biological knowledge in producing his catalogue, "Herbarium Amboinense", covering more than 1,200 species of the plants in Indonesia; during the 18th century, Carl Linnaeus relied upon Rumphius's work, and also corresponded with other people all around the world when developing the biological classification scheme that now underlies the arrangement of much of the accumulated knowledge of the biological sciences. during the 19th century, Charles Darwin, the 'father' of evolutionary theory, on his Voyage of the Beagle took interest in the local biological knowledge of peoples he encountered. Phase I (1900s–1940s) Ethnobiology itself, as a distinctive practice, only emerged during the 20th century as part of the records then being made about other peoples, and other cultures. As a practice, it was nearly always ancillary to other pursuits when documenting others' languages, folklore, and natural resource use. Roy Ellen commented that: This 'first phase' in the development of ethnobiology as a
https://en.wikipedia.org/wiki/Folk%20biology
Folk biology (or folkbiology) is the cognitive study of how people classify and reason about the organic world. Humans everywhere classify animals and plants into obvious species-like groups. The relationship between a folk taxonomy and a scientific classification can assist in understanding how evolutionary theory deals with the apparent constancy of "common species" and the organic processes centering on them. From the vantage of evolutionary psychology, such natural systems are arguably routine "habits of mind", a sort of heuristic used to make sense of the natural world.
https://en.wikipedia.org/wiki/Microsoft%20Cluster%20Server
Microsoft Cluster Server (MSCS) is a computer program that allows server computers to work together as a computer cluster, to provide failover and increased availability of applications, or parallel calculating power in case of high-performance computing (HPC) clusters (as in supercomputing). Microsoft has three technologies for clustering: Microsoft Cluster Service (MSCS, a HA clustering service), Component Load Balancing (CLB) (part of Application Center 2000), and Network Load Balancing Services (NLB). With the release of Windows Server 2008 the MSCS service was renamed to Windows Server Failover Clustering (WSFC), and the Component Load Balancing (CLB) feature became deprecated. Prior to Windows Server 2008, clustering required (per Microsoft KBs) that all nodes in the clusters to be as identical as possible from hardware, drivers, firmware, all the way to software. After Windows Server 2008 however, Microsoft modified the requirements to state that only the operating system needs to be of the same level (such as patch level). Background Cluster Server was codenamed "Wolfpack" during its development. Windows NT Server 4.0, Enterprise Edition was the first version of Windows to include the MSCS software. The software has since been updated with each new server release. The cluster software evaluates the resources of servers in the cluster and chooses which are used based on criteria set in the administration module. In June 2006, Microsoft released Windows Compute Cluster Server 2003, the first high-performance computing (HPC) cluster technology offering from Microsoft. History During Microsoft's first attempt at development of a cluster server Microsoft, originally priced at $10,000, ran into problems causing the software to fail because of buggy software causing fail-over forcing the workload from two servers to a single server. This results in poor allocation of resources, poor performance of the servers, and very poor reviews from analysts. The announc
https://en.wikipedia.org/wiki/Film%20frame
In filmmaking, video production, animation, and related fields, a frame is one of the many still images which compose the complete moving picture. The term is derived from the historical development of film stock, in which the sequentially recorded single images look like a framed picture when examined individually. The term may also be used more generally as a noun or verb to refer to the edges of the image as seen in a camera viewfinder or projected on a screen. Thus, the camera operator can be said to keep a car in frame by panning with it as it speeds past. Overview When the moving picture is displayed, each frame is flashed on a screen for a short time (nowadays, usually 1/24, 1/25 or 1/30 of a second) and then immediately replaced by the next one. Persistence of vision blends the frames together, producing the illusion of a moving image. The frame is also sometimes used as a unit of time, so that a momentary event might be said to last six frames, the actual duration of which depends on the frame rate of the system, which varies according to the video or film standard in use. In North America and Japan, 30 frames per second (fps) is the broadcast standard, with 24 frames/s now common in production for high-definition video shot to look like film. In much of the rest of the world, 25 frames/s is standard. In systems historically based on NTSC standards, for reasons originally related to the Chromilog NTSC TV systems, the exact frame rate is actually (3579545 / 227.5) / 525 = 29.97002616 fps. This leads to many synchronization problems which are unknown outside the NTSC world, and also brings about hacks such as drop-frame timecode. In film projection, 24 fps is the normal, except in some special venue systems, such as IMAX, Showscan and Iwerks 70, where 30, 48 or even 60 frame/s have been used. Silent films and 8 mm amateur movies used 16 or 18 frame/s. Physical film frames In a strip of movie film, individual frames are separated by frame lines. Normal
https://en.wikipedia.org/wiki/C%20Sharp%20%28programming%20language%29
C# (pronounced ) is a general-purpose high-level programming language supporting multiple paradigms. C# encompasses static typing, strong typing, lexically scoped, imperative, declarative, functional, generic, object-oriented (class-based), and component-oriented programming disciplines. The C# programming language was designed by Anders Hejlsberg from Microsoft in 2000 and was later approved as an international standard by Ecma (ECMA-334) in 2002 and ISO/IEC (ISO/IEC 23270) in 2003. Microsoft introduced C# along with .NET Framework and Visual Studio, both of which were closed-source. At the time, Microsoft had no open-source products. Four years later, in 2004, a free and open-source project called Mono began, providing a cross-platform compiler and runtime environment for the C# programming language. A decade later, Microsoft released Visual Studio Code (code editor), Roslyn (compiler), and the unified .NET platform (software framework), all of which support C# and are free, open-source, and cross-platform. Mono also joined Microsoft but was not merged into .NET. the most recent stable version of the language is C# 11.0, which was released in 2022 in .NET 7.0. Design goals The Ecma standard lists these design goals for C#: The language is intended to be a simple, modern, general-purpose, object-oriented programming language. The language, and implementations thereof, should provide support for software engineering principles such as strong type checking, array bounds checking, detection of attempts to use uninitialized variables, and automatic garbage collection. Software robustness, durability, and programmer productivity are important. The language is intended for use in developing software components suitable for deployment in distributed environments. Portability is very important for source code and programmers, especially those already familiar with C and C++. Support for internationalization is very important. C# is intended to be suitable for wr
https://en.wikipedia.org/wiki/Deltatheridium
Deltatheridium (meaning triangle beast or delta beast) is an extinct species of metatherian. It lived in what is now Mongolia during the Upper Cretaceous, circa 80 million years ago. A study in 2022 strongly suggested that Deltatherium was a marsupial, making it the earliest known member of this group. It had a length of about . Its teeth indicate it was carnivorous. One specimen of Archaeornithoides might attest an attack by this mammal, the skull bearing tooth marks that match its teeth. Other Mesozoic mammals from Mongolia Kamptobaatar Zalambdalestes
https://en.wikipedia.org/wiki/MS-CHAP
MS-CHAP is the Microsoft version of the Challenge-Handshake Authentication Protocol, (CHAP). Versions The protocol exists in two versions, MS-CHAPv1 (defined in RFC 2433) and MS-CHAPv2 (defined in RFC 2759). MS-CHAPv2 was introduced with pptp3-fix that was included in Windows NT 4.0 SP4 and was added to Windows 98 in the "Windows 98 Dial-Up Networking Security Upgrade Release" and Windows 95 in the "Dial Up Networking 1.3 Performance & Security Update for MS Windows 95" upgrade. Windows Vista dropped support for MS-CHAPv1. Applications MS-CHAP is used as one authentication option in Microsoft's implementation of the PPTP protocol for virtual private networks. It is also used as an authentication option with RADIUS servers which are used with IEEE 802.1X (e.g., WiFi security using the WPA-Enterprise protocol). It is further used as the main authentication option of the Protected Extensible Authentication Protocol (PEAP). Features Compared with CHAP, MS-CHAP: works by negotiating CHAP Algorithm 0x80 (0x81 for MS-CHAPv2) in LCP option 3, Authentication Protocol. It provides an authenticator-controlled password change mechanism. It provides an authenticator-controlled authentication retry mechanism and defines failure codes returned in the Failure packet message field. MS-CHAPv2 provides mutual authentication between peers by piggybacking a peer challenge on the response packet and an authenticator response on the success packet. MS-CHAP requires each peer to either know the plaintext password, or an MD4 hash of the password. and does not transmit the password over the link. As such, it is not compatible with most password storage formats. Flaws Weaknesses have been identified in MS-CHAP and MS-CHAPv2. The DES encryption used in NTLMv1 and MS-CHAPv2 to encrypt the NTLM password hash enable custom hardware attacks utilizing the method of brute force. As of 2012, MS-CHAP had been completely broken. After Windows 11 22H2, with the default activation of Windo
https://en.wikipedia.org/wiki/Gerber%20Legendary%20Blades
Gerber Legendary Blades is an American maker of knives, multitools, and other tools for outdoors and military headquartered in Portland, Oregon. Gerber is owned by the Finnish outdoors products company Fiskars. Gerber was established in 1939 by Pete Gerber. Gerber is the "largest maker of knives and multi-tools for the United States armed forces." The LMF II Infantry Knife, features a partial tang blade instead of a full tang blade, ostensibly to avoid electric shocks because the knife was designed to free pilots from downed aircraft. Gerber was the first knife company to collaborate with a custom knife maker when it collaborated with World War II knife maker David Murphy. In 2010 Bear Grylls designed a line of Gerber survival knives, including the best selling Ultimate knife. The Bear Grylls range from Geber progressed to including items such as a water bottle, survival kit and tinder grinder. By 2019 the cooperation between Bear Grylls and Gerber ended. History In 1910, the Gerber family started an advertising firm in Portland, Oregon. While working for the family business, Joseph Gerber mailed twenty four sets of kitchen knives to clients during the holidays. These handmade knives were very popular, with then catalog retailer Abercrombie & Fitch requesting more of these knives from Gerber to sell in their catalog in 1939. Gerber started Gerber Legendary Blades that same year. In 1966, the company relocated to new headquarters in Tigard, Oregon. Finnish company Fiskars purchased the private company in 1987. Chad Vincent was hired as chief executive officer in July 2001. By October 2003, the company employed three hundred people, and had revenues near $100 million and was the second leading seller of multitools in the United States, after Leatherman, another company based in the Portland area. Designs Designers who have since designed knives for Gerber include: Bob Loveless, Paul Poehlmann, Blackie Collins, William Harsey Jr., Fred Carter, Rick Hinderer, Bra
https://en.wikipedia.org/wiki/Syntype
In biological nomenclature, a syntype is any one of two or more biological types that is listed in a description of a taxon where no holotype was designated. Precise definitions of this and related terms for types have been established as part of the International Code of Zoological Nomenclature and the International Code of Nomenclature for algae, fungi, and plants. In zoology In zoological nomenclature, a syntype is defined as "Each specimen of a type series (q.v.) from which neither a holotype nor a lectotype has been designated [Arts. 72.1.2, 73.2, 74]. The syntypes collectively constitute the name-bearing type." (Glossary of the zoological Code). Historically, syntypes were often explicitly designated as such, and under the present ICZN this is a requirement (Art. 72.3), but modern attempts to publish species or subspecies descriptions based on syntypes are generally frowned upon by practicing taxonomists, and most are gradually being replaced by lectotypes. Those that still exist are still considered name-bearing types. A lectotype may be designated from among the syntypes, reducing the other specimens to the status of paralectotype. They are no longer name-bearing types, though if the lectotype is lost or destroyed, it is generally preferable to use a paralectotype as a replacement (neotype). Where specimens in a syntype series are found to belong to different taxa, this may cause nomenclatural instability, since the nominal species can be interpreted in different ways. In botany In botanical nomenclature, a syntype can be made in the description of a species or an infraspecific taxon. It is defined as "any specimen cited in the protologue when there is no holotype, or any one of two or more specimens simultaneously designated as types." (Art. 9.5). See also Type (biology)
https://en.wikipedia.org/wiki/Hemifacial%20microsomia
Hemifacial microsomia (HFM) is a congenital disorder that affects the development of the lower half of the face, most commonly the ears, the mouth and the mandible. It usually occurs on one side of the face, but both sides are sometimes affected. If severe, it may result in difficulties in breathing due to obstruction of the trachea—sometimes even requiring a tracheotomy. With an incidence in the range of 1:3500 to 1:4500, it is the second most common birth defect of the face, after cleft lip and cleft palate. HFM shares many similarities with Treacher Collins syndrome. Presentation The clinical presentation of HFM is quite variable. The severity may depend on the extent of the area with an insufficient blood supply in utero, and the gestational age of the fetus at which this occurs. In some people, the only physical manifestation may be a small and underdeveloped external ear. In more severe cases, multiple parts of the face may be affected. Some people with HFM may have sensorineural hearing loss and decreased visual acuity or even blindness. It can be thought of as a particularly severe form of HFM, in which extracranial anomalies are present to some extent. Some of the internal organs (especially the heart, kidneys, and lungs) may be underdeveloped, or in some cases even absent altogether. The affected organs are typically on the same side as the affected facial features, but bilateral involvement occurs in approximately 10% of cases. Deformities of the vertebral column such as scoliosis may also be observed. While there is no universally accepted grading scale, the OMENS scale (standing for Orbital, Mandible, Ear, Nerves and Soft tissue) was developed to help describe the heterogeneous phenotype that makes up this sequence or syndrome. Intellectual disability is not typically seen in people with HFM. Hemifacial microsomia sometimes results in temporomandibular joint disorders. Cause The condition develops in the fetus at approximately 4 weeks gestational
https://en.wikipedia.org/wiki/Borel%20summation
In mathematics, Borel summation is a summation method for divergent series, introduced by . It is particularly useful for summing divergent asymptotic series, and in some sense gives the best possible sum for such series. There are several variations of this method that are also called Borel summation, and a generalization of it called Mittag-Leffler summation. Definition There are (at least) three slightly different methods called Borel summation. They differ in which series they can sum, but are consistent, meaning that if two of the methods sum the same series they give the same answer. Throughout let denote a formal power series and define the Borel transform of to be its equivalent exponential series Borel's exponential summation method Let denote the partial sum A weak form of Borel's summation method defines the Borel sum of to be If this converges at to some function , we say that the weak Borel sum of converges at , and write . Borel's integral summation method Suppose that the Borel transform converges for all positive real numbers to a function growing sufficiently slowly that the following integral is well defined (as an improper integral), the Borel sum of is given by If the integral converges at to some , we say that the Borel sum of converges at , and write . Borel's integral summation method with analytic continuation This is similar to Borel's integral summation method, except that the Borel transform need not converge for all , but converges to an analytic function of near 0 that can be analytically continued along the positive real axis. Basic properties Regularity The methods and are both regular summation methods, meaning that whenever converges (in the standard sense), then the Borel sum and weak Borel sum also converge, and do so to the same value. i.e. Regularity of is easily seen by a change in order of integration, which is valid due to absolute convergence: if is convergent at , then where the rightmost exp
https://en.wikipedia.org/wiki/Simple%20Gateway%20Monitoring%20Protocol
Simple Gateway Monitoring Protocol (SGMP) defined in RFC 1028, allows commands to be issued to application protocol entities to set or retrieve values (integer or octet string types) for use in monitoring the gateways on which the application protocol entities reside. Messages are exchanged using UDP and utilize unreliable transport methods. Authentication takes place on UDP port 153. Some examples of things that can be monitored are listed below. Network Type for interfaces: IEEE 802.3 MAC, IEEE 802.4 MAC, IEEE 802.5 MAC, Ethernet, ProNET-80, ProNET-10, FDDI, X.25, Point-to-Point Serial, RPA 1822 HDH, ARPA 1822, AppleTalk, StarLAN Interface Status (down, up, attempting, etc.) Route Type (local, remote, sub-network, etc.) Routing Protocol (RIP, EGP, GGP, IGRP, Hello) The protocol was replaced by SNMP (Simple Network Management Protocol) Sources RFC 1028 Network protocols
https://en.wikipedia.org/wiki/Moxi%20%28DVR%29
Moxi was a line of high-definition digital video recorders produced by Moxi Digital Digeo and Arris International. Moxi was originally released only to cable operators, but in December 2008 it was released as a retail product. Moxi was removed from the market in November 2011. The former retail product, the Moxi HD DVR, provided a high-definition user interface with support for either two or three CableCARD TV tuners. Arris also offered a companion appliance, the Moxi Mate, which could stream live or recorded TV from a Moxi HD DVR. History Digeo was founded in 1999 (originally under the name Broadband Partners, Inc.) by Microsoft co-founder Paul Allen, with headquarters in Kirkland, Washington. In the same year, Rearden Steel was started by Steve Perlman, founder of WebTV, under a veil of secrecy. In 2000, Rearden Steel was renamed to Moxi Digital while unveiling a line of media centers designed to bridge the gap between personal computers and televisions. Digeo, Inc. purchased Moxi Digital in 2002. Digeo kept its own name but adopted Moxi as its product family name. Its Palo Alto offices and most of Moxi Digital's staff were kept. Digeo also adopted most of the Moxi hardware (originally focused on satellite consumer electronics), as well as some of the Linux extensions, which were merged into Digeo's own Linux-based infrastructure and cable-specific hardware with Digeo's Emmy award-winning user interface, known as Moxi Menu. On September 22, 2009, the assets of Digeo, Inc. were purchased by the Arris International. Arris announced it would continue to develop and market the Moxi product line to both retail customers and cable operators. Retail DVR products The Moxi HD DVR was a high-definition digital video recorder (DVR) with both three-tuner and two-tuner models available, though the two-tuner model was produced only briefly before being updated. It was designed for use with cable television and supported multi-stream CableCARDs, as well as channel scanning f
https://en.wikipedia.org/wiki/End%20system
In networking jargon, a computer, phone, or internet of things device connected to a computer network is sometimes referred to as an end system or end station, because it sits at the edge of the network. The end user directly interacts with an end system that provides information or services. End systems that are connected to the Internet are also referred to as internet hosts; this is because they host (run) internet applications such as a web browser or an email retrieval program. The Internet's end systems include some computers with which the end user does not directly interact. These include mail servers, web servers, or database servers. With the emergence of the internet of things, household items (such as toasters and refrigerators) as well as portable, handheld computers and digital cameras are all being connected to the internet as end systems. End systems are generally connected to each other using switching devices known as routers rather than using a single communication link. The path that transmitted information takes from the sending end system, through a series of communications links and routers, to the receiving end system is known as a route or path through the network. The sending and receiving route can be different, and can be reallocated during transmission due to changes in the network topology. Normally the cheapest or fastest route is chosen. For the end user the actual routing should be completely transparent. See also Communication endpoint Data terminal equipment Edge device End instrument Host (network) Node (networking) Terminal (telecommunication)
https://en.wikipedia.org/wiki/Request%20Tracker
Request Tracker, commonly abbreviated to RT, is an open source tool for organizations of all sizes to track and manage workflows, customer requests, and internal project tasks of all sorts. With seamless email integration, custom ticket lifecycles, configurable automation, and detailed permissions and roles, Request Tracker began as ticket-tracking software written in Perl used to coordinate tasks and manage requests among an online community of users. RT's first release in 1996 was written by Jesse Vincent, who later formed Best Practical Solutions LLC to distribute, develop, and support the package. RT is open source (FOSS) and distributed under the GNU General Public License. Request Tracker for Incident Response (RTIR) is a special distribution of RT to fulfill the specific needs of CERT teams. At this point, RTIR is, at once, a tool specific to incident management, a general purpose tool teams can use for other tasks, and also a tool that can—and very often is—a fully customized system built on layers of user integrations and user customizations. It was initially developed in cooperation with JANET-CERT, and in 2006 was upgraded and expanded with joint funding from nine Computer Security Incident Response Teams (CSIRTs) in Europe. Technology RT is written in Perl and runs on the Apache and lighttpd web servers using mod_perl or FastCGI with data stored in either MySQL, PostgreSQL, Oracle or SQLite. It is possible to extend the RT interface using plug-ins written in Perl. History Jesse Vincent, while enrolled at Wesleyan University in 1994, worked for Wesleyan's computing help desk and was responsible for improving the help desk and residential networking software infrastructure. This task included setting up a ticketing system for the help desk. Initially he set up a Linux server to run "req", but later he identified that the command line interface was limiting usage. Over the next two years he created and maintained WebReq, a web based interface for re
https://en.wikipedia.org/wiki/Odometry
Odometry is the use of data from motion sensors to estimate change in position over time. It is used in robotics by some legged or wheeled robots to estimate their position relative to a starting location. This method is sensitive to errors due to the integration of velocity measurements over time to give position estimates. Rapid and accurate data collection, instrument calibration, and processing are required in most cases for odometry to be used effectively. The word odometry is composed of the Greek words odos (meaning "route") and metron (meaning "measure"). Example Suppose a robot has rotary encoders on its wheels or on its legged joints. It drives forward for some time and then would like to know how far it has traveled. It can measure how far the wheels have rotated, and if it knows the circumference of its wheels, compute the distance. Train operations are also frequent users of odometrics. Typically, a train gets an absolute position by passing over stationary sensors in the tracks, while odometry is used to calculate relative position while the train is between the sensors. More sophisticated example Suppose that a simple robot has two wheels which can both move forward or reverse and that they are positioned parallel to one another, and equidistant from the center of the robot. Further, assume that each motor has a rotary encoder, and so one can determine if either wheel has traveled one "unit" forward or reverse along the floor. This unit is the ratio of the circumference of the wheel to the resolution of the encoder. If the left wheel were to move forward one unit while the right wheel remained stationary, then the right wheel acts as a pivot, and the left wheel traces a circular arc in the clockwise direction. Since one's unit of distance is usually tiny, one can approximate by assuming that this arc is a line. Thus, the original position of the left wheel, the final position of the left wheel, and the position of the right wheel form a triang
https://en.wikipedia.org/wiki/Mathematical%20methods%20in%20electronics
Mathematical methods are integral to the study of electronics. Mathematics in electronics Electronics engineering careers usually include courses in calculus (single and multivariable), complex analysis, differential equations (both ordinary and partial), linear algebra and probability. Fourier analysis and Z-transforms are also subjects which are usually included in electrical engineering programs. Laplace transform can simplify computing RLC circuit behaviour. Basic applications A number of electrical laws apply to all electrical networks. These include Faraday's law of induction: Any change in the magnetic environment of a coil of wire will cause a voltage (emf) to be "induced" in the coil. Gauss's Law: The total of the electric flux out of a closed surface is equal to the charge enclosed divided by the permittivity. Kirchhoff's current law: the sum of all currents entering a node is equal to the sum of all currents leaving the node or the sum of total current at a junction is zero Kirchhoff's voltage law: the directed sum of the electrical potential differences around a circuit must be zero. Ohm's law: the voltage across a resistor is the product of its resistance and the current flowing through it.at constant temperature. Norton's theorem: any two-terminal collection of voltage sources and resistors is electrically equivalent to an ideal current source in parallel with a single resistor. Thévenin's theorem: any two-terminal combination of voltage sources and resistors is electrically equivalent to a single voltage source in series with a single resistor. Millman's theorem: the voltage on the ends of branches in parallel is equal to the sum of the currents flowing in every branch divided by the total equivalent conductance. See also Analysis of resistive circuits. Circuit analysis is the study of methods to solve linear systems for an unknown variable. Circuit analysis Components There are many electronic components currently used and they all have thei
https://en.wikipedia.org/wiki/RCA%20Lyra
Lyra is a series of MP3 and portable media players (PMP). Initially it was developed and sold by Indianapolis-based Thomson Consumer Electronics Inc., a part of Thomson Multimedia, from 1999 under its RCA brand in the United States and under the Thomson brand in Europe. There were also RCA/Thomson PMPs without the Lyra name, such as the RCA Kazoo (RD1000), RCA Opal and RCA Perl. In January 2008, Thomson sold its Consumer Electronics part including the RCA brand and Lyra line to AudioVox. RCA-branded PMPs are still being made today in its domestic market but no longer under the Lyra name. The Lyra was an early pioneer in digital audio players, although in later years most of its output were OEM products. Players Lyra (RD2201/RD2204) The first ever Lyra was released in 1999 as a CompactFlash (CF) based player. It was sold in two models: the RD2201 with a 32 MB CF card ($199.99 list price), and the RD2204 (sold as the Thomson PDP2201 outside the U.S.) with 64 MB CF card ($249.99 list price). It was the first MP3 player that could be updated through software downloads. The Lyra was developed in partnership between Thomson Multimedia and RealNetworks - it has integration with the RealJukebox Windows software and, alongside encrypted MP3, can also play Real's G2 format audio files. A later firmware also allows WMA format playback. It has a 1" × 3/4", 6-line, backlit monochrome display which at that time (1999-2000) was relatively large. It has a software based five band graphic equalizer, and an external power jack. This series of players requires a proprietary CF reader used in conjunction with specific media players in Windows in order to write files to the card. A supported setup would take a blank CF card, recognize the correct reader attached to the PC, and then while syncing songs to the device, convert them to an encrypted version of RealAudio, MP3, MP3Pro, and later WMA format that is unrecognizable to any other device. It also drops a folder title 'Pmp' onto
https://en.wikipedia.org/wiki/Electrorotation
Electrorotation is the circular movement of an electrically polarized particle. Similar to the slip of an electric motor, it can arise from a phase lag between an applied rotating electric field and the respective relaxation processes and may thus be used to investigate the processes or, if these are known or can be accurately described by models, to determine particle properties. The method is popular in cellular biophysics, as it allows measuring cellular properties like conductivity and permittivity of cellular compartments and their surrounding membranes. See also Dielectric relaxation Dielectrophoresis Membrane potential Biophysics Electric and magnetic fields in matter
https://en.wikipedia.org/wiki/COPS%20%28software%29
The Computer Oracle and Password System (COPS) was the first vulnerability scanner for Unix operating systems to achieve widespread use. It was created by Dan Farmer while he was a student at Purdue University. Gene Spafford helped Farmer start the project in 1989. Features COPS is a software suite comprising at least 12 small vulnerability scanners, each programmed to audit one part of the operating system: File permissions, including device permissions/nodes Password strength Content, format, and security of password and group files (e.g., passwd) Programs and files run in /etc/rc* and cron(tab) files Root-SUID files: Which users can modify them? Are they shell scripts? A cyclic redundancy check of important files Writability of users' home directories and startup files Anonymous FTP configuration Unrestricted TFTP, decode alias in sendmail, SUID uudecode problems, hidden shells inside inetd.conf, rexd in inetd.conf Various root checks: Is the current directory in the search path? Is there a plus sign ("+") in the /etc/host.equiv file? Are NFS mounts unrestricted? Is root in /etc/ftpusers? Compare the modification dates of crucial files with dates of advisories from the CERT Coordination Center Kuang expert system After COPS, Farmer developed another vulnerability scanner called SATAN (Security Administrator Tool for Analyzing Networks). COPS is generally considered obsolete, but it is not uncommon to find systems which are set up in an insecure manner that COPS will identify.
https://en.wikipedia.org/wiki/UniFLEX
UniFLEX is a Unix-like operating system developed by Technical Systems Consultants (TSC) for the Motorola 6809 family which allowed multitasking and multiprocessing. It was released for DMA-capable 8" floppy, extended memory addressing hardware (software controlled 4KiB paging of up to 768 KiB RAM), Motorola 6809 based computers. Examples included machines from SWTPC, Gimix and Goupil (France). On SWTPC machines, UniFLEX also supported a 20 MB, 14" hard drive (OEM'd from Century Data Systems) in 1979. Later on, it also supported larger 14" drives (up to 80 MB), 8" hard drives, and 5-1/4" floppies. In 1982 other machines also supported the first widely available 5-1/4" hard disks using the ST506 interface such as the 5 MB BASF 6182 and the removable SyQuest SQ306RD of the same capacity. Due to the limited address space of the 6809 (64 kB) and hardware limitations, the main memory space for the UniFLEX kernel as well as for any running process had to be smaller than 56 kB (code + data)(processes could be up to 64K minus 512 bytes). This was achieved by writing the kernel and most user space code entirely in assembly language, and by removing a few classic Unix features, such as group permissions for files. Otherwise, UniFLEX was very similar to Unix Version 7, though some command names were slightly different. There was no technical reason for the renaming apart from achieving some level of command-level compatibility with its single-user sibling FLEX. By simply restoring the Unix style names, a considerable degree of "Unix Look & Feel" could be established, though due to memory limitations the command line interpreter (shell) was less capable than the Bourne Shell known from Unix Version 7. Memory management included swapping to a dedicated portion of the system disk (even on floppies) but only whole processes could be swapped in and out, not individual pages. This caused swapping to be a very big hit on system responsiveness, so memory had to be sized appropriate
https://en.wikipedia.org/wiki/Enterprise%20risk%20management
Enterprise risk management (ERM) in business includes the methods and processes used by organizations to manage risks and seize opportunities related to the achievement of their objectives. ERM provides a framework for risk management, which typically involves identifying particular events or circumstances relevant to the organization's objectives (threats and opportunities), assessing them in terms of likelihood and magnitude of impact, determining a response strategy, and monitoring process. By identifying and proactively addressing risks and opportunities, business enterprises protect and create value for their stakeholders, including owners, employees, customers, regulators, and society overall. ERM can also be described as a risk-based approach to managing an enterprise, integrating concepts of internal control, the Sarbanes–Oxley Act, data protection and strategic planning. ERM is evolving to address the needs of various stakeholders, who want to understand the broad spectrum of risks facing complex organizations to ensure they are appropriately managed. Regulators and debt rating agencies have increased their scrutiny on the risk management processes of companies. According to Thomas Stanton of Johns Hopkins University, the point of enterprise risk management is not to create more bureaucracy, but to facilitate discussion on what the really big risks are. ERM frameworks defined There are various important ERM frameworks, each of which describes an approach for identifying, analyzing, responding to, and monitoring risks and opportunities, within the internal and external environment facing the enterprise. Management selects a risk response strategy for specific risks identified and analyzed, which may include: Avoidance: exiting the activities giving rise to risk Reduction: taking action to reduce the likelihood or impact related to the risk Alternative Actions: deciding and considering other feasible steps to minimize risks Share or Insure: transfer
https://en.wikipedia.org/wiki/Auricular%20branch%20of%20vagus%20nerve
The auricular branch of the vagus nerve is often termed the Alderman's nerve or Arnold's nerve. The latter name is an eponym for Friedrich Arnold. The auricular branch of the vagus nerve supplies sensory innervation to the skin of the ear canal, tragus, and auricle. Path It arises from the superior ganglion of the vagus nerve, and is joined soon after its origin by a filament from the petrous ganglion of the glossopharyngeal; it passes behind the internal jugular vein, and enters the mastoid canaliculus on the lateral wall of the jugular fossa. Traversing the substance of the temporal bone, it crosses the facial canal about above the stylomastoid foramen, and here it gives off an ascending branch which joins the facial nerve. The nerve reaches the surface by passing through the tympanomastoid fissure between the mastoid process and the tympanic part of the temporal bone, and divides into two branches: one joins the posterior auricular nerve. the other is distributed to the skin of the back of the ear (auricle) and to the posterior part of the ear canal. Clinical significance This nerve may be involved by the glomus jugulare tumour. Laryngeal cancer can present with pain behind the ear and in the ear - this is a referred pain through the vagus nerve to the nerve of Arnold. In a small portion of individuals, the auricular nerve is the afferent limb of the Ear-Cough or Arnold Reflex. Physical stimulation of the external acoustic meatus innervated by the auricular nerve elicits a cough, much like the other cough reflexes associated with the vagus nerve. Rarely, on introduction of speculum in the external ear, patients have experienced syncope due to the stimulation of the auricular branch of the vagus nerve. Clinical application This nerve may be stimulated as a diagnostic or therapeutic technique Transcutaneous vagus nerve stimulation (tVNS) was proposed by Ventureya (2000) for seizures. In 2003 Fallgatter et al. published "Far field potentials from the br
https://en.wikipedia.org/wiki/Valentin%20Turchin
Valentin Fyodorovich Turchin (, 14 February 1931 in Podolsk – 7 April 2010 in Oakland, New Jersey) was a Soviet and American physicist, cybernetician, and computer scientist. He developed the Refal programming language, the theory of metasystem transitions and the notion of supercompilation. He was as a pioneer in artificial intelligence and a proponent of the global brain hypothesis. Biography Turchin was born in 1931 in Podolsk, Soviet Union. In 1952, he graduated from Moscow University in Theoretical Physics, and got his Ph.D. in 1957. After working on neutron and solid-state physics at the Institute for Physics of Energy in Obninsk, in 1964 he accepted a position at the Keldysh Institute of Applied Mathematics in Moscow. There he worked in statistical regularization methods and authored REFAL, one of the first AI languages and the AI language of choice in the Soviet Union. In the 1960s, Turchin became politically active. In Fall 1968, he wrote the pamphlet The Inertia of Fear, which was quite widely circulated in samizdat, the writing began to be circulated under the title The Inertia of Fear: Socialism and Totalitarianism in Moscow from 1976. Following its publication in the underground press, he lost his research laboratory. In 1970 he authored "The Phenomenon of Science", a grand cybernetic meta-theory of universal evolution, which broadened and deepened the earlier book. By 1973, Turchin had founded the Moscow chapter of Amnesty International with Andrey Tverdokhlebov and was working closely with the well-known physicist and Soviet dissident Andrei Sakharov. In 1974 he lost his position at the Institute, and was persecuted by the KGB. Facing almost certain imprisonment, he and his family were forced to emigrate from the Soviet Union in 1977. He went to New York, where he joined the faculty of the City University of New York in 1979. In 1990, together with Cliff Joslyn and Francis Heylighen, he founded the Principia Cybernetica Project, a worldwide organiz
https://en.wikipedia.org/wiki/Hydroxypropyl%20cellulose
Hydroxypropyl cellulose (HPC) is a derivative of cellulose with both water solubility and organic solubility. It is used as an excipient, and topical ophthalmic protectant and lubricant. Chemistry HPC is an ether of cellulose in which some of the hydroxyl groups in the repeating glucose units have been hydroxypropylated forming -OCH2CH(OH)CH3 groups using propylene oxide. The average number of substituted hydroxyl groups per glucose unit is referred to as the degree of substitution (DS). Complete substitution would provide a DS of 3. Because the hydroxypropyl group added contains a hydroxyl group, this can also be etherified during preparation of HPC. When this occurs, the number of moles of hydroxypropyl groups per glucose ring, moles of substitution (MS), can be higher than 3. Because cellulose is very crystalline, HPC must have an MS about 4 in order to reach a good solubility in water. HPC has a combination of hydrophobic and hydrophilic groups, so it has a lower critical solution temperature (LCST) at 45 °C. At temperatures below the LCST, HPC is readily soluble in water; above the LCST, HPC is not soluble. HPC forms liquid crystals and many mesophases according to its concentration in water. Such mesophases include isotropic, anisotropic, nematic and cholesteric. The last one gives many colors such as violet, green and red. Uses Lacrisert, manufactured by Aton Pharma, is a formulation of HPC used for artificial tears. It is used to treat medical conditions characterized by insufficient tear production such as keratoconjunctivitis sicca), recurrent corneal erosions, decreased corneal sensitivity, exposure and neuroparalytic keratitis. HPC is also used as a lubricant for artificial eyes. HPC is used as a thickener, a low level binder and as an emulsion stabiliser with E number E463. In pharmaceuticals it is used as a binder in tablets. HPC is used as a sieving matrix for DNA separations by capillary and microchip electrophoresis. HPC is used as
https://en.wikipedia.org/wiki/Windows%20Server%202008
Windows Server 2008, codenamed "Longhorn Server", is the fourth release of the Windows Server operating system produced by Microsoft as part of the Windows NT family of the operating systems. It was released to manufacturing on February 4, 2008, and generally to retail on February 27, 2008. Derived from Windows Vista, Windows Server 2008 is the successor of Windows Server 2003 and the predecessor to Windows Server 2008 R2. Windows Server 2008 removed support for processors without ACPI. It is the first version of Windows Server that includes Hyper-V and is also the final version of Windows Server that supports IA-32-based processors (also known as 32-bit processors). Its successor, Windows Server 2008 R2, requires a 64-bit processor in any supported architecture (x86-64 for x86 and Itanium). As of July 2019, 60% of Windows servers were running Windows Server 2008. History Microsoft had released Windows Vista to mixed reception, and their last Windows Server release was based on Windows XP. The operating system's working title was Windows Server Codename "Longhorn", but was later changed to Windows Server 2008 when Microsoft chairman Bill Gates announced it during his keynote address at WinHEC on May 16, 2007. Beta 1 was released on July 27, 2005; Beta 2 was announced and released on May 23, 2006, at WinHEC 2006 and Beta 3 was released publicly on April 25, 2007. Release Candidate 0 was released to the general public on September 24, 2007 and Release Candidate 1 was released to the general public on December 5, 2007. Windows Server 2008 was released to manufacturing on February 4, 2008, and officially launched on the 27th of that month. Features Windows Server 2008 is built from the same codebase as Windows Vista and thus it shares much of the same architecture and functionality. Since the codebase is common, Windows Server 2008 inherits most of the technical, security, management and administrative features new to Windows Vista such as the rewritten networki
https://en.wikipedia.org/wiki/ThreadX
Azure RTOS ThreadX is a highly deterministic, embedded real-time operating system (RTOS) programmed mostly in the language C. Overview ThreadX was originally developed and marketed by Express Logic of San Diego, California, United States. The author of ThreadX is William Lamie, who was also the original author of the Nucleus RTOS in 1990. William Lamie was President and CEO of Express Logic. Express Logic was purchased for an undisclosed sum by Microsoft on April 18, 2019. The name ThreadX is derived from the threads that are used as the executable elements, and the letter X represents context switching, i.e., it switches threads. ThreadX provides priority-based, preemptive scheduling, fast interrupt response, memory management, interthread communication, mutual exclusion, event notification, and thread synchronization features. Major distinguishing technology characteristics of ThreadX include preemption-threshold, priority inheritance, efficient timer management, fast software timers, picokernel design, event-chaining, and small size: minimal size on an ARM architecture processor is about 2 KB. ThreadX supports multi-core processor environments via either asymmetric multiprocessing (AMP) or symmetric multiprocessing (SMP). Application thread isolation with memory management unit (MMU) or memory protection unit (MPU) memory protection is available with ThreadX Modules. ThreadX has extensive safety certifications from Technischer Überwachungsverein (TÜV, English: Technical Inspection Association) and UL (formerly Underwriters Laboratories) and is Motor Industry Software Reliability Association MISRA C compliant. ThreadX is the foundation of Express Logic's X-Ware Internet of things (IoT) platform, which also includes embedded file system support (FileX), embedded UI support (GUIX), embedded Internet protocol suite (TCP/IP) and cloud connectivity (NetX/NetX Duo), and Universal Serial Bus (USB) support (USBX). ThreadX has won high appraisal from developers and i
https://en.wikipedia.org/wiki/List%20of%20proteins
Proteins are a class of macromolecular organic compounds that are essential to life. They consist of a long polypeptide chain that usually adopts a single stable three-dimensional structure. They fulfill a wide variety of functions including providing structural stability to cells, catalyze chemical reactions that produce or store energy or synthesize other biomolecules including nucleic acids and proteins, transport essential nutrients, or serve other roles such as signal transduction. They are selectively transported to various compartments of the cell or in some cases, secreted from the cell. This list aims to organize information on how proteins are most often classified: by structure, by function, or by location. Structure Proteins may be classified as to their three-dimensional structure (also known a protein fold). The two most widely used classification schemes are: CATH database Structural Classification of Proteins database (SCOP) Both classification schemes are based on a hierarchy of fold types. At the top level are all alpha proteins (domains consisting of alpha helices), all beta proteins (domains consisting of beta sheets), and mixed alpha helix/beta sheet proteins. While most proteins adopt a single stable fold, a few proteins can rapidly interconvert between one or more folds. These are referred to as metamorphic proteins. Finally other proteins appear not to adopt any stable conformation and are referred to as intrinsically disordered. Proteins frequently contain two or more domains, each have a different fold separated by intrinsically disordered regions. These are referred to as multi-domain proteins. Function Proteins may also be classified based on their celluar function. A widely used classification is PANTHER (protein analysis through evolutionary relationships) classification system. Structural Protein#Structural proteins Catalytic Enzymes classified according to their Enzyme Commission number (EC). Note that strictly speaki
https://en.wikipedia.org/wiki/David%20E.%20Shaw
David Elliot Shaw (born March 29, 1951) is an American billionaire scientist and former hedge fund manager. He founded D. E. Shaw & Co., a hedge fund company which was once described by Fortune magazine as "the most intriguing and mysterious force on Wall Street". A former assistant professor in the computer science department at Columbia University, Shaw made his fortune exploiting inefficiencies in financial markets with the help of state-of-the-art high speed computer networks. In 1996, Fortune magazine referred to him as "King Quant" because of his firm's pioneering role in high-speed quantitative trading. In 2001, Shaw turned to full-time scientific research in computational biochemistry, more specifically molecular dynamics simulations of proteins. Early life and education Shaw was raised in Los Angeles, California. His father was a theoretical physicist who specialised in plasma and fluid flows, and his mother is an artist and educator. They divorced when he was 12. His stepfather, Irving Pfeffer, was professor of finance at University of California, Los Angeles, and the author of papers supporting the efficient market hypothesis. Shaw earned a bachelor's degree summa cum laude from the University of California, San Diego, a PhD from Stanford University in 1980, and then became an assistant professor of the department of computer science at Columbia University. While at Columbia, Shaw conducted research in massively parallel computing with the NON-VON supercomputer. This supercomputer was composed of processing elements in a tree structure meant to be used for fast relational database searches. Earlier in his career, he founded Stanford Systems Corporation. Investment career In 1986, he joined Morgan Stanley, as Vice President for Technology in Nunzio Tartaglia's automated proprietary trading group. In 1994, Shaw was appointed by President Clinton to the President's Council of Advisors on Science and Technology, where he was chairman of the Panel on Educat
https://en.wikipedia.org/wiki/Caml
Caml (originally an acronym for Categorical Abstract Machine Language) is a multi-paradigm, general-purpose programming language which is a dialect of the ML programming language family. Caml was developed in France at INRIA and ENS. Caml is statically typed, strictly evaluated, and uses automatic memory management. OCaml, the main descendant of Caml, adds many features to the language, including an object layer. Examples In the following, represents the Caml prompt. Hello World print_endline "Hello, world!";; Factorial function (recursion and purely functional programming) Many mathematical functions, such as factorial, are most naturally represented in a purely functional form. The following recursive, purely functional Caml function implements factorial: let rec fact n = if n=0 then 1 else n * fact(n - 1);; The function can be written equivalently using pattern matching: let rec fact = function | 0 -> 1 | n -> n * fact(n - 1);; This latter form is the mathematical definition of factorial as a recurrence relation. Note that the compiler inferred the type of this function to be , meaning that this function maps ints onto ints. For example, 12! is: # fact 12;; - : int = 479001600 Numerical derivative (higher-order functions) Since Caml is a functional programming language, it is easy to create and pass around functions in Caml programs. This capability has an enormous number of applications. Calculating the numerical derivative of a function is one such application. The following Caml function computes the numerical derivative of a given function at a given point : let d delta f x = (f (x +. delta) -. f (x -. delta)) /. (2. *. delta);; This function requires a small value . A good choice for delta is the cube root of the machine epsilon. The type of the function indicates that it maps a onto another function with the type . This allows us to partially apply arguments. This functional style is known as currying. In this case, it is useful to part
https://en.wikipedia.org/wiki/Operation%20RAFTER
RAFTER was a code name for the MI5 radio receiver detection technique, mostly used against clandestine Soviet agents and monitoring of domestic radio transmissions by foreign embassy personnel from the 1950s on. Explanation Most radio receivers of the period were of the AM superhet design, with local oscillators which generate a signal typically 455 kHz above or sometimes below the frequency to be received. There is always some oscillator radiation leakage from such receivers, and in the initial stages of RAFTER, MI5 simply attempted to locate clandestine receivers by detecting the leaked signal with a sensitive custom-built receiver. This was complicated by domestic radios in people's homes also leaking radiation. By accident, one such receiver for MI5 mobile radio transmissions was being monitored when a passing transmitter produced a powerful signal which overloaded the receiver, producing an audible change in the received signal. The agency realized that they could identify the actual frequency being monitored if they produced their own transmissions and listened for the change in the superhet tone. Soviet transmitters Soviet short-wave transmitters were extensively used to broadcast messages to clandestine agents, the transmissions consisting simply of number sequences read aloud and decoded using a one-time pad. It was realized that this new technique could be used to track down such agents. Specially equipped aircraft would fly over urban areas at times when the agents were receiving Soviet transmissions, and attempt to locate receivers tuned to the transmissions. Tactics Like many secret technologies, RAFTER's use was attended by the fear of over-use, alerting the quarry and causing a shift in tactics which would neutralize the technology. As a technical means of intelligence, it was also not well supported by the more traditional factions in MI5. Its part in the successes and failures of MI5 at the time is not entirely known. In his book Spycatcher,
https://en.wikipedia.org/wiki/Eckert%20number
The Eckert number (Ec) is a dimensionless number used in continuum mechanics. It expresses the relationship between a flow's kinetic energy and the boundary layer enthalpy difference, and is used to characterize heat transfer dissipation. It is named after Ernst R. G. Eckert. It is defined as where u is the local flow velocity of the continuum, cp is the constant-pressure local specific heat of the continuum, is the difference between wall temperature and local temperature.
https://en.wikipedia.org/wiki/Aaargh%21
Aaargh! is a single-player action video game in which the player controls a giant monster with the goal of obtaining eggs by destroying buildings in different cities across a lost island. It was designed for Mastertronic's Arcadia Systems, an arcade machine based on the custom hardware of the Amiga, and was released in 1987. It was ported to a range of other platforms and released on these across 1988 and 1989. Electronic Arts distributed the Amiga version of the game. Gameplay The goal of the game is to find the golden dragon's egg. The player controls one of two monsters who must destroy buildings in order to find Roc eggs, the discovery of each of which triggers a fight with a rival monster. When five eggs are found, the two monsters fight on a volcano to claim the dragon's egg. The game is an action game with fighting game elements. The player chooses to play as either a dragon-like lizard or an ogre (depicted as a cyclops in the game); the character that the player does not select becomes the player's rival to obtain the egg. In the arcade version of the game either one or two players could play simultaneously, whereas on the ports only one player could play at a time. Gameplay takes place across the ten cities of the Lost Island, each representing a different era of civilisation (such as ancient Egypt and the Wild West) and each comprising one level of the game. Each city is represented by a single static playing area that uses a form of 2.5D projection in order to give the impression of depth on the screen. Reception The game received mixed reviews from gaming magazines across the platforms to which it was ported, with scores ranging from around 2/10 (or equivalent) up to almost 9/10. While reviewers praised the graphics and sound, particularly on the Amiga port, they criticised the gameplay. ACE magazine said that although the game had "good graphics, atmospheric sound and good gameplay" there was not enough challenge to the game and that players woul
https://en.wikipedia.org/wiki/Matter%20creation
Even restricting the discussion to physics, scientists do not have a unique definition of what matter is. In the currently known particle physics, summarised by the standard model of elementary particles and interactions, it is possible to distinguish in an absolute sense particles of matter and particles of antimatter. This is particularly easy for those particles that carry electric charge, such as electrons, protons or quarks, while the distinction is more subtle in the case of neutrinos, fundamental elementary particles that do not carry electric charge. In the standard model, it is not possible to create a net amount of matter particles—or more precisely, it is not possible to change the net number of leptons or of quarks in any perturbative reaction among particles. This remark is consistent with all existing observations. However, similar processes are not considered to be impossible and are expected in other models of the elementary particles, that extend the standard model. They are necessary in speculative theories that aim to explain the cosmic excess of matter over antimatter, such as leptogenesis and baryogenesis. They could even manifest themselves in laboratory as proton decay or as creations of electrons in the so-called neutrinoless double beta decay. The latter case occurs if the neutrinos are Majorana particles, being at the same time matter and antimatter, according to the definition given just above. In a wider sense, one can use the word matter simply to refer to fermions. In this sense, matter and antimatter particles (such as an electron and a positron) are identified beforehand. The process inverse to particle annihilation can be called matter creation; more precisely, we are considering here the process obtained under time reversal of the annihilation process. This process is also known as pair production, and can be described as the conversion of light particles (i.e., photons) into one or more massive particles . The most common and w
https://en.wikipedia.org/wiki/Fleur%20de%20Lys%20%28superhero%29
Fleur de Lys is a superheroine from Quebec and an ally of Northguard, created in 1984 by Mark Shainblum and Gabriel Morrissette in the comic New Triumph featuring Northguard. The name of the character is inspired by the heraldic symbol of the fleur de lys, which is the official emblem of Quebec and a prominent part of the Flag of Quebec. The character was honored with a Canadian postage stamp in 1995, with fellow superheroes Superman, Nelvana of the Northern Lights, Johnny Canuck and Captain Canuck. Fleur de Lys uses a fleur-de-lys-shaped, non-lethal light saber to vanquish her foes. The character's civilian identity is martial-arts expert Manon Deschamps, from Quebec. In 2010, Fleur de Lys was featured in an animated web series, Heroes of the North, where it was revealed that the Quebecois heroine has brothers who are separatist terrorists.
https://en.wikipedia.org/wiki/Internal%20conversion%20%28chemistry%29
Internal conversion is a transition from a higher to a lower electronic state in a molecule or atom. It is sometimes called "radiationless de-excitation", because no photons are emitted. It differs from intersystem crossing in that, while both are radiationless methods of de-excitation, the molecular spin state for internal conversion remains the same, whereas it changes for intersystem crossing. The energy of the electronically excited state is given off to vibrational modes of the molecule. The excitation energy is transformed into heat. Examples A classic example of this process is the quinine sulfate fluorescence, which can be quenched by the use of various halide salts. The excited molecule can de-excite by increasing the thermal energy of the surrounding solvated ions. Several natural molecules perform a fast internal conversion. This ability to transform the excitation energy of photon into heat can be a crucial property for photoprotection by molecules such as melanin. Fast internal conversion reduces the excited state lifetime, and thereby prevents bimolecular reactions. Bimolecular electron transfer always produces a reactive chemical species, free radicals. Nucleic acids (precisely the single, free nucleotides, not those bound in a DNA/RNA strand) have an extremely short lifetime due to a fast internal conversion. Both melanin and DNA have some of the fastest internal conversion rates. In applications that make use of bimolecular electron transfer the internal conversion is undesirable. For example, it is advantageous to have a long lived excited states in Grätzel cells (Dye-sensitized solar cells). See also Fluorescence spectroscopy Förster resonance energy transfer
https://en.wikipedia.org/wiki/Mutation%20testing
Mutation testing (or mutation analysis or program mutation) is used to design new software tests and evaluate the quality of existing software tests. Mutation testing involves modifying a program in small ways. Each mutated version is called a mutant and tests detect and reject mutants by causing the behaviour of the original version to differ from the mutant. This is called killing the mutant. Test suites are measured by the percentage of mutants that they kill. New tests can be designed to kill additional mutants. Mutants are based on well-defined mutation operators that either mimic typical programming errors (such as using the wrong operator or variable name) or force the creation of valuable tests (such as dividing each expression by zero). The purpose is to help the tester develop effective tests or locate weaknesses in the test data used for the program or in sections of the code that are seldom or never accessed during execution. Mutation testing is a form of white-box testing. Introduction Most of this article is about "program mutation", in which the program is modified. A more general definition of mutation analysis is using well-defined rules defined on syntactic structures to make systematic changes to software artifacts. Mutation analysis has been applied to other problems, but is usually applied to testing. So mutation testing is defined as using mutation analysis to design new software tests or to evaluate existing software tests. Thus, mutation analysis and testing can be applied to design models, specifications, databases, tests, XML, and other types of software artifacts, although program mutation is the most common. Overview Tests can be created to verify the correctness of the implementation of a given software system, but the creation of tests still poses the question whether the tests are correct and sufficiently cover the requirements that have originated the implementation. (This technological problem is itself an instance of a deeper phi
https://en.wikipedia.org/wiki/Hyperbolic%20growth
When a quantity grows towards a singularity under a finite variation (a "finite-time singularity") it is said to undergo hyperbolic growth. More precisely, the reciprocal function has a hyperbola as a graph, and has a singularity at 0, meaning that the limit as is infinite: any similar graph is said to exhibit hyperbolic growth. Description If the output of a function is inversely proportional to its input, or inversely proportional to the difference from a given value , the function will exhibit hyperbolic growth, with a singularity at . In the real world hyperbolic growth is created by certain non-linear positive feedback mechanisms. Comparisons with other growth Like exponential growth and logistic growth, hyperbolic growth is highly nonlinear, but differs in important respects. These functions can be confused, as exponential growth, hyperbolic growth, and the first half of logistic growth are convex functions; however their asymptotic behavior (behavior as input gets large) differs dramatically: logistic growth is constrained (has a finite limit, even as time goes to infinity), exponential growth grows to infinity as time goes to infinity (but is always finite for finite time), hyperbolic growth has a singularity in finite time (grows to infinity at a finite time). Applications Population Certain mathematical models suggest that until the early 1970s the world population underwent hyperbolic growth (see, e.g., Introduction to Social Macrodynamics by Andrey Korotayev et al.). It was also shown that until the 1970s the hyperbolic growth of the world population was accompanied by quadratic-hyperbolic growth of the world GDP, and developed a number of mathematical models describing both this phenomenon, and the World System withdrawal from the blow-up regime observed in the recent decades. The hyperbolic growth of the world population and quadratic-hyperbolic growth of the world GDP observed till the 1970s have been correlated by Andrey Korotayev and his
https://en.wikipedia.org/wiki/Physics%20Today
Physics Today is the membership magazine of the American Institute of Physics. First published in May 1948, it is issued on a monthly schedule, and is provided to the members of ten physics societies, including the American Physical Society. It is also available to non-members as a paid annual subscription. The magazine informs readers about important developments in overview articles written by experts, shorter review articles written internally by staff, and also discusses issues and events of importance to the science community in politics, education, and other fields. The magazine provides a historical resource of events associated with physics. For example it discussed debunking the physics of the Star Wars program of the 1980s, and the state of physics in China and the Soviet Union during the 1950s and 1970s. According to the Journal Citation Reports, the journal has a 2017 impact factor of 4.370.
https://en.wikipedia.org/wiki/Algebra%20of%20physical%20space
In physics, the algebra of physical space (APS) is the use of the Clifford or geometric algebra Cl3,0(R) of the three-dimensional Euclidean space as a model for (3+1)-dimensional spacetime, representing a point in spacetime via a paravector (3-dimensional vector plus a 1-dimensional scalar). The Clifford algebra Cl3,0(R) has a faithful representation, generated by Pauli matrices, on the spin representation C2; further, Cl3,0(R) is isomorphic to the even subalgebra Cl(R) of the Clifford algebra Cl3,1(R). APS can be used to construct a compact, unified and geometrical formalism for both classical and quantum mechanics. APS should not be confused with spacetime algebra (STA), which concerns the Clifford algebra Cl1,3(R) of the four-dimensional Minkowski spacetime. Special relativity Spacetime position paravector In APS, the spacetime position is represented as the paravector where the time is given by the scalar part , and e1, e2, e3 are the standard basis for position space. Throughout, units such that are used, called natural units. In the Pauli matrix representation, the unit basis vectors are replaced by the Pauli matrices and the scalar part by the identity matrix. This means that the Pauli matrix representation of the space-time position is Lorentz transformations and rotors The restricted Lorentz transformations that preserve the direction of time and include rotations and boosts can be performed by an exponentiation of the spacetime rotation biparavector W In the matrix representation, the Lorentz rotor is seen to form an instance of the SL(2,C) group (special linear group of degree 2 over the complex numbers), which is the double cover of the Lorentz group. The unimodularity of the Lorentz rotor is translated in the following condition in terms of the product of the Lorentz rotor with its Clifford conjugation This Lorentz rotor can be always decomposed in two factors, one Hermitian , and the other unitary , such that The unitary element R is cal
https://en.wikipedia.org/wiki/Dipole%20anisotropy
Dipole anisotropy is a form of anisotropy and the progressive difference in the frequency of radiation from opposite directions due to the motion of the observer relative to the source. As a result of that motion, one end of the 360-degree spectrum is redshifted, whereas the other end is blueshifted. That effect is notable in measurements of the cosmic microwave background radiation, due to the motion of the Earth. Physical cosmology
https://en.wikipedia.org/wiki/Natta%20projection
In chemistry, the Natta projection (named for Italian chemist Giulio Natta) is a way to depict molecules with complete stereochemistry in two dimensions in a skeletal formula. In a hydrocarbon molecule with all carbon atoms making up the backbone in a tetrahedral molecular geometry, the zigzag backbone is in the paper plane (chemical bonds depicted as solid line segments) with the substituents either sticking out of the paper toward the viewer (chemical bonds depicted as solid wedges) or away from the viewer (chemical bonds depicted as dashed wedges). The Natta projection is useful for representing the tacticity of a polymer. See also Structural formula Wedge-and-dash notation in skeletal formulas Haworth projection Newman projection Fischer projection
https://en.wikipedia.org/wiki/DirectX%20Video%20Acceleration
DirectX Video Acceleration (DXVA) is a Microsoft API specification for the Microsoft Windows and Xbox 360 platforms that allows video decoding to be hardware-accelerated. The pipeline allows certain CPU-intensive operations such as iDCT, motion compensation and deinterlacing to be offloaded to the GPU. DXVA 2.0 allows more operations, including video capturing and processing operations, to be hardware-accelerated as well. DXVA works in conjunction with the video rendering model used by the video card. DXVA 1.0, which was introduced as a standardized API with Windows 2000 and is currently available on Windows 98 or later, can use either the overlay rendering mode or VMR 7/9. DXVA 2.0, available only on Windows Vista, Windows 7, Windows 8 and later OSs, integrates with Media Foundation (MF) and uses the Enhanced Video Renderer (EVR) present in MF. Overview The DXVA is used by software video decoders to define a codec-specific pipeline for hardware-accelerated decoding and rendering of the codec. The pipeline starts at the CPU which is used for parsing the media stream and conversion to DXVA-compatible structures. DXVA specifies a set of operations that can be hardware-accelerated and device driver interfaces (DDIs) that the graphic driver can implement to accelerate the operations. If the codec needs to do any of the defined operations, it can use these interfaces to access the hardware-accelerated implementation of these operations. If the graphic driver does not implement one or more of the interfaces, it is up to the codec to provide a software fallback for it. The decoded video is handed over to the hardware video renderer, where further video post-processing might be applied to it before being rendered to the device. The resulting pipeline is usable in a DirectShow-compatible application. DXVA specifies the Motion Compensation DDI, which specifies the interfaces for iDCT operations, Huffman coding, motion compensation, alpha blending, inverse quantization, col
https://en.wikipedia.org/wiki/GPRS%20Tunnelling%20Protocol
GPRS Tunnelling Protocol (GTP) is a group of IP-based communications protocols used to carry general packet radio service (GPRS) within GSM, UMTS, LTE and 5G NR radio networks. In 3GPP architectures, GTP and Proxy Mobile IPv6 based interfaces are specified on various interface points. GTP can be decomposed into separate protocols, GTP-C, GTP-U and GTP'. GTP-C is used within the GPRS core network for signaling between gateway GPRS support nodes (GGSN) and serving GPRS support nodes (SGSN). This allows the SGSN to activate a session on a user's behalf (PDP context activation), to deactivate the same session, to adjust quality of service parameters, or to update a session for a subscriber who has just arrived from another SGSN. GTP-U is used for carrying user data within the GPRS core network and between the radio access network and the core network. The user data transported can be packets in any of IPv4, IPv6, or PPP formats. GTP' (GTP prime) uses the same message structure as GTP-C and GTP-U, but has an independent function. It can be used for carrying charging data from the charging data function (CDF) of the GSM or UMTS network to the charging gateway function (CGF). In most cases, this should mean from many individual network elements such as the GGSNs to a centralized computer that delivers the charging data more conveniently to the network operator's billing center. Different GTP variants are implemented by RNCs, SGSNs, GGSNs and CGFs within 3GPP networks. GPRS mobile stations (MSs) are connected to a SGSN without being aware of GTP. GTP can be used with UDP or TCP. UDP is either recommended or mandatory, except for tunnelling X.25 in version 0. GTP version 1 is used only on UDP. General features All variants of GTP have certain features in common. The structure of the messages is the same, with a GTP header following the UDP/TCP header. Header GTP version 1 GTPv1 headers contain the following fields: Version It is a 3-bit field. For GTPv1, this