source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Lustre%20%28file%20system%29
Lustre is a type of parallel distributed file system, generally used for large-scale cluster computing. The name Lustre is a portmanteau word derived from Linux and cluster. Lustre file system software is available under the GNU General Public License (version 2 only) and provides high performance file systems for computer clusters ranging in size from small workgroup clusters to large-scale, multi-site systems. Since June 2005, Lustre has consistently been used by at least half of the top ten, and more than 60 of the top 100 fastest supercomputers in the world, including the world's No. 1 ranked TOP500 supercomputer in November 2022, Frontier, as well as previous top supercomputers such as Fugaku, Titan and Sequoia. Lustre file systems are scalable and can be part of multiple computer clusters with tens of thousands of client nodes, hundreds of petabytes (PB) of storage on hundreds of servers, and tens of terabytes per second (TB/s) of aggregate I/O throughput. This makes Lustre file systems a popular choice for businesses with large data centers, including those in industries such as meteorology, simulation, artificial intelligence and machine learning, oil and gas, life science, rich media, and finance. The I/O performance of Lustre has widespread impact on these applications and has attracted broad attention. History The Lustre file system architecture was started as a research project in 1999 by Peter J. Braam, who was a staff of Carnegie Mellon University (CMU) at the time. Braam went on to found his own company Cluster File Systems in 2001, starting from work on the InterMezzo file system in the Coda project at CMU. Lustre was developed under the Accelerated Strategic Computing Initiative Path Forward project funded by the United States Department of Energy, which included Hewlett-Packard and Intel. In September 2007, Sun Microsystems acquired the assets of Cluster File Systems Inc. including its ”intellectual property“. Sun included Lustre with its high-p
https://en.wikipedia.org/wiki/The%20Seven%20Daughters%20of%20Eve
The Seven Daughters of Eve is a 2001 semi-fictional book by Bryan Sykes that presents the science of human origin in Africa and their dispersion to a general audience. Sykes explains the principles of genetics and human evolution, the particularities of mitochondrial DNA, and analyses of ancient DNA to genetically link modern humans to prehistoric ancestors. Following the developments of mitochondrial genetics, Sykes traces back human migrations, discusses the "out of Africa theory" and casts serious doubt upon Thor Heyerdahl's theory of the Peruvian origin of the Polynesians, which opposed the theory of their origin in Indonesia. He also describes the use of mitochondrial DNA in identifying the remains of Emperor Nicholas II of Russia, and in assessing the genetic makeup of modern Europe. The title of the book comes from one of the principal achievements of mitochondrial genetics, which is the classification of all modern Europeans into seven groups, the mitochondrial haplogroups. Each haplogroup is defined by a set of characteristic mutations on the mitochondrial genome, and can be traced along a person's maternal line to a specific prehistoric woman. Sykes refers to these women as "clan mothers", though these women did not all live concurrently. All these women in turn shared a common maternal ancestor, the Mitochondrial Eve. The last third of the book is spent on a series of fictional narratives, written by Sykes, describing his creative guesses about the lives of each of these seven "clan mothers". This latter half generally met with mixed reviews in comparison with the first part. Mitochondrial haplogroups in The Seven Daughters of Eve The seven "clan mothers" mentioned by Sykes each correspond to one (or more) human mitochondrial haplogroups. Ursula: corresponds to Haplogroup U (specifically U5, and excluding its subgroup K) Xenia: corresponds to Haplogroup X Helena: corresponds to Haplogroup H Velda: corresponds to Haplogroup V, found with part
https://en.wikipedia.org/wiki/Limonene
Limonene is a colorless liquid aliphatic hydrocarbon classified as a cyclic monoterpene, and is the major component in the volatile oil of citrus fruit peels. The -isomer, occurring more commonly in nature as the fragrance of oranges, is a flavoring agent in food manufacturing. It is also used in chemical synthesis as a precursor to carvone and as a renewables-based solvent in cleaning products. The less common -isomer has a piny, turpentine-like odor, and is found in the edible parts of such plants as caraway, dill, and bergamot orange plants. Limonene takes its name from Italian limone ("lemon"). Limonene is a chiral molecule, and biological sources produce one enantiomer: the principal industrial source, citrus fruit, contains -limonene ((+)-limonene), which is the (R)-enantiomer. Racemic limonene is known as dipentene. -Limonene is obtained commercially from citrus fruits through two primary methods: centrifugal separation or steam distillation. Chemical reactions Limonene is a relatively stable monoterpene and can be distilled without decomposition, although at elevated temperatures it cracks to form isoprene. It oxidizes easily in moist air to produce carveol, carvone, and limonene oxide. With sulfur, it undergoes dehydrogenation to p-cymene. Limonene occurs commonly as the (R)-enantiomer, but racemizes to dipentene at 300 °C. When warmed with mineral acid, limonene isomerizes to the conjugated diene α-terpinene (which can also easily be converted to p-cymene). Evidence for this isomerization includes the formation of Diels–Alder adducts between α-terpinene adducts and maleic anhydride. It is possible to effect reaction at one of the double bonds selectively. Anhydrous hydrogen chloride reacts preferentially at the disubstituted alkene, whereas epoxidation with mCPBA occurs at the trisubstituted alkene. In another synthetic method Markovnikov addition of trifluoroacetic acid followed by hydrolysis of the acetate gives terpineol. The most widely practice
https://en.wikipedia.org/wiki/Twin%20vector%20quantization
In data compression, twin vector quantization is related to vector quantization, but the speed of the quantization is doubled by the secondary vector analyzer. By using a subdimensional vector space useless hyperspace will be destroyed in the process. The formula for calculating the amount of destroyed hyperspace is: H(x) = 5.22 / 4m Data compression
https://en.wikipedia.org/wiki/List%20of%20office%20suites
In computing, an office suite is a collection of productivity software usually containing at least a word processor, spreadsheet and a presentation program. There are many different brands and types of office suites. This wikipedia article is unique for its list of discontinued office suites. Office suites Free and open source suites AndrOpen Office - available for Android Apache OpenOffice - available for Linux, macOS and Windows Calligra Suite - available for FreeBSD, Linux, macOS and Windows Collabora Online - available for Android, ChromeOS, iOS, iPadOS, Linux, macOS, online and Windows LibreOffice - available for Linux, macOS and Windows, and unofficial: Android, ChromeOS, FreeBSD, Haiku, iOS, iPadOS, OpenBSD, NetBSD and Solaris NeoOffice - available for macOS Freeware and proprietary suites Ability Office - available for Windows Google Workspace - available for Android, ChromeOS, iOS, iPadOS, Linux, macOS, online and Windows Hancom Office - available for Windows iWork - available for iOS, iPadOS, macOS and online Ichitaro - a Japanese-language suite available for Windows Microsoft 365 - available for Android, iOS, iPadOS, macOS, online and Windows MobiSystems OfficeSuite - available for Android, iOS and Windows ONLYOFFICE - available for Android, iOS, Linux, macOS, online and Windows Polaris Office - available for iOS, macOS and Windows SoftMaker Office - available for Android, iOS, iPadOS, Linux, macOS and Windows Tiki Wiki CMS Groupware - online content management WordPerfect Office - available for Windows WPS Office - available for Android, iOS, macOS and Windows Discontinued office suites Aster*x AUIS - an office suite developed by Carnegie Mellon University and named after Andrew Carnegie Breadbox Office - DOS software EasyOffice AppleWorks Breadbox Office Corel WordPerfect for DOS Hancom Office Suite (formerly ThinkFree Office) IBM Lotus SmartSuite IBM Lotus Symphony IBM Works – an office suite for the IBM OS/2 operating
https://en.wikipedia.org/wiki/Plantronics
Plantronics, Inc. is an American electronics company — branded Poly to reflect its dual Plantronics and Polycom heritage — producing audio communications equipment for business and consumers. Its products support unified communications, mobile use, gaming and music. Plantronics is headquartered in Santa Cruz, California, and most of its products are produced in China and Mexico. On March 18, 2019, Plantronics announced that it would change its name to Poly following its acquisition of Polycom, although it continues to trade on the New York Stock Exchange as Plantronics, Inc. (POLY; listed as PLT until May 24, 2021). On March 28, 2022, HP Inc. announced its intent to acquire Poly for $1.7 billion in cash as it looks to bolster its hybrid work offerings, such as headsets and videoconferencing hardware. Including debt, the deal valued at $3.3 billion and closed in August 2022. History In the early 1960s, airline headsets were so large and cumbersome that many pilots had switched back to the use of handheld microphones for communications. The speed and complexity of jet airliners caused a need for the introduction of small, lightweight headsets into the cockpit. In 1961, United Airlines solicited new designs from anyone who was interested. Courtney Graham, a United Airlines pilot, was one of the many who thought the heavy headsets should be replaced by something lighter. He collaborated with his pilot friend Keith Larkin to create a small, functional design which was robust enough to pass airlines standards. (Larkin had been working for a small company called Plane-Aids, a Japanese import company which offered spectacles and sunglasses that contained transistor radios in their temple pieces.) The final design, incorporating two small hearing aid-style transducers attached to a headband was submitted to United Airline approval. UAL's approval of the innovative design caused Graham and Larkin to incorporate as Pacific Plantronics (now called Plantronics, Inc.) on May 1
https://en.wikipedia.org/wiki/Acetoin
Acetoin, also known as 3-hydroxybutanone or acetyl methyl carbinol, is an organic compound with the formula CH3CH(OH)C(O)CH3. It is a colorless liquid with a pleasant, buttery odor. It is chiral. The form produced by bacteria is (R)-acetoin. Production in bacteria Acetoin is a neutral, four-carbon molecule used as an external energy store by a number of fermentative bacteria. It is produced by the decarboxylation of alpha-acetolactate, a common precursor in the biosynthesis of branched-chain amino acids. Owing to its neutral nature, production and excretion of acetoin during exponential growth prevents over-acidification of the cytoplasm and the surrounding medium that would result from accumulation of acidic metabolic products, such as acetic acid and citric acid. Once superior carbon sources are exhausted, and the culture enters stationary phase, acetoin can be used to maintain the culture density. The conversion of acetoin into acetyl-CoA is catalysed by the acetoin dehydrogenase complex, following a mechanism largely analogous to the pyruvate dehydrogenase complex; however, as acetoin is not a 2-oxoacid, it does not undergo decarboxylation by the E1 enzyme; instead, a molecule of acetaldehyde is released. In some bacteria, acetoin can also be reduced to 2,3-butanediol by acetoin reductase/2,3-butanediol dehydrogenase. The Voges-Proskauer test is a commonly used microbiological test for acetoin production. Uses Food ingredients Acetoin, along with diacetyl, is one of the compounds that gives butter its characteristic flavor. Because of this, manufacturers of partially hydrogenated oils typically add artificial butter flavor – acetoin and diacetyl – (along with beta carotene for the yellow color) to the final product. Acetoin can be found in apples, yogurt, asparagus, blackcurrants, blackberries, wheat, broccoli, brussels sprouts, cantaloupes, and maple syrup. Acetoin is used as a food flavoring (in baked goods) and as a fragrance. Electronic cigarettes
https://en.wikipedia.org/wiki/List%20of%20collaborative%20software
This list is divided into proprietary or free software, and open source software, with several comparison tables of different product and vendor characteristics. It also includes a section of project collaboration software, which is a standard feature in collaboration platforms. Collaborative software Comparison of notable software Systems listed on a light purple background are no longer in active development. General Information Comparison of unified communications features Comparison of collaborative software features Comparison of targets Open source software The following are open source applications for collaboration: Standard client–server software Access Grid, for audio and video-based collaboration Axigen Citadel/UX, with support for native groupware clients (Kontact, Novell Evolution, Microsoft Outlook) and web interface Cyn.in EGroupware, with support for native groupware clients (Kontact, Novell Evolution, Microsoft Outlook) and web interface Group-Office groupware and CRM Kolab, various native PIM clients Kopano OpenGroupware.org phpGroupWare Scalix SOGo, integrated email, calendaring with Apple iCal, Mozilla Thunderbird and native Outlook compatibility Teambox, Basecamp-style project management software with focus on GTD task management and conversations. (Only V3 and prior are open-source.) Zarafa Zentyal, with support for native groupware clients (Kontact, Novell Evolution) natively for Microsoft Outlook and web interface Zimbra Zulip Groupware: Web-based software Axigen Bricolage, content management system BigBlueButton, Web meetings Collabora Online, Enterprise-ready edition of LibreOffice enabling real-time collaborative editing of documents, spreadsheets, presentations and graphics DotNetNuke, also called DNN: module-based, evolved from ASP 1.0 demo applications EGroupware, a free open source groupware software intended for businesses from small to enterprises EtherPad, collaborative drafting with chat Feng Office Community Edition Fu
https://en.wikipedia.org/wiki/Spearmint%20%28flavour%29
Spearmint is a flavour that is either naturally or artificially created to taste like the oil of the herbaceous Mentha spicata (spearmint) plant. Uses The most common uses for spearmint flavor is in chewing gum and toothpaste. However, it is also used in a number of other products, mainly confectionery. It is also popular as a seasonal (usually around St. Patrick's Day) milkshake flavoring in Canada and the U.S. Trademark in the UK The words "WRIGLEY'S SPEARMINT" are trademarked in the UK. In 1959, skiffle artist Lonnie Donegan renamed his cover version of the 1924 Rose, Breuer, and Bloom song "Does the Spearmint Lose its Flavor on the Bedpost Overnight?" as the BBC, not wanting to risk breaching trademark laws, refused to play it. Donegan renamed the song "Does Your Chewing Gum Lose Its Flavor (On the Bedpost Overnight)", which then went on to become a top-10 hit in the UK and US.
https://en.wikipedia.org/wiki/Relaxation%20%28physics%29
In the physical sciences, relaxation usually means the return of a perturbed system into equilibrium. Each relaxation process can be categorized by a relaxation time τ. The simplest theoretical description of relaxation as function of time t is an exponential law (exponential decay). In simple linear systems Mechanics: Damped unforced oscillator Let the homogeneous differential equation: model damped unforced oscillations of a weight on a spring. The displacement will then be of the form . The constant T () is called the relaxation time of the system and the constant μ is the quasi-frequency. Electronics: RC circuit In an RC circuit containing a charged capacitor and a resistor, the voltage decays exponentially: The constant is called the relaxation time or RC time constant of the circuit. A nonlinear oscillator circuit which generates a repeating waveform by the repetitive discharge of a capacitor through a resistance is called a relaxation oscillator. In condensed matter physics In condensed matter physics, relaxation is usually studied as a linear response to a small external perturbation. Since the underlying microscopic processes are active even in the absence of external perturbations, one can also study "relaxation in equilibrium" instead of the usual "relaxation into equilibrium" (see fluctuation-dissipation theorem). Stress relaxation In continuum mechanics, stress relaxation is the gradual disappearance of stresses from a viscoelastic medium after it has been deformed. Dielectric relaxation time In dielectric materials, the dielectric polarization P depends on the electric field E. If E changes, P(t) reacts: the polarization relaxes towards a new equilibrium, i.e., the surface charges equalize. It is important in dielectric spectroscopy. Very long relaxation times are responsible for dielectric absorption. The dielectric relaxation time is closely related to the electrical conductivity. In a semiconductor it is a measure of how long it takes
https://en.wikipedia.org/wiki/Clausius%E2%80%93Clapeyron%20relation
The Clausius–Clapeyron relation, in chemical thermodynamics specifies the temperature dependence of pressure, most importantly vapor pressure, at a discontinuous phase transition between two phases of matter of a single constituent. It's named after Rudolf Clausius and Benoît Paul Émile Clapeyron. Its relevance to meteorology and climatology is the increase of the water-holding capacity of the atmosphere by about 7% for every 1 °C (1.8 °F) rise in temperature. James Thomson and William Thomson confirmed the relation experimentally in 1849-50, and it was historically important as a very early successful application of theoretical thermodynamics. Definition Exact Clapeyron equation On a pressure–temperature (P–T) diagram, for any phase change the line separating the two phases is known as the coexistence curve. The Clapeyron relation gives the slope of the tangents to this curve. Mathematically, where is the slope of the tangent to the coexistence curve at any point, is the specific latent heat, is the temperature, is the specific volume change of the phase transition, and is the specific entropy change of the phase transition. Clausius–Clapeyron equation The Clausius–Clapeyron equation applies to vaporization of liquids where vapor follows ideal gas law using the ideal gas constant and liquid volume is neglected as being much smaller than vapor volume V. It is often used to calculate vapor pressure of a liquid. The equation expresses this in a more convenient form just in terms of the latent heat, for moderate temperatures and pressures. Derivations Derivation from state postulate Using the state postulate, take the specific entropy for a homogeneous substance to be a function of specific volume and temperature . The Clausius–Clapeyron relation describes a Phase transition in a closed system composed of two contiguous phases, condensed matter and ideal gas, of a single substance, in mutual thermodynamic equilibrium, at constant temperat
https://en.wikipedia.org/wiki/Feynman%27s%20Lost%20Lecture
Feynman's Lost Lecture: The Motion of Planets Around the Sun is a book based on a lecture by Richard Feynman. Restoration of the lecture notes and conversion into book form was undertaken by Caltech physicist David L. Goodstein and archivist Judith R. Goodstein. Feynman had given the lecture on the motion of bodies at Caltech on March 13, 1964, but the notes and pictures were lost for a number of years and consequently not included in The Feynman Lectures on Physics series. The lecture notes were later found, but without the photographs of his illustrative chalkboard drawings. One of the editors, David L. Goodstein, stated that at first without the photographs, it was very hard to figure out what diagrams he was referring to in the audiotapes, but a later finding of his own private lecture notes made it possible to understand completely the logical framework with which Feynman delivered the lecture. Overview You can explain to people who don't know much of the physics, the early history... how Newton discovered... Kepler's Laws, and equal areas, and that means it's toward the sun, and all this stuff. And then the key - they always ask then, "Well, how do you see that it's an ellipse if it's the inverse square?" Well, it's God damned hard, there's no question of that. But I tried to find the simplest one I could. In a non-course lecture delivered to a freshman physics audience, Feynman undertakes to present an elementary, geometric demonstration of Newton's discovery of the fact that Kepler's first observation, that the planets travel in elliptical orbits, is a necessary consequence of Kepler's other two observations. The structure of Feynman's lecture: A historical introduction to the material An overview of some geometric properties of an ellipse Newton's demonstration that equal areas in equal times is equivalent to forces toward the sun Feynman's demonstration that equal changes in velocity occur in equal angles in the orbit Feynman's demonstration,
https://en.wikipedia.org/wiki/United%20States%20Army%20Command%2C%20Control%2C%20Communication%2C%20Computers%2C%20Cyber%2C%20Intelligence%2C%20Surveillance%20and%20Reconnaissance%20Center
The Combat Capabilities Development Command (CCDC) C5ISR Center, formerly the Communications-Electronics RD&E Center (CERDEC), is the United States Army information technologies and integrated systems center. CCDC C5ISR Center is headquartered at Aberdeen Proving Ground in Maryland, with activities at Fort Belvoir in Virginia and Joint Base McGuire-Dix-Lakehurst in New Jersey. As one of the 10 organizations that make up the Combat Capabilities Development Command, a subordinate organization of the Army Futures Command, CCDC C5ISR Centers supplies Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance and Reconnaissance (C5ISR) capabilities, technologies and integrated solutions for the Soldier. Core competencies CCDC C5ISR Center's six directorates and Product Director (PD) aim to integrate C5ISR technologies in order to provide systems-of-systems products for soldiers. C5ISR is the ability to direct, coordinate and control the assets necessary for accomplishing the mission and reporting battlefield situations and activities. CCDC C5ISR Center develops new technologies, and adapts technologies from other Army R&D centers and laboratories, Department of Defense partners, government and national laboratories, academia and industry. Additionally, the group provides products to other system developers (for platform integration). The group utilizes Modeling and Simulation (M&S) capabilities to provide the Army and Joint Forces, system of systems assessments of C5ISR technologies and concepts. CCDC C5ISR Center's product manager for C5ISR On-the-Move assesses the effectiveness of inserting new technologies into an operationally relevant environment. CCDC C5ISR Center collaborates with Army, DoD and other stakeholders to provide C5ISR models, simulated architectures and automated tools in support of requirement definition, design and engineering, manufacturing, and test and evaluation. Directorates CCDC C5ISR Center is subdivided into several
https://en.wikipedia.org/wiki/The%20Hot%20Zone
The Hot Zone: A Terrifying True Story is a best-selling 1994 nonfiction thriller by Richard Preston about the origins and incidents involving viral hemorrhagic fevers, particularly ebolaviruses and marburgviruses. The basis of the book was Preston's 1992 New Yorker article "Crisis in the Hot Zone". The filoviruses—including Ebola virus, Sudan virus, Marburg virus, and Ravn virus—are Biosafety Level 4 agents, extremely dangerous to humans because they are very infectious, have a high fatality rate, and most have no known prophylactic measures, treatments, or cures. Along with describing the history of the devastation caused by two of these Central African diseases, Ebola virus disease and Marburg virus disease, Preston described a 1989 incident in which a relative of Ebola virus, Reston virus, was discovered at a primate quarantine facility in Reston, Virginia, less than 15 miles (24 km) away from Washington, D.C. Synopsis The book is in four sections: "The Shadow of Mount Elgon" delves into the history of filoviruses, as well as speculation about the origins of AIDS. Preston recounts the story of "Charles Monet" (a pseudonym), who might have caught Marburg virus from visiting Kitum Cave on Mount Elgon in Kenya. The author describes the progression of the disease, from the initial headache and backache, to the final stage in which Monet's internal organs fail and he hemorrhages extensively in a waiting room in a Nairobi hospital. This part also introduces a promising young physician who became infected with Marburg virus while treating Monet. Nancy Jaax's story is told. Viruses, biosafety levels and procedures were described. The Ebola virus disease outbreaks caused by Ebola virus and its cousin, Sudan virus, are mentioned. Preston talks to the man who named the Ebola virus. "The Monkey House" chronicles the discovery of Reston virus among imported monkeys in Reston, Virginia, and the following actions taken by the U.S. Army and Centers for Disease Control. It
https://en.wikipedia.org/wiki/Heirloom%20plant
An heirloom plant, heirloom variety, heritage fruit (Australia and New Zealand), or heirloom vegetable (especially in Ireland and the UK) is an old cultivar of a plant used for food that is grown and maintained by gardeners and farmers, particularly in isolated communities of the Western world. These were commonly grown during earlier periods in human history, but are not used in modern large-scale agriculture. In some parts of the world, it is illegal to sell seeds of cultivars that are not listed as approved for sale. The Henry Doubleday Research Association, now known as Garden Organic, responded to this legislation by setting up the Heritage Seed Library to preserve seeds of as many of the older cultivars as possible. However, seed banks alone have not been able to provide sufficient insurance against catastrophic loss. In some jurisdictions, like Colombia, laws have been proposed that would make seed saving itself illegal. Many heirloom vegetables have kept their traits through open pollination, while fruit varieties such as apples have been propagated over the centuries through grafts and cuttings. The trend of growing heirloom plants in gardens has been returning in popularity in North America and Europe. Origin Before the industrialization of agriculture, a much wider variety of plant foods were grown for human consumption, largely due to farmers and gardeners saving seeds and cuttings for future planting. From the 16th century through the early 20th centuries, the diversity was huge. Old nursery catalogues were filled with plums, peaches, pears and apples of numerous varieties and seed catalogs offered legions of vegetable varieties. Valuable and carefully selected seeds were sold and traded using these catalogs along with useful advice on cultivation. Since World War II, agriculture in the industrialized world has mostly consisted of food crops which are grown in large, monocultural plots. In order to maximize consistency, few varieties of each type of
https://en.wikipedia.org/wiki/Applied%20behavior%20analysis
Applied behavior analysis (ABA), also called behavioral engineering, is a psychological intervention that applies approaches based upon the principles of respondent and operant conditioning to change behavior of social significance. It is the applied form of behavior analysis; the other two forms are radical behaviorism (or the philosophy of the science) and the experimental analysis of behavior (or basic experimental laboratory research). The name applied behavior analysis has replaced behavior modification because the latter approach suggested attempting to change behavior without clarifying the relevant behavior-environment interactions. In contrast, ABA changes behavior by first assessing the functional relationship between a targeted behavior and the environment. Further, the approach often seeks to develop socially acceptable alternatives for aberrant behaviors. Although service delivery providers overwhelmingly specialize in utilizing structured and naturalistic early behavioral interventions for individuals with autism, ABA has also been utilized in a range of other areas. ABA is controversial, especially among members of the autism rights movement, for a number of reasons. Some ABA interventions emphasize normalization instead of acceptance, and there is a history of, in some forms of ABA and its predecessors, the use of aversives, such as electric shocks. ABA is also controversial due to concerns about its evidence base. In the last few years, there have been reforms in some types of ABA interventions to address these criticisms and concerns, especially regarding masking. Definition ABA is an applied science devoted to developing procedures which will produce observable changes in behavior. It is to be distinguished from the experimental analysis of behavior, which focuses on basic experimental laboratory research, but it uses principles developed by such research, in particular operant conditioning and classical conditioning. Behavior analysis adopts
https://en.wikipedia.org/wiki/PC/104
PC/104 (or PC104) is a family of embedded computer standards which define both form factors and computer buses by the PC/104 Consortium. Its name derives from the 104 pins on the interboard connector (ISA) in the original PC/104 specification and has been retained in subsequent revisions, despite changes to connectors. PC/104 is intended for specialized environments where a small, rugged computer system is required. The standard is modular, and allows consumers to stack together boards from a variety of COTS manufacturers to produce a customized embedded system. The original PC/104 form factor is somewhat smaller than a desktop PC motherboard at . Unlike other popular computer form factors such as ATX, which rely on a motherboard or backplane, PC/104 boards are stacked on top of each other like building blocks. The PC/104 specification defines four mounting holes at the corners of each module, which allow the boards to be fastened to each other using standoffs. The stackable bus connectors and use of standoffs provides a more rugged mounting than slot boards found in desktop PCs. The compact board size further contributes to the ruggedness of the form factor by reducing the possibility of PCB flexing under shock and vibration. A typical PC/104 system (commonly referred to as a "stack") will include a CPU board, power supply board, and one or more peripheral boards, such as a data acquisition module, GPS receiver, or Wireless LAN controller. A wide array of peripheral boards are available from various vendors. Users may design a stack that incorporates boards from multiple vendors. The overall height, weight, and power consumption of the stack can vary depending on the number of boards that are used. PC/104 is sometimes referred to as a "stackable PC", as most of the architecture derives from the desktop PC. The majority of PC/104 CPU boards are x86 compatible and include standard PC interfaces such as Serial Ports, USB, Ethernet, and VGA. A x86 PC/104 s
https://en.wikipedia.org/wiki/Institute%20of%20Space%20and%20Astronautical%20Science
, or ISAS, is a Japanese national research organization of astrophysics using rockets, astronomical satellites and interplanetary probes which played a major role in Japan's space development. Since 2003, it is a division of Japan Aerospace Exploration Agency (JAXA). History The ISAS originated as part of the Institute of Industrial Science of the University of Tokyo (:ja: 東京大学生産技術研究所), where Hideo Itokawa experimented with miniature solid-fuel rockets (Pencil Rocket and ) in the 1950s. This experimentation eventually led to the development of the Κ (Kappa) sounding rocket, which was used for observations during the International Geophysical Year (IGY). By 1960, the Κ-8 rocket had reached an altitude of 200 km. In 1964, the rocket group and the Institute of Aeronautics, along with scientific ballooning team, were merged to form within the University of Tokyo. The rocket evolved into the L (Lambda) series, and, in 1970, L-4S-5 was launched as Japan's first artificial satellite Ohsumi. Although Lambda rockets were only sounding rockets, the next generation of M (Mu) rockets was intended to be satellite launch vehicles from the start. Beginning in 1971, ISAS launched a series of scientific satellites to observe the ionosphere and magnetosphere. Since the launch of Hakucho in 1979, ISAS has had X-ray astronomy satellites consecutively in orbit, until it was briefly terminated by the launch failure of ASTRO-E. In 1981, as a part of university system reform, and for the mission expansion, ISAS was spun out from University of Tokyo as an inter-university national research organization, Institute of Space and Astronautical Science. ISAS was responsible for launching Japan's first interplanetary probes, Sakigake and Suisei, to Halley's Comet in 1985. It also launched Hiten, Japan's first lunar probe, in 1990. The Nozomi probe was launched in 1998 in an attempt to orbit Mars, but the spacecraft suffered system failures and was unable to enter orbit. In 2003, ISAS laun
https://en.wikipedia.org/wiki/Processed%20meat
Processed meat is considered to be any meat which has been modified in order to either improve its taste or to extend its shelf life. Methods of meat processing include salting, curing, fermentation, smoking, boiling, frying, and/or the addition of chemical preservatives. Processed meat is usually composed of pork or beef, but also poultry, while it can also contain offal or meat by-products such as blood. Processed meat products include bacon, ham, sausages, salami, corned beef, jerky, hot dogs, lunch meat, canned meat, chicken nuggets, and meat-based sauces. Meat processing includes all the processes that change fresh meat with the exception of simple mechanical processes such as cutting, grinding or mixing. Meat processing began as soon as people realized that cooking and salting prolongs the life of fresh meat. It is not known when this took place; however, the process of salting and sun-drying was recorded in Ancient Egypt, while using ice and snow is credited to early Romans, and canning was developed by Nicolas Appert who in 1810 received a prize for his invention from the French government. Preservatives Nitrate and sodium nitrite found in processed meats can be converted by the human body into nitrosamines that can be carcinogenic, causing mutation in the colorectal cell line, thereby causing tumorigenesis and eventually leading to cancer. Processed meat is more carcinogenic compared to unprocessed red meat because of the abundance of potent nitrosyl-heme molecules that form N-nitroso compounds. A principal concern about sodium nitrite is Nitrosation/nitrosylation, the formation of carcinogenic nitroso-compounds in meats containing sodium nitrite or potassium nitrate, especially nitrosyl-haem (nitrosyl heme). In addition to nitrosyl-haem, carcinogenic nitrosamines can be formed from the reaction of nitrite with secondary amines under acidic conditions (such as occurs in the human stomach) as well as during the curing process used to preserve meats.
https://en.wikipedia.org/wiki/Single-letter%20second-level%20domain
Single-letter second-level domains are domains in which the second-level domain of the domain name consists of only one letter, such as . In 1993, the Internet Assigned Numbers Authority (IANA) explicitly reserved all single-letter and single-digit second-level domains under the top-level domains com, net, and org, and grandfathered those that had already been assigned. In December 2005, ICANN considered auctioning these domain names. Active single-letter domains On December 1, 1993, the Internet Assigned Numbers Authority (IANA) explicitly reserved the remaining single-letter and single-digit domain names. The few domains that were already assigned were grandfathered in and continued to exist. The six single-letter domains in existence at that time under .com, .net and .org were the following: The .org TLD was subsequently reopened for single-letter domain registrations. These and selected other gTLD and ccTLD single-letter domain names currently in use, typically as shortcuts, are listed below. Many other single-letter second-level domains have been registered under country code top-level domains. The list of country code top-level domains which have been identified to allow single-letter domains are: .ac .af .ag .ai .am .bo .by .bz .cm .cn .co .cr .cx .cz .de .dk .fm .fr .gd .gg .gl .gp .gs .gt .gy .hn .hr .ht .ie<ref>One and Two Letter .IE Domains Now Available "The release of short .ie domain names " Dublin, 12 October 2015</ref> .im .io .is .je .kg .ki .kw .la .lb .lc .ly .md .mg .mk .mp .ms .mw .mx .mu .nf .np .nz .pe .ph .pk .pl .pn .pr .pw .ro .sh .st .tc .tl .tt .to .tv .ua .vc .vg .vn .vu .ws Non-ASCII single-character domains Single-character non-ASCII second-level domains also exist (as seen below), also known as Internationalized domain names (IDN), these domains are actually registered as their Punycode translations (which are more than a single character) for DNS purposes.
https://en.wikipedia.org/wiki/Input%20kludge
In computer programming, an input kludge is a type of failure in software (an anti-pattern) where simple user input is not handled. For example, if a computer program accepts free text input from the user, an ad hoc algorithm will mishandle many combinations of legal and illegal input strings. Input kludges are usually difficult for a programmer to detect in a unit test, but very easy for the end user to find. The evidence exists that the end user can easily crash software that fails to correctly handle user input. Indeed, the buffer overflow security hole is an example of the problems caused. To remedy input kludges, one may use input validation algorithms to handle user input. A monkey test can be used to detect an input kludge problem. A common first test to discover this problem is to roll one's hand across the computer keyboard or to 'mash' the keyboard to produce a large junk input, but such an action often lacks reproducibility. Greater systematicity and reproducibility may be obtained by using fuzz testing software. See also Garbage in, garbage out Guard (computer science) Kludge Anti-patterns Software bugs
https://en.wikipedia.org/wiki/Ecological%20anthropology
Ecological anthropology is a sub-field of anthropology and is defined as the "study of cultural adaptations to environments". The sub-field is also defined as, "the study of relationships between a population of humans and their biophysical environment". The focus of its research concerns "how cultural beliefs and practices helped human populations adapt to their environments, and how people used elements of their culture to maintain their ecosystems". Ecological anthropology developed from the approach of cultural ecology, and it provided a conceptual framework more suitable for scientific inquiry than the cultural ecology approach. Research pursued under this approach aims to study a wide range of human responses to environmental problems. Ecological anthropologist, Conrad Kottak published arguing there is an original older 'functionalist', apolitical style ecological anthropology and, as of the time of writing in 1999, a 'new ecological anthropology' was emerging and being recommended consisting of a more complex intersecting global, national, regional and local systems style or approach. History of the domain and leading researchers In the 1960s, ecological anthropology first appeared as a response to cultural ecology, a sub-field of anthropology led by Julian Steward. Steward focused on studying different modes of subsistence as methods of energy transfer and then analyzed how they determine other aspects of culture. Culture became the unit of analysis. The first ecological anthropologists explored the idea that humans as ecological populations should be the unit of analysis, and culture became the means by which that population alters and adapts to the environment. It was characterised by systems theory, functionalism and negative feedback analysis. Benjamin S. Orlove has noted that the development of ecological anthropology has occurred in stages. "Each stage is a reaction to the previous one rather than merely an addition to it". The first stage con
https://en.wikipedia.org/wiki/Harold%20S.%20Shapiro
Harold Seymour Shapiro (2 April 1928 – 5 March 2021) was a professor of mathematics at the Royal Institute of Technology in Stockholm, Sweden, best known for inventing the so-called Shapiro polynomials (also known as Golay–Shapiro polynomials or Rudin–Shapiro polynomials) and for work on quadrature domains. His main research areas were approximation theory, complex analysis, functional analysis, and partial differential equations. He was also interested in the pedagogy of problem-solving. Born and raised in Brooklyn, New York, to a Jewish family, Shapiro earned a B.Sc. from the City College of New York in 1949 and earned his M.S. degree from the Massachusetts Institute of Technology in 1951. He received his Ph.D. in 1952 from MIT; his thesis was written under the supervision of Norman Levinson. He was the father of cosmologist Max Tegmark, a graduate of the Royal Institute of Technology and now a professor at MIT. Shapiro died on 5 March 2021, aged 92. See also Rudin–Shapiro sequence List of Jewish mathematicians#S
https://en.wikipedia.org/wiki/Cocktail%20party%20effect
The cocktail party effect refers to the phenomenon wherein the brain focuses a person's attention on a particular stimulus, usually auditory. This focus excludes a range of other stimuli from conscious awareness, as when a partygoer follows a single conversation in a noisy room. This ability is widely distributed among humans, with most listeners more or less easily able to portion the totality of sound detected by the ears into distinct streams, and subsequently to decide which streams are most pertinent, excluding all or most others. It has been proposed that a person's sensory memory subconsciously parses all stimuli and identifies discrete portions of these sensations according to their salience. This allows most people to tune effortlessly into a single voice while tuning out all others. The phenomenon is often described as a "selective attention" or "selective hearing". It may also describe a similar phenomenon that occurs when one may immediately detect words of importance originating from unattended stimuli, for instance hearing one's name among a wide range of auditory input. A person who lacks the ability to segregate stimuli in this way is often said to display the cocktail party problem or cocktail party deafness. This may also be described as auditory processing disorder or King-Kopetzky syndrome. Neurological basis (and binaural processing) Auditory attention in regards to the cocktail party effect primarily occurs in the left hemisphere of the superior temporal gyrus, a non-primary region of auditory cortex; a fronto-parietal network involving the inferior frontal gyrus, superior parietal sulcus, and intraparietal sulcus also accounts for the acts of attention-shifting, speech processing, and attention control. Both the target stream (the more important information being attended to) and competing/interfering streams are processed in the same pathway within the left hemisphere, but fMRI scans show that target streams are treated with more attention
https://en.wikipedia.org/wiki/Sweetness
Sweetness is a basic taste most commonly perceived when eating foods rich in sugars. Sweet tastes are generally regarded as pleasurable. In addition to sugars like sucrose, many other chemical compounds are sweet, including aldehydes, ketones, and sugar alcohols. Some are sweet at very low concentrations, allowing their use as non-caloric sugar substitutes. Such non-sugar sweeteners include saccharin and aspartame. Other compounds, such as miraculin, may alter perception of sweetness itself. The perceived intensity of sugars and high-potency sweeteners, such as aspartame and neohesperidin dihydrochalcone, are heritable, with gene effect accounting for approximately 30% of the variation. The chemosensory basis for detecting sweetness, which varies between both individuals and species, has only begun to be understood since the late 20th century. One theoretical model of sweetness is the multipoint attachment theory, which involves multiple binding sites between a sweetness receptor and a sweet substance. Studies indicate that responsiveness to sugars and sweetness has very ancient evolutionary beginnings, being manifest as chemotaxis even in motile bacteria such as E. coli. Newborn human infants also demonstrate preferences for high sugar concentrations and prefer solutions that are sweeter than lactose, the sugar found in breast milk. Sweetness appears to have the highest taste recognition threshold, being detectable at around 1 part in 200 of sucrose in solution. By comparison, bitterness appears to have the lowest detection threshold, at about 1 part in 2 million for quinine in solution. In the natural settings that human primate ancestors evolved in, sweetness intensity should indicate energy density, while bitterness tends to indicate toxicity. The high sweetness detection threshold and low bitterness detection threshold would have predisposed our primate ancestors to seek out sweet-tasting (and energy-dense) foods and avoid bitter-tasting foods. Even amongst
https://en.wikipedia.org/wiki/Dipping%20sauce
A dip or dip sauce is a common condiment for many types of food. Dips are used to add flavor or texture to a food, such as pita bread, dumplings, crackers, chopped raw vegetables, fruits, seafood, cubed pieces of meat and cheese, potato chips, tortilla chips, falafel, and sometimes even whole sandwiches in the case of jus. Unlike other sauces, instead of applying the sauce to the food, the food is typically placed or dipped into the sauce. Dips are commonly used for finger foods, appetizers, and other food types. Thick dips based on sour cream, crème fraîche, milk, yogurt, mayonnaise, soft cheese, or beans are a staple of American hors d'oeuvres and are thicker than spreads, which can be thinned to make dips. Celebrity chef Alton Brown suggests that a dip is defined based on its ability to "maintain contact with its transport mechanism over of white carpet". Dips in various forms are eaten all over the world and people have been using sauces for dipping for thousands of years. List of dips Some types of dip include: Ajika, a spicy, subtly flavoured dip in Caucasian cuisine, based on hot red pepper, garlic, herbs and spices Ajvar, made from red bell peppers with garlic, found in Bosnian cuisine and Serbian cuisine Artichoke dip Au jus, a salty beef broth or gravy especially used for dipping french dip sandwiches Baba ghanoush, a dip made from eggplant, popular in the Eastern Mediterranean and parts of South Asia Bagna càuda, a regional dish of the Italian Piedmont Banana ketchup, a Filipino condiment made from bananas; used similar to tomato ketchup Barbecue sauce, often used for grilled and fried meats in the United States Bean dip, dip made from refried beans Blue cheese dressing, commonly used as a dip for raw vegetables or buffalo wings Buffalo sauce, often used as both a coating for Buffalo wings as well as a standalone dipping sauce for other foods Cheese sauce Chile con queso, used in Tex-Mex cuisine with tortilla chips Chili oil, used as
https://en.wikipedia.org/wiki/Yania%20Tierra
Yania Tierra is a Spanish language documentary poem written by Aída Cartagena Portalatín and published in 1981 as Yania Tierra: Poema Documento (Yania Tierra: Document Poem). It traces the history of the Dominican Republic, beginning with the time of Columbus, using Yania Tierra as a viewpoint character. There is also a bilingual version of the poem. See also National personification Columbia National personifications 1981 poems Spanish-language poems Dominican Republic literature History of the Dominican Republic
https://en.wikipedia.org/wiki/Billy%20Yank
Billy Yank or Billy Yankee is the personification of the United States soldier (volunteer or Regular) during the American Civil War. The latter part of the name is derived from Yankee, previously a term for New Englanders, and possibly deriving from a term for Dutch settlers of New Netherland before that, extended by American Southerners to refer to Americans from above the Mason-Dixon Line (and by the British to refer to anyone from the United States). Although little evidence exists to suggest that the name was used widely during the Civil War, unlike its rebel counterpart Johnny Reb, early 20th century political cartoonists introduced 'Billy Yank' to symbolize U.S. combatants in the American Civil War of the 1860s. Billy Yank is usually pictured wearing a regulation U.S. Army blue wool uniform that included the fatigue blouse, a light-weight wool coat with an inside pocket and four brass buttons on the front, with a kepi-style forage cap made of wool broadcloth with a rounded, flat top, cotton lining, and leather visor. See also Johnny Reb Johnny Reb & Billy Yank, 1905 novel Johnny Reb and Billy Yank (comic strip)
https://en.wikipedia.org/wiki/Stirling%20transform
In combinatorial mathematics, the Stirling transform of a sequence { an : n = 1, 2, 3, ... } of numbers is the sequence { bn : n = 1, 2, 3, ... } given by where is the Stirling number of the second kind, also denoted S(n,k) (with a capital S), which is the number of partitions of a set of size n into k parts. The inverse transform is where s(n,k) (with a lower-case s) is a Stirling number of the first kind. Berstein and Sloane (cited below) state "If an is the number of objects in some class with points labeled 1, 2, ..., n (with all labels distinct, i.e. ordinary labeled structures), then bn is the number of objects with points labeled 1, 2, ..., n (with repetitions allowed)." If is a formal power series, and with an and bn as above, then Likewise, the inverse transform leads to the generating function identity See also Binomial transform Generating function transformation List of factorial and binomial topics
https://en.wikipedia.org/wiki/Gas%20generator
A gas generator is a device for generating gas. A gas generator may create gas by a chemical reaction or from a solid or liquid source, when storing a pressurized gas is undesirable or impractical. The term often refers to a device that uses a rocket propellant to generate large quantities of gas. The gas is typically used to drive a turbine rather than to provide thrust as in a rocket engine. Gas generators of this type are used to power turbopumps in rocket engines, in a gas-generator cycle. It is also used by some auxiliary power units to power electric generators and hydraulic pumps. Another common use of the term is in the industrial gases industry, where gas generators are used to produce gaseous chemicals for sale. For example, the chemical oxygen generator, which delivers breathable oxygen at a controlled rate over a prolonged period. During World War II, portable gas generators that converted coke to producer gas were used to power vehicles as a way of alleviating petrol shortages. Other types include the gas generator in an automobile airbag, which is designed to rapidly produce a specific quantity of inert gas. Common applications As a power source The V-2 rocket used hydrogen peroxide decomposed by a liquid sodium permanganate catalyst solution as a gas generator. This was used to drive a turbopump to pressurize the main LOX-ethanol propellants. In the Saturn V F-1 and Space Shuttle main engine, some of the main propellant was burned to drive the turbopump (see gas-generator cycle and staged combustion cycle). The gas generator in these designs uses a highly fuel-rich mix to keep flame temperatures relatively low. The Space Shuttle auxiliary power unit and the F-16 emergency power unit (EPU) use hydrazine as a fuel. The gas drives a turbine which drives hydraulic pumps. In the F-16 EPU it also drives an electric generator. Gas generators have also been used to power torpedoes. For example, the US Navy Mark 16 torpedo was powered by hy
https://en.wikipedia.org/wiki/Bonini%27s%20paradox
Bonini's paradox, named after Stanford business professor Charles Bonini, explains the difficulty in constructing models or simulations that fully capture the workings of complex systems (such as the human brain). Statements In modern discourse, the paradox was articulated by John M. Dutton and William H. Starbuck: "As a model of a complex system becomes more complete, it becomes less understandable. Alternatively, as a model grows more realistic, it also becomes just as difficult to understand as the real-world processes it represents." This paradox may be used by researchers to explain why complete models of the human brain and thinking processes have not been created and will undoubtedly remain difficult for years to come. This same paradox was observed earlier from a quote by philosopher-poet Paul Valéry (1871–1945): "Ce qui est simple est toujours faux. Ce qui ne l’est pas est inutilisable". ("If it's simple, it's always false. If it's not, it's unusable.") Also, the same topic has been discussed by Richard Levins in his classic essay "The Strategy of Model Building in Population Biology", in stating that complex models have 'too many parameters to measure, leading to analytically insoluble equations that would exceed the capacity of our computers, but the results would have no meaning for us even if they could be solved. Related issues Bonini's paradox can be seen as a case of the map–territory relation: simpler maps are less accurate though more useful representations of the territory. An extreme form is given in the fictional stories Sylvie and Bruno Concluded and "On Exactitude in Science", which imagine a map of a scale of 1:1 (the same size as the territory), which is precise but unusable, illustrating one extreme of Bonini's paradox. Isaac Asimov's fictional science of "Psychohistory" in his Foundation series also faces with this dilemma; Asimov even had one of his psychohistorians discuss the paradox. See also
https://en.wikipedia.org/wiki/Typhlosole
A typhlosole is an internal fold of the intestine or intestine inner wall. Typhlosoles occur in bivalve mollusks, lampreys and some annelids and echinoderms. In earthworms, it is a dorsal flap of the intestine that runs along most of its length, effectively forming a tube within a tube, and increasing the absorption area by that of its inner surface. Its function is to increase intestine surface area for more efficient absorption of digested nutrients. In different earthworm families, the typhlosole appears to have multiple origins. The Lumbricidae, for example, have a typhlosole which is an infolding of all layers of the intestine wall, whereas in some other families (e.g. Megascolecidae), it is an infolding of only the inner layer, and in many earthworms it is absent. See also body cavity (coelom)
https://en.wikipedia.org/wiki/Groin
In human anatomy, the groin also known as the inguinal region or iliac region, is the junctional area between the torso and the thigh. The groin is at the front of the body on either side of the pubic tubercle, where the lower part of the abdominal wall meets the thigh. A fold or crease is formed at this junction known as the inguinal groove, or crease. This is also the area of the medial compartment of the thigh that contains attachments of the adductor muscles of the hip or the groin muscles. The groin is the common site for a hernia. Gross anatomy Where the lower part of the anterior abdominal wall meets the thigh, a crease is formed known as the inguinal groove or crease. The junction is the area of the medial compartment of the thigh that contains the attachments of the adductor muscles of the hip also known as the groin muscles. The adductor muscles that make up the groin consist of the adductor brevis, adductor longus, adductor magnus, gracilis, and pectineus. These groin muscles adduct the thigh (bring the thigh and knee closer to the midline). The groin is innervated by branches of the lumbar plexus. The pectineus muscle is innervated by the femoral nerve, and the hamstring portion of adductor magnus is innervated by the tibial nerve. In the groin, underneath the skin, there are three to five deep inguinal lymph nodes that play a role in the immune system. These can be swollen due to certain diseases, the most common one being a simple infection, and, less likely, from cancer. A chain of superficial inguinal lymph nodes drain to the deep nodes. There are two depressions called fossae in an area called the inguinal triangle – the lateral inguinal fossa and the medial inguinal fossa. The inguinal ligament runs from the pubic tubercle to the anterior superior iliac spine, and its anatomy is very important for hernia operations. Clinical significance A pulled groin muscle usually refers to a painful strain of the hip adductor muscles. An inguinal her
https://en.wikipedia.org/wiki/Ralink
Ralink Technology, Corp. is a Wi-Fi chipset manufacturer mainly known for their IEEE 802.11 (Wireless LAN) chipsets. Ralink was founded in 2001 in Cupertino, California, then moved its headquarters to Hsinchu, Taiwan. On 5 May 2011, Ralink was acquired by MediaTek. Some of Ralink's 802.11n RT2800 chipsets have been accepted into the Wi-Fi Alliance 802.11n draft 2.0 core technology testbed. They have also been selected in the Wi-Fi Protected Setup (WPS) and Wireless Multimedia Extensions Power Save (WMM-PS) testbeds. Ralink was a participant in the Wi-Fi Alliance and the IEEE 802.11 standards committees. Ralink chipsets are used in various consumer-grade routers made by Gigabyte Technology, Linksys, D-Link, Asus and Belkin, as well as Wi-Fi adaptors for USB, PCI, ExpressCard, PC Card, and PCI Express interfaces. An example of an adapter is the Nintendo Wi-Fi USB Connector which uses the Ralink RT2570 chipset to allow a Nintendo DS or Wii to be internetworked via a home computer. Operating systems support Ralink provides some documentation without a non-disclosure agreement. This includes datasheets of their PCI and PCIe chipsets, but for now does not include documentation of their system on a chip used in Wireless routers. Linux Drivers for MediaTek Ralink wireless network interface controllers were mainlined into the Linux kernel version 2.6.24. (See Comparison of open-source wireless drivers.) Ralink provides GNU General Public License-licensed (GPL) drivers for the Linux kernel. While Linux drivers for the older RT2500 chipsets are no longer updated by Ralink, these are now being maintained by Serialmonkey's rt2x00 project. Current Ralink chipsets require a firmware to be loaded. Ralink allows the use and redistribution of firmware, but does not allow its modification. In February 2011 Greg Kroah-Hartman praised Ralink for their change in attitude towards the Linux kernel developer community: See also List of companies of Taiwan
https://en.wikipedia.org/wiki/Mills%27%20constant
In number theory, Mills' constant is defined as the smallest positive real number A such that the floor function of the double exponential function is a prime number for all positive natural numbers n. This constant is named after William Harold Mills who proved in 1947 the existence of A based on results of Guido Hoheisel and Albert Ingham on the prime gaps. Its value is unproven, but if the Riemann hypothesis is true, it is approximately 1.3063778838630806904686144926... . Mills primes The primes generated by Mills' constant are known as Mills primes; if the Riemann hypothesis is true, the sequence begins . If ai denotes the i th prime in this sequence, then ai can be calculated as the smallest prime number larger than . In order to ensure that rounding , for n = 1, 2, 3, …, produces this sequence of primes, it must be the case that . The Hoheisel–Ingham results guarantee that there exists a prime between any two sufficiently large cube numbers, which is sufficient to prove this inequality if we start from a sufficiently large first prime . The Riemann hypothesis implies that there exists a prime between any two consecutive cubes, allowing the sufficiently large condition to be removed, and allowing the sequence of Mills primes to begin at a1 = 2. For all a > , there is at least one prime between and . This upper bound is much too large to be practical, as it is infeasible to check every number below that figure. However, the value of Mills' constant can be verified by calculating the first prime in the sequence that is greater than that figure. As of April 2017, the 11th number in the sequence is the largest one that has been proved prime. It is and has 20562 digits. , the largest known Mills probable prime (under the Riemann hypothesis) is , which is 555,154 digits long. Numerical calculation By calculating the sequence of Mills primes, one can approximate Mills' constant as Caldwell and Cheng used this method to compute 6850 base 10 digits of Mills
https://en.wikipedia.org/wiki/ZTE
ZTE Corporation is a Chinese partially state-owned technology company that specializes in telecommunication. Founded in 1985, ZTE is listed on both the Hong Kong and Shenzhen Stock Exchanges. ZTE's core business is wireless, exchange, optical transmission, data telecommunications gear, telecommunications software, and mobile phones. ZTE primarily sells products under its own name, but it is also an OEM. The company has faced criticism in the United States, India, and Sweden over ties to the Chinese government that could enable mass surveillance. In 2017, ZTE was fined for illegally exporting U.S. technology to Iran and North Korea in violations of economic sanctions. In April 2018, after the company failed to properly reprimand the employees involved, the U.S. Department of Commerce banned U.S. companies (semiconductors) from exporting to ZTE for seven years. The ban was lifted in July 2018 after ZTE replaced its senior management, and agreed to pay additional fines and establish an internal compliance team for 10 years. In June 2020, the Federal Communications Commission (FCC) designated ZTE a national security threat. In 2023, the European Commission banned ZTE from providing telecommunication services. History ZTE, initially founded as Zhongxing Semiconductor Co., Ltd in Shenzhen, Guangdong province, in 1985, was incorporated by a group of investors associated with China's Ministry of Aerospace Industry. In March 1993, Zhongxing Semiconductor changed its name to Zhongxing New Telecommunications Equipment Co., Ltd with capital of RMB 3 million, and created a new business model as a "state-owned and private-operating" economic entity. ZTE made an initial public offering (IPO) on the Shenzhen stock exchange in 1997 and another on the Hong Kong stock exchange in December 2004. While the company initially profited from domestic sales, it vowed to use proceeds of its 2004 Hong Kong IPO to further expand R&D, overseas sales to developed nations, and overseas produc
https://en.wikipedia.org/wiki/Semiconductor%20intellectual%20property%20core
In electronic design, a semiconductor intellectual property core (SIP core), IP core, or IP block is a reusable unit of logic, cell, or integrated circuit layout design that is the intellectual property of one party. IP cores can be licensed to another party or owned and used by a single party. The term comes from the licensing of the patent or source code copyright that exists in the design. Designers of system on chip (SoC), application-specific integrated circuits (ASIC) and systems of field-programmable gate array (FPGA) logic can use IP cores as building blocks. History The licensing and use of IP cores in chip design came into common practice in the 1990s. There were many licensors and also many foundries competing on the market. In 2013, the most widely licensed IP cores were from Arm Holdings (43.2% market share), Synopsys Inc. (13.9% market share), Imagination Technologies (9% market share) and Cadence Design Systems (5.1% market share). Types of IP cores The use of an IP core in chip design is comparable to the use of a library for computer programming or a discrete integrated circuit component for printed circuit board design. Each is a reusable component of design logic with a defined interface and behavior that has been verified by its creator and is integrated into a larger design. Soft cores IP cores are commonly offered as synthesizable RTL in a hardware description language such as Verilog or VHDL. These are analogous to low-level languages such as C in the field of computer programming. IP cores delivered to chip designers as RTL permit chip designers to modify designs at the functional level, though many IP vendors offer no warranty or support for modified designs. IP cores are also sometimes offered as generic gate-level netlists. The netlist is a boolean-algebra representation of the IP's logical function implemented as generic gates or process-specific standard cells. An IP core implemented as generic gates can be compiled for any proce
https://en.wikipedia.org/wiki/Lipid%20A
Lipid A is a lipid component of an endotoxin held responsible for the toxicity of gram-negative bacteria. It is the innermost of the three regions of the lipopolysaccharide (LPS), also called endotoxin molecule, and its hydrophobic nature allows it to anchor the LPS to the outer membrane. While its toxic effects can be damaging, the sensing of lipid A by the immune system may also be critical for the onset of immune responses to gram-negative infection, and for the subsequent successful fight against the infection. Chemical composition Lipid A consists of two glucosamine (aminosugar) units, in a β(1→6) linkage, with attached acyl chains ("fatty acids"), and normally containing one phosphate group on each carbohydrate. The optimal immune activating lipid A structure is believed to contain 6 acyl chains. Four acyl chains attached directly to the glucosamine sugars are beta hydroxy acyl chains usually between 10 and 16 carbons in length. Two additional acyl chains are often attached to the beta hydroxy group. E. coli lipid A, as an example, typically has four C14 hydroxy acyl chains attached to the sugars and one C14 and one C12 attached to the beta hydroxy groups. The biosynthetic pathway for Lipid A in E. coli has been determined by the work of Christian R. H. Raetz in the past >32 years. Lipid A structure and effects on eukaryotic cells have been determined and examined, among others, by the groups of Otto Westphal, Chris Galanos, Ernst T. Rietschel and Hajime Takahashi starting already in the 1960s (Gmeiner, Luederitz, Westphal. Eur J Biochem 1969)(Kamio&Takahashi J Biochem 1971)(Luederitz, Galanos et al., J Infect Dis 1973). Biosynthesis The enzymes involved in Lipid A synthesis are conserved among Pseudomonas aeruginosa, Escherichia coli, Bordetella bronchiseptica, and Salmonella. Inhibition and activation of immune response Many of the immune activating abilities of LPS can be attributed to the lipid A unit. It is a very potent stimulant of the immune sys
https://en.wikipedia.org/wiki/Helen%20%28unit%29
A helen is a humorous unit of measurement based on the concept that Helen of Troy had a "face that launched a thousand ships". The helen is thus used to measure quantities of beauty in terms of the theoretical action that could be accomplished by the wielder of such beauty. Origin The classic reference to Helen's beauty is Marlowe's lines from the 1592 play The Tragical History of Doctor Faustus, "Was this the face that launch'd a thousand ships / And burnt the topless towers of Ilium?" In the tradition of humorous pseudounits, then, 1 millihelen is the amount of beauty needed to launch a single ship. In his 1992 collection of jokes and limericks, Isaac Asimov claimed to have invented the term in the 1940s as a graduate student. In a 1958 letter to the New Scientist, R.C. Winton proposes the millihelen as the amount of beauty required to launch one ship. In response, P. Lockwood noted that the unit had been independently proposed by Edgar J. Westbury and extended by the pair to negative values, where −1 millihelen was the amount of ugliness required to sink a battleship. The earliest known print citation is found in Punch magazine dated June 23 1954 and attributed to an unnamed “professor of natural philosophy”. Derived units The Catalogue of Ships from Book II of The Iliad, which describes in detail the commanders who came to fight for Helen and the ships they brought with them, details a total of 1,186 ships which came to fight the Trojan War. As such, Helen herself has a beauty rating of 1.186 helen, capable of launching more than one thousand ships. The "system" has been expanded by some writers, such as conceiving of negative values as measures of the ugliness required to beach a thousand ships. Writing a humorous article about the concept, David Goines considered a range of metric prefixes to the unit, ranging from the attohelen (ah) which could merely "light up a Lucky while strolling past a shipyard", to the terahelen (Th) able to "launch the equival
https://en.wikipedia.org/wiki/Leonhard%20Seppala
Leonhard "Sepp" Seppala (; September 14, 1877 – January 28, 1967) was a Norwegian-Finnish-American sled dog breeder, trainer and musher who with his dogs played a pivotal role in the 1925 serum run to Nome, and participated in the 1932 Winter Olympics. Seppala introduced the work dogs used by Native Siberians at the time to the American public; the breed came to be known as the Siberian Husky in the English-speaking world. The Leonhard Seppala Humanitarian Award, which honors excellence in sled dog care, is named in honour of him. Background Seppala was born in the Lyngen, Troms og Finnmark, Northern Norway. He was the eldest child of Isak Isaksen Seppälä (born in Sweden of Finnish descent) and Anne Henrikke Henriksdatter. His father's family name is of Finnish origin. Leonhard is considered to have been Kven. When Seppala was two years old, his family moved within Troms county to nearby Skjervøy municipality on the island of Skjervøya. While in Skjervøy, his father worked as a blacksmith and fisherman, building up a relatively large estate. Seppala initially followed in his father's footsteps as both a blacksmith and a fisherman. However, in 1900, he emigrated to Alaska during the Nome gold rush. His friend Jafet Lindeberg had returned from Alaska and convinced Seppala to come to work for his mining company in Nome. He became a naturalized citizen in 1906. During his first winter in Alaska, Seppala became a dogsled driver for Lindeberg's company. He enjoyed the task from his first run, which he recalled clearly for the rest of his life. He expressed pleasure in the rhythmic patter of the dogs' feet and the feeling of the sled gliding along the snow. While most drivers considered a long run, Seppala travelled between and most days. This also meant he worked as long as 12 hours a day. He kept his dogs in form during the summer by having them pull a cart on wheels instead of a sled. It was unusual at that time to keep sled dogs working when the snow thawed, or to
https://en.wikipedia.org/wiki/Eb/N0
{{DISPLAYTITLE:Eb/N0}} In digital communication or data transmission, (energy per bit to noise power spectral density ratio) is a normalized signal-to-noise ratio (SNR) measure, also known as the "SNR per bit". It is especially useful when comparing the bit error rate (BER) performance of different digital modulation schemes without taking bandwidth into account. As the description implies, is the signal energy associated with each user data bit; it is equal to the signal power divided by the user bit rate (not the channel symbol rate). If signal power is in watts and bit rate is in bits per second, is in units of joules (watt-seconds). is the noise spectral density, the noise power in a 1 Hz bandwidth, measured in watts per hertz or joules. These are the same units as so the ratio is dimensionless; it is frequently expressed in decibels. directly indicates the power efficiency of the system without regard to modulation type, error correction coding or signal bandwidth (including any use of spread spectrum). This also avoids any confusion as to which of several definitions of "bandwidth" to apply to the signal. But when the signal bandwidth is well defined, is also equal to the signal-to-noise ratio (SNR) in that bandwidth divided by the "gross" link spectral efficiency in bit/s⋅Hz, where the bits in this context again refer to user data bits, irrespective of error correction information and modulation type. must be used with care on interference-limited channels since additive white noise (with constant noise density ) is assumed, and interference is not always noise-like. In spread spectrum systems (e.g., CDMA), the interference is sufficiently noise-like that it can be represented as and added to the thermal noise to produce the overall ratio . Relation to carrier-to-noise ratio is closely related to the carrier-to-noise ratio (CNR or ), i.e. the signal-to-noise ratio (SNR) of the received signal, after the receiver filter but before detection
https://en.wikipedia.org/wiki/Mark%20Ridley%20%28zoologist%29
Mark Ridley (born 1956) is a British zoologist and writer on evolution. He studied at both Oxford and Cambridge in the 1980s (his doctoral advisor being Richard Dawkins), and later worked at Emory University. he worked as a research assistant at the Department of Zoology, Oxford University. Ridley has worked on the evolution of reproductive behaviour and written a number of popular accounts of evolutionary biology, including articles for the New York Times, The Sunday Times, Nature, New Scientist and The Times Literary Supplement. He is sometimes confused with Matt Ridley, another writer on evolution who is also from the UK. Published works Ridley, Mark 1993. Evolution Blackwell; 2nd ed 1996 Blackwell ; 3rd ed 2003 Wiley . A comprehensive textbook: case studies, commentary, dedicated website and CD. Mendel's Demon: Gene Justice and the Complexity of Life 2001 Released in the US with the title: The Cooperative Gene: How Mendel's Demon Explains the Evolution of Complex Beings 2001 Animal Behavior: An Introduction to Behavioral Mechanisms, Development, and Ecology 1995 The Problems of Evolution 1985 The Darwin Reader (Second Edition) 1996 How to Read Darwin 2006 Evolution and classification: The reformation of cladism 1986 Narrow Roads of Geneland (with W. D. Hamilton) 2006 The Explanation of Organic Diversity: The Comparative Method and Adaptations for Mating 1983 Oxford Surveys in Evolutionary Biology: 1985 (with Richard Dawkins) Vol. 2 1986 , Vol. 3 1987 Animal Behaviour: A Concise Introduction 1995 Richard Dawkins: How a Scientist Changed the Way We Think (editor, with Alan Grafen) 2007
https://en.wikipedia.org/wiki/Technetium-99m%20generator
A technetium-99m generator, or colloquially a technetium cow or moly cow, is a device used to extract the metastable isotope 99mTc of technetium from a decaying sample of molybdenum-99. 99Mo has a half-life of 66 hours and can be easily transported over long distances to hospitals where its decay product technetium-99m (with a half-life of only 6 hours, inconvenient for transport) is extracted and used for a variety of nuclear medicine diagnostic procedures, where its short half-life is very useful. Parent isotope source 99Mo can be obtained by the neutron activation (n,γ reaction) of 98Mo in a high neutron flux reactor. However, the most frequently used method is through fission of uranium-235 in a nuclear reactor. While most reactors currently engaged in 99Mo production use highly enriched uranium-235 targets, proliferation concerns have prompted some producers to transition to low-enriched uranium targets. The target is irradiated with neutrons to form 99Mo as a fission product (with 6.1% yield). Molybdenum-99 is then separated from unreacted uranium and other fission products in a hot cell. Generator invention and history 99mTc remained a scientific curiosity until the 1950s when Powell Richards realized the potential of technetium-99m as a medical radiotracer and promoted its use among the medical community. While Richards was in charge of the radioisotope production at the Hot Lab Division of the Brookhaven National Laboratory, Walter Tucker and Margaret Greene were working on how to improve the separation process purity of the short-lived eluted daughter product iodine-132 from tellurium-132, its 3.2-days parent, produced in the Brookhaven Graphite Research Reactor. They detected a trace contaminant which proved to be 99mTc, which was coming from 99Mo and was following tellurium in the chemistry of the separation process for other fission products. Based on the similarities between the chemistry of the tellurium-iodine parent-daughter pair, Tucker and Green
https://en.wikipedia.org/wiki/Stomachic
Stomachic is a historic term for a medicine that serves to tone the stomach, improving its function and increase appetite. While many herbal remedies claim stomachic effects, modern pharmacology does not have an equivalent term for this type of action. Herbs with putative stomachic effects include: Agrimony Aloe Anise Avens (Geum urbanum) Barberry Bitterwood (Picrasmaa excelsa) Cannabis Cayenne Centaurium Cleome Colombo (herb) (Frasera carolinensis) Dandelion Elecampane Ginseng Goldenseal Grewia asiatica (Phalsa or Falsa) Hops Holy thistle Juniper berry Mint Mugwort Oregano Peach bark Rhubarb White mustard seeds Rose hips Rue Sweet flag (Acorus calamus) Wormwood (Artemisia absinthium) The purported stomachic mechanism of action of these substances is to stimulate the appetite by increasing the gastric secretions of the stomach; however, the actual therapeutic value of some of these compounds is dubious. Some other important agents used are: Bitters: used to stimulate the taste buds, thus producing reflex secretion of gastric juices. Quassia, Aristolochia, gentian, and chirata are commonly used. Alcohol: increases gastric secretion by direct action and also by the reflex stimulation of taste buds. Miscellaneous compounds: including insulin which increases the gastric secretion by producing hypoglycemia, and histamine, which produces direct stimulation of gastric glands.
https://en.wikipedia.org/wiki/Tantrum
A tantrum, temper tantrum, lash out, meltdown, fit or hissy fit is an emotional outburst, usually associated with those in emotional distress, that is typically characterized by stubbornness, crying, screaming, violence, defiance, angry ranting, a resistance to attempts at pacification, and, in some cases, hitting and other physically violent behavior. Physical control may be lost; the person may be unable to remain still; and even if the "goal" of the person is met, they may not be calmed. Throwing a temper tantrum can lead to a child getting detention or being suspended from school for older school age children. A tantrum may be expressed in a tirade: a protracted, angry speech. In early childhood Tantrums are one of the most common forms of problematic behavior in young children but tend to decrease in frequency and intensity as the child gets older. For a toddler, tantrums can be considered as normal, and even as gauges of developing strength of character. While tantrums are sometimes seen as a predictor of future anti-social behaviour, in another sense they are simply an age-appropriate sign of excessive frustration, and will diminish over time given a calm and consistent handling. Parental containment where a child cannot contain themself—rather than what the child is ostensibly demanding—may be what is really required. Selma Fraiberg warned against "too much pressure or forceful methods of control from the outside" in child-rearing: "if we turn every instance of pants changing, treasure hunting, napping, puddle wading and garbage distribution into a governmental crisis we can easily bring on fierce defiance, tantrums, and all the fireworks of revolt in the nursery". Intellectual and developmental disorders Some people who have developmental disorders such as autism, Asperger syndrome, ADHD, and intellectual disability could be more vulnerable to tantrums than others, although anyone experiencing brain damage (temporary or permanent) can suffer from tantr
https://en.wikipedia.org/wiki/Canada/USA%20Mathcamp
Canada/USA Mathcamp is a five-week academic summer program for middle and high school students in mathematics. Mathcamp was founded in 1993 by Dr. George Thomas, who believed that students interested in mathematics frequently lacked the resources and camaraderie to pursue their interest. Mira Bernstein became the director when Thomas left in 2002 to found MathPath, a program for younger students. Mathcamp is held each year at a college campus in the United States or Canada. Past locations have included the University of Toronto, the University of Washington, Colorado College, Reed College, University of Puget Sound, Colby College, the University of British Columbia, Mount Holyoke College, and the Colorado School of Mines. Mathcamp enrolls about 120 students yearly, 45–55 returning and 65–75 new. The application process for new students includes an entrance exam (the "Qualifying Quiz"), personal essay, and two letters of recommendation, but no grade reports. The process is intended to ensure that the students who are most passionate about math come to camp. Admission is selective: in 2016, the acceptance rate was 15%. Mathcamp courses cover various branches of recreational and college-level mathematics. Classes at Mathcamp come in four difficulty levels. The easier classes often include basic proof techniques, number theory, graph theory, and combinatorial game theory, while the more difficult classes cover advanced topics in abstract algebra, topology, theoretical computer science, category theory, and mathematical analysis. There are generally four class periods each day and five classes offered during each period intended for varying student interests and backgrounds. Graduate student mentors teach most of the classes, while undergraduate junior counselors, all of them Mathcamp alumni, do most of the behind-the-scenes work. Mathcamp has had a number of renowned guest speakers, including John Conway, Avi Wigderson, and Serge Lang. Culture In 2004, some campe
https://en.wikipedia.org/wiki/Photoacoustic%20spectroscopy
Photoacoustic spectroscopy is the measurement of the effect of absorbed electromagnetic energy (particularly of light) on matter by means of acoustic detection. The discovery of the photoacoustic effect dates to 1880 when Alexander Graham Bell showed that thin discs emitted sound when exposed to a beam of sunlight that was rapidly interrupted with a rotating slotted disk. The absorbed energy from the light causes local heating, generating a thermal expansion which creates a pressure wave or sound. Later Bell showed that materials exposed to the non-visible portions of the solar spectrum (i.e., the infrared and the ultraviolet) can also produce sounds. A photoacoustic spectrum of a sample can be recorded by measuring the sound at different wavelengths of the light. This spectrum can be used to identify the absorbing components of the sample. The photoacoustic effect can be used to study solids, liquids and gases. Uses and techniques Photoacoustic spectroscopy has become a powerful technique to study concentrations of gases at the part per billion or even part per trillion levels. Modern photoacoustic detectors still rely on the same principles as Bell's apparatus; however, to increase the sensitivity, several modifications have been made. Instead of sunlight, intense lasers are used to illuminate the sample since the intensity of the generated sound is proportional to the light intensity; this technique is referred to as laser photoacoustic spectroscopy (LPAS). The ear has been replaced by sensitive microphones. The microphone signals are further amplified and detected using lock-in amplifiers. By enclosing the gaseous sample in a cylindrical chamber, the sound signal is amplified by tuning the modulation frequency to an acoustic resonance of the sample cell. By using cantilever enhanced photoacoustic spectroscopy sensitivity can still be further improved enabling reliable monitoring of gases on ppb-level. Example The following example illustrates the potentia
https://en.wikipedia.org/wiki/Animal%20sexual%20behaviour
Animal sexual behaviour takes many different forms, including within the same species. Common mating or reproductively motivated systems include monogamy, polygyny, polyandry, polygamy and promiscuity. Other sexual behaviour may be reproductively motivated (e.g. sex apparently due to duress or coercion and situational sexual behaviour) or non-reproductively motivated (e.g. homosexual sexual behaviour, bisexual sexual behaviour, cross-species sex, sexual arousal from objects or places, sex with dead animals, etc.). When animal sexual behaviour is reproductively motivated, it is often termed mating or copulation; for most non-human mammals, mating and copulation occur at oestrus (the most fertile period in the mammalian female's reproductive cycle), which increases the chances of successful impregnation. Some animal sexual behaviour involves competition, sometimes fighting, between multiple males. Females often select males for mating only if they appear strong and able to protect themselves. The male that wins a fight may also have the chance to mate with a larger number of females and will therefore pass on his genes to their offspring. Historically, it was believed that only humans and a small number of other species performed sexual acts other than for reproduction, and that animals' sexuality was instinctive and a simple "stimulus-response" behaviour. However, in addition to homosexual behaviours, a range of species masturbate and may use objects as tools to help them do so. Sexual behaviour may be tied more strongly to the establishment and maintenance of complex social bonds across a population which support its success in non-reproductive ways. Both reproductive and non-reproductive behaviours can be related to expressions of dominance over another animal or survival within a stressful situation (such as sex due to duress or coercion). Mating systems In sociobiology and behavioural ecology, the term "mating system" is used to describe the ways in which anim
https://en.wikipedia.org/wiki/List%20of%20version-control%20software
This is a list of notable software for version control. Local data model In the local-only approach, all developers must use the same file system. Open source Revision Control System (RCS) – stores the latest version and backward deltas for fastest access to the trunk tip compared to SCCS and an improved user interface, at the cost of slow branch tip access and missing support for included/excluded deltas. Source Code Control System (SCCS) – part of UNIX; based on interleaved deltas, can construct versions as arbitrary sets of revisions. Extracting an arbitrary version takes essentially the same time and is thus more useful in environments that rely heavily on branching and merging with multiple "current" and identical versions. Proprietary The Librarian – Around since 1969, source control for IBM mainframe computers; from Applied Data Research, later acquired by Computer Associates Panvalet – Around since the 1970s, source and object control for IBM mainframe computers. Client–server model In the client–server model, developers use one shared repository. Open source Concurrent Versions System (CVS) – originally built on RCS, licensed under the GPL. CVSNT – cross-platform port of CVS that allows case insensitive file names among other changes OpenCVS – unreleased CVS clone under a BSD license, emphasising security and source code correctness Subversion (SVN) – versioning control system inspired by CVS Vesta – build system with a versioning file system and support for distributed repositories Proprietary AccuRev – source configuration management tool with integrated issue tracking based on "Streams" that efficiently manages parallel and global development; replication server is also available. Now owned by Micro Focus. Autodesk Vault – Version control tool specifically designed for Autodesk applications managing the complex relationships between design files such as AutoCAD and Autodesk Inventor. CADES – Designer productivity and version contr
https://en.wikipedia.org/wiki/Histiocyte
A histiocyte is a vertebrate cell that is part of the mononuclear phagocyte system (also known as the reticuloendothelial system or lymphoreticular system). The mononuclear phagocytic system is part of the organism's immune system. The histiocyte is a tissue macrophage or a dendritic cell (histio, diminutive of histo, meaning tissue, and cyte, meaning cell). Part of their job is to clear out neutrophils once they've reached the end of their lifespan. Development Histiocytes are derived from the bone marrow by multiplication from a stem cell. The derived cells migrate from the bone marrow to the blood as monocytes. They circulate through the body and enter various organs, where they undergo differentiation into histiocytes, which are part of the mononuclear phagocytic system (MPS). However, the term histiocyte has been used for multiple purposes in the past, and some cells called "histocytes" do not appear to derive from monocytic-macrophage lines. The term Histiocyte can also simply refer to a cell from monocyte origin outside the blood system, such as in a tissue (as in rheumatoid arthritis as palisading histiocytes surrounding fibrinoid necrosis of rheumatoid nodules). Some sources consider Langerhans cell derivatives to be histiocytes. The Langerhans cell histiocytosis embeds this interpretation into its name. Structure Histiocytes have common histological and immunophenotypical characteristics (demonstrated by immunostains). Their cytoplasm is eosinophilic and contains variable amounts of lysosomes. They bear membrane receptors for opsonins, such as IgG and the fragment C3b of complement. They express LCAs (leucocyte common antigens) CD45, CD14, CD33, and CD4 (also expressed by T helper cells). Macrophages and dendritic cells These histiocytes are part of the immune system by way of two distinct functions: phagocytosis and antigen presentation. Phagocytosis is the main process of macrophages and antigen presentation the main property of dendritic cel
https://en.wikipedia.org/wiki/Falcarinol
Falcarinol (also known as carotatoxin or panaxynol) is a natural pesticide and fatty alcohol found in carrots (Daucus carota), red ginseng (Panax ginseng) and ivy. In carrots, it occurs in a concentration of approximately 2 mg/kg. As a toxin, it protects roots from fungal diseases, such as liquorice rot that causes black spots on the roots during storage. The compound requires the freezing condition to maintain well because it is sensitive to light and heat. Chemistry Falcarinol is a polyyne with two carbon-carbon triple bonds and two double bonds. The double bond at the carbon 9 position has cis stereochemistry was introduced by the desaturation, which requires oxygen and NADPH (or NADH) cofactors, creates a bend in the molecule that prevent fatty acid from solidifying in oils and cellular membranes. It is structurally related to the oenanthotoxin and cicutoxin. Biological effects Falcarinol is an irritant that can cause allergic reactions and contact dermatitis. It was shown that falcarinol acts as a covalent cannabinoid receptor type 1 inverse agonist and blocks the effect of anandamide in keratinocytes, leading to pro-allergic effects in human skin. Normal consumption of carrots has no toxic effect in humans. Biosynthesis Starting with oleic acid (1), which possesses a cis double bond at the carbon 9 position from desaturation and a bound of phospholipids (-PL), a bifunctional desaturase/acetylnase system occurred with oxygen (a) to introduce the second cis double bond at the carbon 12 position to form linoleic acid (2). This step was then repeated to turn the cis double bond at the carbon 12 position into a triple bond (also called acetylenic bond) to form crepenynic acid (3). Crepenynic acid was reacted with oxygen (b) to form a second cis double bond at the carbon 14 position (conjugated position) leading to the formation of dehydrocrepenynic acid (4). Allylic isomerization (c) was responsible for the changes from the cis double bond at the carbon 14 po
https://en.wikipedia.org/wiki/Active%20shutter%203D%20system
An active shutter 3D system (a.k.a. alternate frame sequencing, alternate image, AI, alternating field, field sequential or eclipse method) is a technique of displaying stereoscopic 3D images. It works by only presenting the image intended for the left eye while blocking the right eye's view, then presenting the right-eye image while blocking the left eye, and repeating this so rapidly that the interruptions do not interfere with the perceived fusion of the two images into a single 3D image. Modern active shutter 3D systems generally use liquid crystal shutter glasses (also called "LC shutter glasses" or "active shutter glasses"). Each eye's glass contains a liquid crystal layer which has the property of becoming opaque when voltage is applied, being otherwise transparent. The glasses are controlled by a timing signal that allows the glasses to alternately block one eye, and then the other, in synchronization with the refresh rate of the screen. The timing synchronization to the video equipment may be achieved via a wired signal, or wirelessly by either an infrared or radio frequency (e.g. Bluetooth, DLP link) transmitter. Historic systems also used spinning discs, for example the Teleview system. Active shutter 3D systems are used to present 3D films in some theaters, and they can be used to present 3D images on CRT, plasma, LCD, projectors and other types of video displays. Advantages and disadvantages Although virtually all ordinary unmodified video and computer systems can be used to display 3D by adding a plug-in interface and active shutter glasses, disturbing levels of flicker or ghosting may be apparent with systems or displays not designed for such use. The rate of alternation required to eliminate noticeable flicker depends on image brightness and other factors, but is typically well over 30 image pair cycles per second, the maximum possible with a 60 Hz display. A 120 Hz display, allowing 60 images per second per eye, is widely accepted as flicker-fr
https://en.wikipedia.org/wiki/Heart%20rate%20variability
Heart rate variability (HRV) is the physiological phenomenon of variation in the time interval between heartbeats. It is measured by the variation in the beat-to-beat interval. Other terms used include "cycle length variability", "R–R variability" (where R is a point corresponding to the peak of the QRS complex of the ECG wave; and RR is the interval between successive Rs), and "heart period variability". Methods used to detect beats include ECG, blood pressure, ballistocardiograms, and the pulse wave signal derived from a photoplethysmograph (PPG). ECG is considered the gold standard for HRV measurement because it provides a direct reflection of cardiac electric activity. Clinical significance Reduced HRV has been shown to be a predictor of mortality after myocardial infarction although others have shown that the information in HRV relevant to acute myocardial infarction survival is fully contained in the mean heart rate. A range of other outcomes and conditions may also be associated with modified (usually lower) HRV, including congestive heart failure, diabetic neuropathy, post–cardiac-transplant depression, susceptibility to SIDS and poor survival in premature babies, as well as fatigue severity in chronic fatigue syndrome. Psychological and social aspects There is interest in HRV in the field of psychophysiology. For example, HRV is related to emotional arousal. High-frequency (HF) activity has been found to decrease under conditions of acute time pressure and emotional strain and elevated anxiety state, presumably related to focused attention and motor inhibition. HRV has been shown to be reduced in individuals reporting to worry more. In individuals with post-traumatic stress disorder (PTSD), HRV and its HF component (see below) is reduced whilst the low-frequency (LF) component is elevated. Furthermore, PTSD patients demonstrated no LF or HF reactivity to recalling a traumatic event. The neurovisceral integration is a model of HRV that views the cent
https://en.wikipedia.org/wiki/Dewetting
In fluid mechanics, dewetting is one of the processes that can occur at a solid–liquid, solid–solid or liquid–liquid interface. Generally, dewetting describes the process of retraction of a fluid from a non-wettable surface it was forced to cover. The opposite process—spreading of a liquid on a substrate—is called wetting. The factor determining the spontaneous spreading and dewetting for a drop of liquid placed on a solid substrate with ambient gas, is the so-called spreading coefficient : where is the solid-gas surface tension, is the solid-liquid surface tension and is the liquid-gas surface tension (measured for the mediums before they are brought in contact with each other). When , the spontaneous spreading occurs, and if , partial wetting is observed, meaning the liquid will only cover the substrate to some extent. The equilibrium contact angle is determined from the Young–Laplace equation. Spreading and dewetting are important processes for many applications, including adhesion, lubrication, painting, printing, and protective coating. For most applications, dewetting is an unwanted process, because it destroys the applied liquid film. Dewetting can be inhibited or prevented by photocrosslinking the thin film prior to annealing, or by incorporating nanoparticle additives into the film. Surfactants can have a significant effect on the spreading coefficient. When a surfactant is added, its amphiphilic properties cause it to be more energetically favorable to migrate to the surface, decreasing the interfacial tension and thus increasing the spreading coefficient (i.e. making S more positive). As more surfactant molecules are absorbed into the interface, the free energy of the system decreases in tandem to the surface tension decreasing, eventually causing the system to become completely wetting. In biology, by analogy with the physics of liquid dewetting, the process of tunnel formation through endothelial cells has been referred to as cellular dewe
https://en.wikipedia.org/wiki/Mathematics%20of%20general%20relativity
When studying and formulating Albert Einstein's theory of general relativity, various mathematical structures and techniques are utilized. The main tools used in this geometrical theory of gravitation are tensor fields defined on a Lorentzian manifold representing spacetime. This article is a general description of the mathematics of general relativity. Note: General relativity articles using tensors will use the abstract index notation. Tensors The principle of general covariance was one of the central principles in the development of general relativity. It states that the laws of physics should take the same mathematical form in all reference frames. The term 'general covariance' was used in the early formulation of general relativity, but the principle is now often referred to as 'diffeomorphism covariance'. Diffeomorphism covariance is not the defining feature of general relativity,[1] and controversies remain regarding its present status in general relativity. However, the invariance property of physical laws implied in the principle, coupled with the fact that the theory is essentially geometrical in character (making use of non-Euclidean geometries), suggested that general relativity be formulated using the language of tensors. This will be discussed further below. Spacetime as a manifold Most modern approaches to mathematical general relativity begin with the concept of a manifold. More precisely, the basic physical construct representing a curved is modelled by a four-dimensional, smooth, connected, Lorentzian manifold. Other physical descriptors are represented by various tensors, discussed below. The rationale for choosing a manifold as the fundamental mathematical structure is to reflect desirable physical properties. For example, in the theory of manifolds, each point is contained in a (by no means unique) coordinate chart, and this chart can be thought of as representing the 'local spacetime' around the observer (represented by the point). The
https://en.wikipedia.org/wiki/Ecosynthesis
Ecosynthesis is the use of introduced species to fill niches in a disrupted environment, with the aim of increasing the speed of ecological restoration. This decreases the amount of physical damage done in a disrupted landscape. An example is using willow in a stream corridor for sediment and phosphorus capture. It aims to aid ecological restoration, the practice of renewing and restoring degraded, damaged, or destroyed ecosystems and habitats in the environment by active human intervention and action. Humans use ecosynthesis to make environments more suitable for life, through restoration ecology (introduced species, vegetation mapping, habitat enhancement, remediation and mitigation.) Restoration ecology Ecological restoration aims to recreate, initiate, or accelerate the recovery of an ecosystem that has been disturbed. Revegetation: the establishment of vegetation on sites where it has been previously lost, often with erosion control as the primary goal. Habitat enhancement: the process of increasing the suitability of a site as habitat for some desired species. Remediation: improving an existing ecosystem or creating a new one with the aim of replacing another that has deteriorated or been destroyed. Mitigation: legally mandated remediation for loss of protected species or an ecosystem. Through restoration ecology, humans can help ecosystems that we have either caused harm to or disturbed be brought back to functional state. Trophic cascade A clear example of humans ecosynthesiszing would be through the introduction of a species to cause a trophic cascade, which is the result of indirect effects between nonadjacent trophic levels in a food chain or food web, such as the top predator in a food chain and a plant. The most famous example of a trophic cascade is that of the introduction of wolves to Yellowstone National Park, which had extradionary effects to the ecosystem. Yellowstone had a massive population of elk because they had no predators, which ca
https://en.wikipedia.org/wiki/Wilfrid%20Hodges
Wilfrid Augustine Hodges, FBA (born 27 May 1941) is a British mathematician and logician known for his work in model theory. Life Hodges attended New College, Oxford (1959–65), where he received degrees in both Literae Humaniores and (Christianic) Theology. In 1970 he was awarded a doctorate for a thesis in Logic. He lectured in both Philosophy and Mathematics at Bedford College, University of London. He has held visiting appointments in the department of philosophy at the University of California and in the department of mathematics at University of Colorado. Hodges was Professor of Mathematics at Queen Mary College, University of London from 1987 to 2006 and is the author of books on logic. Honors and awards Hodges was President of the British Logic Colloquium, of the European Association for Logic, Language and Information and of the Division of Logic, Methodology, and Philosophy of Science. In 2009 he was elected a Fellow of the British Academy. Writing style Hodges' books are written in an informal style. The "Notes on Notation" in his book "Model theory" end with the following characteristic sentence: 'I' means I, 'we' means we. When this 780-page book appeared in 1993, it became one of the standard textbooks on model theory. Due to its success an abbreviated version (but with a new chapter on stability theory) was published as a paperback. Bibliography Only first editions are listed.
https://en.wikipedia.org/wiki/Bounded%20set%20%28topological%20vector%20space%29
In functional analysis and related areas of mathematics, a set in a topological vector space is called bounded or von Neumann bounded, if every neighborhood of the zero vector can be inflated to include the set. A set that is not bounded is called unbounded. Bounded sets are a natural way to define locally convex polar topologies on the vector spaces in a dual pair, as the polar set of a bounded set is an absolutely convex and absorbing set. The concept was first introduced by John von Neumann and Andrey Kolmogorov in 1935. Definition Suppose is a topological vector space (TVS) over a field A subset of is called or just in if any of the following equivalent conditions are satisfied: : For every neighborhood of the origin there exists a real such that for all scalars satisfying This was the definition introduced by John von Neumann in 1935. is absorbed by every neighborhood of the origin. For every neighborhood of the origin there exists a scalar such that For every neighborhood of the origin there exists a real such that for all scalars satisfying For every neighborhood of the origin there exists a real such that for all real Any one of statements (1) through (5) above but with the word "neighborhood" replaced by any of the following: "balanced neighborhood," "open balanced neighborhood," "closed balanced neighborhood," "open neighborhood," "closed neighborhood". e.g. Statement (2) may become: is bounded if and only if is absorbed by every balanced neighborhood of the origin. If is locally convex then the adjective "convex" may be also be added to any of these 5 replacements. For every sequence of scalars that converges to and every sequence in the sequence converges to in This was the definition of "bounded" that Andrey Kolmogorov used in 1934, which is the same as the definition introduced by Stanisław Mazur and Władysław Orlicz in 1933 for metrizable TVS. Kolmogorov used this definition to prove that a TVS is seminor
https://en.wikipedia.org/wiki/ITnet
ITnet (Institute of Technology Network) was a PoS based multi Mbit/s network created for the Institutes of Technology in Ireland. ITnet used 45 Mbit/s links to each of the institutions and an international link of 310 Mbit/s via HEAnet. The system was proposed in 1991 between Regional Technical College, Cork and Regional Technical College, Carlow, as they were then called. In 1993 an agreement was reached to create RTCnet. The system was running by 1994 with its hub in University College Dublin and service management at Carlow. All Regional Technical Colleges were connected to the system. In 1998 RTCnet became ITnet to reflect the change in status of the Regional Technical College system to Institutes of Technology. The hub of the system and service management moved to Institute of Technology, Tallaght where an international link was provided via HEAnet. ITnet was incorporated into HEAnet during 2007 and 2008 and has now ceased to exist as an independent network for the IoTs. All internet services are now provided directly to the Institutes by HEAnet. See also Communications in Ireland List of higher education institutions in the Republic of Ireland External links Official website - ITnet Academic computer network organizations Education in the Republic of Ireland Internet in Ireland
https://en.wikipedia.org/wiki/Methylotroph
Methylotrophs are a diverse group of microorganisms that can use reduced one-carbon compounds, such as methanol or methane, as the carbon source for their growth; and multi-carbon compounds that contain no carbon-carbon bonds, such as dimethyl ether and dimethylamine. This group of microorganisms also includes those capable of assimilating reduced one-carbon compounds by way of carbon dioxide using the ribulose bisphosphate pathway. These organisms should not be confused with methanogens which on the contrary produce methane as a by-product from various one-carbon compounds such as carbon dioxide. Some methylotrophs can degrade the greenhouse gas methane, and in this case they are called methanotrophs. The abundance, purity, and low price of methanol compared to commonly used sugars make methylotrophs competent organisms for production of amino acids, vitamins, recombinant proteins, single-cell proteins, co-enzymes and cytochromes. Metabolism The key intermediate in methylotrophic metabolism is formaldehyde, which can be diverted to either assimilatory or dissimilatory pathways. Methylotrophs produce formaldehyde through oxidation of methanol and/or methane. Methane oxidation requires the enzyme methane monooxygenase (MMO). Methylotrophs with this enzyme are given the name methanotrophs. The oxidation of methane (or methanol) can be assimilatory or dissimilatory in nature (see figure). If dissimilatory, the formaldehyde intermediate is oxidized completely into CO2 to produce reductant and energy. If assimilatory, the formaldehyde intermediate is used to synthesize a 3-Carbon (C3) compound for the production of biomass. Many methylotrophs use multi-carbon compounds for anabolism, thus limiting their use of formaldehyde to dissimilatory processes, however methanotrophs are generally limited to only C1metabolism. Catabolism Methylotrophs use the electron transport chain to conserve energy produced from the oxidation of C1 compounds. An additional activation ste
https://en.wikipedia.org/wiki/Traitor%20tracing
Traitor tracing schemes help trace the source of leaks when secret or proprietary data is sold to many customers. In a traitor tracing scheme, each customer is given a different personal decryption key. (Traitor tracing schemes are often combined with conditional access systems so that, once the traitor tracing algorithm identifies a personal decryption key associated with the leak, the content distributor can revoke that personal decryption key, allowing honest customers to continue to watch pay television while the traitor and all the unauthorized users using the traitor's personal decryption key are cut off.) Traitor tracing schemes are used in pay television to discourage pirate decryption – to discourage legitimate subscribers from giving away decryption keys. Traitor tracing schemes are ineffective if the traitor rebroadcasts the entire (decrypted) original content. There are other kinds of schemes that discourages pirate rebroadcast – i.e., discourages legitimate subscribers from giving away decrypted original content. These other schemes use tamper-resistant digital watermarking to generate different versions of the original content. Traitor tracing key assignment schemes can be translated into such digital watermarking schemes. Traitor tracing is a copyright infringement detection system which works by tracing the source of leaked files rather than by direct copy protection. The method is that the distributor adds a unique salt to each copy given out. When a copy of it is leaked to the public, the distributor can check the value on it and trace it back to the "leak". Primary methods Activation controls The main concept is that each licensee (the user) is given a unique key which unlocks the software or allows the media to be decrypted. If the key is made public, the content owner then knows exactly who did it from their database of assigned codes. A major attack on this strategy is the key generator (keygen). By reverse engineering the software, the c
https://en.wikipedia.org/wiki/Mackey%20topology
In functional analysis and related areas of mathematics, the Mackey topology, named after George Mackey, is the finest topology for a topological vector space which still preserves the continuous dual. In other words the Mackey topology does not make linear functions continuous which were discontinuous in the default topology. A topological vector space (TVS) is called a Mackey space if its topology is the same as the Mackey topology. The Mackey topology is the opposite of the weak topology, which is the coarsest topology on a topological vector space which preserves the continuity of all linear functions in the continuous dual. The Mackey–Arens theorem states that all possible dual topologies are finer than the weak topology and coarser than the Mackey topology. Definition Definition for a pairing Given a pairing the Mackey topology on induced by denoted by is the polar topology defined on by using the set of all -compact disks in When is endowed with the Mackey topology then it will be denoted by or simply or if no ambiguity can arise. A linear map is said to be Mackey continuous (with respect to pairings and ) if is continuous. Definition for a topological vector space The definition of the Mackey topology for a topological vector space (TVS) is a specialization of the above definition of the Mackey topology of a pairing. If is a TVS with continuous dual space then the evaluation map on is called the canonical pairing. The Mackey topology on a TVS denoted by is the Mackey topology on induced by the canonical pairing That is, the Mackey topology is the polar topology on obtained by using the set of all weak*-compact disks in When is endowed with the Mackey topology then it will be denoted by or simply if no ambiguity can arise. A linear map between TVSs is Mackey continuous if is continuous. Examples Every metrizable locally convex with continuous dual carries the Mackey topology, that is or to put it more succin
https://en.wikipedia.org/wiki/Loss%20of%20heterozygosity
Loss of heterozygosity (LOH) is a type of genetic abnormality in diploid organisms in which one copy of an entire gene and its surrounding chromosomal region are lost. Since diploid cells have two copies of their genes, one from each parent, a single copy of the lost gene still remains when this happens, but any heterozygosity (slight differences between the versions of the gene inherited from each parent) is no longer present. In cancer The loss of heterozygosity is a common occurrence in cancer development. Originally, a heterozygous state is required and indicates the absence of a functional tumor suppressor gene copy in the region of interest. However, many people remain healthy with such a loss, because there still is one functional gene left on the other chromosome of the chromosome pair. The remaining copy of the tumor suppressor gene can be inactivated by a point mutation or via other mechanisms, resulting in a loss of heterozygosity event, and leaving no tumor suppressor gene to protect the body. Loss of heterozygosity does not imply a homozygous state (which would require the presence of two identical alleles in the cell). Knudson two-hit hypothesis of tumorigenesis First Hit: The first hit is classically thought of as a point mutation, but generally arises due to epigenetic events which inactivate one copy of a tumor suppressor gene (TSG), such as Rb1. In hereditary cancer syndromes, individuals are born with the first hit. The individual does not develop cancer at this point because the remaining TSG allele on the other locus is still functioning normally. Second Hit: While the second hit is commonly assumed to be a deletion that results in loss of the remaining functioning TSG allele, the original published mechanism of RB1 LOH was mitotic recombination/gene conversion/copy-neutral LOH, not deletion. There is a critical difference between deletion and CN-LOH, as the latter mechanism cannot be detected by comparative genomic hybridization (CGH)-based
https://en.wikipedia.org/wiki/History%20of%20special%20relativity
The history of special relativity consists of many theoretical results and empirical findings obtained by Albert A. Michelson, Hendrik Lorentz, Henri Poincaré and others. It culminated in the theory of special relativity proposed by Albert Einstein and subsequent work of Max Planck, Hermann Minkowski and others. Introduction Although Isaac Newton based his physics on absolute time and space, he also adhered to the principle of relativity of Galileo Galilei restating it precisely for mechanical systems. This can be stated as: as far as the laws of mechanics are concerned, all observers in inertial motion are equally privileged, and no preferred state of motion can be attributed to any particular inertial observer. However, as to electromagnetic theory and electrodynamics, during the 19th century the wave theory of light as a disturbance of a "light medium" or luminiferous aether was widely accepted, the theory reaching its most developed form in the work of James Clerk Maxwell. According to Maxwell's theory, all optical and electrical phenomena propagate through that medium, which suggested that it should be possible to experimentally determine motion relative to the aether. The failure of any known experiment to detect motion through the aether led Hendrik Lorentz, starting in 1892, to develop a theory of electrodynamics based on an immobile luminiferous aether (about whose material constitution Lorentz did not speculate), physical length contraction, and a "local time" in which Maxwell's equations retain their form in all inertial frames of reference. Working with Lorentz's aether theory, Henri Poincaré, having earlier proposed the "relativity principle" as a general law of nature (including electrodynamics and gravitation), used this principle in 1905 to correct Lorentz's preliminary transformation formulas, resulting in an exact set of equations that are now called the Lorentz transformations. A little later in the same year Albert Einstein published his origin
https://en.wikipedia.org/wiki/Smart%20antenna
Smart antennas (also known as adaptive array antennas, digital antenna arrays, multiple antennas and, recently, MIMO) are antenna arrays with smart signal processing algorithms used to identify spatial signal signatures such as the direction of arrival (DOA) of the signal, and use them to calculate beamforming vectors which are used to track and locate the antenna beam on the mobile/target. Smart antennas should not be confused with reconfigurable antennas, which have similar capabilities but are single element antennas and not antenna arrays. Smart antenna techniques are used notably in acoustic signal processing, track and scan radar, radio astronomy and radio telescopes, and mostly in cellular systems like W-CDMA, UMTS, and LTE and 5G-NR. Smart antennas have many functions: DOA estimation, beamforming, interference nulling, and constant modulus preservation. Direction of arrival (DOA) estimation The smart antenna system estimates the direction of arrival of the signal, using techniques such as MUSIC (MUltiple SIgnal Classification), estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithms, Matrix Pencil method or one of their derivatives. They involve finding a spatial spectrum of the antenna/sensor array, and calculating the DOA from the peaks of this spectrum. These calculations are computationally intensive. Matrix Pencil is very efficient in case of real time systems, and under the correlated sources. Beamforming Beamforming is the method used to create the radiation pattern of the antenna array by adding constructively the phases of the signals in the direction of the targets/mobiles desired, and nulling the pattern of the targets/mobiles that are undesired/interfering targets. This can be done with a simple Finite Impulse Response (FIR) tapped delay line filter. The weights of the FIR filter may also be changed adaptively, and used to provide optimal beamforming, in the sense that it reduces the Minimum Mean Square Er
https://en.wikipedia.org/wiki/Railway%20spine
Railway spine was a nineteenth-century diagnosis for the post-traumatic symptoms of passengers involved in railroad accidents. The first full-length medical study of the condition was John Eric Erichsen's classic book, On Railway and Other Injuries of the Nervous System. For this reason, railway spine is often known as Erichsen's disease. Railway collisions were a frequent occurrence in the early 19th century. Exacerbating the problem was that railway cars were flimsy, wooden structures with no protection for the occupants. Soon a group of people started coming forward who claimed that they had been injured in train crashes, but had no obvious evidence of injury. The railroads rejected these claims as fake. The nature of symptoms caused by "railway spine" was hotly debated in the late 19th century, notably at the meetings of the (Austrian) Imperial Society of Physicians in Vienna, 1886. Germany's leading neurologist, Hermann Oppenheim, claimed that all railway spine symptoms were due to physical damage to the spine or brain, whereas French and British scholars, notably Jean-Martin Charcot and Herbert Page, insisted that some symptoms could be caused by hysteria (now known as conversion disorder). Erichsen observed that those most likely to be injured in a railway crash were those sitting with their backs to the acceleration. This is the same injury mechanism found in whiplash. As with automobile accidents, railway and airplane accidents are now known to cause posttraumatic stress disorder (PTSD) and other psychosomatic symptoms in addition to physical trauma. See also History of rail transport Whiplash Notes Injuries Traumatology Anxiety disorders Obsolete terms for mental disorders Railway accidents and incidents
https://en.wikipedia.org/wiki/Normality%20%28behavior%29
Normality is a behavior that can be normal for an individual (intrapersonal normality) when it is consistent with the most common behavior for that person. Normal is also used to describe individual behavior that conforms to the most common behavior in society (known as conformity). However, normal behavior is often only recognized in contrast to abnormality. In many cases normality is used to make moral judgements, such that normality is seen as good while abnormality is seen as bad, or conversely normality can seen as boring and uninteresting. Someone being seen as normal or not normal can have social ramifications, such as being included, excluded or stigmatized by wider society. Measuring Many difficulties arise in measuring normal behaviors—biologists come across parallel issues when defining normality. One complication that arises regards whether 'normality' is used correctly in everyday language. People say "this heart is abnormal" if only a portion of it is not working correctly, yet it may be inaccurate to include the entirety of the heart under the description of 'abnormal'. There can be a difference between the normality of a body part's structure and its function. Similarly, a behavioral pattern may not conform to social norms, but still be effective and non-problematic for that individual. Where there is a dichotomy between appearance and function of a behavior, it may be difficult to measure its normality. This is applicable when trying to diagnose a pathology and is addressed in the Diagnostic and Statistical Manual of Mental Disorders. Statistical normality In general, 'normal' refers to a lack of significant deviation from the average. The word normal is used in a more narrow sense in mathematics, where a normal distribution describes a population whose characteristics center around the average or the norm. When looking at a specific behavior, such as the frequency of lying, a researcher may use a Gaussian bell curve to plot all reactions, and a n
https://en.wikipedia.org/wiki/Chemotroph
A chemotroph is an organism that obtains energy by the oxidation of electron donors in their environments. These molecules can be organic (chemoorganotrophs) or inorganic (chemolithotrophs). The chemotroph designation is in contrast to phototrophs, which use photons. Chemotrophs can be either autotrophic or heterotrophic. Chemotrophs can be found in areas where electron donors are present in high concentration, for instance around hydrothermal vents. Chemoautotroph Chemoautotrophs, in addition to deriving energy from chemical reactions, synthesize all necessary organic compounds from carbon dioxide. Chemoautotrophs can use inorganic energy sources such as hydrogen sulfide, elemental sulfur, ferrous iron, molecular hydrogen, and ammonia or organic sources to produce energy. Most chemoautotrophs are extremophiles, bacteria or archaea that live in hostile environments (such as deep sea vents) and are the primary producers in such ecosystems. Chemoautotrophs generally fall into several groups: methanogens, sulfur oxidizers and reducers, nitrifiers, anammox bacteria, and thermoacidophiles. An example of one of these prokaryotes would be Sulfolobus. Chemolithotrophic growth can be dramatically fast, such as Hydrogenovibrio crunogenus with a doubling time around one hour. The term "chemosynthesis", coined in 1897 by Wilhelm Pfeffer, originally was defined as the energy production by oxidation of inorganic substances in association with autotrophy—what would be named today as chemolithoautotrophy. Later, the term would include also the chemoorganoautotrophy, that is, it can be seen as a synonym of chemoautotrophy. Chemoheterotroph Chemoheterotrophs (or chemotrophic heterotrophs) are unable to fix carbon to form their own organic compounds. Chemoheterotrophs can be chemolithoheterotrophs, utilizing inorganic electron sources such as sulfur, or, much more commonly, chemoorganoheterotrophs, utilizing organic electron sources such as carbohydrates, lipids, and proteins.
https://en.wikipedia.org/wiki/Two-level%20game%20theory
Two-level game theory is a political model, derived from game theory, that illustrates the domestic-international interactions between states. It was originally, introduced in 1988 by Robert D. Putnam, in his publication "Diplomacy and Domestic Politics: The Logic of Two-Level Games". Putnam had been involved in research around the G7 summits between 1976 and 1979. However, at the fourth summit, held in Bonn in 1978, he observed a qualitative shift in how the negotiations worked. He noted that attending countries agreed to adopt policies in contrast to what they might have in the absence of their international counterparts. However, the agreement was only viable due to strong domestic influence - within each international government - in favour of implementing the agreement internationally. This culminated in international policy co-ordination as a result of the entanglement of international and domestic agendas. The Model The model views international negotiations between states as consisting of simultaneous negotiations at two levels. Level 1: The international level (between governments), and Level 2: The intranational level (domestic). At the international level, the national government (i.e., chief negotiator) seeks an agreement, with an opposing country, relating to topics of concern. At the domestic level, societal actors pressure the chief negotiator for favourable policies. The chief negotiator absorbs the concern of societal actors and builds coalitions with them. Simultaneously, the chief negotiator then seeks to maximise the domestic concerns, yet minimise the impact of any contrary views from the opposing country. Win-Sets At the international level, countries will approach negotiations with a defined set of objectives. It is expected that chief negotiators of both states arrive at a range of outcomes where their objectives overlap. However, before committing to this, the chief negotiator must seek approval from domestic actors. This ratificat
https://en.wikipedia.org/wiki/Combo%20television%20unit
A combo television unit, or a TV/VCR combo, sometimes known as a televideo, is a television with a VCR, DVD player, or sometimes both, built into a single unit. These converged devices have the advantages (compared to a separate TV and VCR) of saving space and increasing portability. Such units entered the market during the mid-to-late 1980s when VCRs had become ubiquitous household devices. By this time, the VHS format had become standard; thus the vast majority of TV/VCR combos are VHS-based. Most combo units have composite inputs on the front and/or back to connect a home video game console, a second VCR, a DVD player, or a camcorder. Some units may also include a headphone jack or S-Video inputs. Though nearly all TV/VCR combination sets have monaural (mono) sound though with stereo soundtrack compatibility, there are a large number of TV/VCR combos with a stereo TV tuner, but a mono VCR (some may even include a mono sound input alongside a composite video input. Some models from Panasonic also included an FM tuner. When DVDs were released, brands such as Toshiba introduced a TV/DVD/VCR combo. However, many of these units turned out to have unreliable DVD players and never had a large place in the TV market. Modern televisions tend to be mostly composed of solid state components, while VCRs require physical movement for both the inner workings of the VCR and the actual viewing of a VHS tape. VCRs also tend to require occasional service to upkeep the VCR, including things like cleaning the heads, capstan, and pinch rollers. For this reason, it is not uncommon for the included VCR to cease functioning or to become unreliable years before a similar fate befalls the television component due to lack of easy maintenance. As late as 2006, flat-panel TVs with integrated DVD players appeared on the market, and integrated TV/DVD sets started overtaking the TV/VCR market. This is due to both the low price and overwhelming availability of DVDs and more compact form f
https://en.wikipedia.org/wiki/Failure%20to%20thrive
Failure to thrive (FTT), also known as weight faltering or faltering growth, indicates insufficient weight gain or absence of appropriate physical growth in children. FTT is usually defined in terms of weight, and can be evaluated either by a low weight for the child's age, or by a low rate of increase in the weight. The term "failure to thrive" has been used in different ways, as there is no objective standard or universally accepted definition for when to diagnose FTT. One definition describes FTT as a fall in one or more weight centile spaces on a World Health Organization (WHO) growth chart depending on birth weight or when weight is below the 2nd percentile of weight for age irrespective of birth weight. Another definition of FTT is a weight for age that is consistently below the 5th percentile or weight for age that falls by at least two major percentile lines on a growth chart. While weight loss after birth is normal and most babies return to their birth weight by three weeks of age, clinical assessment for FTT is recommended for babies who lose more than 10% of their birth weight or do not return to their birth weight after three weeks. Failure to thrive is not a specific disease, but a sign of inadequate weight gain. In veterinary medicine, FTT is also referred to as ill-thrift. Signs and symptoms Failure to thrive is most commonly diagnosed before two years of age, when growth rates are highest, though FTT can present among children and adolescents of any age. Caretakers may express concern about poor weight gain or smaller size compared to peers of a similar age. Physicians often identify failure to thrive during routine office visits, when a child's growth parameters such as height and weight are not increasing appropriately on growth curves. Other signs and symptoms may vary widely depending on the etiology of FTT. It is also important to differentiate stunting from wasting, as they can indicate different causes of FTT. "Wasting" refers to a deceler
https://en.wikipedia.org/wiki/Borel%E2%80%93Carath%C3%A9odory%20theorem
In mathematics, the Borel–Carathéodory theorem in complex analysis shows that an analytic function may be bounded by its real part. It is an application of the maximum modulus principle. It is named for Émile Borel and Constantin Carathéodory. Statement of the theorem Let a function be analytic on a closed disc of radius R centered at the origin. Suppose that r < R. Then, we have the following inequality: Here, the norm on the left-hand side denotes the maximum value of f in the closed disc: (where the last equality is due to the maximum modulus principle). Proof Define A by If f is constant c, the inequality follows from , so we may assume f is nonconstant. First let f(0) = 0. Since Re f is harmonic, Re f(0) is equal to the average of its values around any circle centered at 0. That is, Since f is regular and nonconstant, we have that Re f is also nonconstant. Since Re f(0) = 0, we must have Re for some z on the circle , so we may take . Now f maps into the half-plane P to the left of the x=A line. Roughly, our goal is to map this half-plane to a disk, apply Schwarz's lemma there, and make out the stated inequality. sends P to the standard left half-plane. sends the left half-plane to the circle of radius R centered at the origin. The composite, which maps 0 to 0, is the desired map: From Schwarz's lemma applied to the composite of this map and f, we have Take |z| ≤ r. The above becomes so , as claimed. In the general case, we may apply the above to f(z)-f(0): which, when rearranged, gives the claim. Alternative result and proof We start with the following result: Applications Borel–Carathéodory is often used to bound the logarithm of derivatives, such as in the proof of Hadamard factorization theorem. The following example is a strengthening of Liouville's theorem.
https://en.wikipedia.org/wiki/Tandem%20repeat%20locus
Variable number of tandem repeat locus (VNTR locus) is any DNA sequence that exist in multiple copies strung together in a variety of tandem lengths. The number of repeat copies present at a locus can be visualized by means of a Multi-locus or Multiple Loci VNTR Analysis (MLVA). In short, oligonucleotide primers are developed for each specific tandem repeat locus, followed by PCR and agarose gel electrophoresis. When the length of the repeat and the size of the flanking regions is known, the number of repeats can be calculated. Analysis of multiple loci will result in a genotype.
https://en.wikipedia.org/wiki/Sasa%20%28video%20game%29
Sasa is an arcade video game released for the MSX1 in 1984, and later for the Family Computer as in 1985. This video game involved obtaining capsules with an 'E' on them, sometimes suspended by balloons. The main character could only use bullets to propel himself, and when the bullet count reaches 0, the game ends. A player can also lose bullets by colliding with an enemy, the other player, or the other player's bullets.
https://en.wikipedia.org/wiki/Maximal%20independent%20set
In graph theory, a maximal independent set (MIS) or maximal stable set is an independent set that is not a subset of any other independent set. In other words, there is no vertex outside the independent set that may join it because it is maximal with respect to the independent set property. For example, in the graph , a path with three vertices , , and , and two edges and , the sets and are both maximally independent. The set is independent, but is not maximal independent, because it is a subset of the larger independent set In this same graph, the maximal cliques are the sets and A MIS is also a dominating set in the graph, and every dominating set that is independent must be maximal independent, so MISs are also called independent dominating sets. A graph may have many MISs of widely varying sizes; the largest, or possibly several equally large, MISs of a graph is called a maximum independent set. The graphs in which all maximal independent sets have the same size are called well-covered graphs. The phrase "maximal independent set" is also used to describe maximal subsets of independent elements in mathematical structures other than graphs, and in particular in vector spaces and matroids. Two algorithmic problems are associated with MISs: finding a single MIS in a given graph and listing all MISs in a given graph. Definition For a graph , an independent set is a maximal independent set if for , one of the following is true: where denotes the neighbors of The above can be restated as a vertex either belongs to the independent set or has at least one neighbor vertex that belongs to the independent set. As a result, every edge of the graph has at least one endpoint not in . However, it is not true that every edge of the graph has at least one, or even one endpoint in Notice that any neighbor to a vertex in the independent set cannot be in because these vertices are disjoint by the independent set definition. Related vertex sets If S is a
https://en.wikipedia.org/wiki/Vienna%20Ab%20initio%20Simulation%20Package
The Vienna Ab initio Simulation Package, better known as VASP, is a package written primarily in Fortran for performing ab initio quantum mechanical calculations using either Vanderbilt pseudopotentials, or the projector augmented wave method, and a plane wave basis set. The basic methodology is density functional theory (DFT), but the code also allows use of post-DFT corrections such as hybrid functionals mixing DFT and Hartree–Fock exchange (e.g. HSE, PBE0 or B3LYP), many-body perturbation theory (the GW method) and dynamical electronic correlations within the random phase approximation (RPA) and MP2. Originally, VASP was based on code written by Mike Payne (then at MIT), which was also the basis of CASTEP. It was then brought to the University of Vienna, Austria, in July 1989 by Jürgen Hafner. The main program was written by Jürgen Furthmüller, who joined the group at the Institut für Materialphysik in January 1993, and Georg Kresse. VASP is currently being developed by Georg Kresse; recent additions include the extension of methods frequently used in molecular quantum chemistry to periodic systems. VASP is currently used by more than 1400 research groups in academia and industry worldwide on the basis of software licence agreements with the University of Vienna. Incomplete version history: VASP.6.3.2 was released on 28 June 2022, VASP.6.4.1 on 7 April, 2023 and VASP.6.4.2 on 20 July, 2023. See also Quantum chemistry computer programs
https://en.wikipedia.org/wiki/Richard%20Friederich%20Arens
Richard Friederich Arens (24 April 1919 – 3 May 2000) was an American mathematician. He was born in Iserlohn, Germany. He emigrated to the United States in 1925. Arens received his Ph.D. in 1945 from Harvard University. He was several times was a visiting scholar at the Institute for Advanced Study (1945–46, 1946–47, and 1953–54). He was an Invited Speaker at the ICM in 1950 in Cambridge, Massachusetts. Arens worked in functional analysis, and was a professor at UCLA for more than 40 years. He served on the editorial board of the Pacific Journal of Mathematics for 14 years 1965–1979. There are three topological spaces named for Arens in the book Counterexamples in Topology, including Arens–Fort space. Arens died in Los Angeles, California. See also Arens square Mackey–Arens theorem
https://en.wikipedia.org/wiki/TCP/IP%20stack%20fingerprinting
TCP/IP stack fingerprinting is the remote detection of the characteristics of a TCP/IP stack implementation. The combination of parameters may then be used to infer the remote machine's operating system (aka, OS fingerprinting), or incorporated into a device fingerprint. TCP/IP Fingerprint Specifics Certain parameters within the TCP protocol definition are left up to the implementation. Different operating systems, and different versions of the same operating system, set different defaults for these values. By collecting and examining these values, one may differentiate among various operating systems and implementations of TCP/IP. The TCP/IP fields that may vary include the following: Initial packet size (16 bits) Initial TTL (8 bits) Window size (16 bits) Max segment size (16 bits) Window scaling value (8 bits) "don't fragment" flag (1 bit) "sackOK" flag (1 bit) "nop" flag (1 bit) These values may be combined to form a 67-bit signature, or fingerprint, for the target machine. Just inspecting the Initial TTL and window size fields is often enough to successfully identify an operating system, which eases the task of performing manual OS fingerprinting. Protection against and detecting fingerprinting Protection against the fingerprint doorway to attack is achieved by limiting the type and amount of traffic a defensive system responds to. Examples include blocking address masks and timestamps from outgoing ICMP control-message traffic, and blocking ICMP echo replies. A security tool can alert to potential fingerprinting: it can match another machine as having a fingerprinter configuration by detecting its fingerprint. Disallowing TCP/IP fingerprinting provides protection from vulnerability scanners looking to target machines running a certain operating system. Fingerprinting facilitates attacks. Blocking those ICMP messages is only one of an array of defenses required for full protection against attacks. Targeting the ICMP datagram, an obfuscator ru
https://en.wikipedia.org/wiki/Inner%20class
In object-oriented programming (OOP), an inner class or nested class is a class declared entirely within the body of another class or interface. It is distinguished from a subclass. Overview An instance of a normal or top-level class can exist on its own. By contrast, an instance of an inner class cannot be instantiated without being bound to a top-level class. Let us take the abstract notion of a Car with four Wheels. Our Wheels have a specific feature that relies on being part of our Car. This notion does not represent the Wheels as Wheels in a more general form that could be part of any vehicle. Instead, it represents them as specific to a Car. We can model this notion using inner classes as follows: We have the top-level class Car. Instances of class Car are composed of four instances of the class Wheel. This particular implementation of Wheel is specific to a car, so the code does not model the general notion of a wheel that would be better represented as a top-level class. Therefore, it is semantically connected to the class Car and the code of Wheel is in some way coupled to its outer class, being a composition unit of a car. The wheel for a particular car is unique to that car, but for generalization, the wheel is an aggregation unit to the car. Inner classes provide a mechanism to accurately model this connection. We can refer to our Wheel class as Car.Wheel, Car being the top-level class and Wheel being the inner class. Inner classes therefore allow for the object orientation of certain parts of the program that would otherwise not be encapsulated into a class. Larger segments of code within a class might be better modeled or refactored as a separate top-level class, rather than an inner class. This would make the code more general in its application and therefore more re-usable but potentially might be premature generalization. This may prove more effective, if code has many inner classes with the shared functionality. Types of nested classes in Ja
https://en.wikipedia.org/wiki/Genetic%20marker
A genetic marker is a gene or DNA sequence with a known location on a chromosome that can be used to identify individuals or species. It can be described as a variation (which may arise due to mutation or alteration in the genomic loci) that can be observed. A genetic marker may be a short DNA sequence, such as a sequence surrounding a single base-pair change (single nucleotide polymorphism, SNP), or a long one, like minisatellites. Background For many years, gene mapping was limited to identifying organisms by traditional phenotypes markers. This included genes that encoded easily observable characteristics such as blood types or seed shapes. The insufficient number of these types of characteristics in several organisms limited the mapping efforts that could be done. This prompted the development of gene markers which could identify genetic characteristics that are not readily observable in organisms (such as protein variation). Types Some commonly used types of genetic markers are: RFLP (or Restriction fragment length polymorphism) SSLP (or Simple sequence length polymorphism) AFLP (or Amplified fragment length polymorphism) RAPD (or Random amplification of polymorphic DNA) VNTR (or Variable number tandem repeat) Microsatellite polymorphism, (or Simple sequence repeat) SNP (or Single nucleotide polymorphism) STR (or Short tandem repeat) SFP (or Single feature polymorphism) DArT (or Diversity Arrays Technology) RAD markers (or Restriction site associated DNA markers) (using Sequence-tagged sites) Molecular genetic markers can be divided into two classes: a) biochemical markers which detect variation at the gene product level such as changes in proteins and amino acids and b) molecular markers which detect variation at the DNA level such as nucleotide changes: deletion, duplication, inversion and/or insertion. Markers can exhibit two modes of inheritance, i.e. dominant/recessive or co-dominant. If the genetic pattern of homo-zygotes can be distin
https://en.wikipedia.org/wiki/Candidate%20gene
The candidate gene approach to conducting genetic association studies focuses on associations between genetic variation within pre-specified genes of interest, and phenotypes or disease states. This is in contrast to genome-wide association studies (GWAS), which is a hypothesis-free approach that scans the entire genome for associations between common genetic variants (typically SNPs) and traits of interest. Candidate genes are most often selected for study based on a priori knowledge of the gene's biological functional impact on the trait or disease in question. The rationale behind focusing on allelic variation in specific, biologically relevant regions of the genome is that certain alleles within a gene may directly impact the function of the gene in question and lead to variation in the phenotype or disease state being investigated. This approach often uses the case-control study design to try to answer the question, "Is one allele of a candidate gene more frequently seen in subjects with the disease than in subjects without the disease?" Candidate genes hypothesized to be associated with complex traits have generally not been replicated by subsequent GWASs or highly powered replication attempts. The failure of candidate gene studies to shed light on the specific genes underlying such traits has been ascribed to insufficient statistical power, low prior probability that scientists can correctly guess a specific allele within a specific gene that is related to a trait, poor methodological practices, and data dredging. Selection Suitable candidate genes are generally selected based on known biological, physiological, or functional relevance to the disease in question. This approach is limited by its reliance on existing knowledge about known or theoretical biology of disease. However, molecular tools are allowing insight into disease mechanisms and pinpointing potential regions of interest in the genome. Genome-wide association studies (GWAS) and quantitative tra
https://en.wikipedia.org/wiki/Pronucleus
A pronucleus (: pronuclei) denotes the nucleus found in either a sperm or egg cell during the process of fertilization. The sperm cell undergoes a transformation into a pronucleus after entering the egg cell but prior to the fusion of the genetic material of both the sperm and egg. In contrast, the egg cell possesses a pronucleus once it becomes haploid, not upon the arrival of the sperm cell. Haploid cells, such as sperm and egg cells in humans, carry half the number of chromosomes present in somatic cells, with 23 chromosomes compared to the 46 found in somatic cells. It is noteworthy that the male and female pronuclei do not physically merge, although their genetic material does. Instead, their membranes dissolve, eliminating any barriers between the male and female chromosomes, facilitating the combination of their chromosomes into a single diploid nucleus in the resulting embryo, which contains a complete set of 46 chromosomes. The presence of two pronuclei serves as the initial indication of successful fertilization, often observed around 18 hours after insemination, or intracytoplasmic sperm injection (ICSI) during in vitro fertilization. At this stage, the zygote is termed a two-pronuclear zygote (2PN). Two-pronuclear zygotes transitioning through 1PN or 3PN states tend to yield poorer-quality embryos compared to those maintaining 2PN status throughout development, and this distinction may hold significance in the selection of embryos during in vitro fertilization (IVF) procedures. History The pronucleus was discovered the 1870s microscopically using staining techniques combined with microscopes with improved magnification levels. The pronucleus was originally found during the first studies on meiosis. Edouard Van Beneden published a paper in 1875 in which he first mentions the pronucleus by studying the eggs of rabbits and bats. He stated that the two pronuclei form together in the center of the cell to form the embryonic nucleus. Van Beneden also found t
https://en.wikipedia.org/wiki/Insertion%20%28genetics%29
In genetics, an insertion (also called an insertion mutation) is the addition of one or more nucleotide base pairs into a DNA sequence. This can often happen in microsatellite regions due to the DNA polymerase slipping. Insertions can be anywhere in size from one base pair incorrectly inserted into a DNA sequence to a section of one chromosome inserted into another. The mechanism of the smallest single base insertion mutations is believed to be through base-pair separation between the template and primer strands followed by non-neighbor base stacking, which can occur locally within the DNA polymerase active site. On a chromosome level, an insertion refers to the insertion of a larger sequence into a chromosome. This can happen due to unequal crossover during meiosis. N region addition is the addition of non-coded nucleotides during recombination by terminal deoxynucleotidyl transferase. P nucleotide insertion is the insertion of palindromic sequences encoded by the ends of the recombining gene segments. Trinucleotide repeats are classified as insertion mutations and sometimes as a separate class of mutations. Methods Zinc finger nuclease(ZFN), Transcription activator-like effector nucleases (TALEN), and CRISPR gene editing are the three main methods used in the former research to achieve gene insertion. And CRISPR/Cas tools have already become one of the most used methods to present research. Based on CRISPR/Cas tools, different systems have already been developed to achieve specific functions. For example, one strategy is double-strand nucleases cutting system, using the normal Cas9 protein with single guide RNA (sgRNA) and then achieving the gene insertion through end-joining or dividing cells with the DNA repair system. Another example is the prime editing system, which uses Cas9 nickase and the prime editing guide RNA (pegRNA) carrying the target genes. One limitation of current technology is that the size for DNA precise insertion is not large enough to
https://en.wikipedia.org/wiki/Acetone
Acetone (2-propanone or dimethyl ketone) is an organic compound with the formula . It is the simplest and smallest ketone (). It is a colorless, highly volatile and flammable liquid with a characteristic pungent odor. Acetone is miscible with water and serves as an important organic solvent in industry, home, and laboratory. About 6.7 million tonnes were produced worldwide in 2010, mainly for use as a solvent and for production of methyl methacrylate and bisphenol A, which are precursors to widely-used plastics. It is a common building block in organic chemistry. It serves as a solvent in household products such as nail polish remover and paint thinner. It has volatile organic compound (VOC)-exempt status in the United States. Acetone is produced and disposed of in the human body through normal metabolic processes. It is normally present in blood and urine. People with diabetic ketoacidosis produce it in larger amounts. Ketogenic diets that increase ketone bodies (acetone, β-hydroxybutyric acid and acetoacetic acid) in the blood are used to counter epileptic attacks in children who suffer from refractory epilepsy. Name From the 17th century and before modern developments in organic chemistry nomenclature, acetone was given many different names. Those names include spirit of Saturn, which was given when it was thought to be a compound of lead, and later pyro-acetic spirit and pyro-acetic ester. Prior to the "acetone" name given by Antoine Bussy, it was named "mesit" (from the Greek μεσίτης, meaning mediator) by Carl Reichenbach who also claimed that methyl alcohol consisted of mesit and ethyl alcohol. Names derived from mesit include mesitylene and mesityl oxide which were first synthesised from acetone. Unlike many compounds with the acet- prefix having a 2-carbon chain, acetone has a 3-carbon chain which has caused confusion since there cannot be a ketone with 2 carbons. The prefix refers to acetone's relation to vinegar (acetum in Latin, also the source of
https://en.wikipedia.org/wiki/Montel%20space
In functional analysis and related areas of mathematics, a Montel space, named after Paul Montel, is any topological vector space (TVS) in which an analog of Montel's theorem holds. Specifically, a Montel space is a barrelled topological vector space in which every closed and bounded subset is compact. Definition A topological vector space (TVS) has the if every closed and bounded subset is compact. A is a barrelled topological vector space with the Heine–Borel property. Equivalently, it is an infrabarrelled semi-Montel space where a Hausdorff locally convex topological vector space is called a or if every bounded subset is relatively compact. A subset of a TVS is compact if and only if it is complete and totally bounded. A is a Fréchet space that is also a Montel space. Characterizations A separable Fréchet space is a Montel space if and only if each weak-* convergent sequence in its continuous dual is strongly convergent. A Fréchet space is a Montel space if and only if every bounded continuous function sends closed bounded absolutely convex subsets of to relatively compact subsets of Moreover, if denotes the vector space of all bounded continuous functions on a Fréchet space then is Montel if and only if every sequence in that converges to zero in the compact-open topology also converges uniformly to zero on all closed bounded absolutely convex subsets of Sufficient conditions Semi-Montel spaces A closed vector subspace of a semi-Montel space is again a semi-Montel space. The locally convex direct sum of any family of semi-Montel spaces is again a semi-Montel space. The inverse limit of an inverse system consisting of semi-Montel spaces is again a semi-Montel space. The Cartesian product of any family of semi-Montel spaces (resp. Montel spaces) is again a semi-Montel space (resp. a Montel space). Montel spaces The strong dual of a Montel space is Montel. A barrelled quasi-complete nuclear space is a Montel space. Every product an
https://en.wikipedia.org/wiki/Linde%E2%80%93Buzo%E2%80%93Gray%20algorithm
The Linde–Buzo–Gray algorithm (introduced by Yoseph Linde, Andrés Buzo and Robert M. Gray in 1980) is a vector quantization algorithm to derive a good codebook. It is similar to the k-means method in data clustering. The algorithm At each iteration, each vector is split into two new vectors. A initial state: centroid of the training sequence; B initial estimation #1: code book of size 2; C final estimation after LGA: Optimal code book with 2 vectors; D initial estimation #2: code book of size 4; E final estimation after LGA: Optimal code book with 4 vectors; The final two code vectors are splitted into four and the process is repeated until the desired number of code vector is obtained.
https://en.wikipedia.org/wiki/Visibility
The visibility is the measure of the distance at which an object or light can be clearly discerned. In meteorology it depends on the transparency of the surrounding air and as such, it is unchanging no matter the ambient light level or time of day. It is reported within surface weather observations and METAR code either in meters or statute miles, depending upon the country. Visibility affects all forms of traffic: roads, railways, sailing and aviation. The geometric range of vision is limited by the curvature of the Earth and depends on the eye level and the height of the object being viewed. In geodesy, the atmospheric refraction must be taken into account when calculating geodetic visibility. Meteorological visibility Definition ICAO Annex 3 Meteorological Service for International Air Navigation contains the following definitions and note: a) the greatest distance at which a black object of suitable dimensions, situated near the ground, can be seen and recognized when observed against a bright background; b) the greatest distance at which lights of 1,000 candelas can be seen and identified against an unlit background. Note.— The two distances have different values in air of a given extinction coefficient, and the latter b) varies with the background illumination. The former a) is represented by the meteorological optical range (MOR). Annex 3 also defines Runway Visual Range (RVR) as: The range over which the pilot of an aircraft on the centre line of a runway can see the runway surface markings or the lights delineating the runway or identifying its centre line. In extremely clean air in Arctic or mountainous areas, the visibility can be up to 240 km (150 miles) where there are large markers such as mountains or high ridges. However, visibility is often reduced somewhat by air pollution and high humidity. Various weather stations report this as haze (dry) or mist (moist). Fog and smoke can reduce visibility to near zero, making driving extremely dangero
https://en.wikipedia.org/wiki/Nurikabe%20%28puzzle%29
Nurikabe (hiragana: ぬりかべ) is a binary determination puzzle named for Nurikabe, an invisible wall in Japanese folklore that blocks roads and delays foot travel. Nurikabe was apparently invented and named by Nikoli; other names (and attempts at localization) for the puzzle include Cell Structure and Islands in the Stream. Rules The puzzle is played on a typically rectangular grid of cells, some of which contain numbers. Cells are initially of unknown color, but can only be black or white. Two same-color cells are considered "connected" if they are adjacent vertically or horizontally, but not diagonally. Connected white cells form "islands", while connected black cells form the "sea". The challenge is to paint each cell black or white, subject to the following rules: Each numbered cell is an island cell, the number in it is the number of cells in that island. Each island must contain exactly one numbered cell. There must be only one sea, which is not allowed to contain "pools", i.e. 2×2 areas of black cells. Human solvers typically dot the non-numbered cells they've determined to be certain to belong to an island. Like most other pure-logic puzzles, a unique solution is expected, and a grid containing random numbers is highly unlikely to provide a uniquely solvable Nurikabe puzzle. History Nurikabe was first developed by "renin (れーにん)," whose pen name is the Japanese pronunciation of "Lenin" and whose autonym can be read as such, in the 33rd issue of (Puzzle Communication) Nikoli at March 1991. It soon created a sensation, and has appeared in all issues of that publication from the 38th to the present. As of 2005, seven books consisting entirely of Nurikabe puzzles have been published by Nikoli. (This paragraph mainly depends on "Nikoli complete works of interesting-puzzles(ニコリ オモロパズル大全集)." https://web.archive.org/web/20060707011243/http://www.nikoli.co.jp/storage/addition/omopadaizen/) Solution methods No blind guessing should be required to solve a Nu
https://en.wikipedia.org/wiki/Air%20Force%20Two
Air Force Two is the air traffic control designated call sign held by any United States Air Force aircraft carrying the vice president of the United States, but not the president. The term is often associated with the Boeing C-32, a modified 757 which is most commonly used as the vice president's transport. Other 89th Airlift Wing aircraft, such as the Boeing C-40 Clipper, C-20B, C-37A, and C-37B, have also served in this role. The VC-25A, the aircraft most often used by the president as Air Force One, has also been used by the vice president as Air Force Two. History Richard Nixon was one of the first senior officials in American government to travel internationally via jet aircraft on official business, taking a Boeing VC-137A Stratoliner on his visit to the Soviet Union in July 1959 for the Kitchen Debates as Eisenhower's vice president. Domestically, non-presidential VIP travel still relied on the prop powered Convair VC-131 Samaritan aircraft until Nelson Rockefeller was named Gerald Ford's vice president in 1974. Rockefeller personally owned a Grumman Gulfstream II jet that he preferred to the much slower Convair; Rockefeller's Gulfstream II then used the "Executive Two" callsign while he was in office. This would prompt the 89th Airlift Wing's acquisition of 3 McDonnell Douglas VC-9Cs in 1975, adding to their 3 VC-137s jets used for senior executive international travel. Prior senior executive aircraft included the former presidential Douglas VC-54 Skymaster, Douglas VC-118A, and Lockheed C-121 Constellations, held in reserve as back-up aircraft for the newer aircraft designated for presidential travel. Design Aircraft allocated for use by the vice president and senior executives authorized to travel under the Special Air Mission designation operated by the 89th Airlift Wing can be distinguished from the distinctive Raymond Loewy Air Force One livery by the lack of the Steel blue cheatline and cap over the cockpit. Former presidential aircraft that has
https://en.wikipedia.org/wiki/Archos
Archos (, stylized as ARCHOS) is a French multinational electronics company that was established in 1988 by Henri Crohas. Archos manufactures tablets, smartphones, portable media players and portable data storage devices. The name is an anagram of Crohas' last name. Also, in Greek (-αρχος), it's a suffix used in nouns indicating a person with power. The company's slogan has been updated from "Think Smaller" to "On The Go", and the current "Entertainment your way". Archos has developed a variety of products, including digital audio players, portable video players (PVP), digital video recorders, a personal digital assistant, netbooks, more recently tablet computers using Google Android and Microsoft Windows (tablet PCs), and smartphones (which are manufactured by ZTE under the "Archos" brand name). Success and decline By the year 2000, Archos became an important player in the portable media player market and this was demonstrated by the groundbreaking year 2000 release of the very first disk-based digital audio player (DAP) called Jukebox 6000. This product paved the way for the high-capacity DAPs, which finally resulted in the wide adoption of digital and MP3 players. Archos' success during this period was attributed to its strategy of technological leadership, releasing different iterations of a product line, which featured a succession of products with better specifications and technology than their predecessors. In the latter part of the 2000s, Archos began to lose to Apple, which - for its part - introduced its own portable devices such as the iPod. This development highlighted a trend in the technology industry where beating competitors to the market and equipping products with the most advanced technology available do not always translate to success. The company started phasing its portable media players in 2008 to focus more on its Android tablet range. In 2013, the company entered the mobile phone market by launching a series of smartphone models for ins
https://en.wikipedia.org/wiki/FOIL%20method
In elementary algebra, FOIL is a mnemonic for the standard method of multiplying two binomials—hence the method may be referred to as the FOIL method. The word FOIL is an acronym for the four terms of the product: First ("first" terms of each binomial are multiplied together) Outer ("outside" terms are multiplied—that is, the first term of the first binomial and the second term of the second) Inner ("inside" terms are multiplied—second term of the first binomial and first term of the second) Last ("last" terms of each binomial are multiplied) The general form is Note that is both a "first" term and an "outer" term; is both a "last" and "inner" term, and so forth. The order of the four terms in the sum is not important and need not match the order of the letters in the word FOIL. History The FOIL method is a special case of a more general method for multiplying algebraic expressions using the distributive law. The word FOIL was originally intended solely as a mnemonic for high-school students learning algebra. The term appears in William Betz's 1929 text Algebra for Today, where he states: ... first terms, outer terms, inner terms, last terms. (The rule stated above may also be remembered by the word FOIL, suggested by the first letters of the words first, outer, inner, last.) William Betz was active in the movement to reform mathematics in the United States at that time, had written many texts on elementary mathematics topics and had "devoted his life to the improvement of mathematics education". Many students and educators in the US now use the word "FOIL" as a verb meaning "to expand the product of two binomials". Examples The method is most commonly used to multiply linear binomials. For example, If either binomial involves subtraction, the corresponding terms must be negated. For example, The distributive law The FOIL method is equivalent to a two-step process involving the distributive law: In the first step, the () is distributed over the a
https://en.wikipedia.org/wiki/Pulation%20square
In category theory, a branch of mathematics, a pulation square (also called a Doolittle diagram) is a diagram that is simultaneously a pullback square and a pushout square. It is a self-dual concept.
https://en.wikipedia.org/wiki/Glycerophospholipid
Glycerophospholipids or phosphoglycerides are glycerol-based phospholipids. They are the main component of biological membranes. Two major classes are known: those for bacteria and eukaryotes and a separate family for archaea. Structures The term glycerophospholipid signifies any derivative of glycerophosphoric acid that contains at least one O-acyl, or O-alkyl, or O-alk-1'-enyl residue attached to the glycerol moiety. The phosphate group forms an ester linkage to the glycerol. The long-chained hydrocarbons are typically attached through ester linkages in bacteria/eukaryotes and by ether linkages in archaea. In bacteria and procaryotes, the lipids consist of diesters commonly of C16 or C18 fatty acids. These acids are straight-chained and, especially for the C18 members, can be unsaturated. For archaea, the hydrocarbon chains have chain lengths of C10, C15, C20 etc. since they are derived from isoprene units. These chains are branched, with one methyl substituent per C5 subunit. These chains are linked to the glycerol phosphate by ether linkages. The two hydrocarbon chains attached to the glycerol are hydrophobic while the polar head, which mainly consists of the phosphate group attached to the third carbon of the glycerol backbone, is hydrophilic. This dual characteristic leads to the amphipathic nature of glycerophospholipids. They are usually organized into a bilayer in membranes with the polar hydrophilic heads sticking outwards to the aqueous environment and the non-polar hydrophobic tails pointing inwards. Glycerophospholipids consist of various diverse species which usually differ slightly in structure. The most basic structure is a phosphatidate. This species is an important intermediate in the synthesis of many phosphoglycerides. The presence of an additional group attached to the phosphate allows for many different phosphoglycerides. By convention, structures of these compounds show the 3 glycerol carbon atoms vertically with the phosphate att
https://en.wikipedia.org/wiki/Numerical%20relativity
Numerical relativity is one of the branches of general relativity that uses numerical methods and algorithms to solve and analyze problems. To this end, supercomputers are often employed to study black holes, gravitational waves, neutron stars and many other phenomena governed by Einstein's theory of general relativity. A currently active field of research in numerical relativity is the simulation of relativistic binaries and their associated gravitational waves. Overview A primary goal of numerical relativity is to study spacetimes whose exact form is not known. The spacetimes so found computationally can either be fully dynamical, stationary or static and may contain matter fields or vacuum. In the case of stationary and static solutions, numerical methods may also be used to study the stability of the equilibrium spacetimes. In the case of dynamical spacetimes, the problem may be divided into the initial value problem and the evolution, each requiring different methods. Numerical relativity is applied to many areas, such as cosmological models, critical phenomena, perturbed black holes and neutron stars, and the coalescence of black holes and neutron stars, for example. In any of these cases, Einstein's equations can be formulated in several ways that allow us to evolve the dynamics. While Cauchy methods have received a majority of the attention, characteristic and Regge calculus based methods have also been used. All of these methods begin with a snapshot of the gravitational fields on some hypersurface, the initial data, and evolve these data to neighboring hypersurfaces. Like all problems in numerical analysis, careful attention is paid to the stability and convergence of the numerical solutions. In this line, much attention is paid to the gauge conditions, coordinates, and various formulations of the Einstein equations and the effect they have on the ability to produce accurate numerical solutions. Numerical relativity research is distinct from work on cl
https://en.wikipedia.org/wiki/Mural%20crown
A mural crown () is a crown or headpiece representing city walls, towers, or fortresses. In classical antiquity, it was an emblem of tutelary deities who watched over a city, and among the Romans a military decoration. Later the mural crown developed into a symbol of European heraldry, mostly for cities and towns, and in the 19th and 20th centuries was used in some republican heraldry. Usage in ancient times In Hellenistic culture, a mural crown identified tutelary deities such as the goddess Tyche (the embodiment of the fortunes of a city, familiar to Romans as Fortuna), and Hestia (the embodiment of the protection of a city, familiar to Romans as Vesta). The high cylindrical polos of Rhea/Cybele too could be rendered as a mural crown in Hellenistic times, specifically designating the mother goddess as patron of a city. The mural crown became an ancient Roman military decoration. The corona muralis (Latin for "walled crown") was a golden crown, or a circle of gold intended to resemble a battlement, bestowed upon the soldier who first climbed the wall of a besieged city or fortress to successfully place the standard (flag) of the attacking army upon it. The Roman mural crown was made of gold, and decorated with turrets, as is the heraldic version. As it was among the highest order of military decorations, it was not awarded to a claimant until after a strict investigation. The rostrata mural crown, composed of the rostra indicative of captured ships, was assigned as naval prize to the first in a boarding party, similar to the naval crown. The Graeco-Roman goddess Roma's attributes on Greek coinage usually include her mural crown, signifying Rome's status as a loyal protector of Hellenic city-states. Heraldic use The Roman military decoration was subsequently employed in European heraldry, where the term denoted a crown modeled after the walls of a castle, which may be tinctured or (gold), argent (silver), gules (red), or proper (i.e. stone-coloured). In 19t
https://en.wikipedia.org/wiki/Euro%20sign
The euro sign ⟨⟩ is the currency sign used for the euro, the official currency of the eurozone and adopted, although not required to, by Kosovo and Montenegro. The design was presented to the public by the European Commission on 12 December 1996. It consists of a stylized letter E (or epsilon), crossed by two lines instead of one. Depending on convention in each nation, the symbol can either precede the value (for instance, €10), or follow the value (for instance, 10€), often with an intervening space. Design There were originally 30 proposed designs for a symbol for Europe's new common currency; the Commission short-listed these to ten candidates. These ten were put to a public survey. The President of the European Commission at the time (Jacques Santer) and the European Commissioner with responsibility for the euro (Yves-Thibault de Silguy) then chose the winning design. The other designs that were considered are not available for the public to view, nor is any information regarding the designers available for public query. The Commission considers the process of designing to have been internal and keeps these records secret. The eventual winner was a design created by a team of four experts whose identities have not been revealed. It is believed by some that the Belgian graphic designer Alain Billiet was the designer of the selected euro sign. The official story of the design history of the euro sign is disputed by Arthur Eisenmenger, a former chief graphic designer for the European Economic Community, who says he had the idea 25 years before the Commission's decision. The Commission specified a euro logo with exact proportions and colours (PMS Yellow foreground, PMS Reflex Blue background), for use in public-relations material related to the euro introduction. While the Commission intended the logo to be a prescribed glyph shape, type designers made it clear that they intended instead to adapt the design to be consistent with the typefaces to which it was t
https://en.wikipedia.org/wiki/Pink%20Floyd%20pigs
Inflatable flying pigs were one of the staple props of Pink Floyd's live shows. The first balloon was a sow, with a male pig balloon later introduced in their 1987 tour. Pigs appeared numerous times in concerts by the band, promoting concerts and record releases, and on the cover of their 1977 album Animals. The image rights for the pigs passed to Roger Waters when he split from the rest of the group, though the pigs continued to be used by both Pink Floyd and Roger Waters in their gigs after his departure. Animals The original Pink Floyd pig, a , helium-filled balloon, was designed by Roger Waters and built in December 1976 by the artist Jeffrey Shaw with help of design team Hipgnosis, in preparation for shooting the cover of the Animals album. Plans were made to fly it over Battersea Power Station for a three-day photo-shoot, with a marksman standing by to shoot the pig down if it broke free. On the first day, poor weather and delays meant the pig was not launched, and the marksman was told he was not needed. On the second day, 3 December 1976, the marksman was again not present because no one had told him to return. A gust of wind tore the pig loose. It disappeared from sight within five minutes, and was spotted by airline pilots at thirty thousand feet in the air. Flights at Heathrow Airport were cancelled as the huge inflatable pig continued eastwards across flight lanes and out over the English Channel, finally landing that night on a farm in Kent, where it frightened a herd of cows. The pig was recovered and repaired for the resumption of photography for the album cover, but the sky was cloudless and blue, thus "boring". However, the pictures of the sky from the first day were suitable; the album cover was created using a composite of photos from the first and third days. The pig that was originally floated above Battersea Power Station was called "Algie". In the Flesh After the album Animals was released in 1977, Pink Floyd began their "In the Flesh"
https://en.wikipedia.org/wiki/Rhythms%20NetConnections
Rhythms NetConnections Inc. (Former NASDAQ: RTHM) was in the business of providing broadband local-access communication services to large enterprises, telecommunications carriers and their internet service provider (ISP) affiliates and other ISPs. The company's services included a range of high-speed, always-on connections that were designed to offer customers both cost and performance advantages when accessing the Internet or private networks. The Company used multiple digital subscriber line (DSL) technologies to provide data transfer rates ranging from 128 kbit/s to 8.0 Mbit/s delivering data to the end user, and from 128 kbit/s to 1.5 Mbit/s receiving data from the end user. The company was delisted from NASDAQ in May 2001. On August 2, 2001, the company and all of its wholly owned United States subsidiaries voluntarily filed for reorganization under Chapter 11 of the United States Bankruptcy Code. Also in August 2001, the Company sent 31-day service termination letters to all of its customers. Legal Action A protracted class action securities lawsuit against the officers and directors of Rhythms NetConnections was settled on April 3, 2009. Judge John K. Lane of the U.S. District Court for the District of Colorado gave his final approval to a $17.5 million settlement. Judge Lane also awarded the plaintiffs' attorneys, Milberg LLP, Stull Stull & Brody and The Shuman Law Firm, 30% of the settlement fund and an additional $2.6 million in expenses from the fund. A class of shareholders, who purchased shares in Rhythms NetConnections between Jan. 6, 2000, and April 2, 2001, brought the case. The lawsuit alleged that the officers and directors “knowingly or recklessly” made false statements about the company's subscriber line count, growth and financial condition in an effort to inflate its stock price. At one time, Enron owned 5.4 million shares of Rhythms NetConnections stock. See also Covad Communications Dot-com bubble NorthPoint Communications
https://en.wikipedia.org/wiki/IEC%2061131-3
IEC 61131-3 is the third part (of 10) of the international standard IEC 61131 for programmable logic controllers. It was first published in December 1993 by the IEC; the current (third) edition was published in February 2013. Part 3 of IEC 61131 deals with basic software architecture and programming languages of the control program within PLC. It defines three graphical and two textual programming language standards: Ladder diagram (LD), graphical Function block diagram (FBD), graphical Structured text (ST), textual Instruction list (IL), textual (deprecated in 3rd edition of the standard) Sequential function chart (SFC), has elements to organize programs for sequential and parallel control processing, graphical. Data types Elementary Data Type Bit Strings – groups of on/off values BOOL - 1 bit (0,1) BYTE – 8 bit (1 byte) WORD – 16 bit (2 byte) DWORD – 32 bit (4 byte) LWORD – 64 bit (8 byte) INTEGER – whole numbers (Considering byte size 8 bits) SINT – signed short integer (1 byte) INT – signed integer (2 byte) DINT – signed double integer (4 byte) LINT – signed long integer (8 byte) USINT – Unsigned short integer (1 byte) UINT – Unsigned integer (2 byte) UDINT – Unsigned double integer (4 byte) ULINT – Unsigned long integer (8 byte) REAL – floating point IEC 60559 (same as IEEE 754-2008) REAL – (4 byte) LREAL – (8 byte) Duration TIME – (implementer specific). Literals in the form of T#5m90s15ms LTIME – (8 byte). Literals extend to nanoseconds in the form of T#5m90s15ms542us15ns Date DATE – calendar date (implementer specific) LDATE – calendar date (8 byte, nanoseconds since 1970-01-01, restricted to multiple of one day) Time of day TIME_OF_DAY / TOD – clock time (implementer specific) LTIME_OF_DAY / LTOD – clock time (8 byte) Date and time of Day DATE_AND_TIME / DT – time and date (implementer specific) LDATE_AND_TIME / LDT – time and date (8 byte, nanoseconds since 1970-01-01) Character / Character string CHAR – S
https://en.wikipedia.org/wiki/Linear%20grammar
In computer science, a linear grammar is a context-free grammar that has at most one nonterminal in the right-hand side of each of its productions. A linear language is a language generated by some linear grammar. Example An example of a linear grammar is G with N = {S}, Σ = {a, b}, P with start symbol S and rules S → aSb S → ε It generates the language . Relationship with regular grammars Two special types of linear grammars are the following: the left-linear or left-regular grammars, in which all rules are of the form A → αw where α is either empty or a single nonterminal and w is a string of terminals; the right-linear or right-regular grammars, in which all rules are of the form A → wα where w is a string of terminals and α is either empty or a single nonterminal. Each of these can describe exactly the regular languages. A regular grammar is a grammar that is left-linear or right-linear. Observe that by inserting new nonterminals, any linear grammar can be replaced by an equivalent one where some of the rules are left-linear and some are right-linear. For instance, the rules of G above can be replaced with S → aA A → Sb S → ε However, the requirement that all rules be left-linear (or all rules be right-linear) leads to a strict decrease in the expressive power of linear grammars. Expressive power All regular languages are linear; conversely, an example of a linear, non-regular language is { }. as explained above. All linear languages are context-free; conversely, an example of a context-free, non-linear language is the Dyck language of well-balanced bracket pairs. Hence, the regular languages are a proper subset of the linear languages, which in turn are a proper subset of the context-free languages. While regular languages are deterministic, there exist linear languages that are nondeterministic. For example, the language of even-length palindromes on the alphabet of 0 and 1 has the linear grammar S → 0S0 | 1S1 | ε. An arbitrary string of this