source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Simple%20Ocean%20Data%20Assimilation
The Simple Ocean Data Assimilation (SODA) analysis is an oceanic reanalysis data set consisting of gridded state variables for the global ocean, as well as several derived fields. SODA was developed in the 1990s as a collaborative project between the Department of Atmospheric and Oceanic Science at the University of Maryland and the Department of Oceanography at Texas A&M University with the goal of providing an improved estimate of ocean state from those based solely on observations or numerical simulations. Since its first release there have been several updates, the most recent of which extends from 1958 to 2008, as well as a “beta release” of a long-term reanalysis for 1871–2008. Initial release Initially released in 2000, SODA included data from the 1994 World Ocean Atlas (WOA-94) such as MBT, XBT, CTD and station data; hydrography, SST, and altimeter measured sea level; as well as data from NODC, NCEP, and TOGA-TAO. The spatial extend of the first release was more limited than subsequent releases, extending from 62°S to 62°N, and covering the time period from January 1950 through December 1995. Current release The latest release of SODA (SODA 2.1.6) covers the time period from January 1958 to December 2008. As part of the data assimilation scheme, the system ingests a wide variety of observations including hydrographic profiles, ocean station data, moored temperature and salinity measurements, surface temperature and salinity observations from a variety of instruments (e.g., MBT, XBT, CTD), sea surface temperature (SST) from nighttime infrared observations from satellites, and satellite based sea level altimetry. Additionally, the numerical model used for forecasts is driven by surface winds from the European Center for Medium-Range Weather Forecasts ERA-40 reanalysis data set during the period between 1958 and 2001, and from the QuickSCAT scatterometer for 2002–2008. The state variable forecasts used in the assimilation are calculated using a general circul
https://en.wikipedia.org/wiki/Gromov%E2%80%93Witten%20invariant
In mathematics, specifically in symplectic topology and algebraic geometry, Gromov–Witten (GW) invariants are rational numbers that, in certain situations, count pseudoholomorphic curves meeting prescribed conditions in a given symplectic manifold. The GW invariants may be packaged as a homology or cohomology class in an appropriate space, or as the deformed cup product of quantum cohomology. These invariants have been used to distinguish symplectic manifolds that were previously indistinguishable. They also play a crucial role in closed type IIA string theory. They are named after Mikhail Gromov and Edward Witten. The rigorous mathematical definition of Gromov–Witten invariants is lengthy and difficult, so it is treated separately in the stable map article. This article attempts a more intuitive explanation of what the invariants mean, how they are computed, and why they are important. Definition Consider the following: X: a closed symplectic manifold of dimension 2k, A: a 2-dimensional homology class in X, g: a non-negative integer, n: a non-negative integer. Now we define the Gromov–Witten invariants associated to the 4-tuple: (X, A, g, n). Let be the Deligne–Mumford moduli space of curves of genus g with n marked points and denote the moduli space of stable maps into X of class A, for some chosen almost complex structure J on X compatible with its symplectic form. The elements of are of the form: , where C is a (not necessarily stable) curve with n marked points x1, ..., xn and f : C → X is pseudoholomorphic. The moduli space has real dimension Let denote the stabilization of the curve. Let which has real dimension . There is an evaluation map The evaluation map sends the fundamental class of to a d-dimensional rational homology class in Y, denoted In a sense, this homology class is the Gromov–Witten invariant of X for the data g, n, and A. It is an invariant of the symplectic isotopy class of the symplectic manifold X. To interpret the Gromov–W
https://en.wikipedia.org/wiki/Residuated%20Boolean%20algebra
In mathematics, a residuated Boolean algebra is a residuated lattice whose lattice structure is that of a Boolean algebra. Examples include Boolean algebras with the monoid taken to be conjunction, the set of all formal languages over a given alphabet Σ under concatenation, the set of all binary relations on a given set X under relational composition, and more generally the power set of any equivalence relation, again under relational composition. The original application was to relation algebras as a finitely axiomatized generalization of the binary relation example, but there exist interesting examples of residuated Boolean algebras that are not relation algebras, such as the language example. Definition A residuated Boolean algebra is an algebraic structure such that An equivalent signature better suited to the relation algebra application is where the unary operations x\ and x▷ are intertranslatable in the manner of De Morgan's laws via x\y = ¬(x▷¬y),   x▷y = ¬(x\¬y), and dually /y and ◁y as x/y = ¬(¬x◁y),   x◁y = ¬(¬x/y), with the residuation axioms in the residuated lattice article reorganized accordingly (replacing z by ¬z) to read   ⇔     ⇔   This De Morgan dual reformulation is motivated and discussed in more detail in the section below on conjugacy. Since residuated lattices and Boolean algebras are each definable with finitely many equations, so are residuated Boolean algebras, whence they form a finitely axiomatizable variety. Examples Any Boolean algebra, with the monoid multiplication • taken to be conjunction and both residuals taken to be material implication x→y. Of the remaining 15 binary Boolean operations that might be considered in place of conjunction for the monoid multiplication, only five meet the monotonicity requirement, namely 0, 1, x, y, and . Setting y = z = 0 in the residuation axiom y ≤ x\z   ⇔   x•y ≤ z, we have 0 ≤ x\0   ⇔   x•0 ≤ 0, which is falsified by taking x = 1 when x•y = 1, x, or . The dual argument for z/y r
https://en.wikipedia.org/wiki/List%20of%20ancient%20oceans
This is a list of former oceans that disappeared due to tectonic movements and other geographical and climatic changes. In alphabetic order: List Bridge River Ocean, the ocean between the ancient Insular Islands (that is, Stikinia) and North America Cache Creek Ocean, a Paleozoic ocean between the Wrangellia Superterrane and Yukon-Tanana Terrane Iapetus Ocean, the Southern hemisphere ocean between Baltica and Avalonia Kahiltna-Nutotzin Ocean, Mesozoic Khanty Ocean, the Precambrian to Silurian ocean between Baltica and the Siberian continent Medicine Hat Ocean Mezcalera Ocean, the ocean between the Guerrero Terrane and Laurentia Mirovia, the ocean that surrounded the Rodinia supercontinent Mongol-Okhotsk Ocean, the early Mesozoic ocean between the North China and Siberia cratons Oimyakon Ocean, the northernmost part of the Mesozoic Panthalassa Ocean Paleo-Tethys Ocean, the ocean between Gondwana and the Hunic terranes Pan-African Ocean, the ocean that surrounded the Pannotia supercontinent Panthalassa, the vast world ocean that surrounded the Pangaea supercontinent, also referred to as the Paleo-Pacific Ocean Pharusian Ocean, Neoproterozoic Poseidon Ocean, Mesoproterozoic Pontus Ocean, the western part of the early Mesozoic Panthalassa Ocean Proto-Tethys Ocean, Neoproterozoic Rheic Ocean, the Paleozoic ocean between Gondwana and Laurussia Slide Mountain Ocean, the Mesozoic ocean between the ancient Intermontane Islands (that is, Wrangellia) and North America South Anuyi Ocean, Mesozoic ocean related to the formation of the Arctic Ocean Tethys Ocean, the ocean between the ancient continents of Gondwana and Laurasia Thalassa Ocean, the eastern part of the early Mesozoic Panthalassa Ocean Ural Ocean, the Paleozoic ocean between Siberia and Baltica See also :Category:Historical oceans , an ocean that surrounds a global supercontinent ancient oceans ancient oceans Historical oceans Mesozoic paleogeography Paleozoic paleogeography Pro
https://en.wikipedia.org/wiki/Optimal%20kidney%20exchange
Optimal kidney exchange (OKE) is an optimization problem faced by programs for kidney paired donations (also called Kidney Exchange Programs). Such programs have large databases of patient-donor pairs, where the donor is willing to donate a kidney in order to help the patient, but cannot do so due to medical incompatibility. The centers try to arrange exchanges between such pairs. For example, the donor in pair A donates to the patient in pair B, the donor in pair B donates to the patient in pair C, and the donor in pair C donates to the patient in pair A. The objective of the OKE problem is to find an optimal arrangement of such exchanges. "Optimal" usually means that the number of transplants is as large as possible, but there may be other objectives. A crucial constraint in this optimization problem is that a donor gives a kidney only if his patient receives a compatible kidney, so that no pair loses a kidney from participating. This requirement is sometimes called individual rationality. The OKE problem has many variants, which differ in the allowed size of each exchange, the objective function, and other factors. Definitions Input An instance of OKE is usually described as a directed graph. Every node represents a patient-donor pair. A directed arc from pair A to pair B means that the donor in pair A is medically compatible with the patient in pair B (compatibility is determined based on the blood types of the donor and patient, as well as other factors such as particular antigens in their blood). A directed cycle in the compatibility graph represents a possible exchange. A directed cycle of size 2 (e.g. A -> B -> A) represents a possible pairwise exchange - an exchange between a pair of pairs. A more general variant of OKE considers also nodes of a second type, that represent altruistic donors - donors who are not paired to a patient, and are willing to donate a kidney to any compatible patient. Altruistic donor nodes have only outgoing arcs. With altrui
https://en.wikipedia.org/wiki/Norton%20360
Norton 360 is an "all-in-one" security suite developed by Gen Digital, formerly Symantec and later NortonLifeLock. The current suite was released in 2019 as a replacement for Norton Security, which had itself replaced an earlier suite named Norton 360. See also Norton AntiVirus
https://en.wikipedia.org/wiki/Bart%20syndrome
Bart syndrome, also known as aplasia cutis congenita type VI, is a rare genetic disorder characterized by the association of congenital localized absence of skin, mucocutaneous blistering and absent and dystrophic nails. History This clinical trial was first described by Bruce J Bart in 1966, who reported a large family with 26 affected members. Clinical 1. Absence of skin at birth, involving the lower legs and feet, healing within a few months, leaving scarring and fragile skin. 2. Widespread blistering of the skin and mucous membranes. 3. Variable absence and dystrophy of nails. Genetics The syndrome is inherited by autosomal dominant transmission with complete penetrance but variable expression. This means that children of an affected parent that carries the gene have a 50% chance of developing the disorder, although the extent to which they are affected is variable. Blistering in Bart syndrome represents a form of epidermolysis bullosa caused by ultrastructural abnormalities in the anchoring fibrils. Genetic linkage of the inheritance of the disease points to the region of chromosome 3 near the collagen, type VII, alpha 1 gene (COL7A1). See also List of cutaneous conditions Bart-Pumphrey syndrome
https://en.wikipedia.org/wiki/Ocean%20optics
Ocean optics is the study of how light interacts with water and the materials in water. Although research often focuses on the sea, the field broadly includes rivers, lakes, inland waters, coastal waters, and large ocean basins. How light acts in water is critical to how ecosystems function underwater. Knowledge of ocean optics is needed in aquatic remote sensing research in order to understand what information can be extracted from the color of the water as it appears from satellite sensors in space. The color of the water as seen by satellites is known as ocean color. While ocean color is a key theme of ocean optics, optics is a broader term that also includes the development of underwater sensors using optical methods to study much more than just color, including ocean chemistry, particle size, imaging of microscopic plants and animals, and more. Key terminology Optically deep Where waters are “optically deep,” the bottom does not reflect incoming sunlight, and the seafloor cannot be seen by humans or satellites. The vast majority of the world’s oceans by area are optically deep. Optically deep water can still be relatively shallow water in terms of total physical depth, as long as the water is very turbid, such as in estuaries. Optically shallow Where waters are “optically shallow,” the bottom reflects light and often can be seen by humans and satellites. Here, ocean optics can also be used to study what is under the water. Based on what color they appear to sensors, researchers can map habitat types, including macroalgae, corals, seagrass beds, and more. Mapping shallow-water environments requires knowledge of ocean optics because the color of the water must be accounted for when looking at the color of the seabed environment below. Inherent optical properties (IOPs) Inherent optical properties (IOPs) depend on what is in the water. These properties stay the same no matter what the incoming light is doing (daytime or nighttime, low sun angle or high sun an
https://en.wikipedia.org/wiki/Transverse%20colon
In human anatomy, the transverse colon is the longest and most movable part of the colon. Anatomical position It crosses the abdomen from the ascending colon at the right colic flexure (hepatic flexure) with a downward convexity to the descending colon where it curves sharply on itself beneath the lower end of the spleen forming the left colic flexure (splenic flexure). In its course, it describes an arch, the concavity of which is directed backward and a little upward. Toward its splenic end there is often an abrupt U-shaped curve which may descend lower than the main curve. It is almost completely invested by the peritoneum, and is connected to the inferior border of the pancreas by a large and wide duplicature of that membrane, the transverse mesocolon. It is in relation, by its upper surface, with the liver and gall-bladder, the greater curvature of the stomach, and the lower end of the spleen; by its under surface, with the small intestine; by its anterior surface, with the posterior layer of the greater omentum and the abdominal wall; its posterior surface is in relation from right to left with the descending portion of the duodenum, the head of the pancreas, and some of the convolutions of the jejunum and ileum. Function The transverse colon absorbs water and salts. Additional images See also Colon
https://en.wikipedia.org/wiki/Pelagodinium%20b%C3%A9ii
Pelagodinium béii is a photosynthetic dinoflagellate that forms a symbiotic relationship with planktonic foraminifera. Discovery and classification P. béii was originally described as Gymnodinium béii by marine isotope geochemist Howard Spero in 1987, after being discovered in the eastern Pacific Ocean. It was redefined as P. béii in 2010 after its Ribosomal RNA was characterized, revealing it to be a relative of the genus Symbiodinium. Symbiodinium is a well-studied endosymbiont of deep water invertebrates, protists and foraminifera, found especially alongside reef-dwelling organisms. Ecology The P. béii contains a single straight elongated apical vesicle with a row of small knobs, eight latitudinal series of amphiesmal vesicles, and a Type E eyespot. When not living as a symbiont the species is able to enter a motile stage. Like Symbiodinium, P. béii is a member of the Suessiales order, which lack thecal armored plates. P. béii is hosted by at least four foraminifera: G. ruber, G. conglobatus, G. sacculifer and Orbulina universa. See also Globigerina bulloides
https://en.wikipedia.org/wiki/Arkansas%20Delta
The Arkansas Delta is one of the six natural regions of the state of Arkansas. Willard B. Gatewood Jr., author of The Arkansas Delta: Land of Paradox, says that rich cotton lands of the Arkansas Delta make that area "The Deepest of the Deep South." The region runs along the Mississippi River from Eudora north to Blytheville and as far west as Little Rock. It is part of the Mississippi embayment, itself part of the Mississippi River Alluvial Plain. The flat plain is bisected by Crowley's Ridge, a narrow band of rolling hills rising above the flat delta plains. Several towns and cities have been developed along Crowley's Ridge, including Jonesboro. The region's lower western border follows the Arkansas River just outside Little Rock down through Pine Bluff. There the border shifts to Bayou Bartholomew, stretching south to the Arkansas-Louisiana state line. While the Arkansas Delta shares many geographic similarities with the Mississippi Delta, it is distinguished by its five unique sub-regions: the St. Francis Basin, Crowley's Ridge, the White River Lowlands, the Grand Prairie and the Arkansas River Lowlands (also called "the Delta Lowlands"). Much of the region is within the Mississippi lowland forests ecoregion. The Arkansas Delta includes the entire territories of 15 counties: Arkansas, Chicot, Clay, Craighead, Crittenden, Cross, Desha, Drew, Greene, Lee, Mississippi, Monroe, Phillips, Poinsett, and St. Francis. It also includes portions of another 10 counties: Jackson, Lawrence, Prairie, Randolph, White, Pulaski, Lincoln, Jefferson, Lonoke and Woodruff counties. Geology The Delta is subdivided into five unique sub-regions, including the St. Francis River Basin, Crowley's Ridge, the White River Lowlands, the Grand Prairie, and the Arkansas River Lowlands (also called "the Delta Lowlands"). Grand Prairie The underlying impermeable clay layer in the Stuttgart soil series that allowed the region to be a flat grassland plain initially appeared to stunt the regi
https://en.wikipedia.org/wiki/Universality%E2%80%93diversity%20paradigm
The universality–diversity paradigm is the analysis of biological materials based on the universality and diversity of its fundamental structural elements and functional mechanisms. The analysis of biological systems based on this classification has been a cornerstone of modern biology. For example, proteins constitute the elementary building blocks of a vast variety of biological materials such as cells, spider silk or bone, where they create extremely robust, multi-functional materials by self-organization of structures over many length- and time scales, from nano to macro. Some of the structural features are commonly found in many different tissues, that is, they are conservation|highly conserved. Examples of such universal building blocks include alpha-helices, beta-sheets or tropocollagen molecules. In contrast, other features are highly specific to tissue types, such as particular filament assemblies, beta-sheet nanocrystals in spider silk or tendon fascicles. This coexistence of universality and diversity—referred to as the universality–diversity paradigm (UDP)—is an overarching feature in biological materials and a crucial component of materiomics. It might provide guidelines for bioinspired and biomimetic material development, where this concept is translated into the use of inorganic or hybrid organic-inorganic building blocks. See also Materiomics Phylogenetics
https://en.wikipedia.org/wiki/Kripke%E2%80%93Platek%20set%20theory
The Kripke–Platek set theory (KP), pronounced , is an axiomatic set theory developed by Saul Kripke and Richard Platek. The theory can be thought of as roughly the predicative part of ZFC and is considerably weaker than it. Axioms In its formulation, a Δ0 formula is one all of whose quantifiers are bounded. This means any quantification is the form or (See the Lévy hierarchy.) Axiom of extensionality: Two sets are the same if and only if they have the same elements. Axiom of induction: φ(a) being a formula, if for all sets x the assumption that φ(y) holds for all elements y of x entails that φ(x) holds, then φ(x) holds for all sets x. Axiom of empty set: There exists a set with no members, called the empty set and denoted {}. Axiom of pairing: If x, y are sets, then so is {x, y}, a set containing x and y as its only elements. Axiom of union: For any set x, there is a set y such that the elements of y are precisely the elements of the elements of x. Axiom of Δ0-separation: Given any set and any Δ0 formula φ(x), there is a subset of the original set containing precisely those elements x for which φ(x) holds. (This is an axiom schema.) Axiom of Δ0-collection: Given any Δ0 formula φ(x, y), if for every set x there exists a set y such that φ(x, y) holds, then for all sets X there exists a set Y such that for every x in X there is a y in Y such that φ(x, y) holds. Some but not all authors include an Axiom of infinity KP with infinity is denoted by KPω. These axioms lead to close connections between KP, generalized recursion theory, and the theory of admissible ordinals. KP can be studied as a constructive set theory by dropping the law of excluded middle, without changing any axioms. Empty set If any set is postulated to exist, such as in the axiom of infinity, then the axiom of empty set is redundant because it is equal to the subset . Furthermore, the existence of a member in the universe of discourse, i.e., ∃x(x=x), is implied in certain formulation
https://en.wikipedia.org/wiki/Firefly%20%28key%20exchange%20protocol%29
Firefly is a U.S. National Security Agency public-key key exchange protocol, used in EKMS, the STU-III secure telephone, and several other U.S. cryptographic systems.
https://en.wikipedia.org/wiki/International%20Code%20of%20Nomenclature%20for%20Cultivated%20Plants
The International Code of Nomenclature for Cultivated Plants (ICNCP) is a guide to the rules and regulations for naming cultigens, plants whose origin or selection is primarily due to intentional human activity. It is also known as Cultivated Plant Code. Cultigens under the purview of the ICNCP include cultivars, Groups (cultivar groups), and grexes. All organisms traditionally considered to be plants (including algae and fungi) are included. Taxa that receive a name under the ICNCP will also be included within taxa named under the International Code of Nomenclature for algae, fungi, and plants, for example, a cultivar is a member of a species. Brief history The first edition of the ICNCP, which was agreed in 1952 in Wageningen and published in 1953, has been followed by seven subsequent editions – in 1958 (Utrecht), 1961 (update of 1958), 1969 (Edinburgh), 1980 (Seattle), 1995 (Edinburgh), 2004 (Toronto) and 2009 (Wageningen). The ninth (most recent) edition was published in 2016 (Beijing). William Stearn has outlined the origins of ICNCP, tracing it back to the International Horticultural Congress of Brussels in 1864, when a letter from Alphonse de Candolle to Edouard Morren was tabled. This set out de Candolle's view that Latin names should be reserved for species and varieties found in the wild, with non-Latin or "fancy" names used for garden forms. Karl Koch supported this position at the 1865 International Botanical and Horticultural Congress and at the 1866 International Botanical Congress, where he suggested that future congresses should deal with nomenclatural matters. De Candolle, who had a legal background, drew up the Lois de la Nomenclature botanique (rules of botanical nomenclature). When adopted by the International Botanical Congress of Paris in 1867, this became the first version of today's International Code of Nomenclature for algae, fungi, and plants (ICN). Article 40 of the Lois de la Nomenclature botanique dealt with the names of plants o
https://en.wikipedia.org/wiki/Wolfram%20code
Wolfram code is a widely used numbering system for one-dimensional cellular automaton rules, introduced by Stephen Wolfram in a 1983 paper and popularized in his book A New Kind of Science. The code is based on the observation that a table specifying the new state of each cell in the automaton, as a function of the states in its neighborhood, may be interpreted as a k-digit number in the S-ary positional number system, where S is the number of states that each cell in the automaton may have, k = S2n + 1 is the number of neighborhood configurations, and n is the radius of the neighborhood. Thus, the Wolfram code for a particular rule is a number in the range from 0 to SS − 1, converted from S-ary to decimal notation. It may be calculated as follows: List all the S2n + 1 possible state configurations of the neighbourhood of a given cell. Interpreting each configuration as a number as described above, sort them in descending numerical order. For each configuration, list the state which the given cell will have, according to this rule, on the next iteration. Interpret the resulting list of states again as an S-ary number, and convert this number to decimal. The resulting decimal number is the Wolfram code. The Wolfram code does not specify the size (nor shape) of the neighbourhood, nor the number of states — these are assumed to be known from context. When used on their own without such context, the codes are often assumed to refer to the class of elementary cellular automata, two-state one-dimensional cellular automata with a (contiguous) three-cell neighbourhood, which Wolfram extensively investigates in his book. Notable rules in this class include rule 30, rule 110, and rule 184. Rule 90 is also interesting because it creates Pascal's triangle modulo 2. A code of this type suffixed by an R, such as "Rule 37R", indicates a second-order cellular automaton with the same neighborhood structure. While in a strict sense every Wolfram code in the valid range def
https://en.wikipedia.org/wiki/Crossing%20Numbers%20of%20Graphs
Crossing Numbers of Graphs is a book in mathematics, on the minimum number of edge crossings needed in graph drawings. It was written by Marcus Schaefer, a professor of computer science at DePaul University, and published in 2018 by the CRC Press in their book series Discrete Mathematics and its Applications. Topics The main text of the book has two parts, on the crossing number as traditionally defined and on variations of the crossing number, followed by two appendices providing background material on topological graph theory and computational complexity theory. After introducing the problem, the first chapter studies the crossing numbers of complete graphs (including Hill's conjectured formula for these numbers) and complete bipartite graphs (Turán's brick factory problem and the Zarankiewicz crossing number conjecture), again giving a conjectured formula). It also includes the crossing number inequality, and the Hanani–Tutte theorem on the parity of crossings. The second chapter concerns other special classes of graphs including graph products (especially products of cycle graphs) and hypercube graphs. After a third chapter relating the crossing number to graph parameters including skewness, bisection width, thickness, and (via the Albertson conjecture) the chromatic number, the final chapter of part I concerns the computational complexity of finding minimum-crossing graph drawings, including the results that the problem is both NP-complete and fixed-parameter tractable. In the second part of the book, two chapters concern the rectilinear crossing number, describing graph drawings in which the edges must be represented as straight line segments rather than arbitrary curves, and Fáry's theorem that every planar graph can be drawn without crossings in this way. Another chapter concerns 1-planar graphs and the associated local crossing number, the smallest number such that the graph can be drawn with at most crossings per edge. Two chapters concern book embedd
https://en.wikipedia.org/wiki/Zbus
Z Matrix or bus impedance matrix in computing is an important tool in power system analysis. Though, it is not frequently used in power flow study, unlike Ybus matrix, it is, however, an important tool in other power system studies like short circuit analysis or fault study. The Zbus matrix can be computed by matrix inversion of the Ybus matrix. Since the Ybus matrix is usually sparse, the explicit Zbus matrix would be dense and very memory intensive to handle directly. Context Electric power transmission needs optimization. Only Computer simulation allows the complex handling required. The Zbus matrix is a big tool in that box. Formulation Z Matrix can be formed by either inverting the Ybus matrix or by using Z bus building algorithm. The latter method is harder to implement but more practical and faster (in terms of computer run time and number of floating-point operations per second) for a relatively large system. Formulation: Because the Zbus is the inverse of the Ybus, it is symmetrical like the Ybus. The diagonal elements of the Zbus are referred to as driving-point impedances of the buses and the off-diagonal elements are called transfer impedances. One reason the Ybus is so much more popular in calculation is the matrix becomes sparse for large systems; that is, many elements go to zero as the admittance between two far away buses is very small. In the Zbus, however, the impedance between two far away buses becomes very large, so there are no zero elements, making computation much harder. The operations to modify an existing Zbus are straightforward, and outlined in Table 1. To create a Zbus matrix from scratch, we start by listing the equation for one branch: Then we add additional branches according to Table 1 until each bus is expressed in the matrix:
https://en.wikipedia.org/wiki/Belyi%27s%20theorem
In mathematics, Belyi's theorem on algebraic curves states that any non-singular algebraic curve C, defined by algebraic number coefficients, represents a compact Riemann surface which is a ramified covering of the Riemann sphere, ramified at three points only. This is a result of G. V. Belyi from 1979. At the time it was considered surprising, and it spurred Grothendieck to develop his theory of dessins d'enfant, which describes non-singular algebraic curves over the algebraic numbers using combinatorial data. Quotients of the upper half-plane It follows that the Riemann surface in question can be taken to be the quotient H/Γ (where H is the upper half-plane and Γ is a subgroup of finite index in the modular group) compactified by cusps. Since the modular group has non-congruence subgroups, it is not the conclusion that any such curve is a modular curve. Belyi functions A Belyi function is a holomorphic map from a compact Riemann surface S to the complex projective line P1(C) ramified only over three points, which after a Möbius transformation may be taken to be . Belyi functions may be described combinatorially by dessins d'enfants. Belyi functions and dessins d'enfants – but not Belyi's theorem – date at least to the work of Felix Klein; he used them in his article to study an 11-fold cover of the complex projective line with monodromy group PSL(2,11). Applications Belyi's theorem is an existence theorem for Belyi functions, and has subsequently been much used in the inverse Galois problem.
https://en.wikipedia.org/wiki/EMBRACE%20%28telescope%29
EMBRACE (Electronic MultiBeam Radio Astronomy ConcEpt) is a prototype radio telescope for phase two of the Square Kilometre Array (SKA) project. It's the first dense phased array for radioastronomy in the GHz frequency range (initially planned for covering the 0.5-1.5 GHz, mid-frequency band of SKA). It is composed of two sites, one at the Nançay radio telescope station in France, and one near the Westerbork Synthesis Radio Telescope antennas in Netherlands.
https://en.wikipedia.org/wiki/Chandrasekhar%E2%80%93Fermi%20method
Chandrasekhar–Fermi method or CF method or Davis–Chandrasekhar–Fermi method is a method that is used to calculate the mean strength of the interstellar magnetic field that is projected on the plane of the sky. The method was described by Leverett Davis Jr in 1951 and independently by Subrahmanyan Chandrasekhar and Enrico Fermi in 1953. According to this method, the magnetic field in the plane of the sky is given by where is the mass density, is the line-of-sight velocity dispersion and is the dispersion of polarization angles and is an order unity factor, which is typically taken it to be . The method is also employed for prestellar molecular clouds.
https://en.wikipedia.org/wiki/Q%20fever
Q fever or query fever is a disease caused by infection with Coxiella burnetii, a bacterium that affects humans and other animals. This organism is uncommon, but may be found in cattle, sheep, goats, and other domestic mammals, including cats and dogs. The infection results from inhalation of a spore-like small-cell variant, and from contact with the milk, urine, feces, vaginal mucus, or semen of infected animals. Rarely, the disease is tick-borne. The incubation period can range from . Humans are vulnerable to Q fever, and infection can result from even a few organisms. The bacterium is an obligate intracellular pathogenic parasite. Signs and symptoms The incubation period is usually two to three weeks. The most common manifestation is flu-like symptoms: abrupt onset of fever, malaise, profuse perspiration, severe headache, muscle pain, joint pain, loss of appetite, upper respiratory problems, dry cough, pleuritic pain, chills, confusion, and gastrointestinal symptoms, such as nausea, vomiting, and diarrhea. About half of infected individuals exhibit no symptoms. During its course, the disease can progress to an atypical pneumonia, which can result in a life-threatening acute respiratory distress syndrome, usually occurring during the first four to five days of infection. Less often, Q fever causes (granulomatous) hepatitis, which may be asymptomatic or become symptomatic with malaise, fever, liver enlargement, and pain in the right upper quadrant of the abdomen. This hepatitis often results in the elevation of transaminase values, although jaundice is uncommon. Q fever can also rarely result in Retinal vasculitis. The chronic form of Q fever is virtually identical to endocarditis (i.e. inflammation of the inner lining of the heart), which can occur months or decades following the infection. It is usually fatal if untreated. However, with appropriate treatment, the mortality falls to around 10%. A minority of Q fever survivors develops Q fever fatigue syndrom
https://en.wikipedia.org/wiki/Network%20homophily
Network homophily refers to the theory in network science which states that, based on node attributes, similar nodes may be more likely to attach to each other than dissimilar ones. The hypothesis is linked to the model of preferential attachment and it draws from the phenomenon of homophily in social sciences and much of the scientific analysis of the creation of social ties based on similarity comes from network science. In fact, empirical research seems to indicate the frequent occurrence of homophily in real networks. Homophily in social relations may lead to a commensurate distance in networks leading to the creation of clusters that have been observed in social networking services. Homophily is a key topic in network science as it can determine the speed of the diffusion of information and ideas. Node attributes and homophily The existence of network homophily may necessitate a closer examination of node attributes as opposed to other theories on network evolution which focus on network properties. It is often assumed that nodes are identical and the evolution of networks is determined by the characteristics of the broader network such as the degree. Degree heterogeneity is also observed as a prevalent phenomenon (with a large number of nodes having a small number of links and a few of them having many). It may be linked to homophily as the two seem to show similar characteristics in networks. A large number of excess links caused by degree heterogeneity might be confused with homophily. Influence on network evolution Kim and Altmann (2017) find that homophily may affect the evolution of the degree distribution of scale-free networks. More specifically, homophily may cause a bias towards convexity instead of the often hypothesised concave shape of networks. Thus, homophily can significantly (and uniformly) affect the emergence of scale-free networks influenced by preferential attachment, regardless of the type of seed networks observed (e.g. whether it i
https://en.wikipedia.org/wiki/International%20Comprehensive%20Ocean-Atmosphere%20Data%20Set
The International Comprehensive Ocean-Atmosphere Data Set (ICOADS) is a digital database of 261 million weather observations made by ships, weather ships, and weather buoys spanning the years 1662 to 2007. The database was initially constructed in 1985 and continues to be expanded upon and updated on a regular basis. From the original data, gridded datasets were created. ICOADS information has been useful in determining the reliability of ship and buoy wind measurements, helping to determine temperature trends in the sea surface temperature field, and updating the Atlantic hurricane database. History Beginning in 1981, the Comprehensive Ocean-Atmosphere data set (COADS) began construction within the United States. In April 1985, the first version of this database was created, including 70 million reports covering the years of 1854 through 1979. Each year thereafter, recent data was added to the dataset to extend its length towards the present. In November 1996, the first gridded datasets were created, using a one degree latitude by one degree longitude grid for the years 1960 through 1993. The following November, they were revised and extended to cover years through 1995. In November 1999, they were extended into 1997. In March 2002, two degree grids were created on a monthly basis for the years 1800 through 1949, and in recognition of input from other countries such as the United Kingdom and Germany, the database was renamed ICOADS. In November 2000, the database was extended backward to 1784. In September 2002, this data became available through the internet. In late 2005, data from weather buoys were added into the database. In July 2009, the database was extended back to 1662. Data updates to include recent observations were begun on a monthly basis. The total number of observations in the database is now 261 million. Available information ICOADS has a variety of marine meteorological information within it, such as air temperature, sea surface t
https://en.wikipedia.org/wiki/Membrane%20emulsification
Membrane emulsification (ME) is a relatively novel technique for producing all types of single and multiple emulsions for DDS (drug delivery systems), solid micro carriers for encapsulation of drug or nutrient, solder particles for surface-mount technology, mono dispersed polymer microspheres (for analytical column packing, enzyme carriers, liquid crystal display spacers, toner core particles). Membrane emulsification was introduced by Nakashima and Shimizu in the late 1980s in Japan. Description In this process, the dispersed phase is forced through the pores of a microporous membrane directly into the continuous phase. Emulsified droplets are formed and detached at the end of the pores with a drop-by-drop mechanism. The advantages of membrane emulsification over conventional emulsification processes are that it enables one to obtain very fine emulsions of controlled droplet sizes and narrow droplet size distributions. Successful emulsification can be carried out with much less consumption of emulsifier and energy, and because of the lowered shear stress effect, membrane emulsification allows the use of shear-sensitive ingredients, such as starch and proteins. The membrane emulsification process is generally carried out in cross-flow (continuous or batch) mode or in a stirred cell (batch). A major limiting factor of ME was the low dispersed phase flux. In order to expand the industrial applications, the productivity of this method had to be increased. Some research has been aimed at solving this problem and others, such as membrane fouling. High dispersed phase flux has now been shown to be possible using single-pass annular gap crossflow membranes.
https://en.wikipedia.org/wiki/Bruno%20Degazio
Bruno Degazio (born March 31, 1958) is a composer, researcher and film sound designer based in Ontario, Canada, where he is also a professor at Sheridan College. Degazio is an expert on computer music. Education Degazio received bachelor's and master's degrees in music from the University of Toronto, where he studied music composition with Gustav Ciamaga, as well as Schenkerian analysis, and sound synthesis. He helped establish a contemporary music ensemble, and finished his studies there in 1981. Career Degazio is notable for, among other things, implementing computer music algorithms that were devised by the music theorist Joseph Schillinger, and for designing systems to reverse engineer music production from theories about music theory. He has also studied musical aspects of fractal geometry, for automated composition of music. Degazio was one of the first people in the world to apply fractal techniques to algorithmic composition with some degree of depth. Degazio is proficient with wind controllers, also known as wind synthesizers. His arrangements for this instrument include works by Johann Sebastian Bach and others. Degazio's work on films led to a Genie award nomination for the film Bye Bye Blues, plus prizes from the Baltimore Film Festival and the Toronto Advertising Awards. He also has developed sound tracks for two 3-D IMAX films at the 1990 World's Fair in Osaka, Japan. Arrangements of the Goldberg Variations Degazio has arranged a number of pieces from the Goldberg Variations for other instruments. Following are several examples, performed on electronic wind instrument: Writings "Musical aspects of fractal geometry", Proceedings of the International Computer Music Conference, (San Francisco 1986) "The MIDIFORTH computer music system", Proceedings of Printemps Electroacoustique (Montreal 1987) "The development of context sensitivity in the MIDIFORTH computer music system", Proceedings of the International Computer Music Conference, (Cologne 1988) "
https://en.wikipedia.org/wiki/ISO/IEC%209126
ISO/IEC 9126 Software engineering — Product quality was an international standard for the evaluation of software quality. It has been replaced by ISO/IEC 25010:2011. The fundamental objective of the ISO/IEC 9126 standard is to address some of the well-known human biases that can adversely affect the delivery and perception of a software development project. These biases include changing priorities after the start of a project or not having any clear definitions of "success". By clarifying, then agreeing on the project priorities and subsequently converting abstract priorities (compliance) to measurable values (output data can be validated against schema X with zero intervention), ISO/IEC 9126 tries to develop a common understanding of the project's objectives and goals. The standard is divided into four parts: quality model external metrics internal metrics quality in use metrics. Quality The quality model presented in the first part of the standard, ISO/IEC 9126-1, classifies software quality in a structured set of characteristics and sub-characteristics as follows: Functionality - "A set of attributes that bear on the existence of a set of functions and their specified properties. The functions are those that satisfy stated or implied needs." Suitability Accuracy Interoperability Security Functionality compliance Reliability - "A set of attributes that bear on the capability of software to maintain its level of performance under stated conditions for a stated period of time." Maturity Fault tolerance Recoverability Reliability compliance Usability - "A set of attributes that bear on the effort needed for use, and on the individual assessment of such use, by a stated or implied set of users." Understandability Learnability Operability Attractiveness Usability compliance Efficiency - "A set of attributes that bear on the relationship between the level of performance of the software and the amount of resources used, under stated conditions."
https://en.wikipedia.org/wiki/Undocumented%20feature
An undocumented feature is an unintended or undocumented hardware operation, for example an undocumented instruction, or software feature found in computer hardware and software that is considered beneficial or useful. Sometimes the documentation is omitted through oversight, but undocumented features are sometimes not intended for use by end users, but left available for use by the vendor for software support and development. Also, some unintended operation of hardware or software that ends up being of utility to users is simply a bug, flaw or quirk. Since the suppliers of the software usually consider the software documentation to constitute a contract for the behavior of the software, undocumented features are generally left unsupported and may be removed or changed at will and without notice to the users. Undocumented or unsupported features are sometimes also called "not manufacturer supported" (NOMAS), a term coined by PPC Journal in the early 1980s. Some user-reported defects are viewed by software developers as working as expected, leading to the catchphrase "it's not a bug, it's a feature" (INABIAF) and its variations. Hardware Undocumented instructions, known as illegal opcodes, on the MOS Technology 6502 and its variants are sometimes used by programmers. These were removed in the WDC 65C02. Video game and demoscene programmers for the Amiga have taken advantage of the unintended operation of its coprocessors to produce new effects or optimizations. In 2019, researchers discovered that a manufacturer debugging mode, known as VISA, had an undocumented feature on Intel Platform Controller Hubs (PCHs), chipsets included on most Intel-based motherboards, which makes the mode accessible with a normal motherboard. Since the chipset has direct memory access this is problematic for security reasons. Software Undocumented features (for example, the ability to change the switch character in MS-DOS, usually to a hyphen) can be included for compatibility purp
https://en.wikipedia.org/wiki/Windows%20Glyph%20List%204
Windows Glyph List 4, or more commonly WGL4 for short, also known as the Pan-European character set, is a character repertoire on Microsoft operating systems comprising 657 Unicode characters, two of them private use. Its purpose is to provide an implementation guideline for producers of fonts for the representation of European natural languages; fonts that provide glyphs for the entire set of characters can claim WGL4 compliance and thus can expect to be compatible with a wide range of software. , WGL4 characters were the only ones guaranteed to display correctly on Microsoft Windows. More recent versions of Windows display far more glyphs. Because many fonts are designed to fulfill the WGL4 set, this set of characters is likely to work (display as other than replacement glyphs) on many computer systems. For example, all the non-private-use characters in the table below are likely to display properly, compared to the many missing characters that may be seen in other articles about Unicode. Repertoire The repertoire, defined by Microsoft, encompasses all the characters found in Microsoft's code pages 1252 (Windows Western), 1250 (Windows Central European), 1251 (Windows Cyrillic), 1253 (Windows Greek), 1254 (Windows Turkish), and 1257 (Windows Baltic), as well as characters from DOS code page 437. It does not cover the combining diacritics used by Vietnamese-related code page 1258, the Thai letters used in code page 874, Hebrew and Arabic letters covered by code pages 1255 and 1256, or the ideographic characters used by code pages 932, 936, 949 and 950. It also does not cover the Romanian letters Ș, ș, Ț, and ț (U+0218–B), which were added to several of Microsoft's fonts for Windows Vista (long after the WGL4 repertoire was originally defined). In version 1.5 of the OpenType Specification (May 2008) four Cyrillic characters were added to the WGL4 character set: Ѐ (U+0400), Ѝ (U+040D), ѐ (U+0450) and ѝ (U+045D). Character table Legend See also Adobe Glyph L
https://en.wikipedia.org/wiki/Langley%20%28unit%29
The langley (Ly) is a unit of heat transmission, especially used to express the rate of solar radiation (or insolation) received by the earth. The unit was proposed by Franz Linke in 1942 and named after Samuel Langley (1834–1906) in 1947. Definition One langley is 1 thermochemical calorie per square centimetre, 41 840 J/m2 (joules per square metre) See also Solar constant Radiant exposure
https://en.wikipedia.org/wiki/Bateman%20Manuscript%20Project
The Bateman Manuscript Project was a major effort at collation and encyclopedic compilation of the mathematical theory of special functions. It resulted in the eventual publication of five important reference volumes, under the editorship of Arthur Erdélyi. Overview The theory of special functions was a core activity of the field of applied mathematics, from the middle of the nineteenth century to the advent of high-speed electronic computing. The intricate properties of spherical harmonics, elliptic functions and other staples of problem-solving in mathematical physics, astronomy and right across the physical sciences, are not easy to document completely, absent a theory explaining the inter-relationships. Mathematical tables to perform actual calculations needed to mesh with an adequate theory of how functions could be transformed into those already tabulated. Harry Bateman, a distinguished applied mathematician, undertook the somewhat quixotic task of trying to collate the content of the very large literature. On his death in 1946, his papers on this project were still in a uniformly rough state. The publication of the edited version provided special functions texts more up-to-date than, for example, the classic Whittaker & Watson. The volumes were out of print for many years, and copyright in the works reverted to the California Institute of Technology, who renewed them in the early 1980s. Dover planned to reprint them for publication in 2007, but this never occurred . In 2011, the California Institute of Technology gave permission for scans of the volumes to be made publicly available. Other mathematicians involved in the project include Wilhelm Magnus, Fritz Oberhettinger and Francesco Tricomi. Askey–Bateman project In 2007, the Askey–Bateman project was announced by Mourad Ismail as a five- or six-volume encyclopedic book series on special functions, based on the works of Harry Bateman and Richard Askey. Starting in 2020, Cambridge University Press bega
https://en.wikipedia.org/wiki/Fibla%20carpenteri
Fibla carpenteri is an extinct species of snakefly in the Inocelliidae genus Fibla. F. carpenteri is named in honor of the paleoentomologist Dr Frank Carpenter, for his vast knowledge and interest in Raphidioptera. The species is known from a single specimen, the holotype, deposited in the Harvard University, Museum of Comparative Zoology as specimen #9999. Dr. Michael S. Engel first studied and described the species after finding the specimen in the Harvard collections. He published his type description in the journal Psyche volume 102 published in 1995. Fairly well preserved in Eocene Baltic amber, the female individual has a torn forewing missing the distal portion, partial antennae, and the ovipositor is severed and missing the tip. There are also a number of small areas with "schimmel", a type of white mold sometimes present on arthropods in amber. With a total length, not including ovipositor or antennae, of just over , Fibla carpenteri is the largest species of snakefly from amber and the largest species of the genus. As a whole the female shows no light color marking and was a fairly uniform dark brown to black coloration. The wings are hyaline with brown coloration of the vein structure and are slightly fuscous at the base. The pterostigma is also colored brown. F. carpenteri is one of only four extinct Fibla which are known from the fossil record. Along with F. erigena F. carpenteri is one of two known from the baltic amber deposits, while F. cerdanica is from the Miocene of Spain and F. exusta is from the Eocene of the Florissant Formation, Colorado.
https://en.wikipedia.org/wiki/Stem%20cell%20therapy%20for%20macular%20degeneration
Stem cell therapy for macular degeneration is the use of stem cells to heal, replace dead or damaged cells of the macula in the retina. Stem cell based therapies using bone marrow stem cells as well as retinal pigment epithelial transplantation are being studied. A number of trials have occurred in humans with encouraging results. Historical background In 1959, the first fetal retinal transplant into the anterior chamber of the eyes of animals was reported. Cell culture experiments on RPE were carried out in 1980. Cultured human RPE cells were transplanted into the eyes of animals, first with open techniques and methods and later with closed cavity vitrectomy techniques. In 1991, Gholam Peyman transplanted RPE (Retinal Pigment Epithelium) in humans but with limited success rate. Later, allogenic fetal RPE cell transplantation was tried in which immune rejection of the graft was a major problem. It has also been observed that the rejection rates were lower in dry AMD than that in wet AMD. Autologous RPE transplantation is conventionally done employing two techniques, namely, RPE suspension and autologous full-thickness RPE-choroid transplantation. Encouraging clinical outcomes has already been reported with the transplantation of the autologous RPE choroid from the periphery of the eye to a disease affected portion. Since 2003, researchers have successfully transplanted corneal stem cells into damaged eyes to restore vision. "Sheets of retinal cells used by the team are harvested from aborted fetuses, which some people find objectionable." When these sheets are transplanted over the damaged cornea, the stem cells stimulate renewed repair, eventually restore vision. The such development was in June 2005, when researchers at the Queen Victoria Hospital of Sussex, England were able to restore the sight of forty people using the same technique. The group, led by Sheraz Daya, was able to successfully use adult stem cells obtained from the patient, a relative, or even
https://en.wikipedia.org/wiki/Levich%20constant
A Levich constant (B) is often used in order to simplify the Levich equation. Furthermore, B is readily extracted from rotating disk electrode experimental data. The B can be defined as: where n is the number of moles of electrons transferred in the half reaction (number) F is the Faraday constant (C/mol) A is the electrode area (cm2) D is the diffusion coefficient (see Fick's law of diffusion) (cm2/s) v is the kinematic viscosity (cm2/s) C is the analyte concentration (mol/cm3)
https://en.wikipedia.org/wiki/Red%20pulp
The red pulp of the spleen is composed of connective tissue known also as the cords of Billroth and many splenic sinusoids that are engorged with blood, giving it a red color. Its primary function is to filter the blood of antigens, microorganisms, and defective or worn-out red blood cells. The spleen is made of red pulp and white pulp, separated by the marginal zone; 76-79% of a normal spleen is red pulp. Unlike white pulp, which mainly contains lymphocytes such as T cells, red pulp is made up of several different types of blood cells, including platelets, granulocytes, red blood cells, and plasma. The red pulp also acts as a large reservoir for monocytes. These monocytes are found in clusters in the Billroth's cords (red pulp cords). The population of monocytes in this reservoir is greater than the total number of monocytes present in circulation. They can be rapidly mobilised to leave the spleen and assist in tackling ongoing infections. Sinusoids The splenic sinusoids, are wide vessels that drain into pulp veins which themselves drain into trabecular veins. Gaps in the endothelium lining the sinusoids mechanically filter blood cells as they enter the spleen. Worn-out or abnormal red cells attempting to squeeze through the narrow intercellular spaces become badly damaged, and are subsequently devoured by macrophages in the red pulp. In addition to clearing aged red blood cells, the sinusoids also filter out cellular debris, particles that could clutter up the bloodstream. Cells found in red pulp Red pulp consists of a dense network of fine reticular fiber, continuous with those of the splenic trabeculae, to which are applied flat, branching cells. The meshes of the reticulum are filled with blood: White blood cells are found to be in larger proportion than they are in ordinary blood. Large rounded cells, termed splenic cells, are also seen; these are capable of ameboid movement, and often contain pigment and red-blood corpuscles in their interior. The cell
https://en.wikipedia.org/wiki/Fluazinam
Fluazinam is a broad-spectrum fungicide used in agriculture. It is classed as a diarylamine and more specifically an arylaminopyridine. Its chemical name is 3-chloro-N-(3-chloro-2,6-dinitro-4-trifluoromethylphenyl)-5-trifluoromethyl-2-pyridinamine. The mode of action involves the compound being an extremely potent uncoupler of oxidative phosphorylation in mitochondria and also having high reactivity with thiols. It is unique amongst uncouplers in displaying broad-spectrum activity against fungi and also very low toxicity to mammals due to it being rapidly metabolised to a compound without uncoupling activity. It was first described in 1992 and was developed by researchers at the Japanese company Ishihara Sangyo Kaisha. Uses Fluazinam is a protectant fungicide, but is neither systemic or curative. It acts by inhibiting the germination of spores and the development of infection structures. Although it has activity against many fungi, it is less potent against rusts and powdery mildew and as such has not been commercialised for use in cereal crops. It is widely used to control late blight (P. infestans) in potato due to its activity against the zoospores of the pathogen which makes it particularly effective at controlling infection of the potato tubers. Because of its extensive usage to control late blight in Europe, there are confirmed reports of resistance to fluazinam appearing in P. infestans in genotypes EU_33_A2 and EU_37_A2. Fluazinam is also used to control Sclerotinia on peanuts and turf, Botrytis on grapes and beans and clubroot in brassicas. Toxicity The acute oral median lethal dose in rats is very low at over 5000 mg/kg due to the compound's reactivity with thiols. This reactivity can have negative consequences since repeated exposure can cause skin sensitization and dermatitis to develop in some individuals. Fluazinam also displays low toxicity to birds, bees and worms, but has a high toxicity to fish. The toxicity towards fish is considered to be relat
https://en.wikipedia.org/wiki/SciPy
SciPy (pronounced "sigh pie") is a free and open-source Python library used for scientific computing and technical computing. SciPy contains modules for optimization, linear algebra, integration, interpolation, special functions, FFT, signal and image processing, ODE solvers and other tasks common in science and engineering. SciPy is also a family of conferences for users and developers of these tools: SciPy (in the United States), EuroSciPy (in Europe) and SciPy.in (in India). Enthought originated the SciPy conference in the United States and continues to sponsor many of the international conferences as well as host the SciPy website. The SciPy library is currently distributed under the BSD license, and its development is sponsored and supported by an open community of developers. It is also supported by NumFOCUS, a community foundation for supporting reproducible and accessible science. Components The SciPy package is at the core of Python's scientific computing capabilities. Available sub-packages include: cluster: hierarchical clustering, vector quantization, K-means constants: physical constants and conversion factors fft: Discrete Fourier Transform algorithms fftpack: Legacy interface for Discrete Fourier Transforms integrate: numerical integration routines interpolate: interpolation tools io: data input and output linalg: linear algebra routines misc: miscellaneous utilities (e.g. example images) ndimage: various functions for multi-dimensional image processing ODR: orthogonal distance regression classes and algorithms optimize: optimization algorithms including linear programming signal: signal processing tools sparse: sparse matrices and related algorithms spatial: algorithms for spatial structures such as k-d trees, nearest neighbors, Convex hulls, etc. special: special functions stats: statistical functions weave: tool for writing C/C++ code as Python multiline strings (now deprecated in favor of Cython) Data structures The basic dat
https://en.wikipedia.org/wiki/Martin%27s%20axiom
In the mathematical field of set theory, Martin's axiom, introduced by Donald A. Martin and Robert M. Solovay, is a statement that is independent of the usual axioms of ZFC set theory. It is implied by the continuum hypothesis, but it is consistent with ZFC and the negation of the continuum hypothesis. Informally, it says that all cardinals less than the cardinality of the continuum, , behave roughly like . The intuition behind this can be understood by studying the proof of the Rasiowa–Sikorski lemma. It is a principle that is used to control certain forcing arguments. Statement For any cardinal 𝛋, consider the following statement: MA(𝛋) For any partial order P satisfying the countable chain condition (hereafter ccc) and any family D of dense subsets of P such that |D| ≤ 𝛋, there is a filter F on P such that F ∩ d is non-empty for every d in D. In this case (for application of ccc), an antichain is a subset A of P such that any two distinct members of A are incompatible (two elements are said to be compatible if there exists a common element below both of them in the partial order). This differs from, for example, the notion of antichain in the context of trees. MA(&aleph;0) is simply true — the Rasiowa–Sikorski lemma. MA(2&aleph;0) is false: [0, 1] is a separable compact Hausdorff space, and so (P, the poset of open subsets under inclusion, is) ccc. But now consider the following two size-2&aleph;0= families of dense sets in P: no x∈[0, 1] is isolated, and so each x defines the dense subset {S : x∉S}. And each r∈(0, 1], defines the dense subset {S : diam(S)<r}. The two families combined are also of size , and a filter meeting both must simultaneously avoid all points of [0, 1] while containing sets of arbitrarily small diameter. But a filter F containing sets of arbitrarily small diameter must contain a point in &bigcap;F by compactness. (See also .) Martin's axiom is then that MA(κ) holds "as long as possible": Martin's axiom (MA) For every 𝛋 < , MA(𝛋) ho
https://en.wikipedia.org/wiki/Alpha%20oxidation
Alpha oxidation (α-oxidation) is a process by which certain branched-chain fatty acids are broken down by removal of a single carbon from the carboxyl end. In humans, alpha-oxidation is used in peroxisomes to break down dietary phytanic acid, which cannot undergo beta-oxidation due to its β-methyl branch, into pristanic acid. Pristanic acid can then acquire acetyl-CoA and subsequently become beta oxidized, yielding propionyl-CoA. Pathway Alpha-oxidation of phytanic acid is believed to take place entirely within peroxisomes. Phytanic acid is first attached to CoA to form phytanoyl-CoA. Phytanoyl-CoA is oxidized by phytanoyl-CoA dioxygenase, in a process using Fe2+ and O2, to yield 2-hydroxyphytanoyl-CoA. 2-hydroxyphytanoyl-CoA is cleaved by 2-hydroxyphytanoyl-CoA lyase in a TPP-dependent reaction to form pristanal and formyl-CoA (in turn later broken down into formate and eventually CO2). Pristanal is oxidized by aldehyde dehydrogenase to form pristanic acid (which can then undergo beta-oxidation). (Propionyl-CoA is released as a result of beta oxidation when the beta carbon is substituted) Deficiency Enzymatic deficiency in alpha-oxidation (most frequently in phytanoyl-CoA dioxygenase) leads to Refsum's disease, in which the accumulation of phytanic acid and its derivatives leads to neurological damage. Other disorders of peroxisome biogenesis also prevent alpha-oxidation from occurring.
https://en.wikipedia.org/wiki/Oportuzumab%20monatox
Oportuzumab monatox is an experimental anti-cancer medication. Chemically, oportuzumab is a single chain variable fragment of a monoclonal antibody which binds to epithelial cell adhesion molecule (EpCAM, the tumor-associated calcium signal transducer 1). Oportuzumab is fused with Pseudomonas aeruginosa exotoxin A (which is reflected by the monatox in the medication's name). The drug was developed by Canadian-based Viventia Bio Inc. The company was acquired by Cambridge(MA)-based Eleven Biotherapeutics in 2016, which then changed its name to Sesen Bio. In 2019 Sesen Bio reported updated, preliminary primary and secondary endpoint data from the company's Phase 3 VISTA trial further supporting the strong benefit-risk profile of Vicineum for the potential treatment of patients with high-risk, bacillus Calmette-Guérin(BCG) unresponsive, non-muscle invasive bladder cancer (NMIBC). The Company applied for approval of Vicineum by the United States Food and Drug Administration and the European Medicines Agency.
https://en.wikipedia.org/wiki/Bromethalin
Bromethalin is a neurotoxic rodenticide that damages the central nervous system. History Bromethalin was discovered in the early 1980s through an approach to find replacement rodenticides for first-generation anticoagulants, especially to be useful against rodents that had become resistant to Warfarin-type anticoagulant poisons. A structured study was undertaken to develop a substance that would be both poisonous to rodents, but also would be readily eaten by rodents. Bromethalin—N-methyl-2,4-dinitro-N (2,4,6-tribromophenyl)-6-(trifluoromethyl) benzeneamine— was the outcome of that study, as the specific formulation had both desired rodenticidal properties. Mechanism of action Bromethalin works by being metabolised to n-desmethyl-bromethalin and uncoupling mitochondrial oxidative phosphorylation, which causes a decrease in adenosine triphosphate (ATP) synthesis. The decreased ATP inhibits the activity of the Na/K ATPase enzyme, thereby leading to a subsequent buildup of cerebral spinal fluid (CSF) and vacuolization of myelin. The excess CSF results in increased intracranial pressure, which in turn permanently damages neuronal axons. This damage to the central nervous system can cause paralysis, convulsions, and death. Risk of poisoning to humans and pets Despite risk of severe symptoms and death, most unintentional pediatric exploratory exposures (licking or tasting a pellet) have not shown serious effects, and no deaths have been reported at this time in children, though toxicity is possible if significant amounts are ingested. Due to need for active metabolite generation to produce toxicity, fatal toxicity may be delayed by hours to days. All cases should be managed in consultation with a local poison control center. All intentional ingestions for self harm carry significant risk of death or severe neurologic effects and require monitoring in a hospital setting. In humans the most common initial effects of unintentional exposure are nausea, vomiting, abdomin
https://en.wikipedia.org/wiki/Xylotheque
A xylotheque or xylothek (from the Greek for "wood" and meaning "repository") is special form of herbarium that consists of a collection of authenticated wood specimens. It is also known as a xylarium (from the Greek for "wood" and Latin meaning "separate place"). Traditionally, xylotheque specimens were in the form of book-shaped volumes, each made of a particular kind of wood and holding samples of the different parts of the corresponding plant. While the terms are often used interchangeably, some use xylotheque to refer to these older collections of wooden 'books' and xylarium for modern collections in which some or all of the specimens are in simpler shapes, such as blocks or plaques with information engraved on their surfaces. Many countries have at least one xylotheque with native flora, and some also house flora from other parts of the world. They are valuable to specialists in forestry, botany, conservation, forensics, art restoration, paleontology, archaeology, and other fields. History Xylotheques date back to the later 17th century, when wood specimens began to appear in cabinets of curiosity. Over time, they grew larger and more systematic, with hundreds of individual volumes in a single collection. The oldest extant collection was established in 1823 at the University of Leningrad, and by the middle of the century they had been established in many European countries. Australia now houses 12 xylaria holding 11% of the world's wood specimens, while the Oxford Forestry Institute's xylarium holds about 13%. In older xylotheques, the wooden volumes were typically made out of the same wood as the specimens inside and sometimes decorated with tree bark and associated lichens and mosses. Each volume housed seeds, flowers, twigs, and leaves from the corresponding tree or bush, along with a written description hidden in a small compartment set into the inner spine. An alternative form of xylotheque found in Japan and elsewhere featured paintings of the plan
https://en.wikipedia.org/wiki/Cloud%20storage%20gateway
A cloud storage gateway is a hybrid cloud storage device, implemented in hardware or software, which resides at the customer premises and translates cloud storage APIs such as SOAP or REST to block-based storage protocols such as iSCSI or Fibre Channel or file-based interfaces such as NFS or SMB. According to a 2011 report by Gartner Group, cloud gateways were expected to increase the use of cloud storage by lowering monthly charges and eliminating the concern of data security. Technology Features Modern applications (aka "cloud native applications") use network attached storage by means of REST and SOAP with hypertext transfer protocol on the protocol layer. The related storage is provided from arrays that offer these as object storage. Classic applications use network attached storage by means of Network File System NFS, iSCSI or Server Message Block SMB. To make use of all the advantages of object storage, existing applications need to be rewritten, and new applications must be object storage aware, which is not the case by default. This problem is addressed by cloud storage gateways. They offer object storage via classic native storage protocols like Network File System NFS or Server Message Block SMB (and a very few offer iSCSI as well). As a rule of thumb, classic applications with cloud native object storage can now be used with cloud storage gateways. Functionality In enterprise infrastructures, NFS is mainly used by Linux systems whereas Windows systems are using SMB. Object storage needs data in the form of objects rather than files. For all cloud storage gateways, it is mandatory to cache the incoming files and destage them to object storage on a later step. The time of destaging is subject to the gateway and a policy engine allows functions like pinning = bind specific files to the cache and destage them only for mirroring purpose content based destaging = move only files with specific characteristics to object storage e.g. all MP3 files multi-
https://en.wikipedia.org/wiki/Osteophagy
Osteophagy is the practice in which animals, usually herbivores, consume bones. Most vegetation around the world lacks sufficient amounts of phosphate. Phosphorus is an essential mineral for all animals, as it plays a major role in the formation of the skeletal system, and is necessary for many biological processes including: energy metabolism, protein synthesis, cell signaling, and lactation. Phosphate deficiencies can cause physiological side effects, especially pertaining to the reproductive system, as well as side effects of delayed growth and failure to regenerate new bone. The importance of having sufficient amounts of phosphorus further resides in the physiological importance of maintaining a proper phosphorus to calcium ratio. Having a Ca:P ratio of 2:1 is important for the absorption of these minerals, as deviations from this optimal ratio can inhibit their absorption. Dietary calcium and phosphorus ratio, along with vitamin D, regulates bone mineralization and turnover by affecting calcium and phosphorus transport and absorption in the intestine. It has been suggested that osteophagy is an innate behavior that allows animals to supplement their phosphorus and calcium uptake in order to avoid the costly effects of deficiencies in these minerals. Osteophagic behavior has been observed in pastoral and wild animals, most notably ungulates and other herbivores, for over two hundred years. Osteophagy has been inferred from archaeological studies of dental wear in Pleistocene fossils dating back 780,000 years. It has been seen in domestic animals, as well as red deer, camels, giraffes, wildebeest, antelopes, tortoises, and grizzly bears. Due to differences in tooth structure, herbivores tend to chew old dry bones that are easier to break, while carnivores prefer to chew softer fresh bones. Variations of the behavior have also been observed in humans. While osteophagy has been regarded as a beneficial behavior to combat mineral deficiencies in animals, osteopha
https://en.wikipedia.org/wiki/Carl%20Griffith%27s%20sourdough%20starter
Carl Griffith's sourdough starter, also known as the Oregon Trail Sourdough or Carl's starter, is a sourdough culture, a colony of wild yeast and bacteria cultivated in a mixture of flour and water for use as leavening. Carl's starter has a long history, dating back at least to 1847, when it was carried along the Oregon Trail by settlers from Missouri to Oregon. It was then passed down as an heirloom within the family of Carl Griffith, who shared it via Usenet in the 1990s. Since the year 2000, it has been maintained and shared by a dedicated historical preservation society; its volunteers keep the starter alive, feeding the organisms flour and water, and mail free samples worldwide on request for use by bakers in seeding their own cultures. As with any other sourdough starter, the yeasts in Carl's starter generate carbon dioxide by fermentation when added to bread dough, causing it to rise. Bacteria of the genus Lactobacillus produce lactic acid, giving the bread a sour flavor. Carl's starter is especially robust, quick-rising, and tolerant of mistreatment, producing a consistent, reliable rise and good flavor. History According to Carl T. Griffith, his family's sourdough culture was originally created by his great-grandmother, who traveled with her sourdough west from Missouri along the Oregon Trail in 1847, settling near Salem, Oregon. The sourdough starter was passed down to 10-year-old Carl Griffith in about 1930 in a Basque-American sheep camp. His family was building a homestead in the Steens Mountains at the time, and he baked bread in a Dutch oven in a campfire-heated pit. Griffith took his starter on cattle drives in southeastern Oregon, during which he baked in chuck wagons. Griffith went on to be a lawyer, World War II veteran, and retired lieutenant colonel of the United States Air Force Reserve. Usenet The starter became publicly known in the 1990s through the early Internet via Griffith's presence on Usenet's rec.food.sourdough group. Though on
https://en.wikipedia.org/wiki/Medical%20consensus
Medical consensus is a public statement on a particular aspect of medical knowledge at the time the statement is made that a representative group of experts agree to be evidence-based and state-of-the-art (state-of-the-science) knowledge. Its main objective is to counsel physicians on the best possible and acceptable way to diagnose and treat certain diseases or how to address a particular decision-making area. It is usually, therefore, considered an authoritative, community-based expression of a consensus decision-making and publication process. Methods There are many ways of producing medical consensus, but the most usual way is to convene an independent panel of experts, either by a medical association or by a governmental authority. Since consensus statements provide a "snapshot in time" of the state of knowledge in a particular topic, they must periodically be re-evaluated and published again, replacing the previous consensus statement. Consensus statements differ from medical guidelines, another form of state-of-the-science public statements. According to the NIH, "Consensus statements synthesize new information, largely from recent or ongoing medical research, that has implications for reevaluation of routine medical practices. They do not give specific algorithms or guidelines for practice." History From 1977 to 2013, the National Institutes of Health (United States) promoted about five to six consensus panels per year, and organized this knowledge by means of a special Consensus Development Program, managed by the NIH's Office of Disease Prevention (ODP). It was retired in 2013 in deference to other agencies and organizations that had picked up the lead, such as the U.S. Preventive Services Task Force, the Community Preventive Services Task Force, Institute of Medicine, and Cochrane. Its archive is available in printed form as well as for downloading from the Internet. See also Medical decision making Evidence-based medicine Medical literature Me
https://en.wikipedia.org/wiki/Upload%20components
Upload components are software products that are designed to be embedded into a web site to add upload functionality to it. Upload components are designed to replace the standard HTML4 upload mechanism. Compared with HTML4, Upload Components have a more user-friendly interface and support a wider range of features. HTML file uploads The HTML4 standard supports requesting data to be requested from a client computer and uploaded to a server. The standard mechanism for this type of data transmission is HTML forms. With HTML forms a user's files can be uploaded by employing tag with different attributes. This method allows web site developers to implement basic upload functionality. However, it has the following disadvantages: Multiple files upload is not available – a user can upload only 1 file a time. Limited upload size - it is usually impossible to send files up to dozens of megabytes (MB) via HTTP. No optimization before uploading files is available. Poor visualization – a user cannot see any information about the upload progress and estimated upload time. Preview of selected files is not supported. Awkward look and feel – the way a user selects files for upload is inconvenient. HTML upload alternatives Upload components allow for bypassing the HTML upload restrictions and disadvantages noted previously. An upload component is a plug-in which allows uploading files from a client to a server. Usually upload components are developed by third party companies and can be integrated with any website on any platform. The user's Web browser will display the embedded upload component as a part of the web page. Upload components can be built with various technologies: Flash, Silverlight, Java, ActiveX, and HTML5. The W3C community is in the process of developing a HTML5 standard, the full specification of which is expected by 2014. HTML5 is supposed to support multimedia content without any plug-ins or components. For upload functionality, new HTML5 APIs offer a wide
https://en.wikipedia.org/wiki/Herbert%20Booth
Herbert Henry Howard Booth (26 August 1862 – 25 September 1926) was a Salvation Army officer, the third son of five children to William and Catherine Booth (Mumford), who later went on to serve as an independent evangelist. He oversaw the Limelight Department's development and he was the writer and director for Soldiers of the Cross. Early life Herbert, who was born in Penzance, Cornwall, received little formal elementary education but became a student at Allesly Park College and the Congregational Institute at Nottingham. At the age of twenty, Herbert began helping his sister Kate Booth in building up The Salvation Army in France. Two years later, he was given charge of England's cadet officer training. He wrote many songs for The Salvation Army and became a bandmaster and a songster leader. He was the first Salvation Army Officer to use the magic lantern for presentations in England. In 1886, Herbert Booth took ill and went to Australia to rest and heal. While staying in a mining town there, he found a gold nugget. He eventually forged a ring out of it for his future wife, Dutch Salvationist Cornelie Schoch. Salvation Army Herbert Booth took command of all Salvation Army operations in the British Isles when he was 26. Then, from 1892–1896, he was the Commandant for the Salvation Army in Canada. Next, he was appointed to the Australasian Territory where his health continued to deteriorate. He struggled with depression, but was still very active in his position. In Australia, Herbert took considerable interest in the Salvation Army's Limelight Department there. He soon authorized extensive expansion, allowing Limelight to make Australia's first fictional narrative film in 1897. The following year, he and early cinematographer Joe Perry produced Social Salvation, a multimedia presentation that portrayed the work of The Salvation Army in its Australasian Territory. Whilst appointed to the territory Herbert also founded the Hamodava Tea Company, which pion
https://en.wikipedia.org/wiki/Northern%20resident%20orcas
Northern resident orcas, also known as northern resident killer whales (NRKW), are one of four separate, non-interbreeding communities of the exclusively fish-eating ecotype of orca in the northeast portion of the North Pacific Ocean. They live primarily off the coast of British Columbia (BC), Canada, and also travel to southeastern Alaska and northern Washington state in the United States. The northern resident population consists of three clans (A, G, R) that consists of several pods with one or more matrilines within each pod. The northern residents are genetically distinct from the southern resident orcas and their calls are also quite distinct. Social structure Like the Southern residents, the Northern residents live in groups of matrilines. A typical Northern resident matriline group consists of an elder female, her offspring, and the offspring of her daughters. Both males and female orcas remain within their natal matriline for life. Matrilines have a tendency to split apart over time. Pods consists of related matrilines that tend to travel, forage, socialize, and rest together. Each pod has a unique dialect of acoustic calls. Pods that share one or more certain calls belong to a common clan. Behaviours In the summer months the Northern residents can often be observed swimming close to shores of Johnstone Strait and positioning their stomachs to rub themselves on beach pebbles. More than 90% of the Northern resident population observed in Johnstone Strait visit these rubbing beaches. They emit certain and specific calls more frequently while engaging in this activity. Although it is not clear why they engage in this activity, beach rubbing has been identified as an important activity to the culture of the entire Northern resident community. This behaviour was originally thought to be unique to the Northern resident community; however, the Southern Alaska resident killer whales have also been observed beach rubbing. Location The Northern residents
https://en.wikipedia.org/wiki/Quadrupole%20magnet
Quadrupole magnets, abbreviated as Q-magnets, consist of groups of four magnets laid out so that in the planar multipole expansion of the field, the dipole terms cancel and where the lowest significant terms in the field equations are quadrupole. Quadrupole magnets are useful as they create a magnetic field whose magnitude grows rapidly with the radial distance from its longitudinal axis. This is used in particle beam focusing. The simplest magnetic quadrupole is two identical bar magnets parallel to each other such that the north pole of one is next to the south of the other and vice versa. Such a configuration will have no dipole moment, and its field will decrease at large distances faster than that of a dipole. A stronger version with very little external field involves using a k=3 Halbach cylinder. In some designs of quadrupoles using electromagnets, there are four steel pole tips: two opposing magnetic north poles and two opposing magnetic south poles. The steel is magnetized by a large electric current in the coils of tubing wrapped around the poles. Another design is a Helmholtz coil layout but with the current in one of the coils reversed. Quadrupoles in particle accelerators At the particle speeds reached in high energy particle accelerators, the magnetic force term is larger than the electric term in the Lorentz force: and thus magnetic deflection is more effective than electrostatic deflection. Therefore a 'lattice' of electromagnets is used to bend, steer and focus a charged particle beam. The quadrupoles in the lattice are of two types: 'F quadrupoles' (which are horizontally focusing but vertically defocusing) and 'D quadrupoles' (which are vertically focusing but horizontally defocusing). This situation is due to the laws of electromagnetism (the Maxwell equations) which show that it is impossible for a quadrupole to focus in both planes at the same time. The image on the right shows an example of a quadrupole focusing in the vertical directi
https://en.wikipedia.org/wiki/Callosobruchus%20maculatus
Callosobruchus maculatus is a species of beetles known commonly as the cowpea weevil or cowpea seed beetle. It is a member of the leaf beetle family, Chrysomelidae, and not a true weevil. This common pest of stored legumes has a cosmopolitan distribution, occurring on every continent except Antarctica. The beetle most likely originated in West Africa and moved around the globe with the trade of legumes and other crops. As only a small number of individuals were likely present in legumes carried by people to distant places, the populations that have invaded various parts of the globe have likely gone through multiple bottlenecks. Despite these bottlenecks and the subsequent rounds of inbreeding, these populations persist. This ability to withstand a high degree of inbreeding has likely contributed to this species’ prevalence as a pest. It is used as a model organism for both research and education due to its quick generation time, sexual dimorphism, and ease of maintenance. Description The cowpea weevil lacks the "snout" of a true weevil. It is more elongated in shape than other members of the leaf beetle family. It is reddish-brown overall, with black and gray elytra marked with two central black spots. The last segment of the abdomen extends out from under the short elytra, and also has two black spots. The beetle is sexually dimorphic and males are easily distinguished from females. The females are sometimes larger than males, but this is not true of all strains. Females are darker overall, while males are brown. The plate covering the end of the abdomen is large and dark in color along the sides in females, and smaller without the dark areas in males. There are two morphs of C. maculatus, a flightless form and a flying form. The flying form is more common in beetles that developed in conditions of high larval density and high temperatures. The flying form has a longer lifespan and lower fecundity, and the sexes are less dimorphic and can be more difficult to
https://en.wikipedia.org/wiki/Lyapunov%20time
In mathematics, the Lyapunov time is the characteristic timescale on which a dynamical system is chaotic. It is named after the Russian mathematician Aleksandr Lyapunov. It is defined as the inverse of a system's largest Lyapunov exponent. Use The Lyapunov time mirrors the limits of the predictability of the system. By convention, it is defined as the time for the distance between nearby trajectories of the system to increase by a factor of e. However, measures in terms of 2-foldings and 10-foldings are sometimes found, since they correspond to the loss of one bit of information or one digit of precision respectively. While it is used in many applications of dynamical systems theory, it has been particularly used in celestial mechanics where it is important for the problem of the stability of the Solar System. However, empirical estimation of the Lyapunov time is often associated with computational or inherent uncertainties. Examples Typical values are: See also Belousov–Zhabotinsky reaction Molecular chaos Three-body problem
https://en.wikipedia.org/wiki/Beevor%27s%20axiom
Beevor's Axiom is the idea that the brain does not know muscles, only movements. In other words, the brain registers the movements that muscles combine to make, not the individual muscles that are making the movements. Hence, this is why one can sign their name (albeit poorly) with their foot. Beevor's Axiom was coined by Dr. Charles Edward Beevor, an English neurologist. Dr. Beevor presented Beevor's Axiom in a series of four lectures from June 3, 1903 to July 4, 1903 before the Royal College of Physicians of London as part of the Croonian Lectures. His experiments showed that when an area of the cortex was stimulated, the body responded with a movement, not just a single muscle. Dr. Beevor concluded that “only co-ordinated movements are represented in the excitable cortex” In relation to Beevor's Axiom, it has been found that the brain encodes sequences, such as playing the piano, signing our name, wiping off a counter, and chopping vegetables, and once encoded and practiced, it takes less brain activity to perform them. This supports Beevor's Axiom, because the brain can recall movements easier than it can learn them. Beevor's Axiom is only partially true, however. Most behavior of muscles is encoded in the primary motor cortex (M1) and separated by muscle group. In an effort to understand the encoding in the M1, researchers observed commands of monkeys. Muscle cells changed firing rate according to the direction of the arm movements. Each neuron has one direction that elicits the greatest response. Some M1 neurons encode muscle contractions, while others react to particular movements, regardless of the muscles used to perform them. The key characteristic of the primary motor cortex is its dynamic nature; the M1 changes based on experience. The supplementary motor area (SMA) plays a key role in initiating motion sequences. The premotor cortex (PMA) plays a key role when motor sequences are guided by external events. They map behaviors as opposed to the M1 whi
https://en.wikipedia.org/wiki/Train%20track%20map
In the mathematical subject of geometric group theory, a train track map is a continuous map f from a finite connected graph to itself which is a homotopy equivalence and which has particularly nice cancellation properties with respect to iterations. This map sends vertices to vertices and edges to nontrivial edge-paths with the property that for every edge e of the graph and for every positive integer n the path fn(e) is immersed, that is fn(e) is locally injective on e. Train-track maps are a key tool in analyzing the dynamics of automorphisms of finitely generated free groups and in the study of the Culler–Vogtmann Outer space. History Train track maps for free group automorphisms were introduced in a 1992 paper of Bestvina and Handel. The notion was motivated by Thurston's train tracks on surfaces, but the free group case is substantially different and more complicated. In their 1992 paper Bestvina and Handel proved that every irreducible automorphism of Fn has a train-track representative. In the same paper they introduced the notion of a relative train track and applied train track methods to solve the Scott conjecture which says that for every automorphism α of a finitely generated free group Fn the fixed subgroup of α is free of rank at most n. In a subsequent paper Bestvina and Handel applied the train track techniques to obtain an effective proof of Thurston's classification of homeomorphisms of compact surfaces (with or without boundary) which says that every such homeomorphism is, up to isotopy, either reducible, of finite order or pseudo-anosov. Since then train tracks became a standard tool in the study of algebraic, geometric and dynamical properties of automorphisms of free groups and of subgroups of Out(Fn). Train tracks are particularly useful since they allow to understand long-term growth (in terms of length) and cancellation behavior for large iterates of an automorphism of Fn applied to a particular conjugacy class in Fn. This information
https://en.wikipedia.org/wiki/List%20of%20mechanical%20keyboards
Mechanical keyboards (or mechanical-switch keyboards) are computer keyboards which have an individual switch for each key. The following table is a compilation list of mechanical keyboard models, brands, and series: Mechanical keyboards
https://en.wikipedia.org/wiki/Telogis
Telogis was a privately held US-based company that develops location-based software to manage mobile resources. Telogis sold software as a service (SaaS) which incorporated location information into applications for fleet owners as well as geospatial software development toolkits. In 2016, Telogis was acquired by Verizon. History Telogis was founded in 2001 by Howard Jelinek, Newth Morris and Ralph Mason, as a trunked radio hardware and software provider. Former Novell Inc. Chairman and CEO Jack Messman is chairman of the board of directors. The company was started with investments from its founders, and by 2012 the privately held company had approximately $69 million in revenue. Wall Street Journal reporter Don Clark stated of the company's technology that, "Telogis exploited the evolution of software-as-a-service–placing data from vehicles in the cloud ... so that companies that own vehicle fleets can track their cars and trucks without the need to set up their own servers for the purpose." In July 2010 Telogis acquired the assets of Remote Dynamics, then in February, 2011 Telogis acquired the assets of Intergis, a provider of routing, mobile resource and fleet management software for approximately 2,000 small to mid-sized business fleets. In July 2012 Telogis acquired Navtrak. The acquisition resulted in Telogis Navigation, commercial navigation software that gives professional drivers built-in feedback systems to provide up-to-date road network information. The company's first outside venture capital was announced in October 2013, with $93 million from Kleiner Perkins Caufield & Byers. In February 2011 private investors provided $2.9 million. Additional funding was received from GM Ventures in 2014. On 21 June 2016, it was announced that Telogis will merge with Verizon via an acquisition. On 6 March 2018, a press release announced that Telogis would be rebranded along with sister companies–Fleetmatics and Verizon Telematics–as Verizon Connect. Software Te
https://en.wikipedia.org/wiki/Electric%20power%20quality
Electric power quality is the degree to which the voltage, frequency, and waveform of a power supply system conform to established specifications. Good power quality can be defined as a steady supply voltage that stays within the prescribed range, steady AC frequency close to the rated value, and smooth voltage curve waveform (which resembles a sine wave). In general, it is useful to consider power quality as the compatibility between what comes out of an electric outlet and the load that is plugged into it. The term is used to describe electric power that drives an electrical load and the load's ability to function properly. Without the proper power, an electrical device (or load) may malfunction, fail prematurely or not operate at all. There are many ways in which electric power can be of poor quality, and many more causes of such poor quality power. The electric power industry comprises electricity generation (AC power), electric power transmission and ultimately electric power distribution to an electricity meter located at the premises of the end user of the electric power. The electricity then moves through the wiring system of the end user until it reaches the load. The complexity of the system to move electric energy from the point of production to the point of consumption combined with variations in weather, generation, demand and other factors provide many opportunities for the quality of supply to be compromised. While "power quality" is a convenient term for many, it is the quality of the voltage—rather than power or electric current—that is actually described by the term. Power is simply the flow of energy, and the current demanded by a load is largely uncontrollable. Introduction The quality of electrical power may be described as a set of values of parameters, such as: Continuity of service (whether the electrical power is subject to voltage drops or overages below or above a threshold level thereby causing blackouts or brownouts) Variation i
https://en.wikipedia.org/wiki/The%20Art%20of%20Unix%20Programming
The Art of Unix Programming by Eric S. Raymond is a book about the history and culture of Unix programming from its earliest days in 1969 to 2003 when it was published, covering both genetic derivations such as BSD and conceptual ones such as Linux. The author utilizes a comparative approach to explaining Unix by contrasting it to other operating systems including desktop-oriented ones such as Microsoft Windows and the classic Mac OS to ones with research roots such as EROS and Plan 9 from Bell Labs. The book was published by Addison-Wesley, September 17, 2003, and is also available online, under a Creative Commons license with additional clauses. Contributors The book contains many contributions, quotations and comments from UNIX gurus past and present. These include: Ken Arnold (author of curses and co-author of Rogue) Steve Bellovin Stuart Feldman Jim Gettys Stephen C. Johnson Brian Kernighan David Korn Mike Lesk Doug McIlroy Marshall Kirk McKusick Keith Packard Henry Spencer Ken Thompson See also Unix philosophy The Hacker Ethic and the Spirit of the Information Age
https://en.wikipedia.org/wiki/Xerox%20Sigma%209
The Xerox Sigma 9, also known as the XDS Sigma 9, was a high-speed, general purpose computer. Xerox first became interested in office automation through computers in 1969 and purchased Scientific Data Systems or SDS. They then renamed the division Xerox Data Systems or XDS; they saw limited success, and the division was ultimately sold to Honeywell at a significant loss. The Sigma 9 was announced in 1970 and the first delivery was made in 1971. There were 3 models built, the Sigma 9, the Sigma 9 Model 2 and the Sigma 9 Model 3. The original was the most powerful and was universally applicable to all data processing applications at the time. The Model 2 was able to process in multi-programmed batch, remote batch, conversational time-sharing, real-time, and transaction processing modes. The Model 3 was designed for the scientific real-time community. Features of the Basic Systems All models featured a CPU with at least a floating-point arithmetic unit, Memory map with access protection, Memory write protection, Two real-time clocks, a Power fail-safe, an External interface, Ten internal interrupt levels. Also a Multiplexor input/output processor (MIOP) featuring Channel A with eight sub-channels. Listed below are the individual specifications Sigma 9 CPU featuring: Decimal arithmetic unit Two 16-register general purpose register blocks Interrupt control chassis with eight external interrupt levels Memory reconfiguration control unit Main Memory of 64K words Motor generator set Model 2 CPU featuring: Decimal arithmetic unit Two 16-register general purpose register blocks Interrupt control chassis with two external interrupt levels Main Memory of 32K words Model 3 CPU featuring: One 16-register general purpose register blocks Interrupt control chassis with two external interrupt levels Main Memory of 32K words Interesting facts The Sigma 9 had a very long run, about 10 years, and around 1980 other companies started building computers that coul
https://en.wikipedia.org/wiki/Eli%20Upfal
Eli Upfal is a computer science researcher, currently the Rush C. Hawkins Professor of Computer Science at Brown University. He completed his undergraduate studies in mathematics and statistics at the Hebrew University, Israel in 1978, received an M.Sc. in computer science from the Feinberg Graduate School of the Weizmann Institute of Science, Israel in 1980, and completed his PhD in computer science at the Hebrew University in 1983 under Eli Shamir. He has made contributions in a variety of areas. Most of his work involves randomized and/or online algorithms, stochastic processes, or the probabilistic analysis of deterministic algorithms. Particular applications include routing and communications networks, computational biology, and computational finance. He is responsible for a large body of work, including, as of May 2012, more than 150 publications in journals and conferences as well as many patents. He has won several prizes, including the IBM Outstanding Innovation Award and the Levinson Prize in Mathematical Sciences. In 2002, Eli Upfal, was inducted as a Fellow of the Institute of Electrical and Electronics Engineers, and in 2005 he was inducted as a Fellow of the Association for Computing Machinery. He received, together with Yossi Azar, Andrei Broder, Anna Karlin, and Michael Mitzenmacher, the 2020 ACM Paris Kanellakis Award. Eli is a coauthor of the book
https://en.wikipedia.org/wiki/Rhodothermaceae
The Rhodothermaceae are a family of bacteria.
https://en.wikipedia.org/wiki/Xorshift
Xorshift random number generators, also called shift-register generators, are a class of pseudorandom number generators that were invented by George Marsaglia. They are a subset of linear-feedback shift registers (LFSRs) which allow a particularly efficient implementation in software without the excessive use of sparse polynomials. They generate the next number in their sequence by repeatedly taking the exclusive or of a number with a bit-shifted version of itself. This makes execution extremely efficient on modern computer architectures, but it does not benefit efficiency in a hardware implementation. Like all LFSRs, the parameters have to be chosen very carefully in order to achieve a long period. For execution in software, xorshift generators are among the fastest PRNGs, requiring very small code and state. However, they do not pass every statistical test without further refinement. This weakness is amended by combining them with a non-linear function, as described in the original paper. Because plain xorshift generators (without a non-linear step) fail some statistical tests, they have been accused of being unreliable. Example implementation A C version of three xorshift algorithms is given here. The first has one 32-bit word of state, and period 232−1. The second has one 64-bit word of state and period 264−1. The last one has four 32-bit words of state, and period 2128−1. The 128-bit algorithm passes the diehard tests. However, it fails the MatrixRank and LinearComp tests of the BigCrush test suite from the TestU01 framework. All use three shifts and three or four exclusive-or operations: #include <stdint.h> struct xorshift32_state { uint32_t a; }; /* The state must be initialized to non-zero */ uint32_t xorshift32(struct xorshift32_state *state) { /* Algorithm "xor" from p. 4 of Marsaglia, "Xorshift RNGs" */ uint32_t x = state->a; x ^= x << 13; x ^= x >> 17; x ^= x << 5; return state->a = x; } struct xorshift64_state { uint64_t a; }; uint
https://en.wikipedia.org/wiki/Cubic%20harmonic
In fields like computational chemistry and solid-state and condensed matter physics the so-called atomic orbitals, or spin-orbitals, as they appear in textbooks on quantum physics, are often partially replaced by cubic harmonics for a number of reasons. These harmonics are usually named tesseral harmonics in the field of condensed matter physics in which the name kubic harmonics rather refers to the irreducible representations in the cubic point-group. Introduction The hydrogen-like atomic orbitals with principal quantum number and angular momentum quantum number are often expressed as in which the is the radial part of the wave function and is the angular dependent part. The are the spherical harmonics, which are solutions of the angular momentum operator. The spherical harmonics are representations of functions of the full rotation group SO(3) with rotational symmetry. In many fields of physics and chemistry these spherical harmonics are replaced by cubic harmonics because the rotational symmetry of the atom and its environment are distorted or because cubic harmonics offer computational benefits. Symmetry and coordinate system In many cases, especially in chemistry and solid-state and condensed-matter physics, the system under investigation doesn't have rotational symmetry. Often it has some kind of lower symmetry, with a special point group representation, or it has no spatial symmetry at all. Biological and biochemical systems, like amino acids and enzymes often belong to low molecular symmetry point groups. The solid crystals of the elements often belong to the space groups and point groups with high symmetry. (Cubic harmonics representations are often listed and referenced in point group tables.) The system has at least a fixed orientation in three-dimensional Euclidean space. Therefore, the coordinate system that is used in such cases is most often a Cartesian coordinate system instead of a spherical coordinate system. In a Cartesian coordinate sys
https://en.wikipedia.org/wiki/List%20of%20antibiotic-resistant%20bacteria
A list of antibiotic resistant bacteria is provided below. These bacteria have shown antibiotic resistance (or antimicrobial resistance). Enzyme NDM-1 (New Delhi Metallo-beta-lactamase-1) NDM-1 is an enzyme that makes bacteria resistant to a broad range of beta-lactam antibiotics. NDM-1 (New Delhi Metallo-beta-lactamase-1) originated in India. In Indian hospitals, hospital-acquired infections are common, and with the new super-bugs on rise in India, this can make them dangerous. Mapping of sewage and water supply samples that were NDM-1-positive indicates widespread infection in New Delhi already back in 2011. NDM-1 was first detected in a Klebsiella pneumoniae isolate from a Swedish patient of Indian origin in 2008. It was later detected in bacteria in India, Pakistan, the United Kingdom, the United States, Canada, and Japan. Gram positive Clostridium difficile Clostridium difficile is a nosocomial pathogen that causes diarrheal disease worldwide. Diarrhea caused by C. difficile can be life-threatening. Infections are most frequent in people who have had recent medical and/or antibiotic treatment. C. difficile infections commonly occur during hospitalization. According to a 2015 CDC report, C. difficile caused almost 500,000 infections in the United States over a year period. Associated with these infections were an estimated 15,000 deaths. The CDC estimates that C. difficile infection costs could amount to $3.8 billion over a 5-year span. C. difficile colitis is most strongly associated with fluoroquinolones, cephalosporins, carbapenems, and clindamycin. Some research suggests the overuse of antibiotics in the raising of livestock is contributing to outbreaks of bacterial infections such as C. difficile.[16] Antibiotics, especially those with a broad activity spectrum (such as clindamycin) disrupt normal intestinal flora. This can lead to an overgrowth of C. difficile, which flourishes under these conditions. Pseudomembranous colitis can follow, creatin
https://en.wikipedia.org/wiki/Delta-v
Delta-v (more known as "change in velocity"), symbolized as and pronounced delta-vee, as used in spacecraft flight dynamics, is a measure of the impulse per unit of spacecraft mass that is needed to perform a maneuver such as launching from or landing on a planet or moon, or an in-space orbital maneuver. It is a scalar that has the units of speed. As used in this context, it is not the same as the physical change in velocity of said spacecraft. A simple example might be the case of a conventional rocket-propelled spacecraft, which achieves thrust by burning fuel. Such a spacecraft's delta-v, then, would be the change in velocity that spacecraft can achieve by burning its entire fuel load. Delta-v is produced by reaction engines, such as rocket engines, and is proportional to the thrust per unit mass and the burn time. It is used to determine the mass of propellant required for the given maneuver through the Tsiolkovsky rocket equation. For multiple maneuvers, delta-v sums linearly. For interplanetary missions, delta-v is often plotted on a porkchop plot, which displays the required mission delta-v as a function of launch date. Definition where is the instantaneous thrust at time . is the instantaneous mass at time . Specific cases In the absence of external forces: where is the coordinate acceleration. When thrust is applied in a constant direction ( is constant) this simplifies to: which is simply the magnitude of the change in velocity. However, this relation does not hold in the general case: if, for instance, a constant, unidirectional acceleration is reversed after then the velocity difference is 0, but delta-v is the same as for the non-reversed thrust. For rockets, "absence of external forces" is taken to mean the absence of gravity and atmospheric drag, as well as the absence of aerostatic back pressure on the nozzle, and hence the vacuum I is used for calculating the vehicle's delta-v capacity via the rocket equation. In addition, the cos
https://en.wikipedia.org/wiki/Benzaldehyde
Benzaldehyde (C6H5CHO) is an organic compound consisting of a benzene ring with a formyl substituent. It is the simplest aromatic aldehyde and one of the most industrially useful. It is a colorless liquid with a characteristic almond-like odor, and is commonly used in cherry soda. A component of bitter almond oil, benzaldehyde can be extracted from a number of other natural sources. Synthetic benzaldehyde is the flavoring agent in imitation almond extract, which is used to flavor cakes and other baked goods. History Benzaldehyde was first extracted in 1803 by the French pharmacist Martrès. His experiments focused on elucidating the nature of amygdalin, the poisonous compound found in bitter almonds, the fruit of Prunus dulcis. Further work on the oil by Pierre Robiquet and Antoine Boutron Charlard, two French chemists, produced benzaldehyde. In 1832, Friedrich Wöhler and Justus von Liebig first synthesized benzaldehyde. Production As of 1999, 7000 tonnes of synthetic and 100 tonnes of natural benzaldehyde were produced annually. Liquid phase chlorination and oxidation of toluene are the main routes. Numerous other methods have been developed, such as the partial oxidation of benzyl alcohol, alkali hydrolysis of benzal chloride, and the carbonylation of benzene (the Gatterman-Koch reaction). A significant quantity of natural benzaldehyde is produced from cinnamaldehyde obtained from cassia oil by the retro-aldol reaction: the cinnamaldehyde is heated in an aqueous/alcoholic solution between 90 °C and 150 °C with a base (most commonly sodium carbonate or bicarbonate) for 5 to 80 hours, followed by distillation of the formed benzaldehyde. This reaction also yields acetaldehyde. The natural status of benzaldehyde obtained in this way is controversial. Occurrence Benzaldehyde and similar chemicals occur naturally in many foods. Most of the benzaldehyde that people eat is from natural plant foods, such as almonds. Almonds, apricots, apples, and cherry seed cont
https://en.wikipedia.org/wiki/List%20of%20algebras
This is a list of possibly nonassociative algebras. An algebra is a module, wherein you can also multiply two module elements. (The multiplication in the module is compatible with multiplication-by-scalars from the base ring). *-algebra Akivis algebra Algebra for a monad Albert algebra Alternative algebra Azumaya algebra Banach algebra Birman–Wenzl algebra Boolean algebra Borcherds algebra Brauer algebra C*-algebra Central simple algebra Clifford algebra Cluster algebra Dendriform algebra Differential graded algebra Differential graded Lie algebra Exterior algebra F-algebra Filtered algebra Flexible algebra Freudenthal algebra Genetic algebra Geometric algebra Gerstenhaber algebra Graded algebra Griess algebra Group algebra Group algebra of a locally compact group Hall algebra Hecke algebra of a locally compact group Heyting algebra Hopf algebra Hurwitz algebra Hypercomplex algebra Incidence algebra Iwahori–Hecke algebra Jordan algebra Kac–Moody algebra Kleene algebra Leibniz algebra Lie algebra Lie superalgebra Malcev algebra Matrix algebra Non-associative algebra Octonion algebra Pre-Lie algebra Poisson algebra Process algebra Quadratic algebra Quaternion algebra Rees algebra Relation algebra Relational algebra Schur algebra Semisimple algebra Separable algebra Shuffle algebra Sigma-algebra Simple algebra Structurable algebra Supercommutative algebra Symmetric algebra Tensor algebra Universal enveloping algebra Vertex operator algebra von Neumann algebra Weyl algebra Zinbiel algebra This is a list of fields of algebra. Linear algebra Homological algebra Universal algebra Algebras
https://en.wikipedia.org/wiki/Evelyn%20Silvia
Evelyn Marie Silvia (February 8, 1948 – January 21, 2006) was an American mathematician specializing in functional analysis and particularly in starlike functions. She was a professor at the University of California, Davis, and as well as teaching mathematics at the undergraduate and graduate levels there, was active in the improvement of secondary-school mathematics education. Education Silvia was born in Fall River, Massachusetts. A seventh-grade mathematics teacher told her she was the best student he had ever seen, a moment that built her confidence in the subject and led her to become a mathematician. She graduated from Southeastern Massachusetts University in 1969, with a bachelor's degree in mathematics. Continuing her graduate education at Clark University, she earned a master's degree in 1973 and completed her Ph.D. in 1973. Her doctoral dissertation, Classes Related to Alpha-Starlike Functions, was supervised by Herb Silverman. Career Silvia joined the UC Davis faculty in 1973, and remained there until retiring and becoming a professor emeritus shortly before her death. Beyond her work as a mathematics professor, Silvia worked as a volunteer teacher of mathematics in local elementary and secondary schools. In the 1970s, she directed a master's-level teaching program at UC Davis, and in the 1990s, she headed the Northern California Math Project, an effort to improve mathematics education throughout Northern California. Recognition The UC Davis Academic Senate gave Silvia their Distinguished Teaching Award in 1990. In 2001, she was one of the winners of the Deborah and Franklin Haimo Award for Distinguished College or University Teaching of Mathematics, given by the Mathematical Association of America to recognize college or university teachers "who have been widely recognized as extraordinarily successful and whose teaching effectiveness has been shown to have had influence beyond their own institutions".
https://en.wikipedia.org/wiki/Hypouricemia
Hypouricemia or hypouricaemia is a level of uric acid in blood serum that is below normal. In humans, the normal range of this blood component has a lower threshold set variously in the range of 2 mg/dL to 4 mg/dL, while the upper threshold is 530 μmol/L (6 mg/dL) for women and 619 μmol/L (7 mg/dL) for men. Hypouricemia usually is benign and sometimes is a sign of a medical condition. Presentation Complications Although normally benign, idiopathic renal hypouricemia may increase the risk of exercise-induced acute kidney failure. There is also evidence that hypouricemia can worsen conditions such as rheumatoid arthritis, especially when combined with low Vitamin C uptake, due to free radical damage. Causes Hypouricemia is benign and not a medical condition, but it is a useful medical sign. It is usually due to drugs and toxic agents, sometimes to diet or genetics, and, rarely, suggests an underlying medical condition. Medication The majority of drugs that contribute to hypouricemia are uricosuric drugs that increase the excretion of uric acid from the blood into the urine. Others include drugs that reduce the production of uric acid: xanthine oxidase inhibitors, urate oxidase (rasburicase), and sevelamer. Diet Hypouricemia is common in vegetarians and vegans due to the low purine content of most vegetarian diets. Vegetarian diet has been found to result in mean serum uric acid values as low as 239 μ mol/L (2.7 mg/dL). While a vegetarian diet is typically seen as beneficial with respect to conditions such as gout, it may be associated with some other health conditions. Transient hypouricemia sometimes is produced by total parenteral nutrition. Paradoxically, total parenteral nutrition may produce hypouricemia followed shortly by acute gout, a condition normally associated with hyperuricemia. The reasons for this are unclear. Genetics Two kinds of genetic mutations are known to cause hypouricemia: mutations causing xanthine oxidase deficiency, which r
https://en.wikipedia.org/wiki/Nautilia%20profundicola
Nautilia profundicola is a Gram-negative chemolithoautotrophic bacterium found around hydrothermal vents in the deep ocean. It was first discovered in 1999 on the East Pacific Rise at depth of , on the surface of the polychaete worm Alvinella pompejana. Nautilia profundicola lives symbiotically on the dorsal hairs of A. pompejana but they may also form biofilms and live independently on the walls of hydrothermal vents. The ability of N. profundicola to survive in an anaerobic environment rich in sulfur, H2 and CO2 of varying temperature makes it a useful organism to study, as these are the conditions that are theorized to have prevailed around the time of the earliest life on earth. Morphology Nautilia profundicola is a motile, rod-shaped bacterium, around 0.4 μm long and 0.3 μm wide. Like most Campylobacterota (syn. Epsilonproteobacteria), it has an unsheathed polar flagellum. Physiology Nautilia profundicola lives among the hydrothermal vents and can grow at temperatures of . It uses anaerobic respiration and is a chemolithoautotroph. Nautilia profundicola uses hydrogen or formate as an electron donor and sulfur an electron acceptor to produce hydrogen sulfide. Nautilia profundicola contains the protein reverse gyrase, which has been found amongst thermophilic bacteria, and which helps it survive the large temperature variation associated with its environment. Reverse gyrase is theorized to keep the genome stable and prevent damage by the extreme heat. Along with the ability to fix carbon and sulfur, analysis of the genome of Nautilia profundicola points to a novel pathway of nitrogen fixation. Taxonomy Analysis of the 16S rRNA gene sequence of Nautilia profundicola allowed it to be placed in the family Nautiliaceae of the order Nautiliales in the phylum Campylobacterota. Analysis of this gene demonstrated that this organism shared 97.8 percent of its DNA for this gene with the related bacterium Nautilia lithotrophica. Using DNA–DNA hybridization, the total
https://en.wikipedia.org/wiki/Jet%20Set%20Willy%20II
Jet Set Willy II: The Final Frontier is a platform game released 1985 by Software Projects as the Amstrad CPC port of Jet Set Willy. It was then rebranded as the sequel and ported other home computers. Jet Set Willy II was developed by Derrick P. Rowson and Steve Wetherill rather than Jet Set Willy programmer Matthew Smith and is an expansion of the original game, rather than an entirely new one. Gameplay The map is primarily an expanded version of the original mansion from Jet Set Willy, with only a few new elements over its predecessor several of which are based on rumoured events in JSW that were in fact never programmed (such as being able to launch the titular ship in the screen called "The Yacht" and explore an island). In the ZX Spectrum, Amstrad CPC and MSX versions, Willy is blasted from the Rocket Room into space, and for these 33 rooms he dons a spacesuit. Due to the proliferation of hacking and cheating in the original game, Jet Set Willy II pays homage to this and includes a screen called Cheat that can only be accessed by cheating. Control of Willy also differs from the original: The player can jump in the opposite direction immediately upon landing, without releasing the jump button. Willy now takes a step forward before jumping from a standstill. Some previous "safe spots" in Jet Set Willy are now hazardous to the player in Jet Set Willy II - the tall candle in "The Chapel" for example. The ending of the game is also different. Development Jet Set Willy II was originally created as the Amstrad conversion of Jet Set Willy by Derrick P. Rowson and Steve Wetherill, but Rowson's use of an algorithm to compress much of the screen data meant there was enough memory available to create new rooms. It came with a form of enhanced copy protection called Padlock II. To discourage felt tip copying, it had seven pages, rather than the single page used in Jet Set Willy. Software Projects later had Rowson remove all of the enhancements from the Amstrad ve
https://en.wikipedia.org/wiki/Perfect%20power
In mathematics, a perfect power is a natural number that is a product of equal natural factors, or, in other words, an integer that can be expressed as a square or a higher integer power of another integer greater than one. More formally, n is a perfect power if there exist natural numbers m > 1, and k > 1 such that mk = n. In this case, n may be called a perfect kth power. If k = 2 or k = 3, then n is called a perfect square or perfect cube, respectively. Sometimes 0 and 1 are also considered perfect powers (0k = 0 for any k > 0, 1k = 1 for any k). Examples and sums A sequence of perfect powers can be generated by iterating through the possible values for m and k. The first few ascending perfect powers in numerical order (showing duplicate powers) are : The sum of the reciprocals of the perfect powers (including duplicates such as 34 and 92, both of which equal 81) is 1: which can be proved as follows: The first perfect powers without duplicates are: (sometimes 0 and 1), 4, 8, 9, 16, 25, 27, 32, 36, 49, 64, 81, 100, 121, 125, 128, 144, 169, 196, 216, 225, 243, 256, 289, 324, 343, 361, 400, 441, 484, 512, 529, 576, 625, 676, 729, 784, 841, 900, 961, 1000, 1024, ... The sum of the reciprocals of the perfect powers p without duplicates is: where μ(k) is the Möbius function and ζ(k) is the Riemann zeta function. According to Euler, Goldbach showed (in a now-lost letter) that the sum of over the set of perfect powers p, excluding 1 and excluding duplicates, is 1: This is sometimes known as the Goldbach–Euler theorem. Detecting perfect powers Detecting whether or not a given natural number n is a perfect power may be accomplished in many different ways, with varying levels of complexity. One of the simplest such methods is to consider all possible values for k across each of the divisors of n, up to . So if the divisors of are then one of the values must be equal to n if n is indeed a perfect power. This method can immediately be simplified by instead
https://en.wikipedia.org/wiki/Predictive%20state%20representation
In computer science, a predictive state representation (PSR) is a way to model a state of controlled dynamical system from a history of actions taken and resulting observations. PSR captures the state of a system as a vector of predictions for future tests (experiments) that can be done on the system. A test is a sequence of action-observation pairs and its prediction is the probability of the test's observation-sequence happening if the test's action-sequence were to be executed on the system. One of the advantage of using PSR is that the predictions are directly related to observable quantities. This is in contrast to other models of dynamical systems, such as partially observable Markov decision processes (POMDPs) where the state of the system is represented as a probability distribution over unobserved nominal states.
https://en.wikipedia.org/wiki/Laser%20voltage%20prober
The laser voltage probe (LVP) is a laser-based voltage and timing waveform acquisition system which is used to perform failure analysis on flip-chip integrated circuits. The device to be analyzed is de-encapsulated in order to expose the silicon surface. The silicon substrate is thinned mechanically using a back side mechanical thinning tool. The thinned device is then mounted on a movable stage and connected to an electrical stimulus source. Signal measurements are performed through the back side of the device after substrate thinning has been performed. The device being probed must be electrically stimulated using a repeating test pattern, with a trigger pulse provided to the LVP as reference. The operation of the LVP is similar to that of a sampling oscilloscope. Theory of operation The LVP instrument measures voltage waveform signals in the device diffusion regions. Device imaging is accomplished through the use of a laser scanning microscope (LSM). The LVP uses dual infrared (IR) lasers to perform both device imaging and waveform acquisition. One laser is used to acquire images or waveforms from the device, while the second laser provides a reference which may be used to subtract unwanted noise from the signal data being acquired. On an electrically active device, the instrument monitors the changes in the phase of the electromagnetic field surrounding a signal being applied to a junction. The instrument obtains voltage waveform and timing information by monitoring the interaction of laser light with the changes in the electric field across a p-n junction. As the laser reaches the silicon surface, a certain amount of that light is reflected back. The amount of reflected laser light from the junction is sampled at various points in time. The changing electromagnetic field at the junction affects the amount of laser light that is reflected back. By plotting the variations in reflected laser light versus time, it is possible to construct a timing waveform of the
https://en.wikipedia.org/wiki/N-terminal%20telopeptide
The N-terminal telopeptide (NTX), also known as amino-terminal collagen crosslinks, is the N-terminal telopeptide of fibrillar collagens such as collagen type I and type II. It is used as a biomarker to measure the rate of bone turnover. NTX can be measured in the urine (uNTX) or serum (serum NTX). The peptide consists of eight amino acids with the sequence YDEKSTGG. Usefulness of NTX as a biomarker Evaluating an individual's rate of bone turnover, termed bone remodeling, directly may be important in assessing his or her potential nonsurgical treatment response as well as evaluating his or her risk of developing complications during healing following surgical intervention. To determine an individual's rate of bone turnover, numerous biomarkers are available in the body fluids that can be correlated to this rate, and one such biomarker is NTX. However, while NTX does fluctuate in a very sensitive manner in line with bone resorption patterns, they are not very specific, in that they may vary spontaneously without physiologic intervention. For example, NTX levels may drop by 50% from day to day with no treatment, thus, making NTX levels unconvincing evidence of treatment effect. Conversely, the serum CTX biomarker, described in 2000 by Rosen, appears to be a much more effective and valuable indicator of bone resorption rate. See also C-terminal telopeptide
https://en.wikipedia.org/wiki/Xbox%20system%20software
The Xbox system software is the operating system developed exclusively for Microsoft's Xbox home video game consoles. Across the four generations of Xbox consoles, the software has been based on a version of Microsoft Windows and incorporating DirectX features optimized for the home consoles. The user interface, the Xbox Dashboard, provides access to games, media players, and applications, and integrates with the Xbox network for online functionality. Though initial iterations of the software for the original Xbox and Xbox 360 were based on heavily modified versions of Windows, the newer consoles feature operating systems that are highly compatible with Microsoft's desktop operating systems, allowing for shared applications and ease-of-development between personal computers and the Xbox line. Common features Across all four generations of the Xbox platform, the user interface of the system software has been called the Xbox Dashboard. While its appearance and detailed functions have varied between console generations, the Dashboard has provided the user the means to start a game from the optical media loaded into the console or off the console's storage, launch audio and video players to play optical media discs, or start special applications for the Xbox such as streaming media services from third parties. The Dashboard also provides a menu of settings and configuration pages for the console that the user can adjust. The Dashboard has supported integration with the Xbox Live service since November 2002. Xbox Live provides online functionality to the Xbox, including friends list, game achievement tracking, matchmaking support for online games, in-game communications, and a digital game storefront. While some portions of the Xbox Live service are free, a subscription-tier Xbox Live Gold is generally required to play most multiplayer games on the console. Starting with the Xbox 360 and continuing through its current consoles, Microsoft has offered a means for users
https://en.wikipedia.org/wiki/Kurt%20Mehlhorn
Kurt Mehlhorn (born 29 August 1949) is a German theoretical computer scientist. He has been a vice president of the Max Planck Society and is director of the Max Planck Institute for Computer Science. Education and career Mehlhorn graduated in 1971 from the Technical University of Munich, where he studied computer science and mathematics, and earned his Ph.D. in 1974 from Cornell University under the supervision of Robert Constable. Since 1975 he has been on the faculty of Saarland University in Saarbrücken, Germany, where he was chair of the computer science department from 1976 to 1978 and again from 1987 to 1989. Since 1990 has been the director of the Max Planck Institute for Computer Science, also in Saarbrücken. He has been on the editorial boards of ten journals, a trustee of the International Computer Science Institute in Berkeley, California, and a member of the board of governors of Jacobs University Bremen. He also served on the Engineering and Computer Science jury for the Infosys Prize from 2009 to 2011. Awards and honors He won the Gottfried Wilhelm Leibniz Prize in 1986, the Gay-Lussac-Humboldt-Prize in 1989, the Karl Heinz Beckurts Award in 1994, the Konrad Zuse Medal in 1995, the EATCS Award in 2010, and the Paris Kanellakis Award in 2010. He was named a member of the Academia Europaea in 1995, Fellow of the Association for Computing Machinery in 1999, a member of the Berlin-Brandenburg Academy of Sciences in 2001, a member of the German Academy of Sciences Leopoldina in 2004, a foreign member of the National Academy of Engineering in 2014, and a foreign member of the National Academy of Sciences in 2014. He has received honorary doctorates from the Otto von Guericke University of Magdeburg in 2002 and the University of Waterloo in 2006. He is the 2014 winner of the Erasmus Medal of the Academia Europaea. Research Mehlhorn is the author of several books and over 250 scientific publications, which include fundamental contributions to data struct
https://en.wikipedia.org/wiki/Joost-Pieter%20Katoen
Joost-Pieter Katoen (born October 6, 1964) is a Dutch theoretical computer scientist based in Germany. He is distinguished professor in Computer Science and head of the Software Modeling and Verification Group at RWTH Aachen University. Furthermore, he is part-time associated to the Formal Methods & Tools group at the University of Twente. Education Katoen received his master's degree with distinction in Computer Science from the University of Twente in 1987. In 1990, he was awarded a Professional Doctorate in Engineering from the Eindhoven University of Technology, and in 1996, he received his Ph.D. in computer science from the University of Twente. Research Katoen's main research interests are formal methods, computer aided verification, in particular model checking, concurrency theory, and semantics, in particular semantics of probabilistic programming languages. His research is largely tool and application oriented. Together with Christel Baier he wrote and published the book Principles of Model Checking. Career From 1997 to 1999, Katoen was a postdoctoral researcher at the University of Erlangen-Nuremberg. In 1999, he became an associate professor at the University of Twente, where he still holds a part-time position. In 2004, he was appointed a full professor at RWTH Aachen University. In 2013, Katoen became Theodore von Kármán Fellow and Distinguished Professor at RWTH Aachen University. Also in 2013, he was elected member of the Academia Europaea. In 2017, he received an honorary doctorate from Aalborg University. In 2018, Katoen was awarded the highly remunerated ERC Advanced Grant. In 2020, Katoen became an ACM Fellow and in 2021, he was elected as member of the Royal Holland Society of Science and Humanities (KHMW). In 2022, he was elected as member of the North Rhine-Westphalian Academy of Science, Humanities and the Arts. Katoen is a founding member of the IFIP Working Group (WG) 1.8 on Concurrency Theory and a member of the WG 2.2 Formal Descr
https://en.wikipedia.org/wiki/Process%20design%20kit
A process design kit (PDK) is a set of files used within the semiconductor industry to model a fabrication process for the design tools used to design an integrated circuit. The PDK is created by the foundry defining a certain technology variation for their processes. It is then passed to their customers to use in the design process. The customers may enhance the PDK, tailoring it to their specific design styles and markets. The designers use the PDK to design, simulate, draw and verify the design before handing the design back to the foundry to produce chips. The data in the PDK is specific to the foundry's process variation and is chosen early in the design process, influenced by the market requirements for the chip. An accurate PDK will increase the chances of first-pass successful silicon. Description Different tools in the design flow have different input formats for the PDK data. The PDK engineers have to decide which tools they will support in the design flows and create the libraries and rule sets which support those flows. A typical PDK contains: A primitive device library Symbols Device parameters PCells Verification checks Design Rule Checking Layout Versus Schematic Antenna and Electrical rule check Physical Extraction Technology data Layers, layer names, layer/purpose pairs Colors, fills and display attributes Process constraints Electrical rules Rule files LEF Tool dependent rule formats Simulation models of primitive devices (SPICE or SPICE derivatives) Transistors (typically SPICE) Capacitors Resistors Inductors Design Rule Manual A user friendly representation of the process requirements A PDK may also include standard cell libraries from the foundry, a library vendor or developed internally LEF format of abstracted layout data Symbols Library (.lib) files GDSII layout data
https://en.wikipedia.org/wiki/Maintenance%20engineering
Maintenance Engineering is the discipline and profession of applying engineering concepts for the optimization of equipment, procedures, and departmental budgets to achieve better maintainability, reliability, and availability of equipment. Maintenance, and hence maintenance engineering, is increasing in importance due to rising amounts of equipment, systems, machineries and infrastructure. Since the Industrial Revolution, devices, equipment, machinery and structures have grown increasingly complex, requiring a host of personnel, vocations and related systems needed to maintain them. Prior to 2006, the United States spent approximately US$300 billion annually on plant maintenance and operations alone. Maintenance is to ensure a unit is fit for purpose, with maximum availability at minimum costs. A person practicing maintenance engineering is known as a maintenance engineer. Maintenance engineer's description A maintenance engineer should possess significant knowledge of statistics, probability, and logistics, and in the fundamentals of the operation of the equipment and machinery he or she is responsible for. A maintenance engineer should also possess high interpersonal, communication, and management skills, as well as the ability to make decisions quickly. Typical responsibilities include: Assure optimization of the maintenance organization structure Analysis of repetitive equipment failures Estimation of maintenance costs and evaluation of alternatives Forecasting of spare parts Assessing the needs for equipment replacements and establish replacement programs when due Application of scheduling and project management principles to replacement programs Assessing required maintenance tools and skills required for efficient maintenance of equipment Assessing required skills for maintenance personnel Reviewing personnel transfers to and from maintenance organizations Assessing and reporting safety hazards associated with maintenance of equipment Mainte
https://en.wikipedia.org/wiki/Plasma%20diffusion
Plasma diffusion across a magnetic field is an important topic in magnetic confinement of fusion plasma. It especially concerns how plasma transport is related to strength of an external magnetic field, B. Classical diffusion predicts 1/B2 scaling, while Bohm diffusion, borne out of experimental observations from the early confinement machines, was conjectured to follow 1/B scaling. Hsu diffusion predicts 1/B3/2 scaling, which is presumably the best confinement scenario in magnetized plasma. See also Bohm diffusion Classical diffusion Hsu diffusion
https://en.wikipedia.org/wiki/No-win%20situation
A no-win situation, also called a lose-lose situation, is one where a person has choices, but no choice leads to a net gain. For example, if a company finds a dangerous fault in one of its products, they can either issue a product recall and take some reputational damage, or allow the product to harm people: the company is in a no-win situation. In game theory In game theory, a "no-win" situation is a circumstance in which no player benefits from any outcome, hence ultimately losing the match. This may be because of any or all of the following: Unavoidable or unforeseeable circumstances causing the situation to change after decisions have been made. This is common in text adventures. Zugzwang, as in chess, when any move a player chooses makes them worse off than before such as losing a piece or being checkmated. A situation in which the player has to accomplish two mutually dependent tasks each of which must be completed before the other or that are mutually exclusive (a Catch-22). Ignorance of other players' actions, meaning the best decision for all differs from that for any one player (as in the prisoner's dilemma). In history Carl von Clausewitz's advice never to launch a war that one has not already won characterizes war as a no-win situation. A similar example is the Pyrrhic victory in which a military victory is so costly that the winning side actually ends up worse off than before it started. Looking at the victory as a part of a larger situation, the situation could either be no-win, or more of a win for the other side than the one that won the "victory", or victory at such cost that the gains are outweighed by the cost and are no longer a source of joy. For example, the "victorious" side may have accomplished their objective, which may have been worthless; it may also lose a strategic advantage in manpower or positioning. For example, the British Empire was one of the victorious powers of the Second World War but was so weakened that it could no lon
https://en.wikipedia.org/wiki/Blown%20Away%20%28Carrie%20Underwood%20song%29
"Blown Away" is a song by American recording artist Carrie Underwood, taken from her fourth studio album of the same name (2012). The song served as the album's second single on July 9, 2012, through Arista Nashville. Written by Chris Tompkins and Josh Kear, who previously wrote Underwood's single "Before He Cheats" (2007), "Blown Away" is a country pop song with lyrics addressing the story of a daughter locking herself in a storm cellar while her alcoholic father is passed out on the couch in the path of a tornado. Producer Mark Bright drew inspiration from 1980s music. Upon its release, "Blown Away" was met with positive reviews from music critics, who considered it to be the musical highlight of the album. The song's content and production received particular praise, as critics felt it confirmed the album's darker mood which Underwood had mentioned prior to its release. Commercially, "Blown Away" was successful. In the United States, it became her 13th number one hit on the Billboard Country Airplay chart, and also reached number 20 on the Hot 100. It also charted in Canada and the United Kingdom. It won several awards, including two Grammy Awards, for Best Country Song and Best Country Solo Performance. The song has since been certified 4× platinum by the RIAA. The accompanying music video was directed by Randee St. Nicholas. Underwood said that when she heard first the song, she already had ideas of a possible video for it. She wanted it to be a dark Wizard of Oz in 2012. It earned her an award for Video of the Year at the 2013 CMT Music Awards. The video was nominated for Music Video of the Year at the 2013 Country Music Association Awards. Underwood has performed "Blown Away" in a number of live appearances, including at the 2012 Billboard Music Awards, the 55th Annual Grammy Awards, and two years in a row at the Country Music Association Awards. It was also performed as the encore of the Blown Away Tour (2012–13). Writing and composition After Underwood's
https://en.wikipedia.org/wiki/Memory-bound%20function
Memory bound refers to a situation in which the time to complete a given computational problem is decided primarily by the amount of free memory required to hold the working data. This is in contrast to algorithms that are compute-bound, where the number of elementary computation steps is the deciding factor. Memory and computation boundaries can sometimes be traded against each other, e.g. by saving and reusing preliminary results or using lookup tables. Memory-bound functions and memory functions Memory-bound functions and memory functions are related in that both involve extensive memory access, but a distinction exists between the two. Memory functions use a dynamic programming technique called memoization in order to relieve the inefficiency of recursion that might occur. It is based on the simple idea of calculating and storing solutions to subproblems so that the solutions can be reused later without recalculating the subproblems again. The best known example that takes advantage of memoization is an algorithm that computes the Fibonacci numbers. The following pseudocode uses recursion and memoization, and runs in linear CPU time: Fibonacci (n) { for i = 0 to n-1 results[i] = -1 // -1 means undefined return Fibonacci_Results (results, n); } Fibonacci_Results (results, n) { if (results[n] != -1) // If it has been solved before, return results[n] // look it up. if (n == 0) val = 0 else if (n == 1) val = 1 else val = Fibonacci_Results(results, n-2 ) + Fibonacci_Results(results, n-1) results[n] = val // Save this result for re-use. return val } Compare the above to an algorithm that uses only recursion, and runs in exponential CPU time: Recursive_Fibonacci (n) { if (n == 0) return 0 if (n == 1) return 1 return Recursive_Fibonacci (n-1) + Recursive_Fibonacci (n-2) } While the recursive-only algorithm is simpler and more elegant
https://en.wikipedia.org/wiki/Conifer%20release
Conifer release is a term used in forest management circles to denote selective silvicide and herbicide use, in order to promote conifers at the expense of alternate species. An Oregon State University Extension Service specialist wrote in 2014 that: History Weyerhauser used Glyphosate as early as 1979 for its conifer release programme. Scientists noted in 1997 that the below- and near-ground microclimates were affected by conifer release treatments. Scientists at the Lakehead University used Vision (a trade-mark of Monsanto) in 1998 to suppress Vaccinium blueberry production and hence to promote conifer release in a jack pine plantation. List of herbicides A partial list of common pesticides employed as early as 1981 for conifer release is found below. Amitrole Aminocarb (Matacil) Atrazine Dalapon Dicamba 2,4-D Fosamine ammonium Glyphosate (Vision) Hexazinone Mexacarbate (Zectran) MSMA Picloram Simazine Triclopyr
https://en.wikipedia.org/wiki/Skewness%20risk
Skewness risk in financial modeling is the risk that results when observations are not spread symmetrically around an average value, but instead have a skewed distribution. As a result, the mean and the median can be different. Skewness risk can arise in any quantitative model that assumes a symmetric distribution (such as the normal distribution) but is applied to skewed data. Ignoring skewness risk, by assuming that variables are symmetrically distributed when they are not, will cause any model to understate the risk of variables with high skewness. Skewness risk plays an important role in hypothesis testing. The analysis of variance, one of the most common tests used in hypothesis testing, assumes that the data is normally distributed. If the variables tested are not normally distributed because they are too skewed, the test cannot be used. Instead, nonparametric tests can be used, such as the Mann–Whitney test for unpaired situation or the sign test for paired situation. Skewness risk and kurtosis risk also have technical implications in calculation of value at risk. If either are ignored, the Value at Risk calculations will be flawed. Benoît Mandelbrot, a French mathematician, extensively researched this issue. He feels that the extensive reliance on the normal distribution for much of the body of modern finance and investment theory is a serious flaw of any related models (including the Black–Scholes model and CAPM). He explained his views and alternative finance theory in a book: The (Mis)Behavior of Markets: A Fractal View of Risk, Ruin and Reward. In options markets, the difference in implied volatility at different strike prices represents the market's view of skew, and is called volatility skew. (In pure Black–Scholes, implied volatility is constant with respect to strike and time to maturity.) Skewness for bonds Bonds have a skewed return. A bond will either pay the full amount on time (very likely to much less likely depending on quality)
https://en.wikipedia.org/wiki/Gulf%20of%20Maine%20Closed%20Areas
The Gulf of Maine has been fished since the 1700s, and has been a historic fishing area since. Climate change is having significant impacts on this ecosystem; between 2004 and 2013, the Gulf of Maine has warmed faster than 99.9% of the global oceans, increasing average temperature by . Having seen the depletion of groundfish stocks starting in the early 1990s, managers took care to create five closure areas in the Gulf of Maine. These closed areas do not prohibit all fishing, rather they prevent the further degradation of benthic habitat and groundfish species. Closed areas are different than Marine Protected Areas (MPA) because they allow some forms of fishing and other activities to occur that would normally not be allowed in MPA. Western Gulf of Maine Closure Area Background The Western Gulf of Maine Closure Area (WGoMAC) was established in 1998 in response to the decreasing Atlantic Cod (Gadus morhua) stock. WGoMAC is located off the shores of Massachusetts, New Hampshire, and Maine and is approximately . It is rectangular and runs in straight lines from 42° 15' to 43° 15' North, and 69° 55' to 70° 15' West. It is a year long closed area that prevents groundfish fishing. Fishing vessels that are using exempted gear (pelagic trawls and nets) are allowed. Charter and recreational vessels are allowed and may fish with a Letter of Authorization (LOA) from a Regional Administrator. Cashes Ledge Closure Area Background Cashes Ledge Closure Area (CL) was permanently closed in 1998, in response to overfished groundfish resources. The area boundaries of the five sided polygon that is CL run from 42° 42.5' to 43° 0.7' North, and 68° 46' to 69° 26' West. Cashes Ledge is a shallow habitat () known for being rocky bottom habitat, located off the Northeastern US shore. It is a year long closed area that prevents groundfish fishing. Fishing vessels that are using exempted gear (pelagic trawls and nets) are allowed. Charter and recreational vessels are allowed and may
https://en.wikipedia.org/wiki/Lennard-Jones%20potential
In computational chemistry, the Lennard-Jones potential (also termed the LJ potential or 12-6 potential; named for John Lennard-Jones) is an intermolecular pair potential. Out of all the intermolecular potentials, the Lennard-Jones potential is probably the one that has been the most extensively studied. It is considered an archetype model for simple yet realistic intermolecular interactions. The Lennard-Jones potential models soft repulsive and attractive (van der Waals) interactions. Hence, the Lennard-Jones potential describes electronically neutral atoms or molecules. The commonly used expression for the Lennard-Jones potential is where is the distance between two interacting particles, is the depth of the potential well (usually referred to as 'dispersion energy'), and is the distance at which the particle-particle potential energy is zero (often referred to as 'size of the particle'). The Lennard-Jones potential has its minimum at a distance of where the potential energy has the value The Lennard-Jones potential is a simplified model that yet describes the essential features of interactions between simple atoms and molecules: Two interacting particles repel each other at very close distance, attract each other at moderate distance, and do not interact at infinite distance, as shown in Figure 1. The Lennard-Jones potential is a pair potential, i.e. no three- or multi-body interactions are covered by the potential. Statistical mechanics and computer simulations can be used to study the Lennard-Jones potential and to obtain thermophysical properties of the 'Lennard-Jones substance'. The Lennard-Jones substance is often referred to as 'Lennard-Jonesium,' suggesting that it is viewed as a (fictive) chemical element. Moreover, its energy and length parameters can be adjusted to fit many different real substances. Both the Lennard-Jones potential and, accordingly, the Lennard-Jones substance are simplified yet realistic models, such as they accurately capt
https://en.wikipedia.org/wiki/All%20Species%20Foundation
The All Species Foundation (stylized as ALL Species Foundation) was an organization aiming to catalog all species on Earth by 2025 through their All Species Inventory initiative. The project was launched in 2000 by Kevin Kelly, Stewart Brand and Ryan Phelan. Along with other similar efforts, the All Species Foundation was promoted as an important step forward in expanding, modernizing and digitizing the field of taxonomy. The Foundation started with a large grant from the Schlinger Foundation but had difficulty finding continued funding. the project is no longer active and "hands off [its] mission to the Encyclopedia of Life". The All Species Foundation received some critique for its approach to defining and identifying species. An open letter expressed concern over the species problem, a fundamental issue in taxonomy of what exactly defines a species. The letter argued that failing to acknowledge and account for this fundamental issue could undermine the use of the database for conservation and biodiversity preservation. See also Catalogue of Life Encyclopedia of Life Earth BioGenome Project Open Tree of Life Tree of Life Web Project Wikispecies
https://en.wikipedia.org/wiki/Matthew%20Meselson
Matthew Stanley Meselson (born May 24, 1930) is a geneticist and molecular biologist currently at Harvard University, known for his demonstration, with Franklin Stahl, of semi-conservative DNA replication. After completing his Ph.D. under Linus Pauling at the California Institute of Technology, Meselson became a Professor at Harvard University in 1960, where he has remained, today, as Thomas Dudley Cabot Professor of the Natural Sciences. In the famous Meselson–Stahl experiment of 1958 he and Frank Stahl demonstrated through nitrogen isotope labeling that DNA is replicated semi-conservatively. In addition, Meselson, François Jacob, and Sydney Brenner discovered the existence of messenger RNA in 1961. Meselson has investigated DNA repair in cells and how cells recognize and destroy foreign DNA, and, with Werner Arber, was responsible for the discovery of restriction enzymes. Since 1963 he has been interested in chemical and biological defense and arms control, has served as a consultant on this subject to various government agencies. Meselson worked with Henry Kissinger under the Nixon administration to convince President Richard Nixon to renounce biological weapons, suspend chemical weapons production, and support an international treaty prohibiting the acquisition of biological agents for hostile purposes, which in 1972 became known as the Biological Weapons Convention. Meselson has received the Award in Molecular Biology from the National Academy of Sciences, the Public Service Award of the Federation of American Scientists, the Presidential Award of the New York Academy of Sciences, the 1995 Thomas Hunt Morgan Medal of the Genetics Society of America, as well as the Lasker Award for Special Achievement in Medical Science. His laboratory at Harvard currently investigates the biological and evolutionary nature of sexual reproduction, genetic recombination, and aging. Many of his past students are notable biologists, including Nobel Laureate Sidney Altman, as wel
https://en.wikipedia.org/wiki/VMware%20Server
VMware Server (formerly VMware GSX Server) is a discontinued free-of-charge virtualization-software server suite developed and supplied by VMware, Inc. VMware Server has fewer features than VMware ESX, software available for purchase, but can create, edit, and play virtual machines. It uses a client–server model, allowing remote access to virtual machines, at the cost of some graphical performance (and 3D support). It can run virtual machines created by other VMware products and by Microsoft Virtual PC. VMware Server can preserve and revert to a single snapshot copy of each separate virtual machine within the VMware Server environment. The software does not have a specific interface for cloning virtual machines, unlike VMware Workstation. VMware Server has largely been replaced by the "Shared Virtual Machines" feature, introduced in VMware Workstation 8.0 and onwards. Naming The former name GSX Server allegedly stands for Ground Storm X, an early code name for the project. Versions VMware Server 1.0 VMware released version 1.0 of Server on July 12, 2006, replacing the discontinued VMware GSX Server product-line. VMware Inc continued to develop the Vmware Server 1.0.x series, issuing a maintenance release (version 1.0.10) on 26 October 2009. VMware Server 2.0 VMware Server 2 runs on several server-class host operating systems, including different versions of Microsoft Windows Server 2000, 2003, and 2008, and mainly enterprise-class Linuxes. The manual explicitly states: "you must use a Windows server operating system". The product also runs on Windows 7 Enterprise Edition. Server 2 uses a web-based user-interface, the "VMware Infrastructure Web Access", instead of a GUI. For web interfaces, VMware Server 2 and VMware vCenter 4 use the Tomcat 6 web server, while VMware vCenter 2.5 is based on Tomcat 2.5. As part of the product, the VMware Host Agent service (also carried over to VMware Workstation Server until today) allows remote access to VMware Server fun
https://en.wikipedia.org/wiki/Casein%20kinase%201
The Casein kinase 1 family () of protein kinases are serine/threonine-selective enzymes that function as regulators of signal transduction pathways in most eukaryotic cell types. CK1 isoforms are involved in Wnt signaling, circadian rhythms, nucleo-cytoplasmic shuttling of transcription factors, DNA repair, and DNA transcription. Discovery By the early 1950s it was known from metabolic labeling studies using radioactive phosphate that phosphate groups attached to phosphoproteins inside cells can sometimes undergo rapid exchange of new phosphate for old. In order to perform experiments that would allow isolation and characterization of the enzymes involved in attaching and removing phosphate from proteins, there was a need for convenient substrates for protein kinases and protein phosphatases. Casein has been used as a substrate since the earliest days of research on protein phosphorylation. By the late 1960s, cyclic AMP-dependent protein kinase had been purified, and most attention was centered on kinases and phosphatases that could regulate the activity of important enzymes. Casein kinase activity associated with the endoplasmic reticulum of mammary glands was first characterized in 1974, and its activity was shown to not depend on cyclic AMP. CK1 family The CK1 family of monomeric serine–threonine protein kinases is found in eukaryotic organisms from yeast to humans. Mammals have seven family members (sometimes referred to as isoforms, but encoded by distinct genes): alpha, beta 1, gamma 1, gamma 2, gamma 3, delta, and epsilon. Isoforms range from 22 to 55 kDa and have been identified in the membranes, nucleus, and cytoplasm of eukaryotes and additionally in the mitotic spindle in mammalian cells. The family members have the highest homology in their kinase domains (53%–98% identical) and differ from most other protein kinases by the presence of the sequence S-I-N instead of A-P-E in kinase domain VIII. The family members appear to have similar substrate specif
https://en.wikipedia.org/wiki/List%20of%20personal%20finance%20software
Personal finance software can be used to track spending, create budgets, and plan for future expenses. Some software differs by feature support, software code and development transparency, mobile app features, import methods, Monetization model, privacy and data storage practices. Risks The use of expense tracking, budgeting, and other personal finance software carries some risk, most notably is due to the disclosure of a username, password, or other account credentials used to automatically synchronize banking information with an expense tracking application. Another significant area of risk is due to sensitive personal information that is stored anytime data is digitized. This risk may be compounded based on the security the software vendor has implemented as well as the availability of the data and where specifically it is stored (online or a local application). An often overlooked form of risk is due to the monetization model and privacy practices of the vendor or software provider, whether the application is "free" or fee based. Open source software is one way of potentially minimizing the risks of privacy and monetization related risks of data exposure. The following is a list of personal financial management software. The first section is devoted to free and open-source software, and the second is for proprietary software. Free and open-source personal financial management software Proprietary personal financial management vendors and software {| class="wikitable" |+ !Name !Spending Tracking !Budgeting !Investment Tracking !Third-Party Bill Paying !Operating Systems !Mobile Support !Software Type !Direct Cost !Other Monetization Models !Description |- |Banktivity |Yes |Yes |Yes |Yes |macOS |iOS |Stand alone |Yearly Fee | |Personal finance software for Mac OS. |- |Mint |Yes |Yes |Yes |No |Any |iOS, Android |Web-Based |Free |Financial product referrals | |- |Moneydance |Yes | |Yes | |Any (JVM based) | |Stand alone | | | |- |Moneyspire | | | | | | | | | |
https://en.wikipedia.org/wiki/Polarization%20gradient%20cooling
Polarization gradient cooling (PG cooling) is a technique in laser cooling of atoms. It was proposed to explain the experimental observation of cooling below the doppler limit. Shortly after the theory was introduced experiments were performed that verified the theoretical predictions. While Doppler cooling allows atoms to be cooled to hundreds of microkelvin, PG cooling allows atoms to be cooled to a few microkelvin or less. The superposition of two counterpropagating beams of light with orthogonal polarizations creates a gradient where the polarization varies in space. The gradient depends on which type of polarization is used. Orthogonal linear polarizations (the lin⊥lin configuration) results in the polarization varying between linear and circular polarization in the range of half a wavelength. However, if orthogonal circular polarizations (the σ+σ− configuration) are used, the result is a linear polarization that rotates along the axis of propagation. Both configurations can be used for cooling and yield similar results, however, the physical mechanisms involved are very different. For the lin⊥lin case, the polarization gradient causes periodic light shifts in Zeeman sublevels of the atomic ground state that allows for a Sisyphus effect to occur. In the σ+-σ− configuration, the rotating polarization creates a motion-induced population imbalance in the Zeeman sublevels of the atomic ground state resulting in an imbalance in the radiation pressure that opposes the motion of the atom. Both configurations achieve sub-Doppler cooling and instead reach the recoil limit. While the limit of PG cooling is lower than that of Doppler cooling, the capture range of PG cooling is lower and thus an atomic gas must be pre-cooled before PG cooling. Observation of Cooling Below the Doppler Limit When laser cooling of atoms was first proposed in 1975, the only cooling mechanism considered was Doppler cooling. As such the limit on the temperature was predicted to be the Dopple
https://en.wikipedia.org/wiki/Elephantomyia%20bozenae
Elephantomyia (Elephantomyia) bozenae is an extinct species of crane fly in the family Limoniidae. The species is solely known from the Middle Eocene Baltic amber deposits in the Baltic Sea region of Europe. The species is one of six described from Baltic amber. History and classification Elephantomyia (Elephantomyia) bozenae is known only from the holotype specimen, collection number MP/3338, which is preserved as an inclusion in transparent Baltic amber. As of 2015, the amber specimen was included in the collections of the Polish Academy of Sciences. Baltic amber is recovered from fossil bearing rocks in the Baltic Sea region of Europe. Estimates of the age date between 37 million years old, for the youngest sediments and 48 million years old. This age range straddles the middle Eocene, ranging from near the beginning of the Lutetian to the beginning of the Pribonian. E. bozenae is one of six crane fly species in the genus Elephantomyia described from the Baltic amber, the others being E. baltica, E. brevipalpa, E. irinae, E. longirostris, and E. pulchella. All six species are placed into the Elephantomyia subgenus Elephantomyia based on the lack of tibial spurs and by several aspects of the wing morphology. The type specimen was first studied by paleoentomologist Iwona Kania, of the University of Rzeszów, whose 2015 type description for the species was published in the journal PLoS ONE. The specific epithet bozenae was coined to honor the biologist Bożena Szala. Description The E. bozenae type specimen is a well preserved male that is approximately long, not including the rostrum. The head has a rostrum that is long, just over half the length of the fore-wing and longer than the abdomen. The tip of the rostrum has elongate palpus at the tip. Each palpus is composed of four segments, with the basal three segments long and the apical segment short. All four segments host a system of microtrichia. The antennae are small, composed fifteen segments. The
https://en.wikipedia.org/wiki/%C5%81o%C5%9B%E2%80%93Tarski%20preservation%20theorem
The Łoś–Tarski theorem is a theorem in model theory, a branch of mathematics, that states that the set of formulas preserved under taking substructures is exactly the set of universal formulas. The theorem was discovered by Jerzy Łoś and Alfred Tarski. Statement Let be a theory in a first-order logic language and a set of formulas of . (The sequence of variables need not be finite.) Then the following are equivalent: If and are models of , , is a sequence of elements of . If , then .( is preserved in substructures for models of ) is equivalent modulo to a set of formulas of . A formula is if and only if it is of the form where is quantifier-free. In more common terms, this states that every first-order formula is preserved under induced substructures if and only if it is , i.e. logically equivalent to a first-order universal formula. As substructures and embeddings are dual notions, this theorem is sometimes stated in its dual form: every first-order formula is preserved under embeddings on all structures if and only if it is , i.e. logically equivalent to a first-order existential formula. Note that this property fails for finite models. Citations
https://en.wikipedia.org/wiki/Microsoft%20HoloLens
Microsoft HoloLens is an augmented reality (AR)/mixed reality (MR) headset developed and manufactured by Microsoft. HoloLens runs the Windows Mixed Reality platform under the Windows 10 operating system. Some of the positional tracking technology used in HoloLens can trace its lineage to the Microsoft Kinect, an accessory for Microsoft's Xbox 360 and Xbox One game consoles that was introduced in 2010. The pre-production version of HoloLens, the Development Edition, shipped on March 30, 2016, and is targeted to developers in the United States and Canada for a list price of $3000 which allowed hobbyist, professionals, and corporations to participate in the pre-production version of HoloLens. Samsung and Asus have extended an offer to Microsoft to help produce their own mixed-reality products, in collaboration with Microsoft, based around the concept and hardware on HoloLens. On October 12, 2016, Microsoft announced global expansion of HoloLens and publicized that HoloLens would be available for preorder in Australia, Ireland, France, Germany, New Zealand and the United Kingdom. There is also a commercial suite (similar to a pro edition of Windows), with enterprise features such as BitLocker security. As of May 2017, the suite sold for US$5,000. Microsoft has decided to rent the Hololens without clients making the full investment. Microsoft partners with a company called Absorbents to give the service of HoloLens rental. HoloLens 2 was announced at the Mobile World Congress (MWC) in Barcelona, Spain, on February 24, 2019, and was available on preorder at $3,500. Description The HoloLens is a head-mounted display unit connected to an adjustable, cushioned inner headband, which can tilt HoloLens up and down, as well as forward and backward. To wear the unit, the user fits the HoloLens on their head, using an adjustment wheel at the back of the headband to secure it around the crown, supporting and distributing the weight of the unit equally for comfort, before tiltin
https://en.wikipedia.org/wiki/Bojan%20Mohar
Bojan Mohar (born September 21, 1956) is a Slovenian and Canadian mathematician, working in graph theory. He is a professor of mathematics at the University of Ljubljana and the holder of a Canada Research Chair in graph theory at Simon Fraser University in Vancouver, British Columbia, Canada. Education Mohar received his PhD from the University of Ljubljana in 1986, under the supervision of Tomo Pisanski. Research Mohar's research concerns topological graph theory, algebraic graph theory, graph minors, and graph coloring. With Carsten Thomassen he is the co-author of the book Graphs on Surfaces (Johns Hopkins University Press, 2001). Books Awards and honors Mohar was a Fulbright visiting scholar at Ohio State University in 1988, and won the Boris Kidrič prize of the Socialist Republic of Slovenia in 1990. He has been a member of the Slovenian Academy of Engineering since 1999. He was named a SIAM Fellow in 2018. He was elected as a Fellow of the American Mathematical Society in the 2020 Class, for "contributions to topological graph theory, including the theory of graph embedding algorithms, graph coloring and crossing numbers, and for service to the profession".
https://en.wikipedia.org/wiki/Ryukyu%20kingfisher
The Ryukyu kingfisher (Todiramphus cinnamominus miyakoensis) is an enigmatic taxon of tree kingfisher. It is extinct and is only known from a single specimen. Its taxonomic status is doubtful; it is most likely a subspecies of the Guam kingfisher, which would make its scientific name Todiramphus cinnamominus miyakoensis. As the specimen is at the Yamashina Institute for Ornithology, the question could be resolved using DNA sequence analysis; at any rate, the Guam kingfisher is almost certainly the closest relative of the Ryukyu bird. The IUCN considers this bird a subspecies and has hence struck it from its redlist. The one known bird, probably a male, was according to its label collected on Miyako-jima, the main island of the Miyako group, Ryūkyū Shotō, on February 5, 1887. While it is often and correctly stated that specimen labels may be incorrect or misleading, the locality, to the northwest of the extant populations of Todiramphus cinnamominus, seems sound in a biogeographical sense. At least the specimen labels of Ryukyu collections by later Japanese collectors are usually very reliable; whether this is true for earlier collection too is not known. The only differences between the Miyako-jima bird and males of the Guam kingfisher (the nominate subspecies of the Micronesian kingfisher; presently only surviving in captivity) are the former's lack of a black nape band and the red feet (black in Guam birds). The bill color is unknown due to damage to the specimen, and supposed differences in the proportion of the remiges are almost certainly an artifact of specimen preparation. Indeed, the specimen was not recognized as distinct until some 30 years after its collection. If the bird was indeed a resident of the Miyako group (and as there was better habitat on neighboring Irabu-jima, it is probable that it would have been found there too), it became extinct in the late 19th century. While this seems early, the population must have always been small as there never
https://en.wikipedia.org/wiki/Web%20audience%20measurement
Web Audience Measurement (WAM) is an audience measurement and website analytics tool that measures Internet usage in India. The system, a joint effort of IMRB International and Internet and Mobile Association of India surveys over 6000 individuals across 8 metropolitan centers in India and tracks a variety of metrics such as time-on-site, exposure, reach and frequency of Internet usage. WAM uses audience measurement and is a continuous tracking panel study that provides cross sectional data on Internet usage segmented by gender, SEC and location. This panel-based approach uses metering technology, design for an Indian context that tracks computers. Web Rating Points factor multiple measures of Internet usage to provide a more comprehensive picture to web advertisers and attempts to standardize web analytics in India. The web analytics market in India is currently fragmented, with Comscore and Vizisense being IMRB's key competitors. Several discussions revolve around the difference between the numbers provided by all the competitors in the digital audience measurement space. Therefore, choosing the right measurement partner is imperative for media stakeholders. This creates rifts between users of two different audience measurement tools.