source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Fractional%20crystallization%20%28chemistry%29
In chemistry, fractional crystallization is a method of refining substances based on differences in their solubility. It fractionates via differences in crystallization (forming of crystals). If a mixture of two or more substances in solution are allowed to crystallize, for example by allowing the temperature of the solution to decrease or increase, the precipitate will contain more of the least soluble substance. The proportion of components in the precipitate will depend on their solubility products. If the solubility products are very similar, a cascade process will be needed to effectuate a complete separation. This technique is often used in chemical engineering to obtain pure substances, or to recover saleable products from waste solutions. Fractional crystallization can be used to separate solid-solid mixtures. An example of this is separating KNO3 and KClO3. See also Cold Water Extraction Fractional crystallization (geology) Fractional freezing Laser-heated pedestal growth Pumpable ice technology Recrystallization (chemistry) Seed crystal Single crystal
https://en.wikipedia.org/wiki/AMD%20XGP
AMD XGP (eXternal Graphics Platform) is brand for an external graphics solution for laptops and notebooks by AMD. The technology was announced on June 4, 2008 on Computex 08 trade show, following the announcement of the codenamed Puma notebook platform. Development Originally reported by Hexus.net as a side project to the R600 series graphics cards. Codenamed Lasso, the project is an external graphics solution using desktop video cards, and data is sent via two cables as defined in PCI-E external cabling specification (version 1.0). The project would later fall into development hell with unknown development status. In June 2008, near Computex 2008, rumours surfaced over the Internet about AMD is preparing an external graphics solution for notebook computers, but using a proprietary connectivity solution instead. The ATI XGP was officially announced on June 4, 2008 during the course of the Computex 2008 exhibition. Technology The XGP platform consists of several parts, that includes a mobility Radeon HD graphics card, an external case and a proprietary connectivity solution. Graphics Single GPU configuration Mobility Radeon HD 3870 (M88 core) AVIVO HD with UVD Integrated HDCP Mobility Radeon HD 5870 (Broadway core) ATI CrossFire X technology (Dual GPU configuration) Mobility Radeon HD 3850 X2 Mobility Radeon HD 3870 X2 Connectivity Data Proprietary connectivity solution designed in collaboration with JAE Electronics Transfer PCI-E 2.0 signals between XGP and the notebook computer PCI Express 2.0 compliant 8 lanes and 16 lanes option available Hot plug detection AMD has one year exclusivity on the use of the connector 2 USB 2.0 For connecting external disc players Connected via the signal pairs between the southbridge to the USB hub via the cable Visual Supports up to four displays via the following visual outputs: DVI-I HDMI Optional DisplayPort Consumer products Laptop Fujitsu-Siemens Amilo Sa 3650 Acer Ferrari One 200 Enclosure/doc
https://en.wikipedia.org/wiki/R-algebroid
In mathematics, R-algebroids are constructed starting from groupoids. These are more abstract concepts than the Lie algebroids that play a similar role in the theory of Lie groupoids to that of Lie algebras in the theory of Lie groups. (Thus, a Lie algebroid can be thought of as 'a Lie algebra with many objects '). Definition An R-algebroid, , is constructed from a groupoid as follows. The object set of is the same as that of and is the free R-module on the set , with composition given by the usual bilinear rule, extending the composition of . R-category A groupoid can be regarded as a category with invertible morphisms. Then an R-category is defined as an extension of the R-algebroid concept by replacing the groupoid in this construction with a general category C that does not have all morphisms invertible. R-algebroids via convolution products One can also define the R-algebroid, , to be the set of functions with finite support, and with the convolution product defined as follows: . Only this second construction is natural for the topological case, when one needs to replace 'function' by 'continuous function with compact support', and in this case . Examples Every Lie algebra is a Lie algebroid over the one point manifold. The Lie algebroid associated to a Lie groupoid. See also
https://en.wikipedia.org/wiki/Iterative%20refinement
Iterative refinement is an iterative method proposed by James H. Wilkinson to improve the accuracy of numerical solutions to systems of linear equations. When solving a linear system due to the compounded accumulation of rounding errors, the computed solution may sometimes deviate from the exact solution Starting with iterative refinement computes a sequence which converges to when certain assumptions are met. Description For the th iteration of iterative refinement consists of three steps: The crucial reasoning for the refinement algorithm is that although the solution for in step (ii) may indeed be troubled by similar errors as the first solution, , the calculation of the residual in step (i), in comparison, is numerically nearly exact: You may not know the right answer very well, but you know quite accurately just how far the solution you have in hand is from producing the correct outcome (). If the residual is small in some sense, then the correction must also be small, and should at the very least steer the current estimate of the answer, , closer to the desired one, The iterations will stop on their own when the residual is zero, or close enough to zero that the corresponding correction is too small to change the solution which produced it; alternatively, the algorithm stops when is too small to convince the linear algebraist monitoring the progress that it is worth continuing with any further refinements. Note that the matrix equation solved in step (ii) uses the same matrix for each iterations. If the matrix equation is solved using a direct method, such as Cholesky or LU decomposition, the numerically expensive factorization of is done once and is reused for the relatively inexpensive forward and back substitution to solve for at each iteration. Error analysis As a rule of thumb, iterative refinement for Gaussian elimination produces a solution correct to working precision if double the working precision is used in the computation of
https://en.wikipedia.org/wiki/List%20of%20impossible%20puzzles
This is a list of puzzles that cannot be solved. An impossible puzzle is a puzzle that cannot be resolved, either due to lack of sufficient information, or any number of logical impossibilities. 15 puzzle – Slide fifteen numbered tiles into numerical order. Impossible for half of the starting positions. Five room puzzle – Cross each wall of a diagram exactly once with a continuous line. MU puzzle – Transform the string to according to a set of rules. Mutilated chessboard problem – Place 31 dominoes of size 2×1 on a chessboard with two opposite corners removed. Coloring the edges of the Petersen graph with three colors. Seven Bridges of Königsberg – Walk through a city while crossing each of seven bridges exactly once. Three cups problem – Turn three cups right-side up after starting with one wrong and turning two at a time. Three utilities problem – Connect three cottages to gas, water, and electricity without crossing lines. Thirty-six officers problem – Arrange six regiments consisting of six officers each of different ranks in a 6 × 6 square so that no rank or regiment is repeated in any row or column. See also Impossible Puzzle, or "Sum and Product Puzzle", which is not impossible -gry, a word puzzle List of undecidable problems, no algorithm can exist to answer a yes–no question about the input Puzzles Mathematics-related lists
https://en.wikipedia.org/wiki/Mericitabine
Mericitabine (RG-7128) is an antiviral drug, a deoxycytidine analog (a type of nucleoside analog). It was developed as a treatment for hepatitis C, acting as a NS5B RNA polymerase inhibitor, but while it showed a good safety profile in clinical trials, it was not sufficiently effective to be used as a stand-alone agent. However mericitabine has been shown to boost the efficacy of other antiviral drugs when used alongside them, and as most modern treatment regimens for hepatitis C use a combination therapy of several antiviral drugs, clinical trials have continued to see if it can form a part of a clinically useful drug treatment program. See also BCX4430 MK-608 NITD008 Valopicitabine
https://en.wikipedia.org/wiki/Rollo%20Davidson%20Prize
The Rollo Davidson Prize is a prize awarded annually to early-career probabilists by the Rollo Davidson trustees. It is named after English mathematician Rollo Davidson (1944–1970). Rollo Davidson Trust In 1970, Rollo Davidson, a Fellow-elect of Churchill College, Cambridge died on Piz Bernina, a mountain in Switzerland. In 1975, a trust fund was established at Churchill College in his memory, endowed initially through the publication in his honour of two volumes of papers, edited by E. F. Harding and D. G. Kendall. The Rollo Davidson Trust has awarded an annual prize to young probabilists since 1976, and has organized occasional lectures in honour of Davidson. Since 2012 the Trust has also awarded an annual Thomas Bond Sprague Prize. List of recipients of the Rollo Davidson Prize 1976 – Brian D. Ripley 1977 – Olav Kallenberg 1978 – Zhen-ting Hou 1979 – Frank Kelly 1980 – David Aldous and Erik Jørgensen 1981 – John C. Gittins 1982 – Rouben V. Ambartzumian and Persi Diaconis 1983 – Ed Perkins 1984 – Martin Thomas Barlow and Chris Rogers 1985 – Piet Groeneboom and Terence John Lyons 1986 – Peter Hall and Jean-François Le Gall 1987 – Yao-chi Yu, Jie-zhong Zou, and Andrew Carverhill 1988 – Peter Baxendale, Imre Z. Ruzsa, and Gábor J. Székely 1989 – Geoffrey Grimmett and Rémi Léandre 1990 – Steven Evans 1991 – Alain-Sol Sznitman 1992 – Krzysztof Burdzy 1993 – Gérard Ben Arous and Robin Pemantle 1994 – Thomas Mountford and Laurent Saloff-Coste 1995 – Philippe Biane and Yuval Peres 1996 – Bruce Driver and Jean Bertoin 1997 – James Norris and 1998 – Davar Khoshnevisan and Wendelin Werner 1999 – Raphaël Cerf and Gareth Roberts 2000 – Kurt Johansson and David Wilson 2001 – Richard Kenyon 2002 – Stanislav Smirnov and Balaji Prabhakar 2003 – Alice Guionnet 2004 – Alexander Holroyd and Itai Benjamini 2005 – Olle Häggström and Neil O'Connell 2006 – Scott Sheffield 2007 – Remco van der Hofstad 2008 – Brian Rider and Bálint Virág 2009 – Grég
https://en.wikipedia.org/wiki/GNU%20Linear%20Programming%20Kit
The GNU Linear Programming Kit (GLPK) is a software package intended for solving large-scale linear programming (LP), mixed integer programming (MIP), and other related problems. It is a set of routines written in ANSI C and organized in the form of a callable library. The package is part of the GNU Project and is released under the GNU General Public License. GLPK uses the revised simplex method and the primal-dual interior point method for non-integer problems and the branch-and-bound algorithm together with Gomory's mixed integer cuts for (mixed) integer problems. History GLPK was developed by Andrew O. Makhorin (Андрей Олегович Махорин) of the Moscow Aviation Institute. The first public release was in October 2000. Version 1.1.1 contained a library for a revised primal and dual simplex algorithm. Version 2.0 introduced an implementation of the primal-dual interior point method. Version 2.2 added branch and bound solving of mixed integer problems. Version 2.4 added a first implementation of the GLPK/L modeling language. Version 4.0 replaced GLPK/L by the GNU MathProg modeling language, which is a subset of the AMPL modeling language. Interfaces and wrappers Since version 4.0, GLPK problems can be modeled using GNU MathProg (GMPL), a subset of the AMPL modeling language used only by GLPK. However, GLPK is most commonly called from other programming languages. Wrappers exist for: Julia and the JuMP modeling package Java (using OptimJ) Further reading The book uses GLPK exclusively and contains numerous examples.
https://en.wikipedia.org/wiki/Protein-S-isoprenylcysteine%20O-methyltransferase
The isoprenylcysteine o-methyltransferase () carries out carboxyl methylation of cleaved eukaryotic proteins that terminate in a CaaX motif. In Saccharomyces cerevisiae (Baker's yeast) this methylation is carried out by Ste14p, an integral endoplasmic reticulum membrane protein. Ste14p is the founding member of the isoprenylcysteine carboxyl methyltransferase (ICMT) family, whose members share significant sequence homology. The enzyme catalyzes the chemical reaction S-adenosyl-L-methionine + protein C-terminal S-farnesyl-L-cysteine S-adenosyl-L-homocysteine + protein C-terminal S-farnesyl-L-cysteine methyl ester Thus, the two substrates of this enzyme are S-adenosyl methionine and protein C-terminal S-farnesyl-L-cysteine, whereas its two products are S-adenosylhomocysteine and protein C-terminal S-farnesyl-L-cysteine methyl ester.
https://en.wikipedia.org/wiki/Content-addressable%20storage
Content-addressable storage (CAS), also referred to as content-addressed storage or fixed-content storage, is a way to store information so it can be retrieved based on its content, not its name or location. It has been used for high-speed storage and retrieval of fixed content, such as documents stored for compliance with government regulations. Content-addressable storage is similar to content-addressable memory. CAS systems work by passing the content of the file through a cryptographic hash function to generate a unique key, the "content address". The file system's directory stores these addresses and a pointer to the physical storage of the content. Because an attempt to store the same file will generate the same key, CAS systems ensure that the files within them are unique, and because changing the file will result in a new key, CAS systems provide assurance that the file is unchanged. CAS became a significant market during the 2000s, especially after the introduction of the 2002 Sarbanes–Oxley Act which required the storage of enormous numbers of documents for long periods and retrieved only rarely. Ever-increasing performance of traditional file systems and new software systems have eroded the value of legacy CAS systems, which have become increasingly rare after roughly 2018. However, the principles of content addressability continue to be of great interest to computer scientists, and form the core of numerous emerging technologies, such as peer-to-peer file sharing, cryptocurrencies, and distributed computing. Description Location-based approaches Traditional file systems generally track files based on their filename. On random-access media like a floppy disk, this is accomplished using a directory that consists of some sort of list of filenames and pointers to the data. The pointers refer to a physical location on the disk, normally using disk sectors. On more modern systems and larger formats like hard drives, the directory is itself split into many
https://en.wikipedia.org/wiki/Largest%20organisms
This article lists the largest organisms for various types of life and mostly considers extant species, which found on Earth can be determined according to various aspects of an organism's size, such as: mass, volume, area, length, height, or even genome size. Some organisms group together to form a superorganism (such as ants or bees), but such are not classed as single large organisms. The Great Barrier Reef is the world's largest structure composed of living entities, stretching , but contains many organisms of many types of species. If considered individual entities, the largest organisms are clonal colonies which can spread over large areas. Pando, a clonal colony of the quaking aspen tree, is widely considered to be the largest such organism by mass. Even if such colonies are excluded, trees retain their dominance of this listing, with the giant sequoia being the most massive tree. In 2006 a huge clonal colony of the seagrass Posidonia oceanica was discovered south of the island of Ibiza. At across, and estimated at 100,000 years old, it may be one of the largest and oldest clonal colonies on Earth. Among animals, the largest species are all marine mammals, specifically whales. The blue whale is believed to be the largest animal to have ever lived. The living land animal classification is also dominated by mammals, with the African bush elephant being the largest of these. Plants The largest single-stem tree by wood volume and mass is the giant sequoia (Sequoiadendron giganteum), native to Sierra Nevada and California; it typically grows to a height of and in diameter. The largest organism in the world, according to mass, is the aspen tree whose colonies of clones can grow up to long. The largest such colony is Pando, in the Fishlake National Forest in Utah. A form of flowering plant that far exceeds Pando as the largest organism on Earth in area and probably also mass, is the giant marine plant, Posidonia australis, living in Shark Bay, Australia. I
https://en.wikipedia.org/wiki/Angular%20momentum%20operator
In quantum mechanics, the angular momentum operator is one of several related operators analogous to classical angular momentum. The angular momentum operator plays a central role in the theory of atomic and molecular physics and other quantum problems involving rotational symmetry. Such an operator is applied to a mathematical representation of the physical state of a system and yields an angular momentum value if the state has a definite value for it. In both classical and quantum mechanical systems, angular momentum (together with linear momentum and energy) is one of the three fundamental properties of motion. There are several angular momentum operators: total angular momentum (usually denoted J), orbital angular momentum (usually denoted L), and spin angular momentum (spin for short, usually denoted S). The term angular momentum operator can (confusingly) refer to either the total or the orbital angular momentum. Total angular momentum is always conserved, see Noether's theorem. Overview In quantum mechanics, angular momentum can refer to one of three different, but related things. Orbital angular momentum The classical definition of angular momentum is . The quantum-mechanical counterparts of these objects share the same relationship: where r is the quantum position operator, p is the quantum momentum operator, × is cross product, and L is the orbital angular momentum operator. L (just like p and r) is a vector operator (a vector whose components are operators), i.e. where Lx, Ly, Lz are three different quantum-mechanical operators. In the special case of a single particle with no electric charge and no spin, the orbital angular momentum operator can be written in the position basis as: where is the vector differential operator, del. Spin angular momentum There is another type of angular momentum, called spin angular momentum (more often shortened to spin), represented by the spin operator . Spin is often depicted as a particle literally spinning a
https://en.wikipedia.org/wiki/Joconde
Joconde is the central database created in 1975 and now available online, maintained by the French Ministry of Culture, for objects in the collections of the main French public and private museums listed as Musées de France, according to article L. 441-1 of the Code du patrimoine amounting to more than 1,200 institutions. "La Joconde" is the French name of the Mona Lisa, which like about half of the collections of the Louvre, is included in the database, as one of 295 items by, after, or connected with Leonardo da Vinci; of these, only 42 works are by Leonardo da Vinci, including 6 paintings. By November 2012, Joconde contained over 475,000 object online and over 290,000 with images, from 366 collections in France, including 209,350 drawings, 63,547 paintings, 34,561 prints, 34,102 sculptures or 16,631 costumes and their accessories and is still expanding. By June 2022 it counted 636,405 objects. The database is not only dedicated to the information of the public but as well to the needs of the administrators and curators of the museums, thanks to the online presentation of professional tools to facilitate notably the museums collections cataloguing and state inventory (récolement). This explains the great precision of the listings. Since the museums participate on a voluntary basis to the regular enrichment of the database, some can present a large part of their collection, while others appear only because of the mere permanent deposits made by the first ones. Live on the French Minitel system from 1992, the database went online in 1995. Originally just for objects from the fine arts and decorative arts, in 2004 Joconde was united with what had been separate databases for objects from archeology and ethnology. It comes under the "Direction des Musées de France" (DMF) section of the Ministry. A small number of the best known objects have a prose commentary. Not all images are in colour, especially for the archaeological collections. When an object created a
https://en.wikipedia.org/wiki/Q-PACE
CubeSat Particle Aggregation and Collision Experiment (Q-PACE) or Cu-PACE, is an orbital spacecraft mission that would have studied the early stages of proto-planetary accretion by observing particle dynamical aggregation for several years. Current hypotheses have trouble explaining how particles can grow larger than a few centimeters. This is called the meter size barrier. This mission was selected in 2015 as part of NASA's ELaNa program, and it was launched on 17 January 2021. As of March 2021, however, contact has yet to be established with the satellite, and the mission is feared to be lost. Overview Q-PACE is led by Joshua Colwell at the University of Central Florida and was selected NASA's CubeSat Launch Initiative (CSLI) which placed it on Educational Launch of Nanosatellites ELaNa XX. The development of the mission was funded through NASA's Small Innovative Missions for Planetary Exploration (SIMPLEx) program. Observations of the collisional evolution and accretion of particles in a microgravity environment are necessary to elucidate the processes that lead to the formation of planetesimals (the building blocks of planets), km-size, and larger bodies, within the protoplanetary disk. The current hypotheses of planetesimal formation have difficulties in explaining how particles grow beyond one centimeter in size, so repeated experimentation in relevant conditions is necessary. Q-PACE will explore the fundamental properties of low‐velocity (< ) particle collisions in a microgravity environment in an effort to better understand accretion in the protoplanetary disk. Several precursor tests and flight missions were performed in suborbital flights as well as in the International Space Station. The small spacecraft does not need accurate pointing or propulsion, which simplified the design. On 17 January 2021, Q-PACE launched on a Virgin Orbit Launcher One, an air launch to orbit rocket that was dropped from the Cosmic Girl airplane over the Pacific Ocean. As
https://en.wikipedia.org/wiki/Pareto%20efficiency
Pareto efficiency or Pareto optimality is a situation where no action or allocation is available that makes one individual better off without making another worse off. The concept is named after Vilfredo Pareto (1848–1923), Italian civil engineer and economist, who used the concept in his studies of economic efficiency and income distribution. The following three concepts are closely related: Given an initial situation, a Pareto improvement is a new situation where some agents will gain, and no agents will lose. A situation is called Pareto-dominated if there exists a possible Pareto improvement. A situation is called Pareto-optimal or Pareto-efficient if no change could lead to improved satisfaction for some agent without some other agent losing or, equivalently, if there is no scope for further Pareto improvement (in other words, the situation is not Pareto-dominated). The Pareto front (also called Pareto frontier or Pareto set) is the set of all Pareto-efficient situations. Pareto originally used the word "optimal" for the concept, but as it describes a situation where a limited number of people will be made better off under finite resources, and it does not take equality or social well-being into account, it is in effect a definition of and better captured by "efficiency". In addition to the context of efficiency in allocation, the concept of Pareto efficiency also arises in the context of efficiency in production vs. x-inefficiency: a set of outputs of goods is Pareto-efficient if there is no feasible re-allocation of productive inputs such that output of one product increases while the outputs of all other goods either increase or remain the same. Pareto efficiency is measured along the production possibility frontier (PPF), which is a graphical representation of all the possible options of output for two products that can be produced using all factors of production. Besides economics, the notion of Pareto efficiency has been applied to the selection
https://en.wikipedia.org/wiki/Sortimo
Sortimo is German company that develops and manufactures in-vehicle equipment and storage solutions for various trades, industries and the automotive industry. They also build EV charging stations. The name Sortimo stands for "SORTIment MObil" (meaning mobile assortment). The company has around 500 employees in Germany. SortimHeadquarters are in Zusmarshausen, Bavaria. The modular structure of the equipment makes it possible to equip almost any commercial vehicle. Products are exported from the Zusmarshausen production site throughout the world. History Herbert Dischinger founded a subsidiary of Sortimo Grundstücks- and Beteiligungs-GmbH (Property and Participation GmbH), Sortimo International GmbH in 1973. Dischinger developed a metal case for the safe transport of parts and tools – its dimensions soon became the industry standard. Following the building block principle one product fits the other: boxes into the cases and the cases in the in-vehicle equipment. The first production site in Biburg near Diedorf saw products such as the Rolliboy, tool cases, components cases and case cabinets. In 1980 Sortimo began to produce its own products instead of by sub-contract manufacture under license. Ten years later Sortimo International Ausrüstungssysteme für Servicefahrzeuge GmbH, equipment systems for service vehicles, was founded. In 1992 all production sites were centralised at the new plant in Zusmarshausen and a wide variety of new products were included in the production programme. Zusmarshausen continues to be the central production site. In 1994 Sortimo was the first in the industry to have its quality management certified in accordance with DIN EN ISO 9001. All products carry the TÜV seal for proven quality. Construction for the new logistics centre in Zusmarshausen started in the spring of 2008. Sortimo offers a range of in-vehicle equipment including T-BOXX, L-BOXX, and LS-BOXX. They introduced the Sortimo Globelyst in 2004, which consists of over 1400 c
https://en.wikipedia.org/wiki/Bounded%20deformation
In mathematics, a function of bounded deformation is a function whose distributional derivatives are not quite well-behaved-enough to qualify as functions of bounded variation, although the symmetric part of the derivative matrix does meet that condition. Thought of as deformations of elasto-plastic bodies, functions of bounded deformation play a major role in the mathematical study of materials, e.g. the Francfort-Marigo model of brittle crack evolution. More precisely, given an open subset Ω of Rn, a function u : Ω → Rn is said to be of bounded deformation if the symmetrized gradient ε(u) of u, is a bounded, symmetric n × n matrix-valued Radon measure. The collection of all functions of bounded deformation is denoted BD(Ω; Rn), or simply BD, introduced essentially by P.-M. Suquet in 1978. BD is a strictly larger space than the space BV of functions of bounded variation. One can show that if u is of bounded deformation then the measure ε(u) can be decomposed into three parts: one absolutely continuous with respect to Lebesgue measure, denoted e(u) dx; a jump part, supported on a rectifiable (n − 1)-dimensional set Ju of points where u has two different approximate limits u+ and u−, together with a normal vector νu; and a "Cantor part", which vanishes on Borel sets of finite Hn−1-measure (where Hk denotes k-dimensional Hausdorff measure). A function u is said to be of special bounded deformation if the Cantor part of ε(u) vanishes, so that the measure can be written as where H n−1 | Ju denotes H n−1 on the jump set Ju and denotes the symmetrized dyadic product: The collection of all functions of special bounded deformation is denoted SBD(Ω; Rn), or simply SBD.
https://en.wikipedia.org/wiki/Scott%20Wilson%20%28composer%29
Scott Wilson (born November 26, 1969, in Vancouver) is a Canadian composer. He studied music and composition in Canada, the U.S., and Germany, and his teachers include Barry Truax, Wolfgang Rihm, Christos Hatzis, Gary Kulesha, Ron Kuivila, Alvin Lucier, Owen Underhill, Neely Bruce and David Gordon Duke. Since 2004 he has lived in Birmingham, UK, where he is Reader in Electronic Music and Director of Birmingham ElectroAcoustic Sound Theatre and the Electroacoustic Studios at the University of Birmingham. His works include pieces both for instrumental and electroacoustic forces. He is the Director of Birmingham ElectroAcoustic Sound Theatre, for which he has developed custom software, and an active developer of the SuperCollider computer music language. He was the lead editor of The SuperCollider Book published by MIT Press, was a co-author of Electronic Music with Nick Collins (composer) and Margaret Schedel published by Cambridge University Press, and has published in journals including Organised Sound and the Computer Music Journal. His music has been performed internationally, with performances at the Huddersfield Festival, the Mouvement Festival, the Trash Festival, Open Ears, the Inventionen Festival in Berlin, and the Cool Drummings Festival, and broadcast on CBC Radio 2, Radio France, Netherlands Concertzender, and BBC Radio 3. Works have been performed by Darragh Morgan, Esprit Orchestra, Birmingham Contemporary Music Group, and others, and has been recorded on labels including Edition RZ Diatribe Records and 326 music. Notable former students include Sergio Luque, Norah Lorway, Richard Bullen, Shelly Knotts, and Sam Pluta. External links Official site University of Birmingham Staff Page Canadian Music Centre Page University of Birmingham Electroacoustic Studios
https://en.wikipedia.org/wiki/Mir-572%20microRNA%20precursor%20family
In molecular biology mir-572 microRNA is a short RNA molecule. MicroRNAs function to regulate the expression levels of other genes by several mechanisms. See also MicroRNA
https://en.wikipedia.org/wiki/Nature%20Metabolism
Nature Metabolism is a monthly peer-reviewed academic journal published by Nature Portfolio. It was established in 2019. The editor-in-chief is Christoph Schmitt. Abstracting and indexing The journal is abstracted and indexed in: PubMed/MEDLINE Science Citation Index Expanded Scopus According to the Journal Citation Reports, the journal has a 2021 impact factor of 19.950, ranking it 5th out of 146 journals in the category "Endocrinology & Metabolism".
https://en.wikipedia.org/wiki/Moxi%20%28DVR%29
Moxi was a line of high-definition digital video recorders produced by Moxi Digital Digeo and Arris International. Moxi was originally released only to cable operators, but in December 2008 it was released as a retail product. Moxi was removed from the market in November 2011. The former retail product, the Moxi HD DVR, provided a high-definition user interface with support for either two or three CableCARD TV tuners. Arris also offered a companion appliance, the Moxi Mate, which could stream live or recorded TV from a Moxi HD DVR. History Digeo was founded in 1999 (originally under the name Broadband Partners, Inc.) by Microsoft co-founder Paul Allen, with headquarters in Kirkland, Washington. In the same year, Rearden Steel was started by Steve Perlman, founder of WebTV, under a veil of secrecy. In 2000, Rearden Steel was renamed to Moxi Digital while unveiling a line of media centers designed to bridge the gap between personal computers and televisions. Digeo, Inc. purchased Moxi Digital in 2002. Digeo kept its own name but adopted Moxi as its product family name. Its Palo Alto offices and most of Moxi Digital's staff were kept. Digeo also adopted most of the Moxi hardware (originally focused on satellite consumer electronics), as well as some of the Linux extensions, which were merged into Digeo's own Linux-based infrastructure and cable-specific hardware with Digeo's Emmy award-winning user interface, known as Moxi Menu. On September 22, 2009, the assets of Digeo, Inc. were purchased by the Arris International. Arris announced it would continue to develop and market the Moxi product line to both retail customers and cable operators. Retail DVR products The Moxi HD DVR was a high-definition digital video recorder (DVR) with both three-tuner and two-tuner models available, though the two-tuner model was produced only briefly before being updated. It was designed for use with cable television and supported multi-stream CableCARDs, as well as channel scanning f
https://en.wikipedia.org/wiki/Eastern%20blot
The eastern blot, or eastern blotting, is a biochemical technique used to analyze protein post-translational modifications including the addition of lipids, phosphates, and glycoconjugates. It is most often used to detect carbohydrate epitopes. Thus, eastern blot can be considered an extension of the biochemical technique of western blot. Multiple techniques have been described by the term "eastern blot(ting)", most use phosphoprotein blotted from sodium dodecyl sulfate–polyacrylamide gel electrophoresis (SDS-PAGE) gel on to a polyvinylidene fluoride or nitrocellulose membrane. Transferred proteins are analyzed for post-translational modifications using probes that may detect lipids, carbohydrate, phosphorylation or any other protein modification. Eastern blotting should be used to refer to methods that detect their targets through specific interaction of the post-translational modifications and the probe, distinguishing them from a standard far-western blot. In principle, eastern blotting is similar to lectin blotting (i.e., detection of carbohydrate epitopes on proteins or lipids). History and multiple definitions Definition of the term eastern blot is somewhat confused due to multiple sets of authors dubbing a new method as eastern blot, or a derivative thereof. All of the definitions are a derivative of the technique of western blot developed by Towbin in 1979. The current definitions are summarized below in order of the first use of the name; however, all are based on some earlier works. In some cases, the technique had been in practice for some time before the introduction of the term. (1982) The term eastern blotting was specifically rejected by two separate groups: Reinhart and Malamud referred to a protein blot of a native gel as a native blot; Peferoen et al., opted to refer to their method of drawing sodium dodecyl sulfate-gel separated proteins onto nitrocellulose using a vacuum as Vacuum blotting. (1984) Middle-eastern blotting has been described as a
https://en.wikipedia.org/wiki/Physics%20of%20financial%20markets
Physics of financial markets is a discipline that studies financial markets as physical systems. It seeks to understand the nature of financial processes and phenomena by employing the scientific method and avoiding beliefs, unverifiable assumptions and immeasurable notions, not uncommon to economic disciplines. Physics of financial markets addresses issues such as theory of price formation, price dynamics, market ergodicity, collective phenomena, market self-action, and market instabilities. Physics of financial markets should not be confused with mathematical finance, which are only concerned with descriptive mathematical modeling of financial instruments without seeking to understand nature of underlying processes. See also Econophysics Social physics Quantum economics Thermoeconomics Quantum finance Kinetic exchange models of markets Brownian model of financial markets Ergodicity economics
https://en.wikipedia.org/wiki/Blahut%E2%80%93Arimoto%20algorithm
The term Blahut–Arimoto algorithm is often used to refer to a class of algorithms for computing numerically either the information theoretic capacity of a channel, the rate-distortion function of a source or a source encoding (i.e. compression to remove the redundancy). They are iterative algorithms that eventually converge to one of the maxima of the optimization problem that is associated with these information theoretic concepts. History and application For the case of channel capacity, the algorithm was independently invented by Suguru Arimoto and Richard Blahut. In addition, Blahut's treatment gives algorithms for computing rate distortion and generalized capacity with input contraints (i.e. the capacity-cost function, analogous to rate-distortion). These algorithms are most applicable to the case of arbitrary finite alphabet sources. Much work has been done to extend it to more general problem instances. Recently, a version of the algorithm that accounts for continuous and multivariate outputs was proposed with applications in cellular signaling. There exists also a version of Blahut–Arimoto algorithm for directed information. Algorithm for Channel Capacity A discrete memoryless channel (DMC) can be specified using two random variables with alphabet , and a channel law as a conditional probability distribution . The channel capacity, defined as , indicates the maximum efficiency that a channel can communicate, in the unit of bit per use. Now if we denote the cardinality , then is a matrix, which we denote the row, column entry by . For the case of channel capacity, the algorithm was independently invented by Suguru Arimoto and Richard Blahut. independently found the following expression for the capacity of a DMC with channel law: where and are maximized over the following requirements: is a probability distribution on , That is, if we write as is a matrix that behaves like a transition matrix from to with respect to the channel law.
https://en.wikipedia.org/wiki/Object%20code%20optimizer
An object code optimizer, sometimes also known as a post pass optimizer or, for small sections of code, peephole optimizer, forms part of a software compiler. It takes the output from the source language compile step - the object code or binary file - and tries to replace identifiable sections of the code with replacement code that is more algorithmically efficient (usually improved speed). Examples The earliest "COBOL Optimizer" was developed by Capex Corporation in the mid 1970s for COBOL. This type of optimizer depended, in this case, upon knowledge of 'weaknesses' in the standard IBM COBOL compiler, and actually replaced (or patched) sections of the object code with more efficient code. The replacement code might replace a linear table lookup with a binary search for example or sometimes simply replace a relatively slow instruction with a known faster one that was otherwise functionally equivalent within its context. This technique is now known as strength reduction. For example, on the IBM/360 hardware the CLI instruction was, depending on the particular model, between twice and 5 times as fast as a CLC instruction for single byte comparisons. Advantages The main advantage of re-optimizing existing programs was that the stock of already compiled customer programs (object code) could be improved almost instantly with minimal effort, reducing CPU resources at a fixed cost (the price of the proprietary software). A disadvantage was that new releases of COBOL, for example, would require (charged) maintenance to the optimizer to cater for possibly changed internal COBOL algorithms. However, since new releases of COBOL compilers frequently coincided with hardware upgrades, the faster hardware would usually more than compensate for the application programs reverting to their pre-optimized versions (until a supporting optimizer was released). Other optimizers Some binary optimizers do executable compression, which reduces the size of binary files using generic
https://en.wikipedia.org/wiki/Astroparticle%20physics
Astroparticle physics, also called particle astrophysics, is a branch of particle physics that studies elementary particles of astronomical origin and their relation to astrophysics and cosmology. It is a relatively new field of research emerging at the intersection of particle physics, astronomy, astrophysics, detector physics, relativity, solid state physics, and cosmology. Partly motivated by the discovery of neutrino oscillation, the field has undergone rapid development, both theoretically and experimentally, since the early 2000s. History The field of astroparticle physics is evolved out of optical astronomy. With the growth of detector technology came the more mature astrophysics, which involved multiple physics subtopics, such as mechanics, electrodynamics, thermodynamics, plasma physics, nuclear physics, relativity, and particle physics. Particle physicists found astrophysics necessary due to difficulty in producing particles with comparable energy to those found in space. For example, the cosmic ray spectrum contains particles with energies as high as 1020 eV, where a proton–proton collision at the Large Hadron Collider occurs at an energy of ~1012 eV. The field can be said to have begun in 1910, when a German physicist named Theodor Wulf measured the ionization in the air, an indicator of gamma radiation, at the bottom and top of the Eiffel Tower. He found that there was far more ionization at the top than what was expected if only terrestrial sources were attributed for this radiation. The Austrian physicist Victor Francis Hess hypothesized that some of the ionization was caused by radiation from the sky. In order to defend this hypothesis, Hess designed instruments capable of operating at high altitudes and performed observations on ionization up to an altitude of 5.3 km. From 1911 to 1913, Hess made ten flights to meticulously measure ionization levels. Through prior calculations, he did not expect there to be any ionization above an altitude of 50
https://en.wikipedia.org/wiki/Danoprevir
Danoprevir (INN) is an orally available 15-membered macrocyclic peptidomimetic inhibitor of NS3/4A HCV protease. It contains acylsulfonamide, fluoroisoindole and tert-butyl carbamate moieties. Danoprevir is a clinical candidate based on its favorable potency profile against multiple HCV genotypes 1–6 and key mutants (GT1b, IC50 = 0.2–0.4 nM; replicon GT1b, EC50 = 1.6 nM). Danoprevir has been evaluated in an open-label, single arm clinical trial in combination with ritonavir for treating COVID-19 and favourably compared to lopinavir/ritonavir in a second trial. History Danaoprevir was initially developed by Array BioPharma then licensed to Roche for further development and commercialization. In 2013, Danoprevir was licensed to Ascletis by Roche for development and production in China under the tradename Ganovo.
https://en.wikipedia.org/wiki/Midpoint
In geometry, the midpoint is the middle point of a line segment. It is equidistant from both endpoints, and it is the centroid both of the segment and of the endpoints. It bisects the segment. Formula The midpoint of a segment in n-dimensional space whose endpoints are and is given by That is, the ith coordinate of the midpoint (i = 1, 2, ..., n) is Construction Given two points of interest, finding the midpoint of the line segment they determine can be accomplished by a compass and straightedge construction. The midpoint of a line segment, embedded in a plane, can be located by first constructing a lens using circular arcs of equal (and large enough) radii centered at the two endpoints, then connecting the cusps of the lens (the two points where the arcs intersect). The point where the line connecting the cusps intersects the segment is then the midpoint of the segment. It is more challenging to locate the midpoint using only a compass, but it is still possible according to the Mohr-Mascheroni theorem. Geometric properties involving midpoints Circle The midpoint of any diameter of a circle is the center of the circle. Any line perpendicular to any chord of a circle and passing through its midpoint also passes through the circle's center. The butterfly theorem states that, if is the midpoint of a chord of a circle, through which two other chords and are drawn, then and intersect chord at and respectively, such that is the midpoint of . Ellipse The midpoint of any segment which is an area bisector or perimeter bisector of an ellipse is the ellipse's center. The ellipse's center is also the midpoint of a segment connecting the two foci of the ellipse. Hyperbola The midpoint of a segment connecting a hyperbola's vertices is the center of the hyperbola. Triangle The perpendicular bisector of a side of a triangle is the line that is perpendicular to that side and passes through its midpoint. The three perpendicular bisectors of a triangle's th
https://en.wikipedia.org/wiki/Size%20theory
In mathematics, size theory studies the properties of topological spaces endowed with -valued functions, with respect to the change of these functions. More formally, the subject of size theory is the study of the natural pseudodistance between size pairs. A survey of size theory can be found in . History and applications The beginning of size theory is rooted in the concept of size function, introduced by Frosini. Size functions have been initially used as a mathematical tool for shape comparison in computer vision and pattern recognition. An extension of the concept of size function to algebraic topology was made in the 1999 Frosini and Mulazzani paper where size homotopy groups were introduced, together with the natural pseudodistance for -valued functions. An extension to homology theory (the size functor) was introduced in 2001. The size homotopy group and the size functor are strictly related to the concept of persistent homology group studied in persistent homology. It is worth to point out that the size function is the rank of the -th persistent homology group, while the relation between the persistent homology group and the size homotopy group is analogous to the one existing between homology groups and homotopy groups. In size theory, size functions and size homotopy groups are seen as tools to compute lower bounds for the natural pseudodistance. Actually, the following link exists between the values taken by the size functions , and the natural pseudodistance between the size pairs , An analogous result holds for size homotopy group. The attempt to generalize size theory and the concept of natural pseudodistance to norms that are different from the supremum norm has led to the study of other reparametrization invariant norms. See also Size function Natural pseudodistance Size functor Size homotopy group Size pair Matching distance
https://en.wikipedia.org/wiki/Lipopeptide
A lipopeptide is a molecule consisting of a lipid connected to a peptide. They are able to self-assemble into different structures. Many bacteria produced these molecules as a part of their metabolism, especially those of the genus Bacillus, Pseudomonas and Streptomyces. Certain lipopeptides are used as antibiotics. Other lipopeptides are toll-like receptor agonists. Certain lipopeptides can have strong antifungal and hemolytic activities. It has been demonstrated that their activity is generally linked to interactions with the plasma membrane, and sterol components of the plasma membrane could play a major role in this interaction. It is a general trend that adding a lipid group of a certain length (typically C10–C12) to a lipopeptide will increase its bactericidal activity. Lipopeptides with a higher amount of carbon atoms, for example 14 or 16, in its lipid tail will typically have antibacterial activity as well as anti-fungal activity. Lipopeptide detergents (LPDs) are composed of amphiphiles and two alkyl chains which are located on the last part of the peptide backbone. They were designed to mimic the architecture of the native membranes in which two alkyl chains in a lipid molecule facially interact with the hydrophobic segment of MPs. Examples See also Pepducins, lipopeptides aimed at GPCRs.
https://en.wikipedia.org/wiki/Chemical%20Physics%20%28journal%29
Chemical Physics is a peer-reviewed scientific journal of physical chemistry. The journal was established in 1973 and is published by Elsevier on a monthly basis. The current editors are Mischa Bonn, Tianquan Lian, and Yi Luo. Indexing and abstracting The journal is indexed and abstracted in the following bibliographic databases:
https://en.wikipedia.org/wiki/Mod%20perl
mod_perl is an optional module for the Apache HTTP server. It embeds a Perl interpreter into the Apache server. In addition to allowing Apache modules to be written in the Perl programming language, it allows the Apache web server to be dynamically configured by Perl programs. However, its most common use is so that dynamic content produced by Perl scripts can be served in response to incoming requests, without the significant overhead of re-launching the Perl interpreter for each request. Slash, which runs the web site Slashdot, is written using mod_perl. Early versions of PHP were implemented in Perl using mod_perl. mod_perl can emulate a Common Gateway Interface (CGI) environment, so that existing Perl CGI scripts can benefit from the performance boost without having to be re-written. Unlike CGI (and most other web application environments), mod_perl provides complete access to the Apache API, allowing programmers to write handlers for all phases in the Apache request cycle, manipulate Apache's internal tables and state mechanisms, share data between Apache processes or threads, alter or extend the Apache configuration file parser, and add Perl code to the configuration file itself, among other things. See also CGI.pm FastCGI
https://en.wikipedia.org/wiki/Besifovir
Besifovir (INN) is an investigational medication to treat hepatitis B virus (HBV) infection. It is a novel and potent acyclic nucleotide phosphonate with a similar chemical structure to adefovir and tenofovir.
https://en.wikipedia.org/wiki/Sussman%20anomaly
The Sussman anomaly is a problem in artificial intelligence, first described by Gerald Sussman, that illustrates a weakness of noninterleaved planning algorithms, which were prominent in the early 1970s. Most modern planning systems are not restricted to noninterleaved planning and thus can handle this anomaly. While the significance/value of the problem is now a historical one, it is still useful for explaining why planning is non-trivial. In the problem, three blocks (labeled A, B, and C) rest on a table. The agent must stack the blocks such that A is atop B, which in turn is atop C. However, it may only move one block at a time. The problem starts with B on the table, C atop A, and A on the table: However, noninterleaved planners typically separate the goal (stack A atop B atop C) into subgoals, such as: get A atop B get B atop C Suppose the planner starts by pursuing Goal 1. The straightforward solution is to move C out of the way, then move A atop B. But while this sequence accomplishes Goal 1, the agent cannot now pursue Goal 2 without undoing Goal 1, since both A and B must be moved atop C: If instead the planner starts with Goal 2, the most efficient solution is to move B. But again, the planner cannot pursue Goal 1 without undoing Goal 2: The problem was first identified by Sussman as a part of his PhD research. Sussman (and his supervisor, Marvin Minsky) believed that intelligence requires a list of exceptions or tricks, and developed a modular planning system for "debugging" plans. See also STRIPS Automated planning Greedy algorithm Sources G.J. Sussman (1975) A Computer Model of Skill Acquisition Elsevier Science Inc. New York, NY, USA. Book version of his PhD thesis. Automated planning and scheduling 1975 introductions
https://en.wikipedia.org/wiki/Tucows
Tucows Inc. is an American-Canadian publicly traded Internet services and telecommunications company headquartered in Toronto, Ontario, Canada, and incorporated in Pennsylvania, United States. The company is composed of three independent businesses: Tucows Domains, Ting Internet, and Wavelo. Originally founded in 1993 as a shareware and freeware software download site, Tucows shuttered its downloads business in 2021. Tucows Domains is the second-largest domain registrar worldwide and operates OpenSRS, Ascio, and Hover. In 2012, Tucows launched Ting Mobile, a wireless service provider and used the same brand to launch its fiber Internet provider business Ting Internet in 2015. In 2020, Tucows sold its wireless business to Dish Network, while they continued to operate Ting Internet. The billing platform Tucows built for Ting Mobile was spun off into an independent OSS/BSS SaaS business, Wavelo. The company was formed in Flint, Michigan, United States, in 1993. The Tucows logo was two cow heads, a play on the homophone "two cows". Origins Scott Swedorski, a Flint native, started working as a computer lab manager at Flint's Mott Community College in 1991. By late 1992, Swedorski left Mott College to work at the Genesee County Library System as a system administrator for FALCON (Flint Area Library Cooperative Online Network) and saw a need to bring shareware reviews to the public. In 1993 he formed TUCOWS (The Ultimate Collection Of Winsock Software) leading all editorial, reviews, HTML programming and scripting. Company history In the early 1990s, Tucows was hosted on university and public servers (much like Yahoo! and Google were in their early stages). TUCOWS' mission was to provide users with downloads of both freeware and trial versions of shareware. Internet Direct, owned and operated by John Nemanic, Bill Campbell, and Colin Campbell, acquired Tucows in 1996. STI Ventures acquired Tucows in 1999. The company employed roughly 30 employees in Flint, Michi
https://en.wikipedia.org/wiki/Administrative%20domain
An administrative domain is a service provider holding a security repository permitting to easily authenticate and authorize clients with credentials. This particularly applies to computer network security. This concept is captured by the 'AdminDomain' class of the GLUE information model. An administrative domain is mainly used in intranet environments. Implementation It may be implemented as a collection of hosts and routers, and the interconnecting network(s), managed by a single administrative authority. Interoperation between different administrative domains having different security repositories, different security software or different security policies is notoriously difficult. Therefore, administrative domains wishing ad hoc interoperation or full interoperability have to build a federation.
https://en.wikipedia.org/wiki/Amazon%20Virtual%20Private%20Cloud
Amazon Virtual Private Cloud (VPC) is a commercial cloud computing service that provides a virtual private cloud, by provisioning a logically isolated section of Amazon Web Services (AWS) Cloud. Enterprise customers can access the Amazon Elastic Compute Cloud (EC2) over an IPsec based virtual private network. Unlike traditional EC2 instances which are allocated internal and external IP numbers by Amazon, the customer can assign IP numbers of their choosing from one or more subnets. Comparison to private clouds Amazon Virtual Private Cloud aims to provide a service similar to private clouds using technology such as OpenStack or HPE Helion Eucalyptus. However, private clouds typically use technology such as OpenShift application hosting and various database systems. Cloud security experts warn that there can be compliance risks, such as a loss of control or service cancellation in using public resources which do not exist with in-house systems. If transaction records are requested from Amazon about a VPC using a national security letter they may not be legally allowed to inform the customer of the breach of the security of their system. This would be true even if the actual VPC resources were in another country. The API used by AWS is only partly compatible with that of HPE Helion Eucalyptus and is not compatible with other private cloud systems, so migration from AWS may be difficult. This has led to warnings of the possibility of a lock-in to a specific technology. IP Addressing IP Addressing in Amazon Virtual Private Cloud (VPC) refers to the assignment of IP addresses to the resources within a VPC. VPC is Amazon Web Services (AWS) solution for providing isolated network environments for AWS resources. IP addresses in a VPC are used for communication between resources within the VPC, as well as for communication between the VPC and the Internet. There are two types of IP addresses used in a VPC: private IP addresses and public IP addresses. Private IP ad
https://en.wikipedia.org/wiki/Congenital%20nephrotic%20syndrome
Congenital nephrotic syndrome is a rare kidney disease which manifests in infants during the first 3 months of life, and is characterized by high levels of protein in the urine (proteinuria), low levels of protein in the blood, and swelling. This disease is primarily caused by genetic mutations which result in damage to components of the glomerular filtration barrier and allow for leakage of plasma proteins into the urinary space. Signs and symptoms Urine protein loss leads to total body swelling (generalized edema) and abdominal distension in the first several weeks to months of life. Fluid retention may lead to cough (from pulmonary edema), ascites, and widened cranial sutures and fontanelles. High urine protein loss can lead to foamy appearance of urine. Infants may be born prematurely with low birth weight, and have meconium stained amniotic fluid or a large placenta. Complications Frequent, severe infections: urinary loss of immunoglobulins Malnutrition and poor growth Blood clots (hypercoagulability): imbalance of plasma coagulation factors from urine protein loss Hypothyroidism: urinary loss of thyroid-binding protein Poor bone health associated with vitamin D deficiency: urinary loss of vitamin D binding protein Acute kidney injury Chronic kidney disease and ultimately end-stage kidney disease Causes Primary (genetic) causes Mutations in the following five genes account for greater than 80% of the genetic causes of congenital nephrotic syndrome: NPHS1 (Finnish Type): The gene NPHS1 encodes for the protein nephrin. This genetic variant is characterized by severe protein loss in the first several days to weeks of life. Fin-major and Fin-minor were the first two main genetic mutations identified in Finnish newborns, however, numerous mutations have now been identified in patients all over the world from various ethnic groups. NPHS1 mutations are the most common cause of primary congenital nephrotic syndrome, accounting for 40-80% of cases. NPHS2
https://en.wikipedia.org/wiki/Universal%20quantification
In mathematical logic, a universal quantification is a type of quantifier, a logical constant which is interpreted as "given any", "for all", or "for any". It expresses that a predicate can be satisfied by every member of a domain of discourse. In other words, it is the predication of a property or relation to every member of the domain. It asserts that a predicate within the scope of a universal quantifier is true of every value of a predicate variable. It is usually denoted by the turned A (∀) logical operator symbol, which, when used together with a predicate variable, is called a universal quantifier ("", "", or sometimes by "" alone). Universal quantification is distinct from existential quantification ("there exists"), which only asserts that the property or relation holds for at least one member of the domain. Quantification in general is covered in the article on quantification (logic). The universal quantifier is encoded as in Unicode, and as \forall in LaTeX and related formula editors. Basics Suppose it is given that 2·0 = 0 + 0, and 2·1 = 1 + 1, and , etc. This would seem to be a logical conjunction because of the repeated use of "and". However, the "etc." cannot be interpreted as a conjunction in formal logic. Instead, the statement must be rephrased: For all natural numbers n, one has 2·n = n + n. This is a single statement using universal quantification. This statement can be said to be more precise than the original one. While the "etc." informally includes natural numbers, and nothing more, this was not rigorously given. In the universal quantification, on the other hand, the natural numbers are mentioned explicitly. This particular example is true, because any natural number could be substituted for n and the statement "2·n = n + n" would be true. In contrast, For all natural numbers n, one has 2·n > 2 + n is false, because if n is substituted with, for instance, 1, the statement "2·1 > 2 + 1" is false. It is immaterial that "2·n > 2 + n" is
https://en.wikipedia.org/wiki/Ganoderma%20microsporum%20immunomodulatory%20protein
Ganoderma microsporum immunomodulatory protein or GMI is a protein discovered from the mushroom species Ganoderma microsporum. GMI is a pure protein composed of 111 amino acids and exists in nature as a tetramer. Discovery GMI is found in the mycelium of Ganoderma microsporum. During the life cycle of G. microsporum, GMI acts as an important signaling factor in the transition from the fungi's mycelium phase to the fruiting body phase. However, the levels of GMI found in both the mycelium and fruiting body are very low. In 2005, researchers utilized genetic and bio-engineering methods to obtain purified GMI, and proved that the protein is structurally similar to LZ-8, the first fungal immunomodulatory protein discovered in 1989. The name GMI is derived from the fact that when cultured with immune cells, GMI was found to not only increase the cells’ hormone production, but also induce higher levels of cellular activity.
https://en.wikipedia.org/wiki/Sedanolide
Sedanolide is a tetrahydrophthalide compound with the molecular formula C12H18O2. It is reported that sedanolide is one of flavor constituents of celery oil from fresh celery. Isomers There are 4 stereo isomers. (3R,3aR)-sedanolide (3R,3aS)-sedanolide (3S,3aR)-sedanolide - Also called neocnidilide or trans-sedanolide. (3S,3aS)-sedanolide Similar compounds cnidilide 3-butylhexahydrophthalide External links The Sedanolides, John C. Leffingwell, Ph.D., Chirality & Odour Perception
https://en.wikipedia.org/wiki/Phyloscan
Phyloscan is a web service for DNA sequence analysis that is free and open to all users (without login requirement). For locating matches to a user-specified sequence motif for a regulatory binding site, Phyloscan provides a statistically sensitive scan of user-supplied mixed aligned and unaligned DNA sequence data. Phyloscan's strength is that it brings together the Staden method for computing statistical significance, the "phylogenetic motif model" scanning functionality of the MONKEY software that models evolutionary relationships among aligned sequences, the use of the Bailey & Gribskov method for combining statistics across non-aligned sequence data, and the Neuwald & Green technique for combining statistics across multiple binding sites found within a single gene promoter region.
https://en.wikipedia.org/wiki/Perfect%20matrix
In mathematics, a perfect matrix is an m-by-n binary matrix that has no possible k-by-k submatrix K that satisfies the following conditions: k > 3 the row and column sums of K are each equal to b, where b ≥ 2 there exists no row of the (m − k)-by-k submatrix formed by the rows not included in K with a row sum greater than b. The following is an example of a K submatrix where k = 5 and b = 2:
https://en.wikipedia.org/wiki/Minichromosome
A minichromosome is a small chromatin-like structure resembling a chromosome and consisting of centromeres, telomeres and replication origins but little additional genetic material. They replicate autonomously in the cell during cellular division. Minichromosomes may be created by natural processes as chromosomal aberrations or by genetic engineering. Structure Minichromosomes can be either linear or circular pieces of DNA. By minimizing the amount of unnecessary genetic information on the chromosome and including the basic components necessary for DNA replication (centromere, telomeres, and replication sequences), molecular biologists aim to construct a chromosomal platform which can be utilized to insert or present new genes into a host cell. Production Producing minichromosomes by genetic engineering techniques involves two primary methods, the de novo (bottom-up) and the top-down approach. De novo The minimum constituent parts of a chromosome (centromere, telomeres, and DNA replication sequences) are assembled by using molecular cloning techniques to construct the desired chromosomal contents in vitro. Next, the desired contents of the minichromosome must be transformed into a host which is capable of assembling the components (typically yeast or mammalian cells) into a functional chromosome. This approach has been attempted for the introduction of minichromosomes into maize for the possibility of genetic engineering, but success has been limited and questionable. In general, the de novo approach is more difficult than the top-down method due to species incompatibility issues and the heterochromatic nature of centromeric regions. Top-down This method utilizes the mechanism of telomere-mediated chromosomal truncation (TMCT). This process is the generation of truncation by selective transformation of telomeric sequences into a host genome. This insertion causes the generation of more telomeric sequences and eventual truncation. The newly synthesized trunca
https://en.wikipedia.org/wiki/Institute%20of%20Space%20and%20Astronautical%20Science
, or ISAS, is a Japanese national research organization of astrophysics using rockets, astronomical satellites and interplanetary probes which played a major role in Japan's space development. Since 2003, it is a division of Japan Aerospace Exploration Agency (JAXA). History The ISAS originated as part of the Institute of Industrial Science of the University of Tokyo (:ja: 東京大学生産技術研究所), where Hideo Itokawa experimented with miniature solid-fuel rockets (Pencil Rocket and ) in the 1950s. This experimentation eventually led to the development of the Κ (Kappa) sounding rocket, which was used for observations during the International Geophysical Year (IGY). By 1960, the Κ-8 rocket had reached an altitude of 200 km. In 1964, the rocket group and the Institute of Aeronautics, along with scientific ballooning team, were merged to form within the University of Tokyo. The rocket evolved into the L (Lambda) series, and, in 1970, L-4S-5 was launched as Japan's first artificial satellite Ohsumi. Although Lambda rockets were only sounding rockets, the next generation of M (Mu) rockets was intended to be satellite launch vehicles from the start. Beginning in 1971, ISAS launched a series of scientific satellites to observe the ionosphere and magnetosphere. Since the launch of Hakucho in 1979, ISAS has had X-ray astronomy satellites consecutively in orbit, until it was briefly terminated by the launch failure of ASTRO-E. In 1981, as a part of university system reform, and for the mission expansion, ISAS was spun out from University of Tokyo as an inter-university national research organization, Institute of Space and Astronautical Science. ISAS was responsible for launching Japan's first interplanetary probes, Sakigake and Suisei, to Halley's Comet in 1985. It also launched Hiten, Japan's first lunar probe, in 1990. The Nozomi probe was launched in 1998 in an attempt to orbit Mars, but the spacecraft suffered system failures and was unable to enter orbit. In 2003, ISAS laun
https://en.wikipedia.org/wiki/Journey%20into%20Geometries
Journey into Geometries is a book on non-Euclidean geometry. It was written by Hungarian-Australian mathematician Márta Svéd, and published in 1991 by the Mathematical Association of America in their MAA Spectrum book series. Topics Journey into Geometries is written as a conversation between three characters: Alice, from Alice's Adventures in Wonderland (but older and familiar with Euclidean geometry), Lewis Carroll, the author of Alice's adventures, and a modern mathematician named "Dr. Whatif". Its topics include hyperbolic geometry, inversive geometry, and projective geometry, following an arrangement of these topics credited to Australian mathematician Carl Moppert, and possibly based on an earlier German-language textbook on similar topics by F. Gonseth and P. Marti. As in Alice's original adventures, the first part of the book is arranged as a travelogue. This part of the book has six chapters, each ending with a set of exercises. Following these chapters, more conventionally written material covers geometric axiom systems and provides solutions to the exercises. Audience and reception Reviewer William E. Fenton is unsure of the audience of the book, writing that it is not suitable as a textbook and would scare most undergraduates, but is too unserious for graduate students. David A. Thomas identifies the audience as "people who like to play with mathematical ideas". Fenton criticizes the book's style as a little too glib and lead-footed, and its illustrations as amateurish. H. W. Guggenheimer faults the treatment of projective geometry as "rather sketchy". Nevertheless Fenton writes that he found the book engrossing and well-organized, particularly praising its exercises. Both Fenton and Guggenheimer recommend the book to talented students of mathematics, and both Fenton and David A. Thomas suggest it as auxiliary reading for geometry courses.
https://en.wikipedia.org/wiki/Trilobal
In fibers, trilobal is a cross-section shape with three distinct sides. The shape is advantageous for optical reflective properties and is used in textile fibers. Silk fibers' rounded edges and triangular cross section contribute to their luster; in some cases, synthetic fibers are manufactured to mimic this trilobal shape to give them a silk-like appearance. Filaments with a round cross section have less brilliance than trilobal filaments. Etymology is a combination of the words "Tri" for three and "lobal" for sides. Objective Trilobal shape helps in altering hand and increasing the luster. Many synthetic fibres, such as polyester and nylon, are manufactured in Trilobal cross sectional shape for the purpose of enhancing the brilliance and changing the handle. Luster adds aesthetic values in fabrics, contributes to their attractiveness. Occasionally, this adds value to their quality assessment. Use Synthetic fibers are particularly suitable for specific effects such as crimping and texturizing due to their adaptability during production. Trilobal cross section helps alter texture and several physical attributes such as strength and static properties, in addition to providing brightness to the fibres. The trilobal cross sectional shape helps to reduce manufacturing defects in filaments. See also Fiber
https://en.wikipedia.org/wiki/IGHE
Ig epsilon chain C region is a protein that in humans is encoded by the IGHE gene. IGHE immunoglobulin heavy constant epsilon, (located on chromosome 14 for humans) has been predicted to enable antigen binding activity and immunoglobulin receptor binding activity. Predicted to be involved in several processes, including activation of immune response; defense response to other organism; and phagocytosis. Predicted to be located in extracellular region. Predicted to be part of immunoglobulin complex, circulating. Predicted to be active in external side of plasma membrane. IGHE (immunoglobulin heavy constant epsilon): The gene that encodes the ε heavy chain constant region for the IgE antibody. This gene is critical for the production and function of IgE in the body. The IGHE gene provides instructions for making a part of an antibody (immunoglobulin) called Immunoglobulin E, or IgE. Immunoglobulin E (IgE) are antibodies produced by the immune system. If you have an allergy, your immune system overreacts to an allergen by producing antibodies called Immunoglobulin E (IgE). These antibodies travel to cells that release chemicals, causing an allergic reaction. This reaction usually causes symptoms in the nose, lungs, throat, or on the skin. Each type of IgE has specific "radar" for each type of allergen. That's why some people are only allergic to cat dander (they only have the IgE antibodies specific to cat dander); while others have allergic reactions to multiple allergens because they have many more types of IgE antibodies.
https://en.wikipedia.org/wiki/Quasi-Frobenius%20ring
In mathematics, especially ring theory, the class of Frobenius rings and their generalizations are the extension of work done on Frobenius algebras. Perhaps the most important generalization is that of quasi-Frobenius rings (QF rings), which are in turn generalized by right pseudo-Frobenius rings (PF rings) and right finitely pseudo-Frobenius rings (FPF rings). Other diverse generalizations of quasi-Frobenius rings include QF-1, QF-2 and QF-3 rings. These types of rings can be viewed as descendants of algebras examined by Georg Frobenius. A partial list of pioneers in quasi-Frobenius rings includes R. Brauer, K. Morita, T. Nakayama, C. J. Nesbitt, and R. M. Thrall. Definitions A ring R is quasi-Frobenius if and only if R satisfies any of the following equivalent conditions: R is Noetherian on one side and self-injective on one side. R is Artinian on a side and self-injective on a side. All right (or all left) R modules which are projective are also injective. All right (or all left) R modules which are injective are also projective. A Frobenius ring R is one satisfying any of the following equivalent conditions. Let J=J(R) be the Jacobson radical of R. R is quasi-Frobenius and the socle as right R modules. R is quasi-Frobenius and as left R modules. As right R modules , and as left R modules . For a commutative ring R, the following are equivalent: R is Frobenius R is quasi-Frobenius R is a finite direct sum of local artinian rings which have unique minimal ideals. (Such rings are examples of "zero-dimensional Gorenstein local rings".) A ring R is right pseudo-Frobenius if any of the following equivalent conditions are met: Every faithful right R module is a generator for the category of right R modules. R is right self-injective and is a cogenerator of Mod-R. R is right self-injective and is finitely cogenerated as a right R module. R is right self-injective and a right Kasch ring. R is right self-injective, semilocal and the socle soc(RR)
https://en.wikipedia.org/wiki/Apprenticeship%20learning
In artificial intelligence, apprenticeship learning (or learning from demonstration or imitation learning) is the process of learning by observing an expert. It can be viewed as a form of supervised learning, where the training dataset consists of task executions by a demonstration teacher. Mapping function approach Mapping methods try to mimic the expert by forming a direct mapping either from states to actions, or from states to reward values. For example, in 2002 researchers used such an approach to teach an AIBO robot basic soccer skills. Inverse reinforcement learning approach Inverse reinforcement learning (IRL) is the process of deriving a reward function from observed behavior. While ordinary "reinforcement learning" involves using rewards and punishments to learn behavior, in IRL the direction is reversed, and a robot observes a person's behavior to figure out what goal that behavior seems to be trying to achieve. The IRL problem can be defined as: Given 1) measurements of an agent's behaviour over time, in a variety of circumstances; 2) measurements of the sensory inputs to that agent; 3) a model of the physical environment (including the agent's body): Determine the reward function that the agent is optimizing. IRL researcher Stuart J. Russell proposes that IRL might be used to observe humans and attempt to codify their complex "ethical values", in an effort to create "ethical robots" that might someday know "not to cook your cat" without needing to be explicitly told. The scenario can be modeled as a "cooperative inverse reinforcement learning game", where a "person" player and a "robot" player cooperate to secure the person's implicit goals, despite these goals not being explicitly known by either the person nor the robot. In 2017, OpenAI and DeepMind applied deep learning to the cooperative inverse reinforcement learning in simple domains such as Atari games and straightforward robot tasks such as backflips. The human role was limited to answering
https://en.wikipedia.org/wiki/Causal%20inference
Causal inference is the process of determining the independent, actual effect of a particular phenomenon that is a component of a larger system. The main difference between causal inference and inference of association is that causal inference analyzes the response of an effect variable when a cause of the effect variable is changed. The study of why things occur is called etiology, and can be described using the language of scientific causal notation. Causal inference is said to provide the evidence of causality theorized by causal reasoning. Causal inference is widely studied across all sciences. Several innovations in the development and implementation of methodology designed to determine causality have proliferated in recent decades. Causal inference remains especially difficult where experimentation is difficult or impossible, which is common throughout most sciences. The approaches to causal inference are broadly applicable across all types of scientific disciplines, and many methods of causal inference that were designed for certain disciplines have found use in other disciplines. This article outlines the basic process behind causal inference and details some of the more conventional tests used across different disciplines; however, this should not be mistaken as a suggestion that these methods apply only to those disciplines, merely that they are the most commonly used in that discipline. Causal inference is difficult to perform and there is significant debate amongst scientists about the proper way to determine causality. Despite other innovations, there remain concerns of misattribution by scientists of correlative results as causal, of the usage of incorrect methodologies by scientists, and of deliberate manipulation by scientists of analytical results in order to obtain statistically significant estimates. Particular concern is raised in the use of regression models, especially linear regression models. Definition Inferring the cause of something ha
https://en.wikipedia.org/wiki/Durbin%E2%80%93Wu%E2%80%93Hausman%20test
The Durbin–Wu–Hausman test (also called Hausman specification test) is a statistical hypothesis test in econometrics named after James Durbin, De-Min Wu, and Jerry A. Hausman. The test evaluates the consistency of an estimator when compared to an alternative, less efficient estimator which is already known to be consistent. It helps one evaluate if a statistical model corresponds to the data. Details Consider the linear model y = Xb + e, where y is the dependent variable and X is vector of regressors, b is a vector of coefficients and e is the error term. We have two estimators for b: b0 and b1. Under the null hypothesis, both of these estimators are consistent, but b1 is efficient (has the smallest asymptotic variance), at least in the class of estimators containing b0. Under the alternative hypothesis, b0 is consistent, whereas b1 isn't. Then the Wu–Hausman statistic is: where † denotes the Moore–Penrose pseudoinverse. Under the null hypothesis, this statistic has asymptotically the chi-squared distribution with the number of degrees of freedom equal to the rank of matrix . If we reject the null hypothesis, it means that b1 is inconsistent. This test can be used to check for the endogeneity of a variable (by comparing instrumental variable (IV) estimates to ordinary least squares (OLS) estimates). It can also be used to check the validity of extra instruments by comparing IV estimates using a full set of instruments Z to IV estimates that use a proper subset of Z. Note that in order for the test to work in the latter case, we must be certain of the validity of the subset of Z and that subset must have enough instruments to identify the parameters of the equation. Hausman also showed that the covariance between an efficient estimator and the difference of an efficient and inefficient estimator is zero. Derivation Assuming joint normality of the estimators. Consider the function : By the delta method Using the commonly used result, showed by Haus
https://en.wikipedia.org/wiki/William%20Farquhar%20Collection%20of%20Natural%20History%20Drawings
The William Farquhar Collection of Natural History Drawings consists of 477 watercolour botanical drawings of plants and animals of Malacca and Singapore by unknown Chinese (probably Cantonese) artists that were commissioned between 1819 and 1823 by William Farquhar (26 February 1774 – 13 May 1839). The paintings were meant to be of scientific value with very detailed drawings, except for those of the birds which have text going beyond their original purpose. For each drawing, the scientific and/or common name of the specimen in Malay, and, occasionally, in English, was written in pencil. A translator also penned the Malay names in Jawi using ink. The paper used was normally European paper framed by a blue frame while some have no frames at all suggesting there are two artist. History Most of the drawings were made in Malacca, before his tenure as Resident and Commandant of Singapore. On the basis that Raffles hired a Chinese artist from Macao to do natural history drawings (as described by Munshi Abdullah ), John Bastin conjectured that this artist and perhaps others were hired by Farquhar to paint watercolours of the flora and fauna of the Malay Peninsula. Kwa Chong Guan extended this conjecture via stylistic comparison to identify the artists of Farquhar's drawings as from Guandong. Farquhar donated these in eight volumes to the Museum of the Royal Asiatic Society of Great Britain and Ireland on 17 June 1826. In 1937, the Society lent six of the volumes to the Library of the British Museum (Natural History) (now the Library of the Natural History Museum), retaining the two volumes of botanical drawings in its own library. In 1991 the Natural History Museum returned the works to the Society for valuation, and on 20 October 1993 the Society offered them for sale by auction at Sotheby's in London, where they were acquired by Goh Geok Khim, founder of the brokerage firm GK Goh, for S$3 million. Goh donated the drawings to the National Museum of Singapore in 1995.
https://en.wikipedia.org/wiki/Needle%20Tower
Needle Tower is a public artwork by American sculptor Kenneth Snelson located outside of the Hirshhorn Museum and Sculpture Garden in Washington, D.C., United States. Description This 26.5 meter tall abstract sculpture is a tapering tower made of aluminum and stainless steel. The aluminum tubes act in compression, held in tension by the stainless steel cables threaded through in the ends of the tubes. Acquisition The piece was a gift of Joseph Hirshhorn in 1974. Tensegrity Snelson's unique sculpture style is well articulated in Needle Tower. The structure style displayed is known as "tensegrity," a description given by Snelson's former professor Buckminster Fuller to the melding of tension and structural integrity. According to Snelson: Tensegrity describes a closed structural system composed of a set of three or more elongate compression struts within a network of tension tendons, the combined parts mutually supportive in such a way that the struts do not touch one another, but press outwardly against nodal points in the tension network to form a firm, triangulated, prestressed, tension and compression unit. Symbolism Much has been said about the geometric shapes seen in Snelson's works. Looking up from the inside of Needle Tower one may see the Star of David. According to Snelson, his works are not symbolic and it's common to see six-pointed stars in his work. In Needle Tower the six pointedness comes from the natural geometry of the three compression struts that make up each layer. Sets of three alternate with left and right helical modules, adding up to six when viewed upwards from the base of the tower. Conservation In April 2010 conservation work was completed on the sculpture by the Hirshhorn Museum. It took 15 staff members to stand the tower upright after conservation completion. Needle Tower II A second Needle Tower, Needle Tower II, was completed in 1968 and was acquired by the Kröller-Müller Museum in 1971. The piece resides in the museum's
https://en.wikipedia.org/wiki/Transplant%20rejection
Transplant rejection occurs when transplanted tissue is rejected by the recipient's immune system, which destroys the transplanted tissue. Transplant rejection can be lessened by determining the molecular similitude between donor and recipient and by use of immunosuppressant drugs after transplant. Types of transplant rejection Transplant rejection can be classified into three types: hyperacute, acute, and chronic. These types are differentiated by how quickly the recipient's immune system is activated and the specific aspect or aspects of immunity involved. Hyperacute rejection Hyperacute rejection is a form of rejection that manifests itself in the minutes to hours following transplantation. It is caused by the presence of pre-existing antibodies in the recipient that recognize antigens in the donor organ. These antigens are located on the endothelial lining of blood vessels within the transplanted organ and, once antibodies bind, will lead to the rapid activation of the complement system. Irreversible damage via thrombosis and subsequent graft necrosis is to be expected. Tissue left implanted will fail to work and could lead to high fever and malaise as immune system acts against foreign tissue. Graft failure secondary to hyperacute rejection has significantly decreased in incidence as a result of improved pre-transplant screening for antibodies to donor tissues. While these preformed antibodies may result from prior transplants, prior blood transfusions, or pregnancy, hyperacute rejection is most commonly from antibodies to ABO blood group antigens. Consequently, transplants between individuals with differing ABO blood types is generally avoided though may be pursued in very young children (generally under 12 months, but often as old as 24 months) who do not have fully developed immune systems. Shortages of organs and the morbidity and mortality associated with being on transplant waitlists has also increased interest in ABO-incompatible transplantation in o
https://en.wikipedia.org/wiki/Banach%E2%80%93Mazur%20compactum
In the mathematical study of functional analysis, the Banach–Mazur distance is a way to define a distance on the set of -dimensional normed spaces. With this distance, the set of isometry classes of -dimensional normed spaces becomes a compact metric space, called the Banach–Mazur compactum. Definitions If and are two finite-dimensional normed spaces with the same dimension, let denote the collection of all linear isomorphisms Denote by the operator norm of such a linear map — the maximum factor by which it "lengthens" vectors. The Banach–Mazur distance between and is defined by We have if and only if the spaces and are isometrically isomorphic. Equipped with the metric δ, the space of isometry classes of -dimensional normed spaces becomes a compact metric space, called the Banach–Mazur compactum. Many authors prefer to work with the multiplicative Banach–Mazur distance for which and Properties F. John's theorem on the maximal ellipsoid contained in a convex body gives the estimate: where denotes with the Euclidean norm (see the article on spaces). From this it follows that for all However, for the classical spaces, this upper bound for the diameter of is far from being approached. For example, the distance between and is (only) of order (up to a multiplicative constant independent from the dimension ). A major achievement in the direction of estimating the diameter of is due to E. Gluskin, who proved in 1981 that the (multiplicative) diameter of the Banach–Mazur compactum is bounded below by for some universal Gluskin's method introduces a class of random symmetric polytopes in and the normed spaces having as unit ball (the vector space is and the norm is the gauge of ). The proof consists in showing that the required estimate is true with large probability for two independent copies of the normed space is an absolute extensor. On the other hand, is not homeomorphic to a Hilbert cube. See also Notes
https://en.wikipedia.org/wiki/Type%20species
In zoological nomenclature, a type species (species typica) is the species name with which the name of a genus or subgenus is considered to be permanently taxonomically associated, i.e., the species that contains the biological type specimen (or specimens). A similar concept is used for suprageneric groups and called a type genus. In botanical nomenclature, these terms have no formal standing under the code of nomenclature, but are sometimes borrowed from zoological nomenclature. In botany, the type of a genus name is a specimen (or, rarely, an illustration) which is also the type of a species name. The species name with that type can also be referred to as the type of the genus name. Names of genus and family ranks, the various subdivisions of those ranks, and some higher-rank names based on genus names, have such types. In bacteriology, a type species is assigned for each genus. Whether or not currently recognized as valid, every named genus or subgenus in zoology is theoretically associated with a type species. In practice, however, there is a backlog of untypified names defined in older publications when it was not required to specify a type. Use in zoology A type species is both a concept and a practical system that is used in the classification and nomenclature (naming) of animals. The "type species" represents the reference species and thus "definition" for a particular genus name. Whenever a taxon containing multiple species must be divided into more than one genus, the type species automatically assigns the name of the original taxon to one of the resulting new taxa, the one that includes the type species. The term "type species" is regulated in zoological nomenclature by article 42.3 of the International Code of Zoological Nomenclature, which defines a type species as the name-bearing type of the name of a genus or subgenus (a "genus-group name"). In the Glossary, type species is defined as The type species permanently attaches a formal name (the ge
https://en.wikipedia.org/wiki/Orbital%20Insight
Orbital Insight is a Palo Alto, California-based geospatial analytics company. The company analyzes satellite, drone, balloon and other unmanned aerial vehicle images, including cell phone geolocation data, to study a range of human activity, and provides business and other strategic insights from the data. James Crawford is the company's founder and CEO. History Orbital Insight was founded in 2013 by James Crawford, who earlier worked with artificial intelligence systems at Bell Labs, with Google Books and with NASA's Mars rover project. Crawford saw an opportunity to combine commercial and government satellite images with government image sets. The company's first project was analyzing the health of corn crops. In 2015, the company partnered with the World Bank to improve its poverty data, using building height and rooftop material analysis to approximate wealth. In 2016, the US intelligence committee's research arm, In-Q-Tel, and Google Ventures (GV), along with CME Group's investment arm CME Ventures, invested in the company, joining previous investors Sequoia Capital, Lux Capital and Bloomberg Beta. In May 2017, the company closed a $50M million series C round from Sequoia Capital, making it reportedly one of the most capitalized companies in the geospatial analytics industry. In October, it was reported that Orbital Insight was working with the US Department of Defense's Defense Innovation Unit (DIU) group to develop and apply advanced algorithms to extract insights from images obtained using prototype commercial Synthetic Aperture Radar (SAR) microsatellites. One goal was to improve imagery applications in poor weather or lighting conditions, for better identification of natural and manmade threats. In October, Orbital Insight partnered with commercial space imagery company DigitalGlobe to extract insights from DigitalGlobe's satellite imagery. In June 2018, Orbital Insight partnered with e-GEOS, S.p.A., a joint venture between European spaceflight s
https://en.wikipedia.org/wiki/Gaston%20Tarry
Gaston Tarry (27 September 1843 – 21 June 1913) was a French mathematician. Born in Villefranche de Rouergue, Aveyron, he studied mathematics at high school before joining the civil service in Algeria. He pursued mathematics as an amateur. In 1901 Tarry confirmed Leonhard Euler's conjecture that no 6×6 Graeco-Latin square was possible (the 36 officers problem). See also List of amateur mathematicians Prouhet-Tarry-Escott problem Tarry point Tetramagic square
https://en.wikipedia.org/wiki/Nearly%20free%20electron%20model
In solid-state physics, the nearly free electron model (or NFE model and quasi-free electron model) is a quantum mechanical model of physical properties of electrons that can move almost freely through the crystal lattice of a solid. The model is closely related to the more conceptual empty lattice approximation. The model enables understanding and calculation of the electronic band structures, especially of metals. This model is an immediate improvement of the free electron model, in which the metal was considered as a non-interacting electron gas and the ions were neglected completely. Mathematical formulation The nearly free electron model is a modification of the free-electron gas model which includes a weak periodic perturbation meant to model the interaction between the conduction electrons and the ions in a crystalline solid. This model, like the free-electron model, does not take into account electron–electron interactions; that is, the independent electron approximation is still in effect. As shown by Bloch's theorem, introducing a periodic potential into the Schrödinger equation results in a wave function of the form where the function has the same periodicity as the lattice: (where is a lattice translation vector.) Because it is a nearly free electron approximation we can assume that where denotes the volume of states of fixed radius (as described in Gibbs paradox). A solution of this form can be plugged into the Schrödinger equation, resulting in the central equation: where the kinetic energy is given by which, after dividing by , reduces to if we assume that is almost constant and The reciprocal parameters and are the Fourier coefficients of the wave function and the screened potential energy , respectively: The vectors are the reciprocal lattice vectors, and the discrete values of are determined by the boundary conditions of the lattice under consideration. In any perturbation analysis, one must consider the base case to whi
https://en.wikipedia.org/wiki/Andrey%20Yershov
Andrey Petrovich Yershov (; 19 April 1931, Moscow – 8 December 1988, Moscow) was a Soviet computer scientist, notable as a pioneer in systems programming and programming language research. Donald Knuth considers him to have independently co-discovered the idea of hashing with linear probing. He also created one of the first algorithms for compiling arithmetic expressions. He was responsible for the languages ALPHA and Rapira, the first Soviet time-sharing system AIST-0, electronic publishing system RUBIN, and a multiprocessing workstation MRAMOR. He also was the initiator of developing the Computer Bank of the Russian Language (Машинный Фонд Русского Языка), the Soviet project for creating a large representative Russian corpus, a project in the 1980s comparable to the Bank of English and British National Corpus. The Russian National Corpus created by the Russian Academy of Sciences in the 2000s is a successor of Yershov's project. From 1959, he worked at the Siberian Division of the Academy of Sciences of the Soviet Union, and helped found both the Novosibirsk Computer Center and the Siberian School of Computer Science. He received the Academician A. N. Krylov Prize from the Academy of Sciences, the first programmer to be so recognized. In 1974, he was made a Distinguished Fellow of the British Computer Society. He was involved with developing international standards in programming and informatics, as a member of the International Federation for Information Processing (IFIP) IFIP Working Group 2.1 on Algorithmic Languages and Calculi, which specified, maintains, and supports the languages ALGOL 60 and ALGOL 68. In 1981, he received the IFIP's Silver Core Award. To the computer science community, he is mostly known for his speech Aesthetics and the Human Factor in Programming presented at the dinner at the AFIPS Spring Joint Computer Conference in 1972 and, due to its importance, republished as an article by the Communications of the ACM. See also List of Ru
https://en.wikipedia.org/wiki/Anti-Life%20Equation
The Anti-Life Equation is a fictional concept appearing in American comic books published by DC Comics. In Jack Kirby's Fourth World setting, the Anti-Life Equation is a formula for total control over the minds of sentient beings that is sought by Darkseid, who, for this reason, sends his forces to Earth, as he believes part of the equation exists in the subconsciousness of humanity. Various comics have defined the equation in different ways, but a common interpretation is that the equation may be seen as a mathematical proof of the futility of living, or of life as incarceration of spirit, per predominant religious and modern cultural suppositions. History Jack Kirby's original comics established the Anti-Life Equation as giving the being who learns it power to dominate the will of all sentient and sapient races. It is called the Anti-Life Equation because "if someone possesses absolute control over you — you're not really alive". Most stories featuring the Equation use this concept. The Forever People's Mother Box found the Anti-Life Equation in Sonny Sumo, but Darkseid, unaware of this, stranded him in ancient Japan. A man known as Billion-Dollar Bates had control over the Equation's power even without the Mother Box's aid, but was accidentally killed by one of his own guards. When Metron and Swamp Thing attempt to breach the Source, which drives Swamp Thing temporarily mad, Darkseid discovers that part of the formula is love. Upon being told by the Dominators of their planned invasion of Earth, Darkseid promises not to interfere on the condition that the planet is not destroyed so his quest for the equation is not thwarted. It is later revealed in Martian Manhunter (vol. 2) #33 that Darkseid first became aware of the equation approximately 300 years ago when he made contact with the people of Mars. Upon learning of the Martian philosophy that free will and spiritual purpose could be defined by a Life Equation, Darkseid postulated that there must exist a negat
https://en.wikipedia.org/wiki/Cunningham%27s%20rule
In mathematical optimization, Cunningham's rule (also known as least recently considered rule or round-robin rule) is an algorithmic refinement of the simplex method for linear optimization. The rule was proposed 1979 by W. H. Cunningham to defeat the deformed hypercube constructions by Klee and Minty et al. (see, e.g. Klee–Minty cube). Cunningham's rule assigns a cyclic order to the variables and remembers the last variable to enter the basis. The next entering variable is chosen to be the first allowable candidate starting from the last chosen variable and following the given circular order. History-based rules defeat the deformed hypercube constructions because they tend to average out how many times a variable pivots. It has recently been shown by David Avis and Oliver Friedmann that there is a family of linear programs on which the simplex algorithm equipped with Cunningham's rule requires exponential time. Notes Optimization algorithms and methods Exchange algorithms Oriented matroids Linear programming
https://en.wikipedia.org/wiki/Kaye%20A.%20de%20Ruiz
Kaye A. de Ruiz is a mathematician and educator who has spent the majority of her career teaching calculus and statistics at the United States Air Force Academy. Education De Ruiz received her Bachelor of Science degree from Southern Oregon College and a Master of Science degree from Oregon State University. While working at the Air Force Academy faculty as an instructor, she completed a Ph.D. in applied statistics at the University of California, Riverside in 1990; her dissertation, A Mathematical Model for a Paired Comparison Experiment on a Continuum of Response, was jointly supervised by Robert J. Beaver and Barry Arnold. Teaching career De Ruiz held her first teaching position at Roseburg High School in Roseburg, Oregon, where she focused on connecting mathematical concepts to their practical uses by inviting local professionals into her classroom. She also taught adult classes at the Misawa Air Force Base in Japan and, beginning in 1982, began teaching at the United States Air Force Academy, serving as the course director for the differential calculus course at the Academy. She also taught several courses in statistics while at the Academy, maintaining her emphasis on how students can apply the mathematics to solve problems encountered outside of the classroom. De Ruiz spent some time as the chief of the statistics division at the Academy as well. Louise Hay Award De Ruiz was the 1994 recipient of the Louise Hay Award by the Association for Women in Mathematics for her contributions to mathematics education.
https://en.wikipedia.org/wiki/Load-loss%20factor
Load-loss factor (also loss load factor, LLF, or simply loss factor) is a dimensionless ratio between average and peak values of load loss (loss of electric power between the generator and the consumer in electricity distribution). Since the losses in the wires are proportional to the square of the current (and thus the square of the power), the LLF can be calculated by measuring the square of delivered power over a short interval of time (typically half an hour), calculating an average of these values over a long period (a year), and dividing by the square of the peak power exhibited during the same long period: , where is the total number of short intervals (there are 8760 hours or 17,520 half-hours in a year); is the load experienced during the short interval ; is the peak load within the long interval (typically a year). The LLF value naturally depends on the load profile. For electricity utilities, numbers about 0.2-0.3 are typical (cf. 0.22 for Toronto Hydro, 0.33 for New Zealand). Multiple empirical formulae exist that relate the loss factor to the load factor (Dickert et al. in 2009 listed nine). Similarly, the ratio between the average and the peak current is called form coefficient k or peak responsibility factor k, its typical value is between 0.2 to 0.8 for distribution networks and 0.8 to 0.95 for transmission networks. Coefficient k describes the losses as an additional load carried by the system, and is named loss equivalent load factor in Japan. See also Line Loss Factor, a regulatory definition of the line loss in UK
https://en.wikipedia.org/wiki/McCarthy%20Formalism
In computer science and recursion theory the McCarthy Formalism (1963) of computer scientist John McCarthy clarifies the notion of recursive functions by use of the IF-THEN-ELSE construction common to computer science, together with four of the operators of primitive recursive functions: zero, successor, equality of numbers and composition. The conditional operator replaces both primitive recursion and the mu-operator. Introduction McCarthy's notion of conditional expression McCarthy (1960) described his formalism this way: "In this article, we first describe a formalism for defining functions recursively. We believe this formalism has advantages both as a programming language and as a vehicle for developing a theory of computation.... We shall need a number of mathematical ideas and notations concerning functions in general. Most of the ideas are well known, but the notion of conditional expression is believed to be new, and the use of conditional expressions permits functions to be defined recursively in a new and convenient way." Minsky's explanation of the "formalism" In his 1967 Computation: Finite and Infinite Machines, Marvin Minsky in his § 10.6 Conditional Expressions: The McCarthy Formalism describes the "formalism" as follows: "Practical computer languages do not lend themselves to formal mathematical treatment--they are not designed to make it easy to prove theorems about the procedures they describe. In a paper by McCarthy [1963] we find a formalism that enhances the practical aspect of the recursive-function concept, while preserving and improving its mathematical clarity. ¶ McCarthy introduces "conditional expressions" of the form f = (if p1 then e1 else e2) where the ei are expressions and p1 is a statement (or equation) that may be true or false. ¶ This expression means See if p1 is true; if so the value of f is given by e1. IF p1 is false, the value of f is given by e2. This conditional expression . . . has also the power of the minimizat
https://en.wikipedia.org/wiki/Computational%20Complexity%20Conference
The Computational Complexity Conference (CCC), is an academic conference in the field of theoretical computer science whose roots date to 1986. It fosters research in computational complexity theory, and is typically held annually between mid-May and mid-July in North America or Europe. As of 2015, CCC is organized independently by the Computational Complexity Foundation (CCF). History CCC was first organized in 1986 under the name "Structure in Complexity Theory Conference" (Structures) with support from the US National Science Foundation. The conference was sponsored by the IEEE Computer Society Technical Committee on Mathematical Foundations of Computing from 1987-2014. In 1996, the conference was renamed the "Annual IEEE Conference on Computational Complexity", hence establishing the current acronym "CCC". In 2014, a movement towards independence and open access proceedings led to the establishment of the Computational Complexity Foundation (CCF). Since 2015, CCF organizes the conference independently under the name Computational Complexity Conference (CCC), and publishes open access proceedings via LIPIcs. Future and past conference websites, as well as past programs and call for papers, are archived online. Scope CCC broadly targets research in computational complexity theory. This currently includes (but is not limited to) the study of models of computation ranging from deterministic to quantum to algebraic, as well as resource constraints such as time, randomness, input queries, etc. Logistics CCC is annually held between mid-May and mid-July, with a scientific program running approximately three days. The conference is composed of a single-track. Activities in addition to the scientific program typically include an opening reception, a rump session, and a business meeting. Awards CCC annually confers up to two awards: A "Best Student Paper Award", aimed at papers authored solely by students, and (since 2001) a "Best Paper Award", given to the most
https://en.wikipedia.org/wiki/International%20Conference%20on%20Automated%20Reasoning%20with%20Analytic%20Tableaux%20and%20Related%20Methods
The International Conference on Automated Reasoning with Analytic Tableaux and Related Methods (TABLEAUX) is an annual international academic conference that deals with all aspects of automated reasoning with analytic tableaux. Periodically, it joins with CADE and TPHOLs into the International Joint Conference on Automated Reasoning (IJCAR). The first table convened in 1992. Since 1995, the proceedings of this conference have been published by Springer's LNAI series. In August 2006 TABLEAUX was part of the Federated Logic Conference in Seattle, USA. The following TABLEAUX were held in 2007 in Aix en Provence, France, as part of IJCAR 2008, in Sydney, Australia, as TABLEAUX 2009, in Oslo, Norway, as part of IJCAR 2010, Edinburgh, UK, as TABLEAUX 2011, in Bern, Switzerland, 4–8 July 2011, as part of IJCAR 2012, Manchester, United Kingdom, as TABLEAUX 2013, Nancy, France, 16–19 September 2013, and as part of IJCAR 2014, Vienna, Austria, 19–22 July 2014. External links TABLEAUX home page Theoretical computer science conferences Logic conferences
https://en.wikipedia.org/wiki/CopperLicht
CopperLicht is an open-source JavaScript library for creating games and interactive 3D applications using WebGL, developed by Ambiera. The aim of the library is to provide an API for making it easier developing 3D content for the web. It is supposed to be used together with its commercial 3D world editor CopperCube, but it can also be used without. History In February 2010, Ambiera introduced CopperLicht and showcased it by providing a demo website, showing a Quake III Arena level rendered in real time in the browser window. The library was originally intended to be used as a WebGL backend for the CopperCube editor, but then the developers decided to make the library free to be used by the public. In November 2014, CopperLicht was made free and open source, based on a zlib style license. Features CopperLicht includes the following features: 3D rendering based on a hierarchical scene graph Pre-created materials and shaders, including pre-calculated lightmap support Skeletal animation Built-in collision detection and simple Physics engine Dynamic light support System to create and use custom shaders and materials based on the OpenGL Shading Language (GLSL) Impostors like Billboards and Skyboxes Paths and Splines Behavior and Action system Texture animation Vertex color support Integrated 2D font and 2D primitives rendering system Automatic redraw reduction system See also List of WebGL frameworks WebGL Canvas element CopperCube
https://en.wikipedia.org/wiki/Microbial%20food%20web
The microbial food web refers to the combined trophic interactions among microbes in aquatic environments. These microbes include viruses, bacteria, algae, heterotrophic protists (such as ciliates and flagellates). In aquatic environments, microbes constitute the base of the food web. Single celled photosynthetic organisms such as diatoms and cyanobacteria are generally the most important primary producers in the open ocean. Many of these cells, especially cyanobacteria, are too small to be captured and consumed by small crustaceans and planktonic larvae. Instead, these cells are consumed by phagotrophic protists which are readily consumed by larger organisms. Viruses can infect and break open bacterial cells and (to a lesser extent), planktonic algae (a.k.a. phytoplankton). Therefore, viruses in the microbial food web act to reduce the population of bacteria and, by lysing bacterial cells, release particulate and dissolved organic carbon (DOC). DOC may also be released into the environment by algal cells. One of the reasons phytoplankton release DOC termed "unbalanced growth" is when essential nutrients (e.g. nitrogen and phosphorus) are limiting. Therefore, carbon produced during photosynthesis is not used for the synthesis of proteins (and subsequent cell growth), but is limited due of a lack of the nutrients necessary for macromolecules. Excess photosynthate, or DOC is then released, or exuded. The microbial loop describes a pathway in the microbial food web where DOC is returned to higher trophic levels via the incorporation into bacterial biomass. See also Microbial cooperation Microbial intelligence Microbial population biology
https://en.wikipedia.org/wiki/Matrix%20metalloproteinase%20inhibitor
A matrix metalloproteinase inhibitor (MMPI, INN stem ) inhibits matrix metalloproteinases. Because they inhibit cell migration, they have antiangiogenic effects. They may be both endogenous and exogenous. The most notorious endogenous metalloproteinases are tissue inhibitors of metalloproteinases (TIMPs). There are also cartilage-derived angiogenesis inhibitors. Exogenous matrix metalloproteinase inhibitors were developed as anticancer drugs. Examples include: Batimastat Cipemastat Ilomastat Marimastat MMI270 Prinomastat Rebimastat Ro 28-2653 Tanomastat Metalloproteinase inhibitors are found in numerous marine organisms, including fish, cephalopods, mollusks, algae, and bacteria. See also Drug discovery and development of MMP inhibitors
https://en.wikipedia.org/wiki/Rational%20design
In chemical biology and biomolecular engineering, rational design (RD) is an umbrella term which invites the strategy of creating new molecules with a certain functionality, based upon the ability to predict how the molecule's structure (specifically derived from motifs) will affect its behavior through physical models. This can be done either from scratch or by making calculated variations on a known structure, and usually complements directed evolution. Applications As an example, rational design is used to decipher collagen stability, mapping ligand-receptor interactions, unveiling protein folding and dynamics, and creating extra-biological structures by using fluorinated amino acids. To treat cancer, rational design is used for targeted therapies where proteins are engineered to modify the communication of cells with their environment. There is also the rational design of alfa-alkyl auxin molecules, which are auxin analogs capable of binding and blocking the formation of the hormone receptor complex. Other applications of rational design include: Protein design Nucleic acid design Drug design Pathway design
https://en.wikipedia.org/wiki/Shifting%20baseline
A shifting baseline (also known as a sliding baseline) is a type of change to how a system is measured, usually against previous reference points (baselines), which themselves may represent significant changes from an even earlier state of the system. The concept arose in landscape architect Ian McHarg's 1969 manifesto Design With Nature in which the modern landscape is compared to that on which ancient people once lived. The concept was then considered by the fisheries scientist Daniel Pauly in his paper "Anecdotes and the shifting baseline syndrome of fisheries". Pauly developed the concept in reference to fisheries management where fisheries scientists sometimes fail to identify the correct "baseline" population size (e.g. how abundant a fish species population was before human exploitation) and thus work with a shifted baseline. He describes the way that radically depleted fisheries were evaluated by experts who used the state of the fishery at the start of their careers as the baseline, rather than the fishery in its untouched state. Areas that swarmed with a particular species hundreds of years ago, may have experienced long term decline, but it is the level of decades previously that is considered the appropriate reference point for current populations. In this way large declines in ecosystems or species over long periods of time were, and are, masked. There is a loss of perception of change that occurs when each generation redefines what is "natural". Most modern fisheries' stock assessments do not ignore historical fishing and account for it by either including the historical catch or use other techniques to reconstruct the depletion level of the population at the start of the period for which adequate data is available. Anecdotes about historical populations levels can be highly unreliable and result in severe mismanagement of the fishery. The concept was further refined and applied to the ecology of kelp forests by Paul Dayton and others from the Scri
https://en.wikipedia.org/wiki/Pseudospectrum
In mathematics, the pseudospectrum of an operator is a set containing the spectrum of the operator and the numbers that are "almost" eigenvalues. Knowledge of the pseudospectrum can be particularly useful for understanding non-normal operators and their eigenfunctions. The ε-pseudospectrum of a matrix A consists of all eigenvalues of matrices which are ε-close to A: Numerical algorithms which calculate the eigenvalues of a matrix give only approximate results due to rounding and other errors. These errors can be described with the matrix E. More generally, for Banach spaces and operators , one can define the -pseudospectrum of (typically denoted by ) in the following way where we use the convention that if is not invertible. Notes Bibliography Lloyd N. Trefethen and Mark Embree: "Spectra And Pseudospectra: The Behavior of Nonnormal Matrices And Operators", Princeton Univ. Press, (2005). External links Pseudospectra Gateway by Embree and Trefethen Numerical linear algebra Spectral theory
https://en.wikipedia.org/wiki/Star%20Force
also released in arcades outside of Japan as Mega Force, is a vertical-scrolling shooter computer game released in 1984 by Tehkan. Gameplay In the game, the player pilots a starship called the Final Star, while shooting various enemies and destroying enemy structures for points. Unlike later vertical scrolling shooters, like Toaplan's Twin Cobra, the Final Star had only two levels of weapon power and no secondary weapons like missiles and/or bombs. Each stage in the game was named after a letter of the Greek alphabet. In certain versions of the game, there is an additional level called "Infinity" (represented by the infinity symbol) which occurs after Omega, after which the game repeats indefinitely. In the NES version, after defeating the Omega target, the player can see a black screen with Tecmo's logo, announcing the future release of the sequel Super Star Force. After that, the infinity target becomes available and the game repeats the same level and boss without increasing the difficulty. Reception In Japan, Game Machine listed Star Force on its December 1, 1984, issue as the fourteenth most-successful table arcade unit at the time. Legacy Sequels Super Star Force: Jikūreki no Himitsu, released in 1986 for the Japanese Nintendo Famicom. Final Star Force, released for arcades in 1992. Ports and related releases Star Force was ported and published in 1985 by Hudson Soft to both the MSX home computer and the Family Computer (Famicom) in Japan. Sales of the game were promoted through the first nationwide video game competition to be called "a caravan", although it was not the first event of its kind organized by Hudson (they had previously promoted Lode Runner with a similar event). The North American and European versions for the Nintendo Entertainment System (NES) were published two years later, in 1987, with significant revisions, and with Tecmo credited rather than Hudson on the title screen and box art. Although the NES version is immediately rec
https://en.wikipedia.org/wiki/Keriorrhea
Keriorrhea is the production of greasy, orange-colored stools which results from the consumption of indigestible wax esters found in oilfish and escolar. See also Steatorrhea Rectal discharge
https://en.wikipedia.org/wiki/Them%27s%20Fightin%27%20Herds
Them's Fightin' Herds is an indie fighting game developed by Mane6 and published by Modus Games. It features a cast of ungulate characters fighting each other to find a champion worthy of gaining a magical key that will protect their world from predators. First released into early access in February 2018, the full release was on April 30, 2020 for Microsoft Windows, followed by Linux on March 25, 2021 and a beta macOS version was added on October 27, 2021. The game was released for Nintendo Switch, PlayStation 4, PlayStation 5, Xbox One, and Xbox Series X/S on October 18, 2022. The project is a spiritual successor to Mane6's earlier, unreleased fighting game Fighting Is Magic, based on the animated television show My Little Pony: Friendship Is Magic. Mane6 was a nine-man game development team and part of the adult fandom of the show. Fighting Is Magic featured the six main pony characters from that show. Early versions of this game were released in 2012, drawing attention from both players in the Evolution Championship Series due to the unique moves associated with non-bipedal characters in fighting games, as well as from Hasbro which owned the intellectual property to My Little Pony. After Hasbro sent Mane6 a cease and desist letter, Mane6 discarded the assets tied to the show, while keeping some of the fundamental gameplay factors to create the new title Them's Fightin' Herds. The creator of My Little Pony: Friendship Is Magic, Lauren Faust, offered to help with designing the new characters for the game. Development of the game was completed with crowdfunding through Indiegogo. A separate effort created by fans not associated with the Mane6 team released their Fighting Is Magic: Tribute Edition of the original Mane6 My Little Pony-inspired game in early 2014. This game was made from various beta assets of the original which Mane6 developed in the first two years, and were later leaked by other parties. Gameplay The game uses a four-button fighting system: a b
https://en.wikipedia.org/wiki/Synaptic%20tagging
Synaptic tagging, or the synaptic tagging hypothesis, was first proposed in 1997 by Julietta U. Frey (her publishing name was Uwe Frey or J. U. Frey before the year 2000 (https://scholar.google.com/citations?user=sghJBzMAAAAJ&hl=en&citsig=AD-1fHaN4xLB44tNjIiILbQWZcfn)) and Richard G. Morris; it seeks to explain how neural signaling at a particular synapse creates a target for subsequent plasticity-related product (PRP) trafficking essential for sustained LTP and LTD. Although the molecular identity of the tags remains unknown, it has been established that they form as a result of high or low frequency stimulation, interact with incoming PRPs, and have a limited lifespan. Further investigations have suggested that plasticity-related products include mRNA and proteins from both the soma and dendritic shaft that must be captured by molecules within the dendritic spine to achieve persistent LTP and LTD. This idea was articulated in the synaptic tag-and-capture hypothesis. Overall, synaptic tagging elaborates on the molecular underpinnings of how L-LTP is generated and leads to memory formation. History Frey, a researcher at the Leibniz Institute for Neurobiology (later at the Medical College of Georgia and the Lund University), and Morris, a researcher at the University of Edinburgh, laid the groundwork for the synaptic tagging hypothesis, stating: "We propose that LTP initiates the creation of a short-lasting protein-synthesis-independent 'synaptic tag' at the potentiated synapse which sequesters the relevant protein(s) to establish late LTP. In support of this idea, we now show that weak tetanic stimulation, which ordinarily leads only to early LTP, or repeated tetanization in the presence of protein-synthesis inhibitors, each results in protein-synthesis-dependent late LTP, provided repeated tetanization has already been applied at another input to the same population of neurons. The synaptic tag decays in less than three hours. These findings indicate that th
https://en.wikipedia.org/wiki/Mir-569%20microRNA%20precursor%20family
In molecular biology mir-569 microRNA is a short RNA molecule. MicroRNAs function to regulate the expression levels of other genes by several mechanisms. See also MicroRNA
https://en.wikipedia.org/wiki/Permabit
Permabit Technology Corporation was a private supplier of Data Reduction solutions to the Computer Data Storage industry. On 31 July 2017 it was announced that Red Hat had acquired the assets and technology of Permabit Technology Corporation. Permabit Albireo The Permabit Albireo family of products are designed with data reduction features. The common component among these products is the Albireo index - a hash datastore. Three products in the Albireo family range from an embedded SDK (offering integration with existing storage) to a ready-to-deploy appliance. Albireo SDK – a software development kit designed to add data deduplication to hardware devices or software applications that benefit from sharing duplicate chunks. Albireo VDO – a drop-in data efficiency solution for Linux architectures. VDO provides fine-grained (4 KB chunk), inline deduplication, thin provisioning, compression and replication. Albireo SANblox – a ready-to-run data efficiency appliance that integrates data deduplication and data compression transparently into Fibre Channel SAN environments. History Permabit was founded as Permabit Inc. in 2000 by a technical and business team from Massachusetts Institute of Technology. The company went through a management buyout in 2007 and a new business entity, Permabit Technology Corporation, was formed at that time. Permabit’s first product, Permabit Enterprise Archive (originally known as Permeon) was a multi-PB scalable, content-addressable, scale-out storage product, first launched in 2004. Enterprise Archive utilized in-house developed technologies in the areas of capacity optimization, WORM, storage management and data protection. In 2010, Permabit launched the Albireo family of products which focus on licensing Permabit data efficiency and management innovations to original equipment manufacturers, software vendors and online service providers. Publicly acknowledged companies that offer Albireo-based solutions include Dell EMC, Hitachi Dat
https://en.wikipedia.org/wiki/Immunization%20registry
An immunization registry or immunization information system (IIS) is an information system that collects vaccination data about all persons within a geographic area. It consolidates the immunization records from multiple sources for each person living in its jurisdiction. Introduction Immunization information systems (IIS) are an important tool to increase and sustain high vaccination coverage by consolidating vaccination records of children and adults from multiple providers, forecasting next doses past due, due, and next due to support generating reminder and recall vaccination notices for each individual, and providing official vaccination forms and vaccination coverage assessments. One of the national health objectives is to increase to 95% the proportion of children aged <6 years who participate in fully operational population-based IIS. A "fully operational" IIS includes 95% enrollment or higher of all catchment area children less than 6 years of age with 2 or more immunization encounters administered according to ACIP recommendations. In a population-based IIS, children are entered into the IIS at birth, often through a linkage with electronic birth records. An IIS record also can be initiated by a health care provider at the time of a child's first immunization. If an IIS includes all children in a given geographical area and all providers are reporting immunization information, it can provide a single data source for all community immunization partners. Such a population-based IIS can make it easier to carry out the demonstrably effective immunization strategies (e.g., reminder/recall, AFIX, and WIC linkages) and thereby decrease the resources needed to achieve and maintain high levels of coverage. IIS can also be used to enhance adult immunization services and coverage. Pharmacy immunizations are reported to state IIS, allowing for a complete lifetime immunization history. The concept of IIS is not new. Many individual practices and health plans admin
https://en.wikipedia.org/wiki/Agent%20Communications%20Language
Agent Communication Language (ACL), proposed by the Foundation for Intelligent Physical Agents (FIPA), is a proposed standard language for agent communications. Knowledge Query and Manipulation Language (KQML) is another proposed standard. The most popular ACLs are: FIPA-ACL (by the Foundation for Intelligent Physical Agents, a standardization consortium) KQML (Knowledge Query and Manipulation Language) Both rely on speech act theory developed by Searle in the 1960s and enhanced by Winograd and Flores in the 1970s. They define a set of performatives, also called Communicative Acts, and their meaning (e.g. ask-one). The content of the performative is not standardized, but varies from system to system. To make agents understand each other they have to not only speak the same language, but also have a common ontology. An ontology is a part of the agent's knowledge base that describes what kind of things an agent can deal with and how they are related to each other. Examples of frameworks that implement a standard agent communication language (FIPA-ACL) include FIPA-OS and Jade.
https://en.wikipedia.org/wiki/Iterated%20conditional%20modes
In statistics, iterated conditional modes is a deterministic algorithm for obtaining a configuration of a local maximum of the joint probability of a Markov random field. It does this by iteratively maximizing the probability of each variable conditioned on the rest. See also Belief propagation Graph cuts in computer vision Optimization problem
https://en.wikipedia.org/wiki/Klaus%20Sutner
Klaus Sutner is a Teaching Professor of Computer Science at Carnegie Mellon University, and is also a former Associate Dean of Undergraduate Programs for the Carnegie Mellon School of Computer Science. His research interests include cellular automata, discrete mathematics as pertains to computation, and computational complexity theory. He developed a hybrid Mathematica/C++ application named Automata that manipulates finite-state machines and their syntactic semigroups. He "has survived five decades in the martial arts. Barely." and is the head instructor at the Three-Rivers Aikikai.
https://en.wikipedia.org/wiki/Petru%20Mocanu
Petru T. Mocanu (1 June 1931 – 28 March 2016) was a Romanian mathematician who was elected in 2009 as a titular member of the Romanian Academy. Mocanu was born in Brăila. He studied at the Nicolae Bălcescu High School in Brăila, graduating in 1950. He then went to study mathematics at Babeș-Bolyai University in Cluj-Napoca, completing his B.Sc in 1953, and his Ph.D. in 1959. His dissertation, written under the supervision of Gheorghe Călugăreanu, was titled Variational methods in the theory of univalent functions. He continued as faculty at Babeș-Bolyai University, rising to the rank of Professor in 1970. Mocanu was an invited professor at the University of Conakry in 1966–1967, and the Ohio State University in 1992. Publications Notes External links math.ubbcluj.ro/~pmocanu/ 1931 births 2016 deaths People from Brăila Romanian mathematicians Titular members of the Romanian Academy Babeș-Bolyai University alumni Academic staff of Babeș-Bolyai University Complex analysts
https://en.wikipedia.org/wiki/Mineral%20ascorbates
Mineral ascorbates are a group of salts of ascorbic acid (vitamin C). They are composed of a mineral cation bonded to ascorbate (the anion of ascorbic acid). Production Mineral ascorbates are powders manufactured by reacting ascorbic acid with mineral carbonates in aqueous solutions, venting the carbon dioxide, drying the reaction product, and then milling the dried product to the desired particle size. The choice of the mineral carbonates can be calcium carbonate, potassium carbonate, sodium bicarbonate, magnesium carbonate, or many other mineral forms. Uses Mineral ascorbates are used as dietary supplements and food additives. Ascorbate salts may be better tolerated by the human body than the corresponding weakly acidic ascorbic acid. Ascorbates are highly reactive antioxidants used as food preservatives. Examples of mineral ascorbates are: Sodium ascorbate (E301) Calcium ascorbate (E302) Potassium ascorbate (E303) Magnesium ascorbate See also Ascorbyl palmitate Ascorbyl stearate
https://en.wikipedia.org/wiki/Programmable%20metallization%20cell
The programmable metallization cell, or PMC, is a non-volatile computer memory developed at Arizona State University. PMC, a technology developed to replace the widely used flash memory, providing a combination of longer lifetimes, lower power, and better memory density. Infineon Technologies, who licensed the technology in 2004, refers to it as conductive-bridging RAM, or CBRAM. CBRAM became a registered trademark of Adesto Technologies in 2011. NEC has a variant called "Nanobridge" and Sony calls their version "electrolytic memory". Description PMC is a two terminal resistive memory technology developed at Arizona State University. PMC is an electrochemical metallization memory that relies on redox reactions to form and dissolve a conductive filament. The state of the device is determined by the resistance across the two terminals. The existence of a filament between the terminals produces a low resistance state (LRS) while the absence of a filament results in a high resistance state (HRS). A PMC device is made of two solid metal electrodes, one relatively inert (e.g., tungsten or nickel) the other electrochemically active (e.g., silver or copper), with a thin film of solid electrolyte between them. Device operation The resistance state of a PMC is controlled by the formation (programming) or dissolution (erasing) of a metallic conductive filament between the two terminals of the cell. A formed filament is a fractal tree like structure. Filament formation PMC rely on the formation of a metallic conductive filament to transition to a low resistance state (LRS). The filament is created by applying a positive voltage bias (V) to the anode contact (active metal) while grounding the cathode contact (inert metal). The positive bias oxidizes the active metal (M): M → M+ + e− The applied bias generates an electric field between the two metal contacts. The ionized (oxidized) metal ions migrate along the electric field toward the cathode contact. At the ca
https://en.wikipedia.org/wiki/Prevertebral%20plexus
A prevertebral plexus is a nerve plexus which branches from a prevertebral ganglion.
https://en.wikipedia.org/wiki/Monocarboxylate%20transporter%209
Monocarboxylate transporter 9 (MCT9, solute carrier family 16, member 9, SLC16A9) is a protein that in humans is encoded by the SLC16A9 gene. Clinical relevance Mutations in the SLC16A9 gene have been associated with carnitine levels in blood.
https://en.wikipedia.org/wiki/Broadcast%20%28parallel%20pattern%29
Broadcast is a collective communication primitive in parallel programming to distribute programming instructions or data to nodes in a cluster. It is the reverse operation of reduction. The broadcast operation is widely used in parallel algorithms, such as matrix-vector multiplication, Gaussian elimination and shortest paths. The Message Passing Interface implements broadcast in MPI_Bcast. Definition A message of length should be distributed from one node to all other nodes. is the time it takes to send one byte. is the time it takes for a message to travel to another node, independent of its length. Therefore, the time to send a package from one node to another is . is the number of nodes and the number of processors. Binomial Tree Broadcast With Binomial Tree Broadcast the whole message is sent at once. Each node that has already received the message sends it on further. This grows exponentially as each time step the amount of sending nodes is doubled. The algorithm is ideal for short messages but falls short with longer ones as during the time when the first transfer happens only one node is busy. Sending a message to all nodes takes time which results in a runtime of Message M id := node number p := number of nodes if id > 0 blocking_receive M for (i := ceil(log_2(p)) - 1; i >= 0; i--) if (id % 2^(i+1) == 0 && id + 2^i <= p) send M to node id + 2^i Linear Pipeline Broadcast The message is split up into packages and send piecewise from node to node . The time needed to distribute the first message piece is whereby is the time needed to send a package from one processor to another. Sending a whole message takes . Optimal is to choose resulting in a runtime of approximately The run time is dependent on not only message length but also the number of processors that play roles. This approach shines when the length of the message is much larger than the amount of processors. Message M := [m_1, m_2, ..., m_n] id = node
https://en.wikipedia.org/wiki/Pseudobunaea%20alinda
Pseudobunaea alinda is a species of very large moths in the family Saturniidae. The species was first described by Dru Drury in 1782, and is found in Angola, Cameroon, Congo, DR Congo, Gabon, Guinea, Ivory Coast, Sierra Leone, and Tanzania. Description Upper Side. Antennae pectinated. Neck buff-coloured. Thorax and abdomen brownish red, the centre of the former being grey. Anterior wings brown-red, darkest along the external edges, with two faint dark indented lines crossing them from the anterior to the posterior edges. A transparent spot is placed near the middle of the wings, about a quarter of an inch from the anterior edges, without any iris of a different colour. Posterior wings brown-red, and darkest along the external edges, having a few faint waved lines. Near the middle is a small transparent spot, edged with buff at the bottom, surrounded by a dark brown border, and which is also encircled by another quite black. Under Side. Breast red-brown. Legs, abdomen, and wings entirely of a dark buff. All the faint waved lines, hardly discernible on the other side, are here very conspicuous. Close to the transparent spots, on the anterior wings, are two of a dark brown, and two larger are also placed close to the transparent ones in the posterior wings, without any of the circular ones which are on the upper side. Margins of the wings entire. Wing-span 7¾ inches (195 mm).
https://en.wikipedia.org/wiki/Deep%20time
Deep time is a term introduced and applied by John McPhee to the concept of geologic time in his book Basin and Range (1981), parts of which originally appeared in the New Yorker magazine. The philosophical concept of geological time was developed in the 18th century by Scottish geologist James Hutton (1726–1797); his "system of the habitable Earth" was a deistic mechanism keeping the world eternally suitable for humans. The modern concept entails huge changes over the age of the Earth which has been determined to be, after a long and complex history of developments, around 4.55 billion years. Scientific concept Hutton based his view of deep time on a form of geochemistry that had developed in Scotland and Scandinavia from the 1750s onward. As mathematician John Playfair, one of Hutton's friends and colleagues in the Scottish Enlightenment, remarked upon seeing the strata of the angular unconformity at Siccar Point with Hutton and James Hall in June 1788, "the mind seemed to grow giddy by looking so far into the abyss of time". Early geologists such as Nicolas Steno (1638–1686) and Horace-Bénédict de Saussure (1740–1799) had developed ideas of geological strata forming from water through chemical processes, which Abraham Gottlob Werner (1749–1817) developed into a theory known as Neptunism, envisaging the slow crystallisation of minerals in the ancient oceans of the Earth to form rock. Hutton's innovative 1785 theory, based on Plutonism, visualised an endless cyclical process of rocks forming under the sea, being uplifted and tilted, then eroded to form new strata under the sea. In 1788 the sight of Hutton's Unconformity at Siccar Point convinced Playfair and Hall of this extremely slow cycle, and in that same year Hutton memorably wrote "we find no vestige of a beginning, no prospect of an end". Other scientists such as Georges Cuvier (1769–1832) put forward ideas of past ages, and geologists such as Adam Sedgwick (1785–1873) incorporated Werner's ideas in
https://en.wikipedia.org/wiki/Superhelix
A superhelix is a molecular structure in which a helix is itself coiled into a helix. This is significant to both proteins and genetic material, such as overwound circular DNA. The earliest significant reference in molecular biology is from 1971, by F. B. Fuller: A geometric invariant of a space curve, the writhing number, is defined and studied. For the central curve of a twisted cord the writhing number measures the extent to which coiling of the central curve has relieved local twisting of the cord. This study originated in response to questions that arise in the study of supercoiled double-stranded DNA rings.</blockquote> About the writhing number, mathematician W. F. Pohl says: <blockquote>It is well known that the writhing number is a standard measure of the global geometry of a closed space curve. Contrary to intuition, a topological property, the linking number, arises from the geometric properties twist and writhe according to the following relationship: Lk= T + W, where Lk is the linking number, W is the writhe and T is the twist of the coil. The linking number refers to the number of times that one strand wraps around the other. In DNA this property does not change and can only be modified by specialized enzymes called topoisomerases. See also DNA supercoil (superhelical DNA) Knot theory
https://en.wikipedia.org/wiki/Gordonia%20sp.%20nov.%20Q8
Gordonia sp. nov. Q8 is a bacterium in the phylum of Actinomycetota. It was discovered in 2017 as one of eighteen new species isolated from the Jiangsu Wei5 oilfield in East China with the potential for bioremediation. Strain Q8 is rod-shaped and gram-positive with dimensions 1.0–4.0 μm × 0.5–1.2 μm and an optimal growth temperature of 40 °C. Phylogenetically, it is most closely related to Gordonia paraffinivorans and Gordonia alkaliphila, both of which are known bioremediators. Q8 was assigned as a novel species based on a <70% ratio of DNA homology with other Gordonia bacteria. Bioremediation is the process by which polluted soil, water, and other natural materials are treated to encourage growth of microorganisms which can degrade contaminants. It is generally considered more cost-effective and sustainable as compared to other methods of ecosystem restoration. Q8 was chosen for study as a bioremediator due to its ability to grow on media which includes the polycyclic aromatic hydrocarbons (PAHs) naphthalene and pyrene. PAHs are the products of incomplete combustion of fossil fuels and are considered toxic and carcinogenic, in particular to aquatic organisms. Sixteen PAHs are listed as priority pollutants by the U.S. Environmental Protection Agency because of their association with cancer in aquatic animals and increased mutagenicity of sediments. PAH-degrading microorganisms are commonly found in polluted areas such as oil wells, where they utilize PAHs as their sole carbon and energy source. The study with Q8 demonstrated that the bacterium could degrade nearly all naphthalene and pyrene with 1–2 weeks, indicating that Q8 can grow in the presence of and rapidly degrade PAHs. Q8 can also significantly reduce the viscosity of oil, making it more soluble in water and easier to utilize by other bacteria in a process known as petroleum bioremediation. Other Gordonia have been used to remove boat lubricants from water using a similar mechanism. The process of PAH de
https://en.wikipedia.org/wiki/Macrofungi%20of%20Guatemala
Guatemala is one of the richest biodiversity hotspots in the world. This is due to the variety of its territory and ecosystems that occur from sea level up to more than 4,000 meters above sea level. Ecological niches include (but are not limited to) subtropical and tropical rain forests, wetlands, dry forests, scrublands, cloud forests, pine-fir forests in the highlands. Despite this wealth, however, our knowledge on the mycobiota of the country is very poor. There are several reasons for this, primarily the prolonged Guatemalan civil war (1960–1996) and related political and social instability that have severely hampered field work in the country. The lack of trained local mycologists has certainly also delayed the detailed investigation of the rich mycota inhabiting the highly diversified Guatemalan biotopes. Diversity of Guatemala macrofungi Larger fungi (usually referred to as macrofungi or macromycetes) are of particular interest because of their importance as food resources and as a component of traditional culture in many places in the world. Moreover, many basidiomycetes and ascomycetes with conspicuous sporocarps often play an important role as ectomycorrhizal mycobionts of trees and shrubs of boreal forests in the northern hemisphere and are important elements in many areas of the southern hemisphere. Although Guatemalan macrofungi have not been as yet extensively surveyed, a preliminary checklist encompasses some 350 species of macromycetes (31 ascomycetes and 319 basidiomycetes) occurring in 163 genera and 20 ascomycetous and basidiomycetous orders. Recently, 12 species of Ascomycetes where cited, with the new records, there are now 44 ascomycete species known from Guatemala Most available observations pertain to the highlands, in the departments of Alta Verapaz, Baja Verapaz, Chimaltenango, Guatemala, El Quiché, Huehuetenango, and Quetzaltenango, while the wide lowland Petén region has been scantly explored, despite the fact that it accounts for abou
https://en.wikipedia.org/wiki/Stickman%20Soccer
Stickman Soccer is a videogame series for Android and iOS by Austrian studio Djinnworks first released in 2014. Games Stickman Soccer Classic (2013) Stickman Soccer 2014 or Stickman Soccer 14 (2014) Stickman Soccer 2016 or Stickman Soccer 16 (2016) Stickman Soccer 2018 or Stickman Soccer 18 (2018)
https://en.wikipedia.org/wiki/Random%20vibration
In mechanical engineering, random vibration is motion which is non-deterministic, meaning that future behavior cannot be precisely predicted. The randomness is a characteristic of the excitation or input, not the mode shapes or natural frequencies. Some common examples include an automobile riding on a rough road, wave height on the water, or the load induced on an airplane wing during flight. Structural response to random vibration is usually treated using statistical or probabilistic approaches. Mathematically, random vibration is characterized as an ergodic and stationary process. A measurement of the acceleration spectral density (ASD) is the usual way to specify random vibration. The root mean square acceleration (Grms) is the square root of the area under the ASD curve in the frequency domain. The Grms value is typically used to express the overall energy of a particular random vibration event and is a statistical value used in mechanical engineering for structural design and analysis purposes. While the term power spectral density (PSD) is commonly used to specify a random vibration event, ASD is more appropriate when acceleration is being measured and used in structural analysis and testing. Crandall is uniformly considered as the father of random vibrations (see also books by Bolotin, Elishakoff et al.). The dramatic effect of often neglected cross-correlations is elucidated in the monograph by Elishakoff. Random vibration testing Test specifications can be established from real environment measurements using an ASD envelope or a fatigue damage equivalence criterion (Extreme response spectrum and Fatigue damage spectrum). Random vibration testing is one of the more common types of vibration testing services performed by vibration test labs. Some of the more common random vibration test standards are MIL-STD-810, RTCA DO-160, and IEC 60068-2-64. See also Random noise
https://en.wikipedia.org/wiki/Temperament%20test
Temperament tests assess dogs for certain behaviors or suitability for dog sports or adoption from an animal shelter by observing the animal for unwanted or potentially dangerous behavioral traits, such as aggressiveness towards other dogs or humans, shyness, or extreme fear. AKC Temperament Test In 2019, the American Kennel Club launched its AKC Temperament Test (ATT), a pass-fail evaluation by AKC licensed or member clubs. Evaluators are specially trained AKC Obedience judges, Rally judges and AKC Approved Canine Good Evaluators. American Temperament Test Society American Temperament Test Society, Inc. was started by Alfons Ertel in 1977. Ertel created a test for dogs that checks a dog's reaction to strangers, to auditory and visual stimuli (such as the gun shot test), and to unusual situations in an outdoor setting; it does not test indoor or home situation scenarios. It favors a bold confident dog. , the top three dog breeds that have tested with ATTS are Rottweiler (17% of all tests conducted), German Shepherd Dog (10%), and Doberman (5%). The test itself is copyrighted and prospective testers must apply to become official. The test is conducted as a pass-fail by majority rule of three testers, and each individual dog is graded according to its own breed's native aptitudes, and taking into account the individual dog's age, health and training. Though the ATTS is the only organization which posts pass rates "by breed", the breeds cannot be compared against each other because the grades are based on each breed's own characteristics. Despite that, attorneys have been encouraged to use the ATTS published "results by breed" to defend their clients in dangerous dog cases by comparing pass rates of the breed of their client's dog against the pass rates of other well-known non-aggressive pet dog breeds. , 34,686 tests have been completed; less than 1,000 per year. BH-VT test by FCI BH-VT, an abbreviation of a German term which roughly translates to "companion do
https://en.wikipedia.org/wiki/2%2C4-Diaminopyrimidine
2,4-Diaminopyrimidine is a diaminopyrimidine. See also 4,5-Diaminopyrimidine
https://en.wikipedia.org/wiki/Bamlanivimab
Bamlanivimab is a monoclonal antibody developed by AbCellera Biologics and Eli Lilly as a treatment for COVID-19. The medication was granted an emergency use authorization (EUA) by the US Food and Drug Administration (FDA) in November 2020, and the EUA was revoked in April 2021. Bamlanivimab is an IgG1 monoclonal antibody (mAb) directed against the spike protein of SARS-CoV-2. The aim is to block viral attachment and entry into human cells, thus neutralizing the virus, and help preventing and treating COVID-19. Bamlanivimab emerged from the collaboration between Lilly and AbCellera to create antibody therapies for the prevention and treatment of COVID-19. Bamlanivimab is also used as part of the bamlanivimab/etesevimab combination that was granted an EUA by the FDA. In June 2021, the US Office of the Assistant Secretary for Preparedness and Response (ASPR) paused distribution of bamlanivimab and etesevimab together, and etesevimab alone (to pair with existing supply of bamlanivimab), due to the increase of circulating variants. Studies Bamlanivimab has been studied in several trials. Some initial results on bamlanivimab seemed promising, with one review saying that it "decrease[s] viral load when given early on in the course of SARS-CoV-2 infection and favourably impact[s] clinical outcomes for patients with mild-to-moderate COVID-19". However, further results have not shown any clinically relevant benefit. Animal trials An initial trial tested bamlanivimab in rhesus monkeys. Administration of the drug reduced SARS-CoV-2 replications in the upper and lower respiratory tract of monkeys. Following these results in a non-human primate model, several human studies were initiated. Human trials BLAZE-1 Trial The Blocking Viral Attachment and Cell Entry with SARS-CoV-2 Neutralizing Antibodies (BLAZE-1) trial was sponsored by the drug's developer Eli Lilly. The drug was tested in SARS-CoV-2 patients who did not require hospitalization. While an interim analysis