source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Voltage%20source
A voltage source is a two-terminal device which can maintain a fixed voltage. An ideal voltage source can maintain the fixed voltage independent of the load resistance or the output current. However, a real-world voltage source cannot supply unlimited current. A voltage source is the dual of a current source. Real-world sources of electrical energy, such as batteries and generators, can be modeled for analysis purposes as a combination of an ideal voltage source and additional combinations of impedance elements. Ideal voltage sources An ideal voltage source is a two-terminal device that maintains a fixed voltage drop across its terminals. It is often used as a mathematical abstraction that simplifies the analysis of real electric circuits. If the voltage across an ideal voltage source can be specified independently of any other variable in a circuit, it is called an independent voltage source. Conversely, if the voltage across an ideal voltage source is determined by some other voltage or current in a circuit, it is called a dependent or controlled voltage source. A mathematical model of an amplifier will include dependent voltage sources whose magnitude is governed by some fixed relation to an input signal, for example. In the analysis of faults on electrical power systems, the whole network of interconnected sources and transmission lines can be usefully replaced by an ideal (AC) voltage source and a single equivalent impedance. |- align="center" |style="padding: 1em 2em 0;"| |style="padding: 1em 2em 0;"| |- align="center" | Ideal Voltage Source | Ideal Current Source |- align="center" |style="padding: 1em 2em 0;"| |style="padding: 1em 2em 0;"| |- align="center" | Controlled Voltage Source | Controlled Current Source |- align="center" |style="padding: 1em 2em 0;"| |style="padding: 1em 2em 0;"| |- align="center" | Battery of cells | Single cell The internal resistance of an ideal voltage source is zero; it is able to supply or absorb any amount of
https://en.wikipedia.org/wiki/Exact%20solutions%20in%20general%20relativity
In general relativity, an exact solution is a solution of the Einstein field equations whose derivation does not invoke simplifying assumptions, though the starting point for that derivation may be an idealized case like a perfectly spherical shape of matter. Mathematically, finding an exact solution means finding a Lorentzian manifold equipped with tensor fields modeling states of ordinary matter, such as a fluid, or classical non-gravitational fields such as the electromagnetic field. Background and definition These tensor fields should obey any relevant physical laws (for example, any electromagnetic field must satisfy Maxwell's equations). Following a standard recipe which is widely used in mathematical physics, these tensor fields should also give rise to specific contributions to the stress–energy tensor . (A field is described by a Lagrangian, varying with respect to the field should give the field equations and varying with respect to the metric should give the stress-energy contribution due to the field.) Finally, when all the contributions to the stress–energy tensor are added up, the result must be a solution of the Einstein field equations In the above field equations, is the Einstein tensor, computed uniquely from the metric tensor which is part of the definition of a Lorentzian manifold. Since giving the Einstein tensor does not fully determine the Riemann tensor, but leaves the Weyl tensor unspecified (see the Ricci decomposition), the Einstein equation may be considered a kind of compatibility condition: the spacetime geometry must be consistent with the amount and motion of any matter or non-gravitational fields, in the sense that the immediate presence "here and now" of non-gravitational energy–momentum causes a proportional amount of Ricci curvature "here and now". Moreover, taking covariant derivatives of the field equations and applying the Bianchi identities, it is found that a suitably varying amount/motion of non-gravitational energy–mom
https://en.wikipedia.org/wiki/GEANT-3
GEANT is the name of a series of simulation software designed to describe the passage of elementary particles through matter, using Monte Carlo methods. The name is an acronym formed from "GEometry ANd Tracking". Originally developed at CERN for high energy physics experiments, GEANT-3 has been used in many other fields. History The very first version of GEANT dates back to 1974, while the first version of GEANT-3 dates back to 1982. Versions of GEANT through 3.21 were written in FORTRAN and eventually maintained as part of CERNLIB. Since about 2000, the last FORTRAN release has been essentially in stasis and receives only occasional bug fixes. GEANT3 was, however, still in use by some experiments for some time thereafter. Most of GEANT-3 is available under the GNU General Public License, with the exception of some hadronic interaction code contributed by the FLUKA collaboration. GEANT-3 was used by a majority of high energy physics experiments from the late 1980s to the early 2000s. The largest experiments using were three of the experiments at the Large Electron-Positron collider, including ALEPH, L3 and OPAL. It was also a key tool in the design and optimization of the detectors of all experiments at the Large Hadron Collider (LHC) – see e.g. the ATLAS Technical Design Report. GEANT-3.21 based programs remained main simulation engine of ATLAS, CMS and LHCb at LHC until 2004, when these experiments moved to Geant4-based simulations. Even in 2019 it remains the primary simulation tool for the ALICE experiment at the LHC. A related (but separate) product is Geant4 (when referring to this version, the name is typically no longer capitalized). It is a complete rewrite in C++ with a modern object-oriented design. Geant4 was developed by the RD44 collaboration in 1994–1998 and is being maintained and improved now by the Geant4 international collaboration. For quite some time Geant4 did not have a clearly defined software license. As of version 8.1 (release
https://en.wikipedia.org/wiki/Infinity%20plus%20one
In mathematics, infinity plus one is a concept which has a well-defined formal meaning in some number systems, and may refer to: Transfinite numbers, numbers that are larger than all the finite numbers. Cardinal numbers, representations of sizes (cardinalities) of abstract sets, which may be infinite. Ordinal numbers, representations of order types of well-ordered sets, which may also be infinite. Hyperreal numbers, an extension of the real number system that contains infinite and infinitesimal numbers. Surreal numbers, another extension of the real numbers, contain the hyperreal and all the transfinite ordinal numbers. English phrases Infinity
https://en.wikipedia.org/wiki/Bidirectional%20reflectance%20distribution%20function
The bidirectional reflectance distribution function (BRDF), symbol , is a function of four real variables that defines how light is reflected at an opaque surface. It is employed in the optics of real-world light, in computer graphics algorithms, and in computer vision algorithms. The function takes an incoming light direction, , and outgoing direction, (taken in a coordinate system where the surface normal lies along the z-axis), and returns the ratio of reflected radiance exiting along to the irradiance incident on the surface from direction . Each direction is itself parameterized by azimuth angle and zenith angle , therefore the BRDF as a whole is a function of 4 variables. The BRDF has units sr−1, with steradians (sr) being a unit of solid angle. Definition The BRDF was first defined by Fred Nicodemus around 1965. The definition is: where is radiance, or power per unit solid-angle-in-the-direction-of-a-ray per unit projected-area-perpendicular-to-the-ray, is irradiance, or power per unit surface area, and is the angle between and the surface normal, . The index indicates incident light, whereas the index indicates reflected light. The reason the function is defined as a quotient of two differentials and not directly as a quotient between the undifferentiated quantities, is because irradiating light other than , which are of no interest for , might illuminate the surface which would unintentionally affect , whereas is only affected by . Related functions The Spatially Varying Bidirectional Reflectance Distribution Function (SVBRDF) is a 6-dimensional function, , where describes a 2D location over an object's surface. The Bidirectional Texture Function (BTF) is appropriate for modeling non-flat surfaces, and has the same parameterization as the SVBRDF; however in contrast, the BTF includes non-local scattering effects like shadowing, masking, interreflections or subsurface scattering. The functions defined by the BTF at each point on the surfa
https://en.wikipedia.org/wiki/Primary%20succession
Primary succession is the beginning step of ecological succession after an extreme disturbance, which usually occurs in an environment devoid of vegetation and other organisms. These environments are typically lacking in soil, as disturbances like lava flow or retreating glaciers scour the environment clear of nutrients. In contrast, secondary succession occurs on substrates that previously supported vegetation before an ecological disturbance. This occurs when smaller disturbances like floods, hurricanes, tornadoes, and fires destroy only the local plant life and leave soil nutrients for immediate establishment by intermediate community species. Occurrence In primary succession pioneer species like lichen, algae and fungi as well as abiotic factors like wind and water start to "normalise" the habitat or in other words start to develop soil and other important mechanisms for greater diversity to flourish. Primary succession begins on rock formations, such as volcanoes or mountains, or in a place with no organisms or soil. Primary succession leads to conditions nearer optimum for vascular plant growth; pedogenesis or the formation of soil, and the increased amount of shade are the most important processes. These pioneer lichen, algae, and fungi are then dominated and often replaced by plants that are better adapted to less harsh conditions, these plants include vascular plants like grasses and some shrubs that are able to live in thin soils that are often mineral-based. Water and nutrient levels increase with the amount of succession exhibited. The early stages of primary succession are dominated by species with small propagules (seed and spores) which can be dispersed long distances. The early colonizers—often algae, fungi, and lichens—stabilize the substrate. Nitrogen supplies are limited in new soils, and nitrogen-fixing species tend to play an important role early in primary succession. Unlike in primary succession, the species that dominate secondary success
https://en.wikipedia.org/wiki/Porism
A porism is a mathematical proposition or corollary. It has been used to refer to a direct consequence of a proof, analogous to how a corollary refers to a direct consequence of a theorem. In modern usage, it is a relationship that holds for an infinite range of values but only if a certain condition is assumed, such as Steiner's porism. The term originates from three books of Euclid that have been lost. A proposition may not have been proven, so a porism may not be a theorem or true. Origins The book that talks about porisms first is Euclid's Porisms. What is known of it is in Pappus of Alexandria's Collection, who mentions it along with other geometrical treatises, and gives several lemmas necessary for understanding it. Pappus states: The porisms of all classes are neither theorems nor problems, but occupy a position intermediate between the two, so that their enunciations can be stated either as theorems or problems, and consequently some geometers think that they are theorems, and others that they are problems, being guided solely by the form of the enunciation. But it is clear from the definitions that the old geometers understood better the difference between the three classes. The older geometers regarded a theorem as directed to proving what is proposed, a problem as directed to constructing what is proposed, and finally a porism as directed to finding what is proposed (). Pappus said that the last definition was changed by certain later geometers, who defined a porism as an accidental characteristic as (to leîpon hypothései topikoû theōrḗmatos), that which falls short of a locus-theorem by a (or in its) hypothesis. Proclus pointed out that the word porism was used in two senses: one sense is that of "corollary", as a result unsought but seen to follow from a theorem. In the other sense, he added nothing to the definition of "the older geometers", except to say that the finding of the center of a circle and the finding of the greatest common measure are
https://en.wikipedia.org/wiki/Hotline%20Communications
Hotline Communications Limited (HCL) was a software company founded in 1997, based in Toronto, Canada, with employees also in the United States and Australia. Hotline Communications' main activity was the publishing and distribution of a multi-purpose client/server communication software product named Hotline Connect, informally called, simply, Hotline. Initially, Hotline Communications sought a wide audience for its products, and organizations as diverse as Avid Technology, Apple Computer Australia, and public high schools used Hotline. At its peak, Hotline received millions of dollars in venture capital funding, grew to employ more than fifty people, served millions of users, and won accolades at trade shows and in newspapers and computer magazines around the world. Hotline eventually attracted more of an "underground" community, which saw it as an easier to use successor to the Internet Relay Chat (IRC) community. In 2001 Hotline Communications lost the bulk of its VC funding, and went out of business later that year. All of its assets were acquired in 2002 by Hotsprings, Inc., a new company formed by some ex-employees and shareholders. Hotsprings Inc. has since also abandoned development of the Hotline Connect software suite; the last iteration of Hotline Connect was released in December 2003. Currently, only a few servers and trackers remain but the Hotline community is still alive. History Hotline was designed in 1996 and known as "hotwire" by Australian programmer Adam Hinkley (known online by his username, "Hinks"), then 17 years old, as a classic Mac OS application. The source code for the Hotline applications was based on a class library, "AppWarrior" (AW), which Hinkley wrote. AppWarrior would later become litigious, as Hinkley wrote parts of it while he was employed by an Australian company, Redrock Holdings. Six other fans of Hotline, David Murphy, Alex King, Phil Hilton, Jason Roks, David Bordin, and Terrance Gregory, joined Adam Hinkley's efforts to
https://en.wikipedia.org/wiki/Morphogen
A morphogen is a substance whose non-uniform distribution governs the pattern of tissue development in the process of morphogenesis or pattern formation, one of the core processes of developmental biology, establishing positions of the various specialized cell types within a tissue. More specifically, a morphogen is a signaling molecule that acts directly on cells to produce specific cellular responses depending on its local concentration. Typically, morphogens are produced by source cells and diffuse through surrounding tissues in an embryo during early development, such that concentration gradients are set up. These gradients drive the process of differentiation of unspecialised stem cells into different cell types, ultimately forming all the tissues and organs of the body. The control of morphogenesis is a central element in evolutionary developmental biology (evo-devo). History The term was coined by Alan Turing in the paper "The Chemical Basis of Morphogenesis", where he predicted a chemical mechanism for biological pattern formation, decades before the formation of such patterns was demonstrated. The concept of the morphogen has a long history in developmental biology, dating back to the work of the pioneering Drosophila (fruit fly) geneticist, Thomas Hunt Morgan, in the early 20th century. Lewis Wolpert refined the morphogen concept in the 1960s with the French flag model, which described how a morphogen could subdivide a tissue into domains of different target gene expression (corresponding to the colours of the French flag). This model was championed by the leading Drosophila biologist, Peter Lawrence. Christiane Nüsslein-Volhard was the first to identify a morphogen, Bicoid, one of the transcription factors present in a gradient in the Drosophila syncitial embryo. She was awarded the 1995 Nobel Prize in Physiology and Medicine for her work explaining the morphogenic embryology of the common fruit fly. Groups led by Gary Struhl and Stephen Cohen then de
https://en.wikipedia.org/wiki/Ord%27s%20thyroiditis
Ord's thyroiditis is a common form of thyroiditis, an autoimmune disease where the body's own antibodies fight the cells of the thyroid. It is named after the physician, William Miller Ord, who first described it in 1877 and again in 1888. It is more common among women than men. It has historically been separated from Hashimoto's Thyroiditis which presents with goiters, however some argue they each represent extremes of the same disease and should be classified together as a combined "Ord-Hashimoto’s disease". Signs and symptoms Symptoms of Ord's thyroiditis include symptoms of hypothyroidism and atrophy of the thyroid gland. Pathophysiology Physiologically, antibodies to thyroid peroxidase and/or thyroglobulin cause gradual destruction of follicles in the thyroid gland. Accordingly, the disease can be detected clinically by looking for these antibodies in the blood. It is also characterised by invasion of the thyroid tissue by leukocytes, chiefly T-lymphocytes. Ord's thyroiditis usually results in hypothyroidism. Transient hyperthyroid states in the acute phase, (a common observation in Hashimoto's thyroiditis), are rare in Ord's disease. Diagnosis Treatment Treatment is as with hypothyroidism, daily thyroxine(T4) and/or triiodothyronine(T3). Epidemiology Outside Europe a goitrous form of autoimmune thyroiditis (Hashimoto's Thyroiditis) is more common than Ord's disease. See also Thyroiditis Hypothyroidism Hashimoto's thyroiditis
https://en.wikipedia.org/wiki/Facts%20and%20Arguments%20for%20Darwin
Facts and Arguments for Darwin is an 1864 book on evolutionary biology by the German biologist Fritz Müller, originally published in German under the title ("For Darwin"), and translated into English by William Sweetland Dallas in 1869. Müller argued that Charles Darwin's theory of evolution by natural selection that he had advanced in his book The Origin of Species only five years earlier was correct, citing evidence that he had come across in Brazil. Müller states in the 'Author's Preface': It is not the purpose of the following pages to discuss once more the arguments deduced for and against Darwin's theory of the origin of species, or to weigh them one against the other. Their object is simply to indicate a few facts favourable to this theory, collected upon the same South American ground, on which, as Darwin tells us, the idea first occurred to him of devoting his attention to ‘the origin of species, — that mystery of mysteries. It is only by the accumulation of new and valuable material that the controversy will gradually be brought into a state fit for final decision, and this appears to be for the present of more importance than a repeated analysis of what is already before us. Moreover, it is but fair to leave it to Darwin himself at first to beat off the attacks of his opponents from the splendid structure which he has raised with such a master-hand.
https://en.wikipedia.org/wiki/History%20of%20materials%20science
Materials science has shaped the development of civilizations since the dawn of mankind. Better materials for tools and weapons has allowed mankind to spread and conquer, and advancements in material processing like steel and aluminum production continue to impact society today. Historians have regarded materials as such an important aspect of civilizations such that entire periods of time have defined by the predominant material used (Stone Age, Bronze Age, Iron Age). For most of recorded history, control of materials had been through alchemy or empirical means at best. The study and development of chemistry and physics assisted the study of materials, and eventually the interdisciplinary study of materials science emerged from the fusion of these studies. The history of materials science is the study of how different materials were used and developed through the history of Earth and how those materials affected the culture of the peoples of the Earth. The term "Silicon Age" is sometimes used to refer to the modern period of history during the late 20th to early 21st centuries. Prehistory In many cases, different cultures leave their materials as the only records; which anthropologists can use to define the existence of such cultures. The progressive use of more sophisticated materials allows archeologists to characterize and distinguish between peoples. This is partially due to the major material of use in a culture and to its associated benefits and drawbacks. Stone-Age cultures were limited by which rocks they could find locally and by which they could acquire by trading. The use of flint around 300,000 BCE is sometimes considered the beginning of the use of ceramics. The use of polished stone axes marks a significant advance, because a much wider variety of rocks could serve as tools. The innovation of smelting and casting metals in the Bronze Age started to change the way that cultures developed and interacted with each other. Starting around 5,500 BCE,
https://en.wikipedia.org/wiki/Hyperrectangle
In geometry, an hyperrectangle (also called a box, hyperbox, or orthotope), is the generalization of a rectangle (a plane figure) and the rectangular cuboid (a solid figure) to higher dimensions. A necessary and sufficient condition is that it is congruent to the Cartesian product of finite intervals. If all of the edges are equal length, it is a hypercube. A hyperrectangle is a special case of a parallelotope. Types A four-dimensional orthotope is likely a hypercuboid. The special case of an n-dimensional orthotope where all edges have equal length is the n-cube or hypercube. By analogy, the term "hyperrectangle" can refer to Cartesian products of orthogonal intervals of other kinds, such as ranges of keys in database theory or ranges of integers, rather than real numbers. Dual polytope The dual polytope of an n-orthotope has been variously called a rectangular n-orthoplex, rhombic n-fusil, or n-lozenge. It is constructed by 2n points located in the center of the orthotope rectangular faces. An n-fusil's Schläfli symbol can be represented by a sum of n orthogonal line segments: { } + { } + ... + { } or n{ }. A 1-fusil is a line segment. A 2-fusil is a rhombus. Its plane cross selections in all pairs of axes are rhombi. See also Minimum bounding box Cuboid Notes
https://en.wikipedia.org/wiki/Specific%20language%20impairment
Specific language impairment (SLI) (the term developmental language disorder is preferred by some) is diagnosed when a child's language does not develop normally and the difficulties cannot be accounted for by generally slow development, physical abnormality of the speech apparatus, autism spectrum disorder, apraxia, acquired brain damage or hearing loss. Twin studies have shown that it is under genetic influence. Although language impairment can result from a single-gene mutation, this is unusual. More commonly SLI results from the combined influence of multiple genetic variants, each of which is found in the general population, as well as environmental influences. Classification Specific language impairment (SLI) is diagnosed when a child has delayed or disordered language development for no apparent reason. Usually the first indication of SLI is that the child is later than usual in starting to speak and subsequently is delayed in putting words together to form sentences. Spoken language may be immature. In many children with SLI, understanding of language, or receptive language, is also impaired, though this may not be obvious unless the child is given a formal assessment. Although difficulties with use and understanding of complex sentences are a common feature of SLI, the diagnostic criteria encompass a wide range of problems, and for some children other aspects of language are problematic (see below). In general, the term SLI is reserved for children whose language difficulties persist into school age, and so it would not be applied to toddlers who are late to start talking, most of whom catch up with their peer group after a late start. Terminology The terminology for children's language disorders is extremely wide-ranging and confusing, with many labels that have overlapping but not necessarily identical meanings. In part this confusion reflects uncertainty about the boundaries of SLI, and the existence of different subtypes. Historically, the terms "dev
https://en.wikipedia.org/wiki/Synanthrope
A synanthrope (from ancient Greek σύν sýn "together, with" and ἄνθρωπος ánthrōpos "man") is an organism that lives near and benefits from humans and their environmental modifications (see also anthropophilia for animals who live close to humans as parasites). The term synanthrope includes many species regarded as pests or weeds, but does not include domesticated animals. Common synanthrope habitats include houses, gardens, farms, parks, roadsides and rubbish dumps. Zoology Examples of synanthropes are various insect species (ants, lice, silverfish, cockroaches, etc.), house sparrows, rock doves (pigeons), crows, various rodent species, Virginia opossums, raccoons, certain monkey species, coyotes, deer, passerines, and other urban wildlife. The brown rat is counted as one of the most prominent synanthropic animals and can be found in almost every place there are people. Botany Synanthropic plants include pineapple weed, dandelion, chicory, and plantain. Plant synanthropes are classified into two main types - apophytes and anthropophytes. Apophytes are synanthropic species that are native in origin. They can be subdivided into the following: Cultigen apophytes – spread by cultivation methods Ruderal apophytes – spread by development of marginal areas Pyrophyte apophytes – spread by fires Zoogen apophytes – spread by grazing animals Substitution apophytes – spread by logging or voluntary extension Anthropophytes are synanthropic species of foreign origin, whether introduced voluntarily or involuntarily. They can be subdivided into the following: Archaeophytes – introduced before the end of the 15th century Kenophytes – introduced after the 15th century Ephemerophytes – anthropophytic plants that appear episodically Subspontaneous – voluntarily introduced plants that have escaped cultivation and survived in the wild without further human intervention for a certain period. Adventive – involuntarily introduced plants that have escaped cultivation and survived in t
https://en.wikipedia.org/wiki/Quaternium-15
Quaternium-15 (systematic name: hexamethylenetetramine chloroallyl chloride) is a quaternary ammonium salt that has been used as a surfactant and preservative. It acts as an antimicrobial agent because it slowly releases formaldehyde, which is a preservative with biocidal properties. Both quaternium-15 and formaldehyde release agents have been the subjects of controversy. They are often banned in US and Europe. It can be found under a variety of names, including Dow Chemical Company: Dowicil 200 (cis isomer only), Dowicil 75 and Dowicil 100 (both a mix of cis and trans isomers). Synthesis Quaternium-15 can be prepared by treating hexamethylenetetramine with 1,3-dichloropropene. A mixture of cis and trans isomers are produced. Applications The isolated cis-compound is used primarily in cosmetic applications, with a maximum permitted concentration in the EU of 0.2%. The mixed product (cis- and trans-) is used in a wider range of formulations such as: emulsifiable metal-cutting fluids, latex and emulsion paints, liquid floor polishes and floor waxes, and glues and adhesives. Safety concerns Quaternium-15 has been banned in the EU since 2017 and a bill was introduced in the US in 2017 to require the FDA to investigate its safety. Allergic reaction Quaternium-15 is an allergen, and can cause dermatitis. Many of those with an allergy to quaternium-15 are also allergic to formaldehyde. At low pHs, it would be expected to release significant amounts of formaldehyde due to acid hydrolysis via the Delepine reaction. Allergic sensitivity to quaternium-15 can be detected using a patch test. It is the single most often found cause of allergic contact dermatitis of the hands (16.5% in 959 cases). In 2005–06, it was the fourth-most-prevalent allergen in patch tests (10.3%). Although quaternium-15 releases low amounts of formaldehyde. Even so, Johnson & Johnson announced plans to phase out its use of quaternium-15 in cosmetic products by 2015 in response to consumer pr
https://en.wikipedia.org/wiki/Software%20deployment
Software deployment is all of the activities that make a software system available for use. The general deployment process consists of several interrelated activities with possible transitions between them. These activities can occur on the producer side or on the consumer side or both. Because every software system is unique, the precise processes or procedures within each activity can hardly be defined. Therefore, "deployment" should be interpreted as a general process that has to be customized according to specific requirements or characteristics. History When computers were extremely large, expensive, and bulky (mainframes and minicomputers), the software was often bundled together with the hardware by manufacturers. If business software needed to be installed on an existing computer, this might require an expensive, time-consuming visit by a systems architect or a consultant. For complex, on-premises installation of enterprise software today, this can still sometimes be the case. However, with the development of mass-market software for the new age of microcomputers in the 1980s came new forms of software distribution first cartridges, then Compact Cassettes, then floppy disks, then (in the 1990s and later) optical media, the internet and flash drives. This meant that software deployment could be left to the customer. However, it was also increasingly recognized over time that configuration of the software by the customer was important and that this should ideally have a user-friendly interface (rather than, for example, requiring the customer to edit registry entries on Windows). In pre-internet software deployments, deployments (and their closely related cousin, new software releases) were of necessity expensive, infrequent, bulky affairs. It is arguable therefore that the spread of the internet made end-to-end agile software development possible. Indeed, the advent of cloud computing and software as a service meant that software could be deployed to a
https://en.wikipedia.org/wiki/Lysochrome
A lysochrome is a soluble dye used for histochemical staining of lipids, which include triglycerides, fatty acids, and lipoproteins. Lysochromes such as Sudan IV dissolve in the lipid and show up as colored regions. The dye does not stick to any other substrates, so a quantification or qualification of lipid presence can be obtained. The name was coined by John Baker (biologist) in his book "Principles of Biological Microtechnique", published in 1958, from the Greek words lysis (solution) and chroma (colour).
https://en.wikipedia.org/wiki/Alloimmunity
Alloimmunity (sometimes called isoimmunity) is an immune response to nonself antigens from members of the same species, which are called alloantigens or isoantigens. Two major types of alloantigens are blood group antigens and histocompatibility antigens. In alloimmunity, the body creates antibodies (called alloantibodies) against the alloantigens, attacking transfused blood, allotransplanted tissue, and even the fetus in some cases. Alloimmune (isoimmune) response results in graft rejection, which is manifested as deterioration or complete loss of graft function. In contrast, autoimmunity is an immune response to the self's own antigens. (The allo- prefix means "other", whereas the auto- prefix means "self".) Alloimmunization (isoimmunization) is the process of becoming alloimmune, that is, developing the relevant antibodies for the first time. Alloimmunity is caused by the difference between products of highly polymorphic genes, primarily genes of the major histocompatibility complex, of the donor and graft recipient. These products are recognized by T-lymphocytes and other mononuclear leukocytes which infiltrate the graft and damage it. Types of the rejection Transfusion reaction Blood transfusion can result in alloantibodies reacting towards the transfused cells, resulting in a transfusion reaction. Even with standard blood compatibility testing, there is a risk of reaction against human blood group systems other than ABO and Rh. Hemolytic disease of the fetus and newborn Hemolytic disease of the fetus and newborn is similar to a transfusion reaction in that the mother's antibodies cannot tolerate the fetus's antigens, which happens when the immune tolerance of pregnancy is impaired. In many instances the maternal immune system attacks the fetal blood cells, resulting in fetal anemia. HDN ranges from mild to severe. Severe cases require intrauterine transfusions or early delivery to survive, while mild cases may only require phototherapy at birth. Transplan
https://en.wikipedia.org/wiki/Anthrax%20hoaxes
Anthrax hoaxes involving the use of white powder or labels to falsely suggest the use of anthrax are frequently reported in the United States and globally. Hoaxes have increased following the 2001 anthrax attacks, after which no genuine anthrax attacks have occurred. The FBI and U.S. postal inspectors have responded to thousands of "white powder events" and targets have included government offices, US embassies, banks and news organizations. History Anthrax hoaxes were sporadically reported in the 1990s, including a petri dish in an envelope labeled "anthrachs"[sic] sent to B'nai B'rith in Washington in 1997 that contained harmless Bacillus cereus, but a spate of anthrax threats followed the 1998 arrest of Larry Wayne Harris, a microbiologist and white supremacist. Harris released what he said was military-grade anthrax but was actually a harmless vaccine strain, but news coverage popularized the idea of anthrax among hoaxers. In response to these hoaxes, the CDC released guidance for public health authorities for handling bioterrorism threats. Post-2001 In the month following the 2001 anthrax attacks, hundreds of hoaxes were reported worldwide. Legislation was enacted in the UK in October 2001 so that anyone convicted of a hoax involving threats of biological, chemical, nuclear or radioactive contamination would face a seven-year prison sentence. The Anti-Hoax Terrorism Act 2001 was passed by the US House of Representatives but never enacted, and legislation making terrorism hoaxes a federal offence was finally passed as part of the Intelligence Reform and Terrorism Prevention Act of 2004. Cases One of the most prolific hoaxers was Clayton Waagner, an anti-abortion activist who mailed hundreds of anthrax hoax letters to abortion clinics in late 2001 and who was convicted in December 2003. A Sacramento man, Marc M. Keyser, admitted to sending around 120 packages marked as containing anthrax in October 2008, which he says was to highlight the lack of preparedness
https://en.wikipedia.org/wiki/Electronic%20game
An electronic game is a game that uses electronics to create an interactive system with which a player can play. Video games are the most common form today, and for this reason the two terms are often used interchangeably. There are other common forms of electronic game including handheld electronic games, standalone systems (e.g. pinball, slot machines, or electro-mechanical arcade games), and exclusively non-visual products (e.g. audio games). Teletype games The earliest form of computer game to achieve any degree of mainstream use was the text-based Teletype game. Teletype games lack video display screens and instead present the game to the player by printing a series of characters on paper which the player reads as it emerges from the platen. Practically this means that each action taken will require a line of paper and thus a hard-copy record of the game remains after it has been played. This naturally tends to reduce the size of the gaming universe or alternatively to require a great amount of paper. As computer screens became standard during the rise of the third generation computer, text-based command line-driven language parsing Teletype games transitioned into visual interactive fiction allowing for greater depth of gameplay and reduced paper requirements. This transition was accompanied by a simultaneous shift from the mainframe environment to the personal computer. Several of these subsequently were ported to systems with video displays, eliminating the need for a teletype printer. Examples of text-based Teletype games include: The Oregon Trail (1971) Trek73 (1973) Dungeon (1975) Super Star Trek (1975) Colossal Cave Adventure (1976) Zork (1977) Electronic handhelds The earliest form of dedicated console, handheld electronic games are characterized by their size and portability. Used to play interactive games, handheld electronic games are often miniaturized versions of video games. The controls, display and speakers are all part of a single unit, an
https://en.wikipedia.org/wiki/Donald%20B.%20Gillies
Donald Bruce Gillies (October 15, 1928 – July 17, 1975) was a Canadian computer scientist and mathematician who worked in the fields of computer design, game theory, and minicomputer programming environments. Early life and education Donald B. Gillies was born in Toronto, Ontario, Canada, to John Zachariah Gillies (a Canadian) and Anne Isabelle Douglas MacQueen (an American). He attended the University of Toronto Schools, a laboratory school originally affiliated with the university. Gillies attended the University of Toronto from 1946 to 1950, majoring in mathematics. He began his graduate education at the University of Illinois and helped with the checkout of ORDVAC computer in the summer of 1951. After one year he transferred to Princeton to work for John von Neumann and developed the first theorems of core (game theory) in his PhD thesis. Gillies ranked among the top ten participants in the William Lowell Putnam Mathematical Competition held in 1950. Career Gillies moved to England for two years to work for the National Research Development Corporation. He returned to the US in 1956, married Alice E. Dunkle, and began a job as a professor at the University of Illinois at Urbana-Champaign. Starting in 1957, Gillies designed the three-stage pipeline control of the ILLIAC II supercomputer at the University of Illinois. The pipelined stages were named "advanced control", "delayed control", and "interplay". This work competed with the IBM 7030 Stretch computer and was in the public domain. Gillies presented a talk on ILLIAC II at the University of Michigan Engineering Summer Conference in 1962. During checkout of ILLIAC II, Gillies found three new Mersenne primes, one of which was the largest prime number known at the time. Death and legacy Gillies died unexpectedly at age 46 on July 17, 1975, of a rare viral myocarditis. In 1975, the Donald B. Gillies Memorial lecture was established at the University of Illinois, with one leading researcher from compu
https://en.wikipedia.org/wiki/Smoked%20salt
Smoked salt is an aromatic salt smoked with any number of select bark free woods for up to 14 days. The kind of wood used for smoking impacts the flavor, which can range from subtle to bold or even sweet. The most common choices are alder, apple wood, hickory, mesquite, and oak. Infused smoked salts like smoked bacon chipotle sea salt are very popular because of their dynamic flavor profiles. Smoked salt is used to enhance the inherent flavors of a dish, while also imparting a smoky taste. It is suitable for vegetarians, often acting as a replacement for bacon crumbles. Smoked salt differs from smoke-flavored salt, as the latter contains a smoke-flavored additive, and is not classified as a natural salt product. See also List of smoked foods Edible salt Smoked food
https://en.wikipedia.org/wiki/Legion%20%28demons%29
Legion means a large group or in another parlance it may mean "many". In the Christian Bible, it is used to refer to the group of demons, particularly those in two of three versions of the exorcism of the Gerasene demoniac, an account in the New Testament of an incident in which Jesus performs an exorcism. Development of the story The earliest version of this story exists in the Gospel of Mark, described as taking place in "the country of the Gerasenes". Jesus encounters a possessed man and calls on the demon to emerge, demanding to know its name – an important element of traditional exorcism practice. He finds the man is possessed by a multitude of demons who give the collective name of "Legion". Fearing that Jesus will drive them out of the world and into the abyss, they beg him instead to cast them into a herd of pigs on a nearby hill, which he does. The pigs then rush into the sea and are drowned (). This story is also in the other two Synoptic Gospels. The Gospel of Luke shortens the story but retains most of the details including the name (). The Gospel of Matthew shortens it more dramatically, changes the possessed man to two men (a particular stylistic device of this writer) and changes the location to "the country of the Gadarenes". This is probably because the author was aware that Gerasa is actually around 50 km away from the Sea of Galilee (although Gadara is still 10 km distant). In this version, the demons are unnamed (). Cultural background According to Michael Willett Newheart, professor of New Testament Language and Literature at the Howard University School of Divinity (2004), the author of the Gospel of Mark could well have expected readers to associate the name Legion with the Roman military formation, active in the area at the time (around 70 AD). The intention may be to show that Jesus is stronger than the occupying force of the Romans. The Biblical scholar Seyoon Kim, however, points out that the Latin legio was commonly used as a loanwor
https://en.wikipedia.org/wiki/Fluorescence%20in%20situ%20hybridization
Fluorescence in situ hybridization (FISH) is a molecular cytogenetic technique that uses fluorescent probes that bind to only particular parts of a nucleic acid sequence with a high degree of sequence complementarity. It was developed by biomedical researchers in the early 1980s to detect and localize the presence or absence of specific DNA sequences on chromosomes. Fluorescence microscopy can be used to find out where the fluorescent probe is bound to the chromosomes. FISH is often used for finding specific features in DNA for use in genetic counseling, medicine, and species identification. FISH can also be used to detect and localize specific RNA targets (mRNA, lncRNA and miRNA) in cells, circulating tumor cells, and tissue samples. In this context, it can help define the spatial-temporal patterns of gene expression within cells and tissues. Probes – RNA and DNA In biology, a probe is a single strand of DNA or RNA that is complementary to a nucleotide sequence of interest. RNA probes can be designed for any gene or any sequence within a gene for visualization of mRNA, lncRNA and miRNA in tissues and cells. FISH is used by examining the cellular reproduction cycle, specifically interphase of the nuclei for any chromosomal abnormalities. FISH allows the analysis of a large series of archival cases much easier to identify the pinpointed chromosome by creating a probe with an artificial chromosomal foundation that will attract similar chromosomes. The hybridization signals for each probe when a nucleic abnormality is detected. Each probe for the detection of mRNA and lncRNA is composed of ~20-50 oligonucleotide pairs, each pair covering a space of 40–50 bp. The specifics depend on the specific FISH technique used. For miRNA detection, the probes use proprietary chemistry for specific detection of miRNA and cover the entire miRNA sequence. Probes are often derived from fragments of DNA that were isolated, purified, and amplified for use in the Human Genome Projec
https://en.wikipedia.org/wiki/Pericycle
The pericycle is a cylinder of parenchyma or sclerenchyma cells that lies just inside the endodermis and is the outer most part of the stele of plants. Although it is composed of non-vascular parenchyma cells, it's still considered part of the vascular cylinder because it arises from the procambium as do the vascular tissues it surrounds. In eudicots, it also has the capacity to produce lateral roots. Branch roots arise from this primary meristem tissue. In plants undergoing secondary growth, the pericycle contributes to the vascular cambium often diverging into a cork cambium. In angiosperms certain molecules within the endodermis and the surrounding vasculature are sent to the pericycle which promotes the growth of the root meristems. Location The pericycle is located between the endodermis and phloem in plant roots. In dicot stems, it is situated around the ring of vascular bundles in the stele. Function In dicot roots, the pericycle strengthens the roots and provides protection for the vascular bundles. In dicot root, the vascular cambium is completely secondary in origin, and it originates from a portion of pericycle tissue. The pericycle regulates the formation of lateral roots by rapidly dividing near the xylem elements of the root. It has been known to often be confused with other parts of the plant. However, its unique ring structure allows it to be more easily identified. Past efforts to isolate such tissue have been successful. Monocot roots rarely branch, but can, and this branch will originate from the pericycle.
https://en.wikipedia.org/wiki/Lingual%20tonsils
The lingual tonsils are a collection of lymphatic tissue located in the lamina propria of the root of the tongue. This lymphatic tissue consists of the lymphatic nodules rich in cells of the immune system (immunocytes). The immunocytes initiate the immune response when the lingual tonsils get in contact with invading microorganisms (pathogenic bacteria, viruses or parasites). Structure Microanatomy Lingual tonsils are covered externally by stratified squamous nonkeratinized epithelium that invaginates inward forming crypts. Beneath the epithelium is a layer of lymphoid nodules containing lymphocytes. Mucous glands located at the root of tongue are drained through several ducts into the crypt of lingual tonsils. Secretions of these mucous glands keep the crypt clean and free of any debris. Blood supply Lingual tonsils are located on posterior aspect of tongue which is supplied through: Lingual artery, branch of external carotid artery Tonsillar branch of facial artery Ascending and descending palatine arteries Ascending pharyngeal branch of external carotid artery Nerve supply Lingual tonsils are innervated by tonsillar nerves from the tonsilar plexus, formed by the glossopharyngeal and vagus nerves. Function Like other lymphatic tissues, the function of lingual tonsils is to prevent infections. These tonsils contain B and T lymphocytes which get activated when harmful bacteria and viruses come in contact with tonsils. B lymphocytes kill pathogens by producing antibodies against them, while T lymphocytes directly kill them releasing cytotoxic substances or indirectly by stimulating other cells of the immune system. Clinical significance Cancer Squamous cell carcinoma is a type of neoplasm that can affect lingual tonsils. Sleep apnea Enlarged or hypertrophic lingual tonsils have the potential to cause or exacerbate sleep apnea. Additional images External links Pictures at usc.edu (labeled as 'lymphoid tissue')] Lingual Tonsil
https://en.wikipedia.org/wiki/List%20of%20Brazilian%20flags
This article is a list of Brazilian flags. National Flags Government flags Ministries Imperial standards of Brazil Diplomatic services flags Military flags Brazilian Army Brazilian Navy Police flags First-level administrative divisions This list shows the flags of the 26 Brazilian States and the Federal District. History Municipalities Political flags Separatist movements flags Ethnic groups flags Historical flags Proposed flags House flags of Brazilian freight companies Yacht clubs of Brazil See also Flag of Brazil Hino Nacional Brasileiro
https://en.wikipedia.org/wiki/Jasper%20Ridge%20Biological%20Preserve
The Jasper Ridge Biological Preserve is a nature preserve and biological field station formally established as a reserve in 1973. The biological preserve is owned by Stanford University, and is part of the Stanford School of Humanities and Sciences. It is located at south of Sand Hill Road and west of Interstate 280 in Portola Valley, San Mateo County, California. It is used by students, researchers, and docents to conduct biology research, and teach the community about the importance of that research. The preserve encompasses Jasper Ridge and Searsville Lake (actually a reservoir) and the upper reaches of San Francisquito Creek, along with the latter's Corte Madera Creek and Bear Creek tributaries. Geology Jasper Ridge is part of the foothills northeast of the Santa Cruz Mountains and is bounded by San Francisquito Creek, Corte Madera Creek and Los Trancos Creek, although the preserve occupies only the northwestern half of the ridge. The hilly mass runs about ten kilometers from northwest to southeast and about half that in width. Serpentine (Serpentinite) is the California State Rock. It was formed from deep sea or mantle rocks. This rock was squeezed toward the surface by tectonic plate movement, and thus feels greasy, as it has been polished over millions of years. Graywacke Sandstone after crossing Leonard's Bridge. This sandstone was part of the Franciscan formation 138 million years ago. Some rocks found at the preserve include: Greenstone, Chert, Serpentinite, Sandstone. Ecology In 1922, Cooper asserted that Jasper Ridge was historically chaparral, and cleared in the nineteenth century to open grasslands, primarily Eurasian wild oats (Avena fatua and Avena barbata). However much of the grassland has been replaced by various oaks, especially Coast Live Oak (Quercus agrifolia), and Pacific Madrone (Arbutus menziesii). More recently, the oak/madrone forest is being succeeded by specimens of large Douglas fir (Pseudotsuga menziesii) as in the image above.
https://en.wikipedia.org/wiki/Steve%20Chen%20%28computer%20engineer%29
Steve Chen (; pinyin: Chén Shìqīng) (born 1944 in Taiwan) is a Taiwanese computer engineer and internet entrepreneur. Chen was elected to the US National Academy of Engineering in 1991 for leadership in the development of super-computer architectures and their realization. Life Chen earned a BS from National Taiwan University in 1966. MS from Villanova University in 1971 and a PhD under David Kuck from the University of Illinois at Urbana-Champaign in 1975. From 1975 through 1978 he worked for Burroughs Corporation on the design of the Burroughs large systems line of supercomputers. He is best known as the principal designer of the Cray X-MP and Cray Y-MP multiprocessor supercomputers. Chen left Cray Research in September 1987 after it dropped the MP line. With IBM's financial support, Chen founded Supercomputer Systems Incorporated (SSI) in January 1988. SSI was devoted to development of the SS-1 supercomputer, which was nearly completed before the estimated $150 million investment ran out. The Eau Claire, Wisconsin-based company went bankrupt in early 1993, leaving more than 300 employees jobless. An attempt to salvage the work was made by forming a new company, SuperComputer International (SCI), later that year. SCI was renamed Chen Systems in 1995. It was acquired by Sequent Computer Systems the following year. John Markoff, a technology journalist, wrote in the New York Times that Chen was considered "one of the nation's most brilliant supercomputer designers while working in this country for the technology pioneer Seymour Cray in the 1980s." In 1999, Chen became founder and CEO of Galactic Computing, a developer of supercomputing blade systems, based in Shenzhen, China. At Tonbu, Inc., his team designed and implemented the world's first fully scalable cloud computing system. A fully scalable dynamic process and application engine. By 2005 he started to focus on grid computing to model a human brain instead. By 2010, he was reported to be working on
https://en.wikipedia.org/wiki/A%20Mathematical%20Theory%20of%20Natural%20and%20Artificial%20Selection
A Mathematical Theory of Natural and Artificial Selection is the title of a series of scientific papers by the British population geneticist J.B.S. Haldane, published between 1924 and 1934. Haldane outlines the first mathematical models for many cases of evolution due to selection, an important concept in the modern synthesis of Darwin's theory with Mendelian genetics. Overview The papers were published in ten parts over ten years in three different journals. Mathematical Theory of Natural and Artificial Selection, A Biology papers Works by J. B. S. Haldane Modern synthesis (20th century)
https://en.wikipedia.org/wiki/National%20Centre%20for%20Radio%20Astrophysics
The National Centre for Radio Astrophysics (NCRA; Hindi: राष्ट्रीय रेडियो खगोल भौतिकी केन्द्र) is a research institution in India in the field of radio astronomy is located in the Pune University Campus (just beside IUCAA), is part of the Tata Institute of Fundamental Research, Mumbai, India. NCRA has an active research program in many areas of Astronomy and Astrophysics, which includes studies of the Sun, Interplanetary scintillations, pulsars, the Interstellar medium, Active galaxies and cosmology and particularly in the specialized field of Radio Astronomy and Radio instrumentation. NCRA also provides exciting opportunities and challenges in engineering fields such as analog and digital electronics, signal processing, antenna design, telecommunication and software development. NCRA has set up the Giant Metrewave Radio Telescope (GMRT), the world's largest telescope operating at meter wavelengths located at Khodad, 80 km from Pune. NCRA also operates the Ooty Radio Telescope (ORT), which is a large Cylindrical Telescope located near Udhagamandalam, India. History The Centre has its roots in the Radio Astronomy Group of TIFR, set up in the early 1960s under the leadership of Govind Swarup. The group designed and built the Ooty Radio Telescope. In the early 80's an ambitious plan for a new telescope was proposed - the Giant Metrewave Radio Telescope. Since the site chosen for this new telescope was close to Pune, a new home for the group was built in the scenic campus of Pune University. The radio astronomy group morphed into the National Centre for Radio Astrophysics around this time. Research The National Centre for Radio Astrophysics of the Tata Institute of Fundamental Research (NCRA-TIFR) is a institute for radio astronomy in India. Research activities at NCRA-TIFR are centered on low frequency radio astronomy, with research in a wide range of areas, including solar physics, pulsars, active galactic nuclei, the interstellar medium, supernova remnants, the G
https://en.wikipedia.org/wiki/Amazon%20Web%20Services
Amazon Web Services, Inc. (AWS) is a subsidiary of Amazon that provides on-demand cloud computing platforms and APIs to individuals, companies, and governments, on a metered, pay-as-you-go basis. Clients will often use this in combination with autoscaling (a process that allows a client to use more computing in times of high application usage, and then scale down to reduce costs when there is less traffic). These cloud computing web services provide various services related to networking, compute, storage, middleware, IoT and other processing capacity, as well as software tools via AWS server farms. This frees clients from managing, scaling, and patching hardware and operating systems. One of the foundational services is Amazon Elastic Compute Cloud (EC2), which allows users to have at their disposal a virtual cluster of computers, with extremely high availability, which can be interacted with over the internet via REST APIs, a CLI or the AWS console. AWS's virtual computers emulate most of the attributes of a real computer, including hardware central processing units (CPUs) and graphics processing units (GPUs) for processing; local/RAM memory; hard-disk/SSD storage; a choice of operating systems; networking; and pre-loaded application software such as web servers, databases, and customer relationship management (CRM). AWS services are delivered to customers via a network of AWS server farms located throughout the world. Fees are based on a combination of usage (known as a "Pay-as-you-go" model), hardware, operating system, software, or networking features chosen by the subscriber require availability, redundancy, security, and service options. Subscribers can pay for a single virtual AWS computer, a dedicated physical computer, or clusters of either. Amazon provides select portions of security for subscribers (e.g. physical security of the data centers) while other aspects of security are the responsibility of the subscriber (e.g. account management, vulnerabil
https://en.wikipedia.org/wiki/Zoo%20%28file%20format%29
zoo is a data compression program and format developed by Rahul Dhesi in the mid-1980s. The format is based on the LZW compression algorithm and compressed files are identified by the .zoo file extension. It is no longer widely used. Program source code was originally published on the comp.sources.misc Usenet newsgroup, and was compatible with a variety of Unix-like operating systems. Binaries were also published for the MS-DOS and AmigaOS user communities. Zoo features Zoo archives can store multiple "generations" of a file; if files are added to an archive with the same pathname yet more recent date, if generations are enabled for the archive the older version(s) will be retained (with a semicolon and version number, similar to version numbers in the VMS and RT-11 operating systems) as the new file is added. This allows files that are frequently modified to be backed up in such a way as to allow access to previous versions (up to the version limit chosen) from one archive. External links zoo 2.10 source unzoo - zoo archive extractor, source included Data compression Archive formats
https://en.wikipedia.org/wiki/HijackThis
HijackThis (also HiJackThis or HJT) is a free and open-source tool to detect malware and adware on Microsoft Windows. It was originally created by Merijn Bellekom, and later sold to Trend Micro. The program is notable for quickly scanning a user's computer to display the most common locations of malware, rather than relying on a database of known spyware. HijackThis is used primarily for diagnosis of malware, not to remove or detect spyware—as uninformed use of its removal facilities can cause significant software damage to a computer. Browser hijacking can cause malware to be installed on a computer. On February 16, 2012, Trend Micro released the HijackThis source code as open source and it is now available on the SourceForge site. Use HijackThis can generate a plain-text logfile detailing all entries it finds, and some entries can be fixed by HijackThis. Inexperienced users are advised to exercise caution or seek help when using the latter option. Except for a small whitelist of known safe entries, HijackThis does not discriminate between legitimate and unwanted items. HijackThis attempts to create backups of the files and registry entries that it fixes, which can be used to restore the system in the event of a mistake. A common use is to post the logfile to a forum where more experienced users can help decipher which entries need to be removed. Automated tools also exist that analyze saved logs and attempt to provide recommendations to the user, or to clean entries automatically. Use of such tools, however, is generally discouraged by those who specialize in manually dealing with HijackThis logs: they consider the tools dangerous for inexperienced users, and neither accurate nor reliable enough to substitute for consulting with a trained human analyst. Later versions of HijackThis include such additional tools as a task manager, a hosts-file editor, and an alternate-data-stream scanner. HijackThis reached end-of-life in 2013 and is no longer developed. Howe
https://en.wikipedia.org/wiki/Motoko%20Kusanagi
Major , or just "Major", is the main protagonist in Masamune Shirow's Ghost in the Shell manga and anime series. She is a synthetic "full-body prosthesis" augmented-cybernetic human employed as the field commander of Public Security Section 9, a fictional anti-cybercrime law-enforcement division of the Japanese National Public Safety Commission. A strong-willed, physically powerful, and highly intelligent cyberhero, she is well known for her skills in deduction, hacking and military tactics. Conception and creation Motoko Kusanagi's body was designed by the manga author and artist Masamune Shirow to be a mass production model so she would not be conspicuous. Her electrical and mechanical system within is special and features parts unavailable on the civilian market. Shirow intentionally chose this appearance so Motoko would not be harvested for those parts. Character In the 1995 anime film adaptation, character designer and key animator supervisor Hiroyuki Okiura made her different from her original manga counterpart, stating, "Motoko Kusanagi is a cyborg. Therefore, her body is strong and youthful. However her human mentality is considerably older than she looks. I tried to depict this maturity in her character instead of the original girl created by Masamune Shirow." In nearly all portrayals, Kusanagi is depicted as a self-made woman. She is a fiercely independent and capable leader who has proven herself under fire countless times. Kenji Kamiyama had a difficult time identifying her and could not understand her motives during the first season of the anime series Ghost in the Shell: Stand Alone Complex. Due to this, he created an episode in the second season where he recounted her past. He was then able to describe her as a human who was chosen to gain this superhuman power; she probably believes that she has an obligation to use that ability for the benefit of others. English voice actor and director Mary Elizabeth McGlynn states she loved playing the role of
https://en.wikipedia.org/wiki/Yuri%20Manin
Yuri Ivanovich Manin (; 16 February 1937 – 7 January 2023) was a Russian mathematician, known for work in algebraic geometry and diophantine geometry, and many expository works ranging from mathematical logic to theoretical physics. Life and career Manin was born on 16 February 1937 in Simferopol, Crimean ASSR, Soviet Union. He received a doctorate in 1960 at the Steklov Mathematics Institute as a student of Igor Shafarevich. He became a professor at the Max-Planck-Institut für Mathematik in Bonn, where he was director from 1992 to 2005 and then director emeritus. He was also a professor emeritus at Northwestern University. He had over the years more than 40 doctoral students, including Vladimir Berkovich, Mariusz Wodzicki, Alexander Beilinson, Ivan Cherednik, Alexei Skorobogatov, Vladimir Drinfeld, Mikhail Kapranov, Vyacheslav Shokurov, Ralph Kaufmann, Arend Bayer, Victor Kolyvagin and Hà Huy Khoái. Manin died on 7 January 2023. Research Manin's early work included papers on the arithmetic and formal groups of abelian varieties, the Mordell conjecture in the function field case, and algebraic differential equations. The Gauss–Manin connection is a basic ingredient of the study of cohomology in families of algebraic varieties. He developed the Manin obstruction, indicating the role of the Brauer group in accounting for obstructions to the Hasse principle via Grothendieck's theory of global Azumaya algebras, setting off a generation of further work. Manin pioneered the field of arithmetic topology (along with John Tate, David Mumford, Michael Artin, and Barry Mazur). He also formulated the Manin conjecture, which predicts the asymptotic behaviour of the number of rational points of bounded height on algebraic varieties. In mathematical physics, Manin wrote on Yang–Mills theory, quantum information, and mirror symmetry. He was one of the first to propose the idea of a quantum computer in 1980 with his book Computable and Uncomputable. He wrote a book on cu
https://en.wikipedia.org/wiki/Tom%20Pepper
Tom Pepper (born August 24, 1975 in Des Moines, Iowa) is a computer programmer best known for his collaboration with Justin Frankel on the Gnutella peer-to-peer system. He and Frankel co-founded Nullsoft, whose most popular program is Winamp, which was sold to AOL in May 1999. He subsequently worked for AOL developing SHOUTcast, an Internet streaming audio service, with Frankel and Stephen "Tag" Loomis. After leaving AOL in 2004. he worked at RAZZ, Inc. He continues to collaborate with Frankel on independent projects like Ninjam. See also WASTE Friend-to-friend (F2F) File sharing Peer-to-peer (P2P) Gnutella Nullsoft Justin Frankel
https://en.wikipedia.org/wiki/DVD%20region%20code
DVD region codes are a digital rights management technique introduced in 1997. It is designed to allow rights holders to control the international distribution of a DVD release, including its content, release date, and price, all according to the appropriate region. This is achieved by way of region-locked DVD players, which will play back only DVDs encoded to their region (plus those without any region code). The American DVD Copy Control Association also requires that DVD player manufacturers incorporate the regional-playback control (RPC) system. However, region-free DVD players, which ignore region coding, are also commercially available, and many DVD players can be modified to be region-free, allowing playback of all discs. DVDs may use one code, multiple codes (multi-region), or all codes (region free). Region codes and countries Any combination of regions can be applied to a single disc. For example, a DVD designated Region 2/4 is suitable for playback in Europe, Latin America, Oceania, and any other Region 2 or Region 4 area. So-called "Region 0" and "ALL" discs are meant to be playable worldwide. The term "Region 0" also describes the DVD players designed or modified to incorporate Regions 1–8, thereby providing compatibility with most discs, regardless of region. This apparent solution was popular in the early days of the DVD format, but studios quickly responded by adjusting discs to refuse to play in such machines by implementing a system known as "Regional Coding Enhancement" (RCE). DVDs sold in the Baltic states use both region 2 and 5 codes, having previously been in region 5 (because of their history as part of the USSR), but EU single market law concerning the free movement of goods caused a switch to region 2. European region 2 DVDs may be sub-coded "D1" to "D4". "D1" are the UK only releases; "D2" and "D3" are not sold in the UK and Ireland; "D4" are distributed throughout Europe. Overseas territories of the United Kingdom and France (both in
https://en.wikipedia.org/wiki/SUPER-UX
SUPER-UX was a version of the Unix operating system from NEC that is used on its SX series of supercomputers. History The initial version of SUPER-UX was based on UNIX System V version 3.1 with features from BSD 4.3. The version for the NEC SX-9 was based on SVR4.2MP with BSD enhancements. Features SUPER-UX is a 64-bit UNIX operating system. It supports the Supercomputer File System (SFS). Earth Simulator The Earth Simulator uses a custom OS called "ESOS" (Earth Simulator Operating System) based on SUPER-UX. It has many enhanced features custom designed for the Earth Simulator which are not in the regular SUPER-UX OS. See also EWS-UX
https://en.wikipedia.org/wiki/History%20of%20Animals
History of Animals (, Ton peri ta zoia historion, "Inquiries on Animals"; , "History of Animals") is one of the major texts on biology by the ancient Greek philosopher Aristotle, who had studied at Plato's Academy in Athens. It was written in the fourth century BC; Aristotle died in 322 BC. Generally seen as a pioneering work of zoology, Aristotle frames his text by explaining that he is investigating the what (the existing facts about animals) prior to establishing the why (the causes of these characteristics). The book is thus an attempt to apply philosophy to part of the natural world. Throughout the work, Aristotle seeks to identify differences, both between individuals and between groups. A group is established when it is seen that all members have the same set of distinguishing features; for example, that all birds have feathers, wings, and beaks. This relationship between the birds and their features is recognized as a universal. The History of Animals contains many accurate eye-witness observations, in particular of the marine biology around the island of Lesbos, such as that the octopus had colour-changing abilities and a sperm-transferring tentacle, that the young of a dogfish grow inside their mother's body, or that the male of a river catfish guards the eggs after the female has left. Some of these were long considered fanciful before being rediscovered in the nineteenth century. Aristotle has been accused of making errors, but some are due to misinterpretation of his text, and others may have been based on genuine observation. He did however make somewhat uncritical use of evidence from other people, such as travellers and beekeepers. The History of Animals had a powerful influence on zoology for some two thousand years. It continued to be a primary source of knowledge until zoologists in the sixteenth century, such as Conrad Gessner, all influenced by Aristotle, wrote their own studies of the subject. Context Aristotle (384–322 BC) studied at Plat
https://en.wikipedia.org/wiki/Generation%20of%20Animals
The Generation of Animals (or On the Generation of Animals; Greek: Περὶ ζῴων γενέσεως (Peri Zoion Geneseos); Latin: De Generatione Animalium) is one of the biological works of the Corpus Aristotelicum, the collection of texts traditionally attributed to Aristotle (384–322 BC). The work provides an account of animal reproduction, gestation and heredity. Content Generation of Animals consists of five books, which are themselves split into varying numbers of chapters. Most editions of this work categorise it with Bekker numbers. In general, each book covers a range of related topics, however there is also a significant amount of overlap in the content of the books. For example, while one of the two principal topics covered in book I is the function of semen (gone, sperma), this account is not finalised until partway through book II. Book I (715a – 731b) Chapter 1 begins with Aristotle claiming to have already addressed the parts of animals, referencing the author's work of the same name. While this and possibly his other biological works, have addressed three of the four causes pertaining to animals, the final, formal, and material, the efficient cause has yet to be spoken of. He argues that the efficient cause, or "that from which the source of movement comes" can be addressed with an inquiry into the generation of animals. Aristotle then provides a general overview of the processes of reproduction adopted by the various genera, for instance most 'blooded' animals reproduce by coition of a male and female of the same species, but cases vary for 'bloodless' animals. The reproductive organs of males and females are also investigated. Through chapters 2–5 Aristotle successively describes the general reproductive features common to each sex, the differences in reproductive parts among blooded animals, the causes of differences of testes in particular, and why some animals do not have external reproductive organs. The latter provides clear examples of Aristotle's te
https://en.wikipedia.org/wiki/Physics%20%28Aristotle%29
The Physics (Greek: Φυσικὴ ἀκρόασις Phusike akroasis; Latin: Physica, or Naturales Auscultationes, possibly meaning "lectures on nature") is a named text, written in ancient Greek, collated from a collection of surviving manuscripts known as the Corpus Aristotelicum, attributed to the 4th-century BC philosopher Aristotle. The meaning of physics in Aristotle It is a collection of treatises or lessons that deals with the most general (philosophical) principles of natural or moving things, both living and non-living, rather than physical theories (in the modern sense) or investigations of the particular contents of the universe. The chief purpose of the work is to discover the principles and causes of (and not merely to describe) change, or movement, or motion (κίνησις kinesis), especially that of natural wholes (mostly living things, but also inanimate wholes like the cosmos). In the conventional Andronicean ordering of Aristotle's works, it stands at the head of, as well as being foundational to, the long series of physical, cosmological and biological treatises, whose ancient Greek title, τὰ φυσικά, means "the [writings] on nature" or "natural philosophy". Description of the content The Physics is composed of eight books, which are further divided into chapters. This system is of ancient origin, now obscure. In modern languages, books are referenced with Roman numerals, standing for ancient Greek capital letters (the Greeks represented numbers with letters, e.g. A for 1). Chapters are identified by Arabic numerals, but the use of the English word "chapter" is strictly conventional. Ancient "chapters" (capita) are generally very short, often less than a page. Additionally, the Bekker numbers give the page and column (a or b) used in the Prussian Academy of Sciences' edition of Aristotle's works, instigated and managed by Bekker himself. These are evident in the 1831 2-volume edition. Bekker's line numbers may be given. These are often given, but unless the editi
https://en.wikipedia.org/wiki/Advance%20market%20commitment
An advance market commitment (AMC) is a binding contract, typically offered by a government or other financial entity, used to guarantee a viable market for a product once it is successfully developed. Generally AMCs are used in circumstances where the cost of developing a new product is too high to be worthwhile for the private sector without a guarantee of a certain quantity of purchases in advance. As such, AMCs have been used in the creation of vaccines or other medicine with high up-front development costs. As a result of such a commitment, the market for vaccines or drugs for neglected diseases would be comparable in size and certainty to the market for medicines for rich countries. This would encourage biotech and pharmaceutical companies to invest in the development of new vaccines to tackle the world’s most pressing health problems, such as pneumonia, diarrheal disease, HIV/AIDS, and malaria, in the normal course of their business decisions. An AMC has also been launched for carbon-removal that meets certain technical specifications. AMCs were proposed by Michael Kremer and Rachel Glennerster in 2004 to encourage research. First AMC announced Feb, 2007 In February 2007, five countries (Canada, Italy, Norway, Russia, the United Kingdom), and the Bill & Melinda Gates Foundation committed US$1.5 billion to launch the first advance market commitment to speed the development and availability of a new vaccine which is expected to save the lives of 7 million children by 2030. The pilot provides 7 to 10 years of funding to support the development of future vaccines against pneumococcal disease. Pneumococcal disease is a major cause of pneumonia and meningitis and it kills 1.6 million people every year. The pilot includes provisions to assure the long term sustainable supply and price for the poorest countries. The AMC for pneumococcal disease offers an improved market for vaccines now in development. Vaccines are bought only if they meet pre-determined standa
https://en.wikipedia.org/wiki/Privilege%20%28computing%29
In computing, privilege is defined as the delegation of authority to perform security-relevant functions on a computer system. A privilege allows a user to perform an action with security consequences. Examples of various privileges include the ability to create a new user, install software, or change kernel functions. Users who have been delegated extra levels of control are called privileged. Users who lack most privileges are defined as unprivileged, regular, or normal users. Theory Privileges can either be automatic, granted, or applied for. An automatic privilege exists when there is no requirement to have permission to perform an action. For example, on systems where people are required to log into a system to use it, logging out will not require a privilege. Systems that do not implement file protection - such as MS-DOS - essentially give unlimited privilege to perform any action on a file. A granted privilege exists as a result of presenting some credential to the privilege granting authority. This is usually accomplished by logging on to a system with a username and password, and if the username and password supplied are correct, the user is granted additional privileges. A privilege is applied for by either an executed program issuing a request for advanced privileges, or by running some program to apply for the additional privileges. An example of a user applying for additional privileges is provided by the sudo command to run a command as superuser (root) user, or by the Kerberos authentication system. Modern processor architectures have multiple CPU modes that allows the OS to run at different privilege levels. Some processors have two levels (such as user and supervisor); i386+ processors have four levels (#0 with the most, #3 with the least privileges). Tasks are tagged with a privilege level. Resources (segments, pages, ports, etc.) and the privileged instructions are tagged with a demanded privilege level. When a task tries to use a re
https://en.wikipedia.org/wiki/Magnetosheath
The magnetosheath is the region of space between the magnetopause and the bow shock of a planet's magnetosphere. The regularly organized magnetic field generated by the planet becomes weak and irregular in the magnetosheath due to interaction with the incoming solar wind, and is incapable of fully deflecting the highly charged particles. The density of the particles in this region is considerably lower than what is found beyond the bow shock, but greater than within the magnetopause, and can be considered a transitory state. Scientific research into the exact nature of the magnetosheath has been limited due to a longstanding misconception that it was a simple byproduct of the bow shock/magnetopause interaction and had no inherently important properties of its own. Recent studies indicate, however, that the magnetosheath is a dynamic region of turbulent plasma flow that may play an important role in the structure of the bow shock and the magnetopause, and may help to dictate the flow of energetic particles across those boundaries. Kinetic plasma instabilities may cause further complexity by generating plasma waves and energetic particle beams in the magnetosheath and foreshock regions. The Earth's magnetosheath typically occupies the region of space approximately 10 Earth radii on the upwind (Sun-facing) side of the planet, extending significantly farther out on the downwind side due to the pressure of the solar wind. The exact location and width of the magnetosheath depends on variables such as solar activity. See also Earth's magnetic field Interplanetary magnetic field (IMF) Magnetotail Van Allen radiation belt Plasmasphere Ionosphere Space weather and heliophysics
https://en.wikipedia.org/wiki/Triple%20bar
The triple bar or tribar, ≡, is a symbol with multiple, context-dependent meanings indicating equivalence of two different things. Its main uses are in mathematics and logic. It has the appearance of an equals sign  with a third line. Encoding The triple bar character in Unicode is code point . The closely related code point is the same symbol with a slash through it, indicating the negation of its mathematical meaning. In LaTeX mathematical formulas, the code \equiv produces the triple bar symbol and \not\equiv produces the negated triple bar symbol as output. Uses Mathematics and philosophy In logic, it is used with two different but related meanings. It can refer to the if and only if connective, also called material equivalence. This is a binary operation whose value is true when its two arguments have the same value as each other. Alternatively, in some texts ⇔ is used with this meaning, while ≡ is used for the higher-level metalogical notion of logical equivalence, according to which two formulas are logically equivalent when all models give them the same value. Gottlob Frege used a triple bar for a more philosophical notion of identity, in which two statements (not necessarily in mathematics or formal logic) are identical if they can be freely substituted for each other without change of meaning. In mathematics, the triple bar is sometimes used as a symbol of identity or an equivalence relation (although not the only one; other common choices include ~ and ≈). Particularly, in geometry, it may be used either to show that two figures are congruent or that they are identical. In number theory, it has been used beginning with Carl Friedrich Gauss (who first used it with this meaning in 1801) to mean modular congruence: if N divides a − b. In category theory, triple bars may be used to connect objects in a commutative diagram, indicating that they are actually the same object rather than being connected by an arrow of the category. This symbol is als
https://en.wikipedia.org/wiki/Genetic%20viability
Genetic viability is the ability of the genes present to allow a cell, organism or population to survive and reproduce. The term is generally used to mean the chance or ability of a population to avoid the problems of inbreeding. Less commonly genetic viability can also be used in respect to a single cell or on an individual level. Inbreeding depletes heterozygosity of the genome, meaning there is a greater chance of identical alleles at a locus. When these alleles are non-beneficial, homozygosity could cause problems for genetic viability. These problems could include effects on the individual fitness (higher mortality, slower growth, more frequent developmental defects, reduced mating ability, lower fecundity, greater susceptibility to disease, lowered ability to withstand stress, reduced intra- and inter-specific competitive ability) or effects on the entire population fitness (depressed population growth rate, reduced regrowth ability, reduced ability to adapt to environmental change). See Inbreeding depression. When a population of plants or animals loses their genetic viability, their chance of going extinct increases. Necessary conditions To be genetically viable, a population of plants or animals requires a certain amount of genetic diversity and a certain population size. For long-term genetic viability, the population size should consist of enough breeding pairs to maintain genetic diversity. The precise effective population size can be calculated using a minimum viable population analysis.  Higher genetic diversity and a larger population size will decrease the negative effects of genetic drift and inbreeding in a population. When adequate measures have been met, the genetic viability of a population will increase. Causes for decrease The main cause of a decrease in genetic viability is loss of habitat. This loss can occur because of, for example urbanization or deforestation causing habitat fragmentation. Natural events like earthquakes, floods
https://en.wikipedia.org/wiki/Three-center%20two-electron%20bond
A three-center two-electron (3c–2e) bond is an electron-deficient chemical bond where three atoms share two electrons. The combination of three atomic orbitals form three molecular orbitals: one bonding, one non-bonding, and one anti-bonding. The two electrons go into the bonding orbital, resulting in a net bonding effect and constituting a chemical bond among all three atoms. In many common bonds of this type, the bonding orbital is shifted towards two of the three atoms instead of being spread equally among all three. Example molecules with 3c–2e bonds are the trihydrogen cation () and diborane (). In these two structures, the three atoms in each 3c-2e bond form an angular geometry, leading to a bent bond. Boranes and carboranes An extended version of the 3c–2e bond model features heavily in cluster compounds described by the polyhedral skeletal electron pair theory, such as boranes and carboranes. These molecules derive their stability from having a completely filled set of bonding molecular orbitals as outlined by Wade's rules. The monomer BH3 is unstable since the boron atom has an empty p-orbital. A B−H−B 3-center-2-electron bond is formed when a boron atom shares electrons with a B−H bond on another boron atom. The two electrons (corresponding to one bond) in a B−H−B bonding molecular orbital are spread out across three internuclear spaces. In diborane (B2H6), there are two such 3c-2e bonds: two H atoms bridge the two B atoms, leaving two additional H atoms in ordinary B−H bonds on each B. As a result, the molecule achieves stability since each B participates in a total of four bonds and all bonding molecular orbitals are filled, although two of the four bonds are 3-center B−H−B bonds. The reported bond order for each B−H interaction in a bridge is 0.5, so that the bridging B−H−B bonds are weaker and longer than the terminal B−H bonds, as shown by the bond lengths in the structural diagram. Transition metal complexes Three-center, two-electron bonding
https://en.wikipedia.org/wiki/Pseudaconitine
Pseudaconitine, also known as nepaline (C36H51NO12), is an extremely toxic alkaloid found in high quantities in the roots of Aconitum ferox, also known as Indian Monkshood, which belongs to the family Ranunculaceae. The plant is found in East Asia, including the Himalayas. History Pseudaconitine was discovered in 1878 by Wright and Luff. They isolated a highly toxic alkaloid from the roots of the plant Aconitum ferox and called it pseudaconitine. The poison is also called bikh, bish, or nabee. Toxicity and mechanism Pseudaconitine is a moderate inhibitor of the enzyme acetylcholinesterase. This enzyme breaks down the neurotransmitter acetylcholine through hydrolysis. Inhibition of this enzyme causes a constant stimulation of the postsynaptic membrane by the neurotransmitter which it cannot cancel. This accumulation of acetylcholine may thus lead to the constant stimulation of the muscles, glands and central nervous system. Furthermore, it appears the substance in small quantities also causes a tingling effect on the tongue, lips and skin. Structure and reactivity Pseudaconitine is a diterpene alkaloid, with the chemical formula C36H51NO12. The crystal melts at 202 °C and is moderately soluble in water, but more so in alcohol. This shows that it is a lipophilic substance. When heated in the dry state, it undergoes pyrolysis and pyropseudaconitine (C34H47O10N) is formed. This does not have the same tingling effect as pseudaconitine. See also Aconitine
https://en.wikipedia.org/wiki/List%20of%20botanists%20by%20author%20abbreviation%20%28A%29
A Aa – Hubertus Antonius van der Aa (1935–2017) A.A.Burb. – Andrew A. Burbidge (fl. 2016) A.A.Cocucci – (born 1959) A.A.Eaton – Alvah Augustus Eaton (1865–1908) A.A.Fisch.Waldh. – Alexandr Alexandrovich Fischer von Waldheim (1839–1920) A.Agostini – Angela Agostini (born 1880) A.A.Ham. – Arthur Andrew Hamilton (1855–1929) A.A.Hend. – Andrew Augustus Henderson (1816–1876) A.Ames – Adeline Ames (1879–1976) A.Anderson – Alexander Anderson (1748–1811) A.Arber – Agnes Arber (1879–1960) Aarons. – Aaron Aaronsohn (1876–1919) Aase – Hannah Caroline Aase (1883–1980) A.Barbero – Andrés Barbero (1877–1951) A.Bassi – Agostino Bassi (1773–1856) A.Baytop – Asuman Baytop (1920–2015) Abbayes – Henry Nicollon des Abbayes (1898–1974) Abbiatti – Delia Abbiatti (born 1918) Abbot – John Abbot (1751–c. 1840) Abedin – (fl. 1986) Aberc. – Henry McLaren, 2nd Baron Aberconway (1879–1953) A.Berger – Alwin Berger (1871–1931) A.B.Frank – Albert Bernhard Frank (1839–1900) A.B.Graf – Alfred Byrd Graf (1901–2001) A.B.Jacks. – Albert Bruce Jackson (1876–1947) A.B.Nickels – Anna Buck Nickels (1832–1917) A.Bloxam – Andrew Bloxam (1801–1878) A.Blytt – Axel Gudbrand Blytt (1843–1898) A.Br. – Addison Brown (1830–1913) Abramov – Ivan N. Abramov (1884–1953) Abrams – LeRoy Abrams (1874–1956) A.Braun – Alexander Karl Heinrich Braun (1805–1877) Abrom. – Johannes Abromeit (1857–1946) A.Bruggen – Adolph Cornelis van Bruggen (born 1929) A.Butler – (born 1946) A.Cabrera – Ángel Cabrera (1879–1960) (not to be confused with botanist Ángel Lulio Cabrera (1908–1999)) A.Camus – Aimée Antoinette Camus (1879–1965) A.Cast. – (1896–1968) A.Cels – Auguste Cels (1809-1898) Acerbi – Giuseppe Acerbi (1773–1846) Ach. – Erik Acharius (1757–1819) A.C.H.Blinks – Anne Catherine Hof Blinks (1903–1995) A.Chev. – Auguste Jean Baptiste Chevalier (1873–1956) Achv. – (1907–1999) Ackerman – James David Ackerman (born 1950) Acloque – Alexandre Noël Charles Acloque (1871–1908) A.C.Sch
https://en.wikipedia.org/wiki/Bikont
A bikont ("two flagella") is any of the eukaryotic organisms classified in the group Bikonta. Many single-celled and multi-celled organisms are members of the group, and these, as well as the presumed ancestor, have two flagella. Enzymes Another shared trait of bikonts is the fusion of two genes into a single unit: the genes for thymidylate synthase (TS) and dihydrofolate reductase (DHFR) encode a single protein with two functions. The genes are separately translated in unikonts. Relationships Some research suggests that a unikont (a eukaryotic cell with a single flagellum) was the ancestor of opisthokonts (Animals, Fungi, and related forms) and Amoebozoa, and a bikont was the ancestor of Archaeplastida (Plants and relatives), Excavata, Rhizaria, and Chromalveolata. Cavalier-Smith has suggested that Apusozoa, which are typically considered incertae sedis, are in fact bikonts. Relationships within the bikonts are not yet clear. Cavalier-Smith has grouped the Excavata and Rhizaria into the Cabozoa and the Archaeplastida and Chromalveolata into the Corticata, but at least one other study has suggested that the Rhizaria and Chromalveolata form a clade. An alternative to the Unikont–Bikont division was suggested by Derelle et al. in 2015, where they proposed the acronyms Opimoda–Diphoda respectively, as substitutes to the older terms. The name Diphoda is formed from the letters of DIscoba and diaPHOretickes (shown in capitals). [suggested singular forms are Opneme-Dipheme respectively] Cladogram A "classical" cladogram (data from 2012, 2015) is: However, a cladogram (data from 2015, 2016) with the root in Excavata is The corticates correspond roughly to the bikonts. While Haptophyta, Cryptophyta, Glaucophyta, Rhodophyta, the SAR supergroup and viridiplantae are usually considered monophyletic, Archaeplastida may be paraphyletic, and the mutual relationships between these phyla are still to be fully resolved. Recent reconstructions placed Archaeplastida and Ha
https://en.wikipedia.org/wiki/Taenia%20saginata
Taenia saginata (synonym Taeniarhynchus saginatus), commonly known as the beef tapeworm, is a zoonotic tapeworm belonging to the order Cyclophyllidea and genus Taenia. It is an intestinal parasite in humans causing taeniasis (a type of helminthiasis) and cysticercosis in cattle. Cattle are the intermediate hosts, where larval development occurs, while humans are definitive hosts harbouring the adult worms. It is found globally and most prevalently where cattle are raised and beef is consumed. It is relatively common in Africa, Europe, Southeast Asia, South Asia, and Latin America. Humans are generally infected as a result of eating raw or undercooked beef which contains the infective larvae, called cysticerci. As hermaphrodites, each body segment called proglottid has complete sets of both male and female reproductive systems. Thus, reproduction is by self-fertilisation. From humans, embryonated eggs, called oncospheres, are released with faeces and are transmitted to cattle through contaminated fodder. Oncospheres develop inside muscle, liver, and lungs of cattle into infective cysticerci. T. saginata has a strong resemblance to the other human tapeworms, such as Taenia asiatica and Taenia solium, in structure and biology, except for few details. It is typically larger and longer, with more proglottids, more testes, and higher branching of the uteri. It also lacks an armed scolex unlike other Taenia. Like the other tapeworms, it causes taeniasis inside the human intestine, but does not cause cysticercosis. Its infection is relatively harmless and clinically asymptomatic. Description T. saginata is the largest of species in the genus Taenia. An adult worm is normally 4 to 10 m in length, but can become very large; specimens over 22 m long are reported. Typical of cestodes, its body is flattened dorsoventrally and heavily segmented. It is entirely covered by a tegument. The body is white in colour and consists of three portions: scolex, neck, and strobila. The sco
https://en.wikipedia.org/wiki/Copper%20fist
Copper fist is an N-terminal domain involved in copper-dependent DNA binding. It is named for its resemblance to a fist. It can be found in some fungal transcription factors. These proteins activate the transcription of the metallothionein gene in response to copper. Metallothionein maintains copper levels in yeast. The copper fist domain is similar in structure to metallothionein itself, and on copper binding undergoes a large conformational change, which allows DNA binding. External links Copper fist definition
https://en.wikipedia.org/wiki/Turmite
In computer science, a turmite is a Turing machine which has an orientation in addition to a current state and a "tape" that consists of an infinite two-dimensional grid of cells. The terms ant and vant are also used. Langton's ant is a well-known type of turmite defined on the cells of a square grid. Paterson's worms are a type of turmite defined on the edges of an isometric grid. It has been shown that turmites in general are exactly equivalent in power to one-dimensional Turing machines with an infinite tape, as either can simulate the other. History Langton's ants were invented in 1986 and declared "equivalent to Turing machines". Independently, in 1988, Allen H. Brady considered the idea of two-dimensional Turing machines with an orientation and called them "TurNing machines". Apparently independently of both of these, Greg Turk investigated the same kind of system and wrote to A. K. Dewdney about them. A. K. Dewdney named them "tur-mites" in his "Computer Recreations" column in Scientific American in 1989. Rudy Rucker relates the story as follows: Relative vs. absolute turmites Turmites can be categorised as being either relative or absolute. Relative turmites, alternatively known as "Turning machines", have an internal orientation. Langton's Ant is such an example. Relative turmites are, by definition, isotropic; rotating the turmite does not affect its outcome. Relative turmites are so named because the directions are encoded relative to the current orientation, equivalent to using the words "left" or "backwards". Absolute turmites, by comparison, encode their directions in absolute terms: a particular instruction may direct the turmite to move "North". Absolute turmites are two-dimensional analogues of conventional Turing machines, so are occasionally referred to as simply "Two-dimensional Turing machines". The remainder of this article is concerned with the relative case. Specification The following specification is specific to turmites on a two-
https://en.wikipedia.org/wiki/Kiki%20Kaikai
is a shoot 'em up video game developed and published by Taito for arcades in 1986. Set in Feudal Japan, the player assumes the role of a Shinto shrine maiden who must use her o-fuda scrolls and gohei wand to defeat renegade spirits and monsters from Japanese mythology. The game is noteworthy for using a traditional fantasy setting in a genre otherwise filled with science fiction motifs. The game received a number of home ports, both as a stand-alone title and as part of compilations. The original arcade game was only ever released in Japan, but a bootleg version called Knight Boy was released outside Japan. Kiki Kaikai was followed by a sequel for the Super NES in 1992 known as Pocky & Rocky outside Japan. The series, known as Kiki Kaikai in Japan and Pocky & Rocky outside Japan, has continued since then and includes several games. Plot The game follows the adventures of "Sayo-chan", a young Shinto shrine maiden living in Feudal Japan. One night, while Sayo-chan is fanning a ceremonial fire, she is visited by the Seven Lucky Gods, who warn her of a great, impending danger. Suddenly, a band of mischievous youkai appear and kidnap the gods, quickly retreating to a faraway mountain range. Sayo-chan, determined to help the gods, sets off on a journey across the countryside, where she confronts a number of strange creatures from Japanese mythology, including obake, and yurei. Gameplay Kiki Kaikai is an overhead multi-directional shooter game that requires the player to move in four directions through various levels while attacking harmful enemies as they approach from off screen. As Sayo-chan, the player can attack by either throwing her special o-fuda scrolls in eight separate directions, or by swinging her purification rod directly in front of her. These techniques can be upgraded by finding special paper slips left by defeated enemies that will either enhance their power or improve their range. Sayo can be damaged by coming in contact with an enemy, and can only
https://en.wikipedia.org/wiki/Feature%20integration%20theory
Feature integration theory is a theory of attention developed in 1980 by Anne Treisman and Garry Gelade that suggests that when perceiving a stimulus, features are "registered early, automatically, and in parallel, while objects are identified separately" and at a later stage in processing. The theory has been one of the most influential psychological models of human visual attention. Stages According to Treisman, the first stage of the feature integration theory is the preattentive stage. During this stage, different parts of the brain automatically gather information about basic features (colors, shape, movement) that are found in the visual field. The idea that features are automatically separated appears counterintuitive. However, we are not aware of this process because it occurs early in perceptual processing, before we become conscious of the object. The second stage of feature integration theory is the focused attention stage, where a subject combines individual features of an object to perceive the whole object. Combining individual features of an object requires attention, and selecting that object occurs within a "master map" of locations. The master map of locations contains all the locations in which features have been detected, with each location in the master map having access to the multiple feature maps. When attention is focused at a particular location on the map, the features currently in that position are attended to and are stored in "object files". If the object is familiar, associations are made between the object and prior knowledge, which results in identification of that object. In support of this stage, researchers often refer to patients with Balint's syndrome. Due to damage in the parietal lobe, these people are unable to focus attention on individual objects. Given a stimulus that requires combining features, people with Balint's syndrome are unable to focus attention long enough to combine the features, providing support for this s
https://en.wikipedia.org/wiki/Hilbert%27s%20ninth%20problem
Hilbert's ninth problem, from the list of 23 Hilbert's problems (1900), asked to find the most general reciprocity law for the norm residues of k-th order in a general algebraic number field, where k is a power of a prime. Progress made The problem was partially solved by Emil Artin by establishing the Artin reciprocity law which deals with abelian extensions of algebraic number fields. Together with the work of Teiji Takagi and Helmut Hasse (who established the more general Hasse reciprocity law), this led to the development of the class field theory, realizing Hilbert's program in an abstract fashion. Certain explicit formulas for norm residues were later found by Igor Shafarevich (1948; 1949; 1950). The non-abelian generalization, also connected with Hilbert's twelfth problem, is one of the long-standing challenges in number theory and is far from being complete. See also List of unsolved problems in mathematics
https://en.wikipedia.org/wiki/Hilbert%27s%20fourteenth%20problem
In mathematics, Hilbert's fourteenth problem, that is, number 14 of Hilbert's problems proposed in 1900, asks whether certain algebras are finitely generated. The setting is as follows: Assume that k is a field and let K be a subfield of the field of rational functions in n variables, k(x1, ..., xn ) over k. Consider now the k-algebra R defined as the intersection Hilbert conjectured that all such algebras are finitely generated over k. Some results were obtained confirming Hilbert's conjecture in special cases and for certain classes of rings (in particular the conjecture was proved unconditionally for n = 1 and n = 2 by Zariski in 1954). Then in 1959 Masayoshi Nagata found a counterexample to Hilbert's conjecture. The counterexample of Nagata is a suitably constructed ring of invariants for the action of a linear algebraic group. History The problem originally arose in algebraic invariant theory. Here the ring R is given as a (suitably defined) ring of polynomial invariants of a linear algebraic group over a field k acting algebraically on a polynomial ring k[x1, ..., xn] (or more generally, on a finitely generated algebra defined over a field). In this situation the field K is the field of rational functions (quotients of polynomials) in the variables xi which are invariant under the given action of the algebraic group, the ring R is the ring of polynomials which are invariant under the action. A classical example in nineteenth century was the extensive study (in particular by Cayley, Sylvester, Clebsch, Paul Gordan and also Hilbert) of invariants of binary forms in two variables with the natural action of the special linear group SL2(k) on it. Hilbert himself proved the finite generation of invariant rings in the case of the field of complex numbers for some classical semi-simple Lie groups (in particular the general linear group over the complex numbers) and specific linear actions on polynomial rings, i.e. actions coming from finite-dimensional repres
https://en.wikipedia.org/wiki/Gene%20conversion
Gene conversion is the process by which one DNA sequence replaces a homologous sequence such that the sequences become identical after the conversion event. Gene conversion can be either allelic, meaning that one allele of the same gene replaces another allele, or ectopic, meaning that one paralogous DNA sequence converts another. Allelic gene conversion Allelic gene conversion occurs during meiosis when homologous recombination between heterozygotic sites results in a mismatch in base pairing. This mismatch is then recognized and corrected by the cellular machinery causing one of the alleles to be converted to the other. This can cause non-Mendelian segregation of alleles in germ cells. Nonallelic/ectopic gene conversion Recombination occurs not only during meiosis, but also as a mechanism for repair of double-strand breaks (DSBs) caused by DNA damage. These DSBs are usually repaired using the sister chromatid of the broken duplex and not the homologous chromosome, so they would not result in allelic conversion. Recombination also occurs between homologous sequences present at different genomic loci (paralogous sequences) which have resulted from previous gene duplications. Gene conversion occurring between paralogous sequences (ectopic gene conversion) is conjectured to be responsible for concerted evolution of gene families. Mechanism Conversion of one allele to the other is often due to base mismatch repair during homologous recombination: if one of the four chromatids during meiosis pairs up with another chromatid, as can occur because of sequence homology, DNA strand transfer can occur followed by mismatch repair. This can alter the sequence of one of the chromosomes, so that it is identical to the other. Meiotic recombination is initiated through formation of a double-strand break (DSB). The 5’ ends of the break are then degraded, leaving long 3’ overhangs of several hundred nucleotides. One of these 3’ single stranded DNA segments then invades a homolo
https://en.wikipedia.org/wiki/Intrinsic%20function
In computer software, in compiler theory, an intrinsic function (or built-in function) is a function (subroutine) available for use in a given programming language whose implementation is handled specially by the compiler. Typically, it may substitute a sequence of automatically generated instructions for the original function call, similar to an inline function. Unlike an inline function, the compiler has an intimate knowledge of an intrinsic function and can thus better integrate and optimize it for a given situation. Compilers that implement intrinsic functions generally enable them only when a program requests optimization, otherwise falling back to a default implementation provided by the language runtime system (environment). Intrinsic functions are often used to explicitly implement vectorization and parallelization in languages which do not address such constructs. Some application programming interfaces (API), for example, AltiVec and OpenMP, use intrinsic functions to declare, respectively, vectorizable and multiprocessing-aware operations during compiling. The compiler parses the intrinsic functions and converts them into vector math or multiprocessing object code appropriate for the target platform. Some intrinsics are used to provide additional constraints to the optimizer, such as values a variable cannot assume. C and C++ Compilers for C and C++, of Microsoft, Intel, and the GNU Compiler Collection (GCC) implement intrinsics that map directly to the x86 single instruction, multiple data (SIMD) instructions (MMX, Streaming SIMD Extensions (SSE), SSE2, SSE3, SSSE3, SSE4, AVX, AVX2, AVX512, FMA, ...). The Microsoft Visual C++ compiler of Microsoft Visual Studio does not support inline assembly for x86-64. To compensate for this, new intrinsics have been added that map to standard assembly instructions that are not normally accessible through C/C++, e.g., bit scan. Some C and C++ compilers provide non-portable platform-specific intrinsics. Other intr
https://en.wikipedia.org/wiki/Garmin%20Forerunner
The Garmin Forerunner series is a selection of sports watches produced by Garmin. Most models use the Global Positioning System (GPS), and are targeted at road runners and triathletes. Forerunner series watches are designed to measure distance, speed, heart rate (optional), time, altitude, and pace. Models The Forerunner series consists of the 101, 201, 301, 205, 305, 50, 405, 60, 405CX, 310XT, 110, 210, 410, 610, 910XT, 70, 10, 220, 620, 15, 920XT, 225, 25, 230, 235, 630, 735XT, 35, 935, 30, 645, 645 Music, 45, 45S, 245, 245 Music, 945, 745, 55, 945 LTE, 255, 255 Music, 955, 955 Solar, 265, 965 (listed in chronological order by release date). All models except the 101 include a way to upload training data to a personal computer and training software. Garmin registered the name "Forerunner" with the United States Patent and Trademark Office in August 2001 but released the first watches—the 101, 201, and 301—in 2003. In 2006, the improved 205 and 305 appeared. These models are smaller than the first generation and feature a more sensitive SiRFstarIII GPS receiver chip. In late 2007, the Forerunner 50 was introduced. As opposed to GPS, this model paired with a foot pod to measure displacement. The Forerunner 50 also came packaged with a USB stick that allowed training data to be transferred wirelessly to one's pc. This feature has since become a staple of Garmin's more full-featured sport watches. The Forerunner 405 was introduced in 2008 and is significantly smaller than its predecessors, only slightly outsizing a typical wristwatch. The 405 also featured improved satellite discovery and connection. In 2009, Garmin produced three new models: the Forerunner 60 (an evolution of the Forerunner 50), the Forerunner 405CX (405 chassis), and the Forerunner 310XT (an evolution of the 305 chassis). New features included additional battery life and vibration alerts on the 310XT and advanced calorie consumption modelling on all watches. The new calorie consumption modell
https://en.wikipedia.org/wiki/Spring%20%28operating%20system%29
Spring is a discontinued project in building an experimental microkernel-based object-oriented operating system (OS) developed at Sun Microsystems in the early 1990s. Using technology substantially similar to concepts developed in the Mach kernel, Spring concentrated on providing a richer programming environment supporting multiple inheritance and other features. Spring was also more cleanly separated from the operating systems it would host, divorcing it from its Unix roots and even allowing several OSes to be run at the same time. Development faded out in the mid-1990s, but several ideas and some code from the project was later re-used in the Java programming language libraries and the Solaris operating system. History Spring started in a roundabout fashion in 1987, as part of Sun and AT&T's collaboration to create a merged UNIX. Both companies decided it was also a good opportunity to "reimplement UNIX in an object-oriented fashion". However, after only a few meetings, this part of the project died. Sun decided to keep their team together and instead explore a system on the leading edge. Along with combining Unix flavours, the new system would also be able to run almost any other system, and in a distributed fashion. The system was first running in a "complete" fashion in 1993, and produced a series of research papers. In 1994, a "research quality" release was made under a non-commercial license, but it is unclear how widely this was used. Described as a "clean slate" intended to help Sun improve its existing Unix products, the software was made available at a cost of $75, with Sun targeting universities and computer scientists. Commercial research institutions could obtain the software at a cost of $750. The team broke up and moved to other projects within Sun, using some of the Spring concepts on a variety of other projects. Background The Spring project began soon after the release of Mach 3. In earlier versions Mach was simply a modified version of existi
https://en.wikipedia.org/wiki/Authentication%20and%20Key%20Agreement
Authentication and Key Agreement (AKA) is a security protocol used in 3G networks. AKA is also used for one-time password generation mechanism for digest access authentication. AKA is a challenge–response based mechanism that uses symmetric cryptography. AKA in CDMA AKA – Authentication and Key Agreement a.k.a. 3G Authentication, Enhanced Subscriber Authorization (ESA). The basis for the 3G authentication mechanism, defined as a successor to CAVE-based authentication, AKA provides procedures for mutual authentication of the Mobile Station (MS) and serving system. The successful execution of AKA results in the establishment of a security association (i.e., set of security data) between the MS and serving system that enables a set of security services to be provided. Major advantages of AKA over CAVE-based authentication include: Larger authentication keys (128-bit ) Stronger hash function (SHA-1) Support for mutual authentication Support for signaling message data integrity Support for signaling information encryption Support for user data encryption Protection from rogue MS when dealing with R-UIM AKA is not yet implemented in CDMA2000 networks, although it is expected to be used for IMS. To ensure interoperability with current devices and partner networks, support for AKA in CDMA networks and handsets will likely be in addition to CAVE-based authentication. Air interface support for AKA is included in all releases following CDMA2000 Rev C. TIA-41 MAP support for AKA was defined in TIA-945 (3GPP2 X.S0006), which has been integrated into TIA-41 (3GPP2 X.S0004). For information on AKA in roaming, see CDG Reference Document #138. AKA in UMTS AKA a mechanism which performs authentication and session key distribution in Universal Mobile Telecommunications System (UMTS) networks. AKA is a challenge–response based mechanism that uses symmetric cryptography. AKA is typically run in a UMTS IP Multimedia Services Identity Module (ISIM), which is an application on a
https://en.wikipedia.org/wiki/Dot%20crawl
Dot crawl (also known as chroma crawl or cross-luma) is a visual defect of color analog video standards when signals are transmitted as composite video, as in terrestrial broadcast television. It consists of moving checkerboard patterns which appear along horizontal color transitions (vertical edges). It results from intermodulation or crosstalk between chrominance and luminance components of the signal, which are imperfectly multiplexed in the frequency domain. The term is more associated with the NTSC analog color TV system, but is also present in PAL (see Chroma dots). Although the interference patterns are slightly different depending on the system used, they have the same cause and the same general principles apply. A related effect, color bleed or rainbow artifacts, is discussed below. Description Intermodulation or crosstalk problems take two forms: chrominance interference in luminance (chrominance being interpreted as luminance), luminance interference in chrominance. Dot crawl is most visible when the chrominance is transmitted with a high bandwidth, so that its spectrum reaches well into the band of frequencies used by the luminance signal in the composite video signal. This causes high-frequency chrominance detail at color transitions to be interpreted as luminance detail. Some (mostly older) video-game consoles and home computers use nonstandard colorburst phases, thereby producing dot crawl that appears quite different from that seen in broadcast NTSC or PAL. The effect is more noticeable on these cases due to the saturated colors and small pixel scale details normally present on computer graphics. The opposite problem, luminance interference in chroma, is the appearance of a colored noise in image areas with high levels of detail. This results from high-frequency luminance detail crossing into the frequencies used by the chrominance channel and producing false coloration, known as color bleed or rainbow artifacts. Bleed can also make narr
https://en.wikipedia.org/wiki/The%20Naturalist%20on%20the%20River%20Amazons
The Naturalist on the River Amazons, subtitled A Record of the Adventures, Habits of Animals, Sketches of Brazilian and Indian Life, and Aspects of Nature under the Equator, during Eleven Years of Travel, is an 1863 book by the British naturalist Henry Walter Bates about his expedition to the Amazon basin. Bates and his friend Alfred Russel Wallace set out to obtain new species and new evidence for evolution by natural selection, as well as exotic specimens to sell. He explored thousands of miles of the Amazon and its tributaries, and collected over 14,000 species, of which 8,000 were new to science. His observations of the coloration of butterflies led him to discover Batesian mimicry. The book contains an evenly distributed mixture of natural history, travel, and observation of human societies, including the towns with their Catholic processions. Only the most remarkable discoveries of animals and plants are described, and theories such as evolution and mimicry are barely mentioned. Bates remarks that finding a new species is only the start; he also describes animal behaviour, sometimes in detail, as for the army ants. He constantly relates the wildlife to the people, explaining how the people hunt, what they eat and what they use as medicines. The book is illustrated with drawings by leading artists including E. W. Robinson, Josiah Wood Whymper, Joseph Wolf and Johann Baptist Zwecker. On Bates's return to England, he was encouraged by Charles Darwin to write up his eleven-year stay in the Amazon as a book. The result was widely admired, not least by Darwin: Other reviewers sometimes disagreed with the book's support for evolution, but generally enjoyed his account of the journey, scenery, people, and natural history. The book has been reprinted many times, mostly in Bates's own effective abridgement for the second edition, which omitted the more technical descriptions. Publication history The first edition, in 1863, was long and full of technical description
https://en.wikipedia.org/wiki/Optical%20burst%20switching
Optical burst switching (OBS) is an optical networking technique that allows dynamic sub-wavelength switching of data. OBS is viewed as a compromise between the yet unfeasible full optical packet switching (OPS) and the mostly static optical circuit switching (OCS). It differs from these paradigms because OBS control information is sent separately in a reserved optical channel and in advance of the data payload. These control signals can then be processed electronically to allow the timely setup of an optical light path to transport the soon-to-arrive payload. This is known as delayed reservation. Purpose The purpose of optical burst switching (OBS) is to dynamically provision sub-wavelength granularity by optimally combining electronics and optics. OBS considers sets of packets with similar properties called bursts. Therefore, OBS granularity is finer than optical circuit switching (OCS). OBS provides more bandwidth flexibility than wavelength routing but requires faster switching and control technology. OBS can be used for realizing dynamic end-to-end all optical communications. Method In OBS, packets are aggregated into data bursts at the edge of the network to form the data payload. Various assembling schemes based on time and/or size exist (see burst switching). Edge router architectures have been proposed (see ). OBS features the separation between the control plane and the data plane. A control signal (also termed burst header or control packet) is associated to each data burst. The control signal is transmitted in optical form in a separated wavelength termed the control channel, but signaled out of band and processed electronically at each OBS router, whereas the data burst is transmitted in all optical form from one end to the other end of the network. The data burst can cut through intermediate nodes, and data buffers such as fiber delay lines may be used. In OBS data is transmitted with full transparency to the intermediate nodes in the network. After
https://en.wikipedia.org/wiki/Organic%20horticulture
Organic horticulture is the science and art of growing fruits, vegetables, flowers, or ornamental plants by following the essential principles of organic agriculture in soil building and conservation, pest management, and heirloom variety preservation. The Latin words hortus (garden plant) and cultura (culture) together form horticulture, classically defined as the culture or growing of garden plants. Horticulture is also sometimes defined simply as "agriculture minus the plough". Instead of the plough, horticulture makes use of human labour and gardener's hand tools, although some small machine tools like rotary tillers are commonly employed now. General Mulches, cover crops, compost, manures, vermicompost, and mineral supplements are soil-building mainstays that distinguish this type of farming from its conventional counterpart. Through attention to good healthy soil condition, it is expected that insect, fungal, or other problems that sometimes plague plants can be minimized. However, pheromone traps, insecticidal soap sprays, and other pest-control methods available to organic farmers are also utilized by organic horticulturists. Horticulture involves five areas of study: floriculture (includes production and marketing of floral crops), landscape horticulture (includes production, marketing and maintenance of landscape plants), olericulture (includes production and marketing of vegetables), pomology (includes production and marketing of fruits), and postharvest physiology (involves maintaining quality and preventing spoilage of horticultural crops). All of these can be, and sometimes are, pursued according to the principles of organic cultivation. Organic horticulture (or organic gardening) is based on knowledge and techniques gathered over thousands of years. In general terms, organic horticulture involves natural processes, often taking place over extended periods of time, and a sustainable, holistic approach – while chemical-based horticulture focuses on
https://en.wikipedia.org/wiki/ThinkCentre
The ThinkCentre is a line of business-oriented desktop computers designed, developed and marketed by Lenovo, and formerly by IBM from 2003 to 2005. ThinkCentre computers typically include mid-range to high-end processors, options for discrete graphics cards, and multi-monitor support. History Launch The ThinkCentre line of desktop computers was introduced by IBM in 2003. The first three models in this line were the S50, the M50, and A50p. All three desktops were equipped with Intel Pentium 4 processors. The chassis was made of steel and designed for easy component access without the use of tools. The hard disk was fixed in place by a 'caddy' without the use of screws. The caddy had rubber bumpers to reduce vibration and operational noise. Additional updates to the desktops included greater use of ThinkVantage technologies. All desktop models were made available with ImageUltra. The three desktop models also included an 'Access IBM' button, allowing access to onboard resources, diagnostic tools, automated software, and links to online updates and services. Select models featured IBM's Embedded Security Subsystem, with an integrated security chip and IBM Client Security Software. Acquisition by Lenovo In 2005, after completing its acquisition of IBM's personal computing business, leading to the IBM/Lenovo partnership, IBM/Lenovo announced the ThinkCentre E Series desktops, designed specifically for small businesses. The ThinkCentre E50 was made available in tower and small form factor, with a silver and black design. In 2005, Technology Business Research (TBR) observed an increase in the customer satisfaction rate for ThinkCentre desktops. According to TBR's "Corporate IT Buying Behavior and Customer Satisfaction Study” published in the second quarter of 2005, Lenovo was the only one of four surveyed companies that displayed a substantial increase in ratings. In May 2005, the ThinkCentre M52 and A52 desktops were announced by Lenovo. These desktops marked the
https://en.wikipedia.org/wiki/DVB-H
DVB-H (digital video broadcasting - handheld) is one of three prevalent mobile TV formats. It is a technical specification for bringing broadcast services to mobile handsets. DVB-H was formally adopted as ETSI standard EN 302 304 in November 2004. The DVB-H specification (EN 302 304) can be downloaded from the official DVB-H website. From March 2008, DVB-H is officially endorsed by the European Union as the "preferred technology for terrestrial mobile broadcasting". The major competitors of this technology are Qualcomm's MediaFLO system, the 3G cellular system based MBMS mobile-TV standard, and the ATSC-M/H format in the U.S. DVB-SH (Satellite to Handhelds) now and DVB-NGH (Next Generation Handheld) in the future are possible enhancements to DVB-H, providing improved spectral efficiency and better modulation flexibility. DVB-H has been a commercial failure, and the service is no longer on-air. Ukraine was the last country with a nationwide broadcast in DVB-H, which began transitioning to DVB-T2 during 2019. Technical explanation DVB-H technology is a superset of the successful DVB-T (Digital Video Broadcasting - Terrestrial) system for digital terrestrial television, with additional features to meet the specific requirements of handheld, battery-powered receivers. In 2002 four main requirements of the DVB-H system were agreed: broadcast services for portable and mobile usage with 'acceptable quality'; a typical user environment, and so geographical coverage, as mobile radio; access to service while moving in a vehicle at high speed (as well as imperceptible handover when moving from one cell to another); and as much compatibility with existing digital terrestrial television (DVB-T), to allow sharing of network and transmission equipment. DVB-H can offer a downstream channel at high data rates which can be used as standalone or as an enhancement of mobile telecommunication networks which many typical handheld terminals are able to access anyway. Time slicing techn
https://en.wikipedia.org/wiki/Center%20of%20population
In demographics, the center of population (or population center) of a region is a geographical point that describes a centerpoint of the region's population. There are several ways of defining such a "center point", leading to different geographical locations; these are often confused. Definitions Three commonly used (but different) center points are: the mean center, also known as the centroid or center of gravity; the median center, which is the intersection of the median longitude and median latitude; the geometric median, also known as Weber point, Fermat–Weber point, or point of minimum aggregate travel. A further complication is caused by the curved shape of the Earth. Different center points are obtained depending on whether the center is computed in three-dimensional space, or restricted to the curved surface, or computed using a flat map projection. Mean center The mean center, or centroid, is the point on which a rigid, weightless map would balance perfectly, if the population members are represented as points of equal mass. Mathematically, the centroid is the point to which the population has the smallest possible sum of squared distances. It is easily found by taking the arithmetic mean of each coordinate. If defined in the three-dimensional space, the centroid of points on the Earth's surface is actually inside the Earth. This point could then be projected back to the surface. Alternatively, one could define the centroid directly on a flat map projection; this is, for example, the definition that the US Census Bureau uses. Contrary to a common misconception, the centroid does not minimize the average distance to the population. That property belongs to the geometric median. Median center The median center is the intersection of two perpendicular lines, each of which divides the population into two equal halves. Typically these two lines are chosen to be a parallel (a line of latitude) and a meridian (a line of longitude). In that case, th
https://en.wikipedia.org/wiki/Nyquist%20stability%20criterion
In control theory and stability theory, the Nyquist stability criterion or Strecker–Nyquist stability criterion, independently discovered by the German electrical engineer at Siemens in 1930 and the Swedish-American electrical engineer Harry Nyquist at Bell Telephone Laboratories in 1932, is a graphical technique for determining the stability of a dynamical system. Because it only looks at the Nyquist plot of the open loop systems, it can be applied without explicitly computing the poles and zeros of either the closed-loop or open-loop system (although the number of each type of right-half-plane singularities must be known). As a result, it can be applied to systems defined by non-rational functions, such as systems with delays. In contrast to Bode plots, it can handle transfer functions with right half-plane singularities. In addition, there is a natural generalization to more complex systems with multiple inputs and multiple outputs, such as control systems for airplanes. The Nyquist stability criterion is widely used in electronics and control system engineering, as well as other fields, for designing and analyzing systems with feedback. While Nyquist is one of the most general stability tests, it is still restricted to linear time-invariant (LTI) systems. Nevertheless, there are generalizations of the Nyquist criterion (and plot) for non-linear systems, such as the circle criterion and the scaled relative graph of a nonlinear operator. Additionally, other stability criteria like Lyapunov methods can also be applied for non-linear systems. Although Nyquist is a graphical technique, it only provides a limited amount of intuition for why a system is stable or unstable, or how to modify an unstable system to be stable. Techniques like Bode plots, while less general, are sometimes a more useful design tool. Nyquist plot A Nyquist plot is a parametric plot of a frequency response used in automatic control and signal processing. The most common use of Nyquist p
https://en.wikipedia.org/wiki/Bromethalin
Bromethalin is a neurotoxic rodenticide that damages the central nervous system. History Bromethalin was discovered in the early 1980s through an approach to find replacement rodenticides for first-generation anticoagulants, especially to be useful against rodents that had become resistant to Warfarin-type anticoagulant poisons. A structured study was undertaken to develop a substance that would be both poisonous to rodents, but also would be readily eaten by rodents. Bromethalin—N-methyl-2,4-dinitro-N (2,4,6-tribromophenyl)-6-(trifluoromethyl) benzeneamine— was the outcome of that study, as the specific formulation had both desired rodenticidal properties. Mechanism of action Bromethalin works by being metabolised to n-desmethyl-bromethalin and uncoupling mitochondrial oxidative phosphorylation, which causes a decrease in adenosine triphosphate (ATP) synthesis. The decreased ATP inhibits the activity of the Na/K ATPase enzyme, thereby leading to a subsequent buildup of cerebral spinal fluid (CSF) and vacuolization of myelin. The excess CSF results in increased intracranial pressure, which in turn permanently damages neuronal axons. This damage to the central nervous system can cause paralysis, convulsions, and death. Risk of poisoning to humans and pets Despite risk of severe symptoms and death, most unintentional pediatric exploratory exposures (licking or tasting a pellet) have not shown serious effects, and no deaths have been reported at this time in children, though toxicity is possible if significant amounts are ingested. Due to need for active metabolite generation to produce toxicity, fatal toxicity may be delayed by hours to days. All cases should be managed in consultation with a local poison control center. All intentional ingestions for self harm carry significant risk of death or severe neurologic effects and require monitoring in a hospital setting. In humans the most common initial effects of unintentional exposure are nausea, vomiting, abdomin
https://en.wikipedia.org/wiki/Dynamic%20web%20page
A dynamic web page is a web page constructed at runtime (during software execution), as opposed to a static web page, delivered as it is stored. A server-side dynamic web page is a web page whose construction is controlled by an application server processing server-side scripts. In server-side scripting, parameters determine how the assembly of every new web page proceeds, and including the setting up of more client-side processing. A client-side dynamic web page processes the web page using JavaScript running in the browser as it loads. JavaScript can interact with the page via Document Object Model (DOM), to query page state and modify it. Even though a web page can be dynamic on the client-side, it can still be hosted on a static hosting service such as GitHub Pages or Amazon S3 as long as there is not any server-side code included. A dynamic web page is then reloaded by the user or by a computer program to change some variable content. The updating information could come from the server, or from changes made to that page's DOM. This may or may not truncate the browsing history or create a saved version to go back to, but a dynamic web page update using AJAX technologies will neither create a page to go back to, nor truncate the web browsing history forward of the displayed page. Using AJAX, the end user gets one dynamic page managed as a single page in the web browser while the actual web content rendered on that page can vary. The AJAX engine sits only on the browser requesting parts of its DOM, the DOM, for its client, from an application server. A particular application server could offer a standardized REST style interface to offer services to the web application. DHTML is the umbrella term for technologies and methods used to create web pages that are not static web pages, though it has fallen out of common use since the popularization of AJAX, a term which is now itself rarely used. Client-side-scripting, server-side scripting, or a combination of these
https://en.wikipedia.org/wiki/Racemic%20acid
Racemic acid is an old name for an optically inactive or racemic form of tartaric acid. It is an equal mixture of two mirror-image isomers (enantiomers), optically active in opposing directions. It occurs naturally in grape juice. Tartaric acid's sodium-ammonium salt is unusual among racemic mixtures in that during crystallization it can separate out into two kinds of crystals, each composed of one isomer, and whose macroscopic crystalline shapes are mirror images of each other. Thus, Louis Pasteur was able in 1848 to isolate each of the two enantiomers by laboriously separating the two kinds crystals using delicate tweezers and a hand lens. Pasteur announced his intention to resolve racemic acid in: Pasteur, Louis (1848) "Sur les relations qui peuvent exister entre la forme cristalline, la composition chimique et le sens de la polarisation rotatoire" while he presented his resolution of racemic acid into separate optical isomers in: Pasteur, Louis (1850) "Recherches sur les propriétés spécifiques des deux acides qui composent l'acide racémique" In the latter paper, Pasteur sketches from natural concrete reality chiral polytopes quite possibly for the first time. The optical property of tartaric acid was first observed in 1832 by Jean Baptiste Biot, who observed its ability to rotate polarized light. It remains unknown whether Arthur Cayley or Ludwig Schläfli, or other contemporary mathematicians who studied polytopes, knew of the French work. In two modern-day re-enactments performed in Japan of the Pasteur experiment, it was established that the preparation of crystals was not very reproducible. The crystals deformed, but they were large enough to inspect with the naked eye (microscope not required). See also Tartaric acid Uvitic acid Uvitonic acid
https://en.wikipedia.org/wiki/Abuse%20of%20notation
In mathematics, abuse of notation occurs when an author uses a mathematical notation in a way that is not entirely formally correct, but which might help simplify the exposition or suggest the correct intuition (while possibly minimizing errors and confusion at the same time). However, since the concept of formal/syntactical correctness depends on both time and context, certain notations in mathematics that are flagged as abuse in one context could be formally correct in one or more other contexts. Time-dependent abuses of notation may occur when novel notations are introduced to a theory some time before the theory is first formalized; these may be formally corrected by solidifying and/or otherwise improving the theory. Abuse of notation should be contrasted with misuse of notation, which does not have the presentational benefits of the former and should be avoided (such as the misuse of constants of integration). A related concept is abuse of language or abuse of terminology, where a term — rather than a notation — is misused. Abuse of language is an almost synonymous expression for abuses that are non-notational by nature. For example, while the word representation properly designates a group homomorphism from a group G to GL(V), where V is a vector space, it is common to call V "a representation of G". Another common abuse of language consists in identifying two mathematical objects that are different, but canonically isomorphic. Other examples include identifying a constant function with its value, identifying a group with a binary operation with the name of its underlying set, or identifying to the Euclidean space of dimension three equipped with a Cartesian coordinate system. Examples Structured mathematical objects Many mathematical objects consist of a set, often called the underlying set, equipped with some additional structure, such as a mathematical operation or a topology. It is a common abuse of notation to use the same notation for the underlying
https://en.wikipedia.org/wiki/Base%20address
In computing, a base address is an address serving as a reference point ("base") for other addresses. Related addresses can be accessed using an addressing scheme. Under the relative addressing scheme, to obtain an absolute address, the relevant base address is taken and an offset (aka displacement) is added to it. Under this type of scheme, the base address is the lowest numbered address within a prescribed range, to facilitate adding related positive-valued offsets. In IBM System/360 architecture, the base address is a 24-bit value in a general register (extended in steps to 64 bits in z/Architecture), and the offset is a 12 bit value in the instruction (extended to 20 bits in z/Architecture). See also Index register Rebasing Computer memory
https://en.wikipedia.org/wiki/Composite%20monitor
A composite monitor or composite video monitor is any analog video display that receives input in the form of an analog composite video signal to a defined specification. A composite video signal encodes all information on a single conductor; a composite cable has a single live conductor plus earth. Other equipment with display functionality includes monitors with more advanced interfaces and connectors giving a better picture, including analog VGA, and digital DVI, HDMI, and DisplayPort; and television (TV) receivers which are self-contained, receiving and displaying video RF broadcasts received with an internal tuner. Video monitors are used for displaying computer output, closed-circuit television (e.g. security cameras) and other applications requiring a two-dimensional monochrome or colour image. Inputs Composite monitors usually use RCA jacks or BNC connectors for video input. Earlier equipment (1970s) often used UHF connectors. Typically simple composite monitors give a picture inferior to other interfaces. In principle a monitor can have one or several of multiple types of input, including composite—in addition to composite monitors as such, many monitors accept composite input among other standards. In practice computer monitors ceased to support composite input as other interfaces became predominant. A composite monitor must have a two-dimensional approximately flat display device with circuitry to accept a composite signal with picture and synchronisation information, process it into monochrome chrominance and luminance, or the red, green, and blue of RGB, plus synchronisation pulses, and display it on a screen, which was predominantly a CRT until the 21st century, and then a thin panel using LCD or other technology. A critical factor in the quality of this display is the type of encoding used in the TV camera to combine the signal together and the decoding used in the monitor to separate the signals back to RGB for display. Composite monitors can b
https://en.wikipedia.org/wiki/Fuchs%27%20dystrophy
Fuchs dystrophy, also referred to as Fuchs endothelial corneal dystrophy (FECD) and Fuchs endothelial dystrophy (FED), is a slowly progressing corneal dystrophy that usually affects both eyes and is slightly more common in women than in men. Although early signs of Fuchs dystrophy are sometimes seen in people in their 30s and 40s, the disease rarely affects vision until people reach their 50s and 60s. Signs and symptoms As a progressive, chronic condition, signs and symptoms of Fuchs dystrophy gradually progress over decades of life, starting in middle age. Early symptoms include blurry vision upon wakening which improves during the morning, as fluid retained in the cornea is unable to evaporate through the surface of the eye when the lids are closed overnight. As the disease worsens, the interval of blurry morning vision extends from minutes to hours. In moderate stages of the disease, an increase in guttae and swelling in the cornea can contribute to changes in vision and decreased sharpness throughout the day. Contrast sensitivity may be affected. The change in the refractive index of the cornea may result in subtle refractive shifts, which affected individuals may experience as a small change in their eyeglass prescription. In the late stages of the disease, the cornea is unable to maintain its fluid content and blisters, known as bullae, form on the surface of the cornea. These cause foreign body sensations and can be painful. The cornea may not heal from such epithelial defects, until corneal transplantation is able to restore the endothelial pump function. Cause FECD is a degenerative disease of the corneal endothelium with accumulation of focal outgrowths called guttae (drops) and thickening of Descemet's membrane, leading to corneal edema and loss of vision. The corneal endothelial cell layer and its basement membrane (Descemet's membrane) act as a barrier to hydration of the corneal stroma by aqueous humor and are "pump" cells of the cornea that functi
https://en.wikipedia.org/wiki/Homogeneous%20polynomial
In mathematics, a homogeneous polynomial, sometimes called quantic in older texts, is a polynomial whose nonzero terms all have the same degree. For example, is a homogeneous polynomial of degree 5, in two variables; the sum of the exponents in each term is always 5. The polynomial is not homogeneous, because the sum of exponents does not match from term to term. The function defined by a homogeneous polynomial is always a homogeneous function. An algebraic form, or simply form, is a function defined by a homogeneous polynomial. A binary form is a form in two variables. A form is also a function defined on a vector space, which may be expressed as a homogeneous function of the coordinates over any basis. A polynomial of degree 0 is always homogeneous; it is simply an element of the field or ring of the coefficients, usually called a constant or a scalar. A form of degree 1 is a linear form. A form of degree 2 is a quadratic form. In geometry, the Euclidean distance is the square root of a quadratic form. Homogeneous polynomials are ubiquitous in mathematics and physics. They play a fundamental role in algebraic geometry, as a projective algebraic variety is defined as the set of the common zeros of a set of homogeneous polynomials. Properties A homogeneous polynomial defines a homogeneous function. This means that, if a multivariate polynomial P is homogeneous of degree d, then for every in any field containing the coefficients of P. Conversely, if the above relation is true for infinitely many then the polynomial is homogeneous of degree d. In particular, if P is homogeneous then for every This property is fundamental in the definition of a projective variety. Any nonzero polynomial may be decomposed, in a unique way, as a sum of homogeneous polynomials of different degrees, which are called the homogeneous components of the polynomial. Given a polynomial ring over a field (or, more generally, a ring) K, the homogeneous polynomials of degree d
https://en.wikipedia.org/wiki/Networking%20hardware
Networking hardware, also known as network equipment or computer networking devices, are electronic devices that are required for communication and interaction between devices on a computer network. Specifically, they mediate data transmission in a computer network. Units which are the last receiver or generate data are called hosts, end systems or data terminal equipment. Range Networking devices includes a broad range of equipment which can be classified as core network components which interconnect other network components, hybrid components which can be found in the core or border of a network and hardware or software components which typically sit on the connection point of different networks. The most common kind of networking hardware today is a copper-based Ethernet adapter which is a standard inclusion on most modern computer systems. Wireless networking has become increasingly popular, especially for portable and handheld devices. Other networking hardware used in computers includes data center equipment (such as file servers, database servers and storage areas), network services (such as DNS, DHCP, email, etc.) as well as devices which assure content delivery. Taking a wider view, mobile phones, tablet computers and devices associated with the internet of things may also be considered networking hardware. As technology advances and IP-based networks are integrated into building infrastructure and household utilities, network hardware will become an ambiguous term owing to the vastly increasing number of network-capable endpoints. Specific devices Network hardware can be classified by its location and role in the network. Core Core network components interconnect other network components. Gateway: an interface providing a compatibility between networks by converting transmission speeds, protocols, codes, or security measures. Router: a networking device that forwards data packets between computer networks. Routers perform the "traffic directing" fun
https://en.wikipedia.org/wiki/Z-order%20curve
In mathematical analysis and computer science, functions which are Z-order, Lebesgue curve, Morton space-filling curve, Morton order or Morton code map multidimensional data to one dimension while preserving locality of the data points. It is named in France after Henri Lebesgue, who studied it in 1904, and named in the United States after Guy Macdonald Morton, who first applied the order to file sequencing in 1966. The z-value of a point in multidimensions is simply calculated by interleaving the binary representations of its coordinate values. Once the data are sorted into this ordering, any one-dimensional data structure can be used, such as simple one dimensional arrays, binary search trees, B-trees, skip lists or (with low significant bits truncated) hash tables. The resulting ordering can equivalently be described as the order one would get from a depth-first traversal of a quadtree or octree. Coordinate values The figure below shows the Z-values for the two dimensional case with integer coordinates 0 ≤ x ≤ 7, 0 ≤ y ≤ 7 (shown both in decimal and binary). Interleaving the binary coordinate values (starting to the right with the x-bit (in blue) and alternating to the left with the y-bit (in red)) yields the binary z-values (tilted by 45° as shown). Connecting the z-values in their numerical order produces the recursively Z-shaped curve. Two-dimensional Z-values are also known as quadkey values. The Z-values of the x coordinates are described as binary numbers from the Moser–de Bruijn sequence, having nonzero bits only in their even positions: x[] = {0b000000, 0b000001, 0b000100, 0b000101, 0b010000, 0b010001, 0b010100, 0b010101} The sum and difference of two x values are calculated by using bitwise operations: x[i+j] = ((x[i] | 0b10101010) + x[j]) & 0b01010101 x[i−j] = ((x[i] & 0b01010101) − x[j]) & 0b01010101 if i ≥ j This property can be used to offset a Z-value, for example in two dimensions the coordinates to the top (decreasing y), bottom (increasi
https://en.wikipedia.org/wiki/SOAP%20note
The SOAP note (an acronym for subjective, objective, assessment, and plan) is a method of documentation employed by healthcare providers to write out notes in a patient's chart, along with other common formats, such as the admission note. Documenting patient encounters in the medical record is an integral part of practice workflow starting with appointment scheduling, patient check-in and exam, documentation of notes, check-out, rescheduling, and medical billing. Additionally, it serves as a general cognitive framework for physicians to follow as they assess their patients. The SOAP note originated from the problem-oriented medical record (POMR), developed nearly 50 years ago by Lawrence Weed, MD. It was initially developed for physicians to allow them to approach complex patients with multiple problems in a highly organized way. Today, it is widely adopted as a communication tool between inter-disciplinary healthcare providers as a way to document a patient's progress. SOAP notes are commonly found in electronic medical records (EMR) and are used by providers of various backgrounds. Generally, SOAP notes are used as a template to guide the information that physicians add to a patient's EMR. Prehospital care providers such as emergency medical technicians may use the same format to communicate patient information to emergency department clinicians. Due to its clear objectives, the SOAP note provides physicians a way to standardize the organization of a patient's information to reduce confusion when patients are seen by various members of healthcare professions. Many healthcare providers, ranging from physicians to behavioral healthcare professionals to veterinarians, use the SOAP note format for their patient's initial visit and to monitor progress during follow-up care. Components The four components of a SOAP note are Subjective, Objective, Assessment, and Plan. The length and focus of each component of a SOAP note vary depending on the specialty; for instance,
https://en.wikipedia.org/wiki/Power-on%20self-test
A power-on self-test (POST) is a process performed by firmware or software routines immediately after a computer or other digital electronic device is powered on. This article mainly deals with POSTs on personal computers, but many other embedded systems such as those in major appliances, avionics, communications, or medical equipment also have self-test routines which are automatically invoked at power-on. The results of the POST may be displayed on a panel that is part of the device, output to an external device, or stored for future retrieval by a diagnostic tool. Since a self-test might detect that the system's usual human-readable display is non-functional, an indicator lamp or a speaker may be provided to show error codes as a sequence of flashes or beeps. In addition to running tests, the POST process may also set the initial state of the device from firmware. In the case of a computer, the POST routines are part of a device's pre-boot sequence; if they complete successfully, the bootstrap loader code is invoked to load an operating system. IBM-compatible PC POST In IBM PC compatible computers, the main duties of POST are handled by the BIOS/UEFI, which may hand some of these duties to other programs designed to initialize very specific peripheral devices, notably for video and SCSI initialization. These other duty-specific programs are generally known collectively as option ROMs or individually as the video BIOS, SCSI BIOS, etc. The principal duties of the main BIOS during POST are as follows: verify CPU registers verify the integrity of the BIOS code itself verify some basic components like DMA, timer, interrupt controller initialize, size, and verify system main memory initialize BIOS pass control to other specialized extension BIOSes (if installed) identify, organize, and select which devices are available for booting The functions above are served by the POST in all BIOS versions back to the very first. In later BIOS versions, POST will also
https://en.wikipedia.org/wiki/DNA%20glycosylase
DNA glycosylases are a family of enzymes involved in base excision repair, classified under EC number EC 3.2.2. Base excision repair is the mechanism by which damaged bases in DNA are removed and replaced. DNA glycosylases catalyze the first step of this process. They remove the damaged nitrogenous base while leaving the sugar-phosphate backbone intact, creating an apurinic/apyrimidinic site, commonly referred to as an AP site. This is accomplished by flipping the damaged base out of the double helix followed by cleavage of the N-glycosidic bond. Glycosylases were first discovered in bacteria, and have since been found in all kingdoms of life. In addition to their role in base excision repair, DNA glycosylase enzymes have been implicated in the repression of gene silencing in A. thaliana, N. tabacum and other plants by active demethylation. 5-methylcytosine residues are excised and replaced with unmethylated cytosines allowing access to the chromatin structure of the enzymes and proteins necessary for transcription and subsequent translation. Monofunctional vs. bifunctional glycosylases There are two main classes of glycosylases: monofunctional and bifunctional. Monofunctional glycosylases have only glycosylase activity, whereas bifunctional glycosylases also possess AP lyase activity that permits them to cut the phosphodiester bond of DNA, creating a single-strand break without the need for an AP endonuclease. β-Elimination of an AP site by a glycosylase-lyase yields a 3' α,β-unsaturated aldehyde adjacent to a 5' phosphate, which differs from the AP endonuclease cleavage product. Some glycosylase-lyases can further perform δ-elimination, which converts the 3' aldehyde to a 3' phosphate. Biochemical mechanism The first crystal structure of a DNA glycosylase was obtained for E. coli Nth. This structure revealed that the enzyme flips the damaged base out of the double helix into an active site pocket in order to excise it. Other glycosylases have since been
https://en.wikipedia.org/wiki/Mandorla
A mandorla is an almond-shaped aureola, i.e. a frame that surrounds the totality of an iconographic figure. It is usually synonymous with vesica, a lens shape. Mandorlas often surround the figures of Jesus Christ and the Virgin Mary in traditional Christian iconography. It is distinguished from a halo in that it encircles the entire body and not just the head. It is commonly used to frame the figure of Christ in Majesty in early medieval and Romanesque art, as well as Byzantine art of the same periods. It is the shape generally used for mediaeval ecclesiastical seals, secular seals generally being round. Depictions Mandorla is Italian for the almond nut, to which shape it refers. It may be elliptical or depicted as a vesica, a lens shape as the intersection of two circles. Rhombic mandorlas are also sometimes depicted. In icons of the Eastern Orthodox Church, the mandorla is used to depict sacred moments that "transcend time and space", such as the Resurrection and the Transfiguration of Jesus Christ and the Dormition of the Theotokos. These mandorla often are painted in several concentric bands of different color, which become darker in progression to the center of the mandorla. This accords with the church's use of apophatic theology, as described by Dionysius the Areopagite and others: as holiness increases, only increasing darkness can depict the luminance and brightness thereof. In architectural iconography, the frame of the mandorla is often marked with decorative mouldings. The interior of the mandorla is usually undecorated, but may contain the symbols for Alpha and Omega (Α and Ω) or, less frequently, depictions of a starry sky or clouds. In a famous Catholic Romanesque fresco of Jesus Christ in Glory in Sant Climent de Taüll, the scriptural inscription Ego Sum Lux Mundi ("I Am the Light of the World") is incorporated in the mandorla design. The tympanum at Conques has Christ, with a gesture carved in Romanesque sculpture, indicate the angels at his fe
https://en.wikipedia.org/wiki/List%20of%20application%20servers
This list compares the features and functionality of application servers, grouped by the hosting environment that is offered by that particular application server. BASIC Run BASIC An all-in-one BASIC scriptable application server, can automatically manage session and state. C Enduro/X A middleware platform for distributed transaction processing, based on XATMI and XA standards, open source, C API C++ Tuxedo Based on the ATMI standard, is one of the original application servers. Wt A web toolkit similar to Qt permitting GUI-application-like web development with built-in Ajax abilities. POCO C++ Libraries A set of open source class libraries including Poco.Net.HTTPServer.html CppCMS Enduro/X A middleware platform for distributed transaction processing, based on XATMI and XA standards, open source Go Enduro/X ASG Application server for Go. This provides XATMI and XA facilities for Golang. Go application can be built by normal Go executable files which in turn provides stateless services, which can be load balanced, clustered and reloaded on the fly without service interruption by means of administrative work only. Framework provides distributed transaction processing facility for Go. Java Apache MINA an abstract event-driven asynchronous API over various transports such as TCP/IP and UDP/IP via Java NIO Netty a non-blocking I/O client-server framework for the development of Java network applications similar in spirit to Node.js JavaScript Broadvision Server-side JavaScript AS. One of the early entrants in the market during the eCommerce dot-com bubble, they have vertical solution packages catering to the eCommerce industry. Wakanda Server Server-side JavaScript application server integrating a NoSQL database engine (WakandaDB), a dedicated HTTP server, user, and group management and an optional client-side JavaScript framework. Node.js implements Google's V8 engine as a standalone (outside the browser) asynchronous Javascript inter
https://en.wikipedia.org/wiki/Exon%20trapping
Exon trapping is a molecular biology technique to identify potential exons in a fragment of eukaryote DNA of unknown intron-exon structure. This is done to determine if the fragment is part of an expressed gene. The genomic fragment is inserted into the intron of a 'splicing vector' consisting of a known exon - intron - exon sequence of DNA, and the vector is then inserted into an eukaryotic cell. If the fragment does not contain exons (i.e., consists solely of intron DNA), it will be spliced out together with the vector's original intron. On the other hand, if exons are contained, they will be part of the mature mRNA after transcription (with all intron material removed). The presence of 'trapped exons' can be detected by an increase in size of the mRNA, or through RT-PCR to amplify the DNA of interest. The technique has largely been supplanted by the approach of sequencing cDNA generated from mRNA and then using bioinformatics tools such as NCBI's BLAST server to determine the source of the sequence, thereby identifying the appropriate exon-intron splice sites.
https://en.wikipedia.org/wiki/Mature%20messenger%20RNA
Mature messenger RNA, often abbreviated as mature mRNA is a eukaryotic RNA transcript that has been spliced and processed and is ready for translation in the course of protein synthesis. Unlike the eukaryotic RNA immediately after transcription known as precursor messenger RNA, mature mRNA consists exclusively of exons and has all introns removed. Mature mRNA is also called "mature transcript", "mature RNA" or "mRNA". The production of a mature mRNA molecule occurs in 3 steps: Capping of the 5' end Polyadenylation of the 3' end RNA Splicing of the introns Capping the 5' End During capping, a 7-methylguanosine residue is attached to the 5'-terminal end of the primary transcripts.This is otherwise known as the GTP or 5' cap. The 5' cap is used to increase mRNA stability. Further, the 5' cap is used as an attachment point for ribosomes. Beyond this, the 5' cap has also been shown to have a role in exporting the mature mRNA from the nucleus and into the cytoplasm. Polyadenylation In polyadenylation, a poly-adenosine tail of about 200 adenylate residues is added by a nuclear polymerase post-transcriptionally. This is known as a Poly-A tail and is used for stability and guidance, so that the mRNA can exit the nucleus and find the ribosome. It is added at a polyadenylation site in the 3' untranslated region of the mRNA, cleaving the mRNA in the process. When there are multiple polyadenylation sites on the same mRNA molecule, alternative polyadenylation can occur. See polyadenylation for further details. RNA Splicing Pre-mRNA has both introns and exons. As a part of the maturation process, RNA splicing removes the non-coding RNA introns leaving behind the exons, which are then spliced and joined together to form the mature mRNA. Splicing is conducted by the spliceosome. The spliceosome is a large ribonucleoprotein which cleaves the RNA at the splicing site and recombines the exons of the RNA. Similar to polyadenylation, alternative splicing can occur, resultin
https://en.wikipedia.org/wiki/Pseudoscalar%20meson
In high-energy physics, a pseudoscalar meson is a meson with total spin 0 and odd parity (usually notated as Pseudoscalar mesons are commonly seen in proton-proton scattering and proton-antiproton annihilation, and include the pion (), kaon (), eta (), and eta prime () particles, whose masses are known with great precision. Among all of the mesons known to exist, in some sense, the pseudoscalars are the most well studied and understood. History The pion () was first proposed to exist by Yukawa in the 1930s as the primary force carrying boson of the Yukawa potential in nuclear interactions, and was later observed at nearly the same mass that he originally predicted for it. In the 1950s and 1960s, the pseudoscalar mesons began to proliferate, and were eventually organized into a multiplet according to Murray Gell-Mann's so-called "Eightfold Way". Gell-Mann further predicted the existence of a ninth resonance in the pseudoscalar multiplet, which he originally called . Indeed, this particle was later found and is now known as the eta prime meson (). The structure of the pseudoscalar meson multiplet, and also the ground state baryon multiplets, led Gell-Mann (and Zweig, independently) to create the well known quark model. The puzzle Despite the pseudoscalar mesons' masses being known to high precision, and being the most well studied and understood mesons, the decay properties of the pseudoscalar mesons, particularly of eta () and eta-prime (), are somewhat contradictory to their mass hierarchy: While the meson is much more massive than the meson, the meson is thought to contain a larger component of the relatively heavy strange and anti-strange quarks, than the meson does, which appears contradictory. This failure of the quark model to explain this mass difference is called the " puzzle". The presence of an (1405) state also brings glueball mixing into the discussion. It is possible that the and mesons mix with the pseudoscalar glueball which should occur
https://en.wikipedia.org/wiki/Vector%20meson
In high energy physics, a vector meson is a meson with total spin 1 and odd parity (usually noted as ). Vector mesons have been seen in experiments since the 1960s, and are well known for their spectroscopic pattern of masses. The vector mesons contrast with the pseudovector mesons, which also have a total spin 1 but instead have even parity. The vector and pseudovector mesons are also dissimilar in that the spectroscopy of vector mesons tends to show nearly pure states of constituent quark flavors, whereas pseudovector mesons and scalar mesons tend to be expressed as composites of mixed states. Uniquely pure flavor states Since the development of the quark model by Murray Gell-Mann (and also independently by George Zweig), the vector mesons have demonstrated the spectroscopy of pure states. The fact that the rho meson (ρ) and omega meson (ω) have nearly equal mass centered on 770–, while the phi meson (φ) has a higher mass around , indicates that the light-quark vector mesons appear in nearly pure states, with the φ meson having a nearly 100 percent amplitude of hidden strangeness. These nearly pure states characteristic of the vector mesons are not at all evident in the pseudoscalar meson or scalar meson multiplets, and may be only slightly realized among the tensor meson and pseudovector meson multiplets. This fact makes the vector mesons an excellent probe of the quark flavor content of other types of mesons, measured through the respective decay rates of non-vector mesons into the different types of vector mesons. Such experiments are very revealing for theorists who seek to determine the flavor content of mixed state mesons. Backbone of meson spectroscopy At higher masses, the vector mesons include charm and bottom quarks in their structure. In this realm, the radiative processes tend to stand out, with heavy tensor and scalar mesons decaying dominantly into vector mesons by photon emission. Pseudovector mesons transition by a similar process into pseudo
https://en.wikipedia.org/wiki/Pseudovector%20meson
In high energy physics, a pseudovector meson or axial vector meson is a meson with total spin 1 and even parity (+) (usually noted as Compare to a vector meson, which has a total spin 1 and odd parity Charge parity (C) in addition to spatial parity (P) The known pseudovector mesons fall into two different classes, all have even spatial parity ( P = "+" ), but they differ in another kind of parity called charge parity (C) which can be either even (+) or odd (−). The two types of pseudovector meson are: those with odd   charge parity those with even charge parity The 1 group has no intrinsic spin excitation , but do gain spin from angular momentum of the orbits of the two constituent quarks around their mutual center. The second group (1) have both intrinsic spin and with and coupling to Pseudovector, or axial vector, mesons in the 1 category are most readily be seen in proton‑antiproton annihilation and pion‑nucleon scattering. The mesons in the 1 category are normally seen in proton-proton and pion-nucleon scattering. Discrepant mass estimates The difference between the two groups gives them slightly different masses, from the spin‑orbit coupling rule. Theoretically, the and mesons are in the 1 group, and should have heavier masses, according to the spin-orbit mass splitting. However, the measured masses of the mesons do not appear to follow the rule, as evidenced by the and mesons being heavier. There are considerable uncertainties in experimental measurement of pseudovector mesons; more experimental data will be needed to confirm and accurately determine the discrepancy between theory and measurement. The 1 multiplet of light mesons may show similar behavior to that of other vector mesons, in that the mixing of light quarks with strange quarks appears to be small for this quantum number. The 1 multiplet, on the other hand, may be affected by other factors that generally reduce meson masses. Again, further experimentation is re
https://en.wikipedia.org/wiki/Interactive%20programming
Interactive programming is the procedure of writing parts of a program while it is already active. This focuses on the program text as the main interface for a running process, rather than an interactive application, where the program is designed in development cycles and used thereafter (usually by a so-called "user", in distinction to the "developer"). Consequently, here, the activity of writing a program becomes part of the program itself. It thus forms a specific instance of interactive computation as an extreme opposite to batch processing, where neither writing the program nor its use happens in an interactive way. The principle of rapid feedback in extreme programming is radicalized and becomes more explicit. Synonyms: on-the-fly-programming, just in time programming, conversational programming Application fields Interactive programming techniques are especially useful in cases where no clear specification of the problem that is to be solved can be given in advance. In such situations (which are not unusual in research), the formal language provides the necessary environment for the development of an appropriate question or problem formulation. Interactive programming has also been used in applications that need to be rewritten without stopping them, a feature which the computer language Smalltalk is famous for. Generally, dynamic programming languages provide the environment for such an interaction, so that typically prototyping and iterative and incremental development is done while other parts of the program are running. As this feature is an apparent need in sound design and algorithmic composition, it has evolved significantly there. More recently, researchers have been using this method to develop sonification algorithms. Using dynamic programming languages for sound and graphics, interactive programming is also used as an improvisational performance style live coding, mainly in algorithmic music and video. Example code Live coding of 3D graphi
https://en.wikipedia.org/wiki/Cognitive%20architecture
A cognitive architecture refers to both a theory about the structure of the human mind and to a computational instantiation of such a theory used in the fields of artificial intelligence (AI) and computational cognitive science. The formalized models can be used to further refine a comprehensive theory of cognition and as a useful artificial intelligence program. Successful cognitive architectures include ACT-R (Adaptive Control of Thought - Rational) and SOAR. The research on cognitive architectures as software instantiation of cognitive theories was initiated by Allen Newell in 1990. The Institute for Creative Technologies defines cognitive architecture as: "hypothesis about the fixed structures that provide a mind, whether in natural or artificial systems, and how they work together – in conjunction with knowledge and skills embodied within the architecture – to yield intelligent behavior in a diversity of complex environments." History Herbert A. Simon, one of the founders of the field of artificial intelligence, stated that the 1960 thesis by his student Ed Feigenbaum, EPAM provided a possible "architecture for cognition" because it included some commitments for how more than one fundamental aspect of the human mind worked (in EPAM's case, human memory and human learning). John R. Anderson started research on human memory in the early 1970s and his 1973 thesis with Gordon H. Bower provided a theory of human associative memory. He included more aspects of his research on long-term memory and thinking processes into this research and eventually designed a cognitive architecture he eventually called ACT. He and his students were influenced by Allen Newell's use of the term "cognitive architecture". Anderson's lab used the term to refer to the ACT theory as embodied in a collection of papers and designs (there was not a complete implementation of ACT at the time). In 1983 John R. Anderson published the seminal work in this area, entitled The Architecture o
https://en.wikipedia.org/wiki/Nitrosomonas%20europaea
Nitrosomonas europaea is a Gram-negative obligate chemolithoautotroph that can derive all its energy and reductant for growth from the oxidation of ammonia to nitrite and lives in several places such as soil, sewage, freshwater, the walls of buildings and on the surface of monuments especially in polluted areas where the air contains high levels of nitrogen compounds. Due to the large amounts of ammonia this bacterium needs to consume for energy to divide, cell division can take up to several days. This is perhaps one reason why this microbe is not studied very much. This microbe has been shown to be an ammonia-oxidizing soil bacterium and it is known to have a range of substrates that might be useful in bioremediation. Several studies are still being done with the bacterium, but will take some time due to the slow cell division rate and the high amounts of nitrogen needed to live. While not using photosynthesis for energy is not unique, "burning" ammonia with oxygen is. Both are characteristics of Nitrosomonas europaea. This microbe tolerates a pH of 6.0-9.0, the optimal conditions being slightly basic; has an aerobic metabolism; and prefers a temperature between 20-30 degrees Celsius. Most are mobile with flagella located in the polar regions although some species are nonmobile. The reaction catalysed by these bacteria is the first step in the oxidation of ammonia to nitrate. Nitrosomonas europaea are also important in the treatment of industrial and sewage waste in the first step of oxidizing ammonia to nitrate. Evidence suggests that ammonia-oxidizing bacteria (AOB) contribute significantly to the global production of nitrous oxide (produced by the reduction of nitrite). Other evidence reveals that AOB are a possible source of nitric oxide via the oxidation of ammonia. Nitrosomonas europaea is also capable of degrading benzene as well as a variety of halogenated organic compounds, including trichloroethylene and vinyl chloride. The ability of nitrifying orga
https://en.wikipedia.org/wiki/Trigger%20%28horse%29
Trigger (July 4, 1934 – July 3, 1965) was a palomino horse made famous in American Western films with his owner and rider, cowboy star Roy Rogers. Pedigree The original Trigger, named Golden Cloud, was born in San Diego, California. Though often mistaken for a Tennessee Walking Horse, his sire was a Thoroughbred and his dam a grade (unregistered) mare that, like Trigger, was a palomino. Movie director William Witney, who directed Roy and Trigger in many of their movies, claimed a slightly different lineage, that his sire was a "registered" palomino stallion (though no known palomino registry existed at the time of Trigger's birth) and his dam was by a Thoroughbred and out of a "cold-blood" mare. Horses other than Golden Cloud also portrayed "Trigger" over the years, none of which was related to Golden Cloud; the two most prominent were palominos known as "Little Trigger" and "Trigger Jr." (a Tennessee Walking Horse listed as "Allen's Gold Zephyr" in the Tennessee Walking Horse registry). Though Trigger remained a stallion his entire life, he was never bred and has no descendants. Rogers used "Trigger Jr."/"Allen's Golden Zephyr", though, at stud for many years, and the horse named "Triggerson" that actor Val Kilmer led on stage as a tribute to Rogers and his cowboy peers during the Academy Awards show in March 1999 was reportedly a grandson of Trigger Jr. Film career Golden Cloud made an early appearance as the mount of Maid Marian, played by Olivia de Havilland in The Adventures of Robin Hood (1938). A short while later, when Roy Rogers was preparing to make his first movie in a starring role, he was offered a choice of five rented "movie" horses to ride and chose Golden Cloud. Rogers bought him eventually in 1943 and renamed him Trigger for his quickness of both foot and mind. Trigger learned 150 trick cues and could walk 50 ft (15 m) on his hind legs (according to sources close to Rogers). They were said to have run out of places to cue Trigger. Trigger bec
https://en.wikipedia.org/wiki/School%20Mathematics%20Study%20Group
The School Mathematics Study Group (SMSG) was an American academic think tank focused on the subject of reform in mathematics education. Directed by Edward G. Begle and financed by the National Science Foundation, the group was created in the wake of the Sputnik crisis in 1958 and tasked with creating and implementing mathematics curricula for primary and secondary education, which it did until its termination in 1977. The efforts of the SMSG yielded a reform in mathematics education known as New Math which was promulgated in a series of reports, culminating in a series published by Random House called the New Mathematical Library (Vol. 1 is Ivan Niven's Numbers: Rational and Irrational). In the early years, SMSG also produced a set of draft textbooks in typewritten paperback format for elementary, middle and high school students. Perhaps the most authoritative collection of materials from the School Mathematics Study Group is now housed in the Archives of American Mathematics in the University of Texas at Austin's Center for American History. See also Foundations of geometry Further reading 1958 Letter from Ralph A. Raimi to Fred Quigley concerning the New Math Whatever Happened to the New Math by Ralph A. Raimi Some Technical Commentaries on Mathematics Education and History by Ralph A. Raimi External links The SMSG Collection at The Center for American History at UT Archives of American Mathematics at the Center for American History at UT Mathematics education Curricula
https://en.wikipedia.org/wiki/Sewall%20Wright
Sewall Green Wright FRS(For) Honorary FRSE (December 21, 1889March 3, 1988) was an American geneticist known for his influential work on evolutionary theory and also for his work on path analysis. He was a founder of population genetics alongside Ronald Fisher and J. B. S. Haldane, which was a major step in the development of the modern synthesis combining genetics with evolution. He discovered the inbreeding coefficient and methods of computing it in pedigree animals. He extended this work to populations, computing the amount of inbreeding between members of populations as a result of random genetic drift, and along with Fisher he pioneered methods for computing the distribution of gene frequencies among populations as a result of the interaction of natural selection, mutation, migration and genetic drift. Wright also made major contributions to mammalian and biochemical genetics. Biography Sewall Wright was born in Melrose, Massachusetts, to Philip Green Wright and Elizabeth Quincy Sewall Wright. His parents were first cousins, an interesting fact in light of Wright's later research on inbreeding. The family moved three years later after Philip accepted a teaching job at Lombard College, a Universalist college in Galesburg, Illinois. As a child, Wright helped his father and brother print and publish an early book of poems by his father's student Carl Sandburg. At the age of seven, in 1897, he wrote his first "book", entitled Wonders of Nature, and he published his last paper in 1988: he can be claimed, therefore, to be the scientist with the longest career of science writing. Wright's astonishing maturity at the age of seven may be judged from the following excerpt quoted in the obituary: Have you ever examined the gizzard of a fowl? The gizzard of a fowl is a deep red colar with blu at the top. First on the outside is a very thick muscle. Under this is a white and fleecy layer. Holding very tight to the other. I expect you know that chickens eat sand. The nex
https://en.wikipedia.org/wiki/Monoboard
A monoboard is a device or product that consists of a single printed circuit board (PCB). Benefits The primary benefit of a monoboard solution is cost savings. There are a number of ways that incorporating all parts on a single board can reduce costs. The primary reason is that solutions with multiple boards require connections between the boards via edge connectors, and sometimes include a ribbon cable. By connecting devices directly together on the same PCB, there is no need for these additional connectors and cables. Additionally, PCB space may be optimized using electronic design automation (EDA) tools, resulting in a smaller device as well as further cost savings. Aesthetically enhanced micro branding is used as well. Disadvantages The disadvantages of using a monoboard solution is that they are inflexible to upgrades. For example, take a personal computer; in most personal computers, the video card is an external device that plugs into a dedicated slot in the motherboard and may be swapped out with a higher performing card. However, in small form-factor designs, a graphics processing unit (GPU) may be placed directly on the board. The typical reason is to reduce size and cost, but this removed the ability to later upgrade the device. Another common example is with data acquisition hardware. Many DAQ solutions involve the use of DAQ modules, which are cards that plug into a backplane. Different DAQ modules can be purchased with different functionalities depending on the speed and resolution of signals being acquired. Examples of this can be seen in products ranging from Fluke DAQs to Tektronix Logic Analyzer modules (such as the TLA7000 series frames which support more than 4 different series of acquisition cards). Examples Personal computers, specifically small form-factor versions such as the UMPC or MacBook Air Data acquisition hardware Cameras Mobile phones Single-board computer Motherboard Printed circuit board manufacturing
https://en.wikipedia.org/wiki/Provitamin
A provitamin is a substance that may be converted within the body to a vitamin. The term previtamin is a synonym. The term "provitamin" is used when it is desirable to label a substance with little or no vitamin activity, but which can be converted to an active form by normal metabolic processes. Example Some provitamins are: "Provitamin A" is a name for β-carotene, which has only about 1/6 the biological activity of retinol (vitamin A); the body uses an enzyme to convert β-carotene to retinol. In other contexts, both β-carotene and retinol are simply considered to be different forms (vitamers) of vitamin A. "Provitamin B5" is a name for panthenol, which may be converted in the body to vitamin B5 (pantothenic acid). Menadione is a synthetic provitamin of vitamin K. Provitamin D2 is ergosterol, and provitamin D3 is 7-dehydrocholesterol. They are converted by UV light into vitamin D. The human body produces provitamin D3 naturally; deficiency is usually caused by a lack of sun exposure, not a lack of the provitamin.