source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Complex%20line
In mathematics, a complex line is a one-dimensional affine subspace of a vector space over the complex numbers. A common point of confusion is that while a complex line has dimension one over C (hence the term "line"), it has dimension two over the real numbers R, and is topologically equivalent to a real plane, not a real line. The "complex plane" commonly refers to the graphical representation of the complex line on the real plane, and is thus generally synonymous with the complex line, and not a two-dimensional space over the complex numbers. See also Algebraic geometry Complex vector Riemann sphere
https://en.wikipedia.org/wiki/Response%20time%20%28technology%29
In technology, response time is the time a system or functional unit takes to react to a given input. Computing Response time is the total amount of time it takes to respond to a request for service. That service can be anything from a memory fetch, to a disk IO, to a complex database query, or loading a full web page. Ignoring transmission time for a moment, the response time is the sum of the service time and wait time. The service time is the time it takes to do the work you requested. For a given request the service time varies little as the workload increases – to do X amount of work it always takes X amount of time. The wait time is how long the request had to wait in a queue before being serviced and it varies from zero, when no waiting is required, to a large multiple of the service time, as many requests are already in the queue and have to be serviced first. With basic queueing theory math you can calculate how the average wait time increases as the device providing the service goes from 0-100% busy. As the device becomes busier, the average wait time increases in a non-linear fashion. The busier the device is, the more dramatic the response time increases will seem as you approach 100% busy; all of that increase is caused by increases in wait time, which is the result of all the requests waiting in queue that have to run first. Transmission time gets added to response time when your request and the resulting response has to travel over a network and it can be very significant. Transmission time can include propagation delays due to distance (the speed of light is finite), delays due to transmission errors, and data communication bandwidth limits (especially at the last mile) slowing the transmission speed of the request or the reply. Real-time systems In real-time systems the response time of a task or thread is defined as the time elapsed between the dispatch (time when task is ready to execute) to the time when it finishes its job (one dispatch).
https://en.wikipedia.org/wiki/%CE%92-Thromboglobulin
β-Thromboglobulin (β-TG), or beta-thromboglobulin, is a chemokine protein secreted by platelets. It is a type of chemokine (C-X-C motif) ligand 7. Along with platelet factor 4 (PF4), β-TG is one of the best-characterized platelet-specific proteins. β-TG and PF4 are stored in platelet alpha granules and are released during platelet activation. As a result, they are useful markers of platelet activation. β-TG also has multiple biological activities, for instance being involved in maturation of megakaryocytes. Biological actions β-TG is a chemoattractant, strongly for fibroblasts and weakly for neutrophils. It is a stimulator of mitogenesis, extracellular matrix synthesis, glucose metabolism, and plasminogen activator synthesis in human fibroblasts. β-TG also affects megakaryocyte maturation, and thus helps in regulating platelet production. Clinical uses Levels of β-TG is used to index platelet activation. It is measured by ELISA in blood plasma or urine, and often in conjunction with PF4. Influences β-TG levels may increase with age. It is elevated in diabetes mellitus. β-TG levels have been found to be increased by treatment with the synthetic estrogen ethinylestradiol, though were not significantly increased by the natural estrogen estradiol valerate. Levels of β-TG have also been found to be increased or unchanged during normal pregnancy.
https://en.wikipedia.org/wiki/Incomplete%20Bessel%20functions
In mathematics, the incomplete Bessel functions are types of special functions which act as a type of extension from the complete-type of Bessel functions. Definition The incomplete Bessel functions are defined as the same delay differential equations of the complete-type Bessel functions: And the following suitable extension forms of delay differential equations from that of the complete-type Bessel functions: Where the new parameter defines the integral bound of the upper-incomplete form and lower-incomplete form of the modified Bessel function of the second kind: Properties for integer for non-integer for non-integer for non-integer Differential equations satisfies the inhomogeneous Bessel's differential equation Both , , and satisfy the partial differential equation Both and satisfy the partial differential equation Integral representations Base on the preliminary definitions above, one would derive directly the following integral forms of , : With the Mehler–Sonine integral expressions of and mentioned in Digital Library of Mathematical Functions, we can further simplify to and , but the issue is not quite good since the convergence range will reduce greatly to .
https://en.wikipedia.org/wiki/Fedora%20Media%20Writer
Fedora Media Writer is an open source tool designed to create live media for Fedora Linux. Features Cross-platform (available for Linux, macOS, and Windows) Destructive installer - "overwrites the drive's partition layout though so it also provides a way to restore a single-partition layout with a FAT32 partition" Supports various Fedora Linux releases Automatically detects all removable devices Persistent storage creation, to save all documents created and modifications made to the system SHA-1 checksum verification of known releases, to ensure there is no corruption when downloading Not limited to Fedora Linux releases, supports custom images See also Fedora Linux Fedora Project List of tools to create live media systems
https://en.wikipedia.org/wiki/Pyrolobus
Pyrolobus is a genus of the Pyrodictiaceae.
https://en.wikipedia.org/wiki/Jacobi%20transform
In mathematics, Jacobi transform is an integral transform named after the mathematician Carl Gustav Jacob Jacobi, which uses Jacobi polynomials as kernels of the transform . The Jacobi transform of a function is The inverse Jacobi transform is given by Some Jacobi transform pairs
https://en.wikipedia.org/wiki/Distributed%20manufacturing
Distributed manufacturing also known as distributed production, cloud producing, distributed digital manufacturing, and local manufacturing is a form of decentralized manufacturing practiced by enterprises using a network of geographically dispersed manufacturing facilities that are coordinated using information technology. It can also refer to local manufacture via the historic cottage industry model, or manufacturing that takes place in the homes of consumers. Enterprise In enterprise environments, the primary attribute of distributed manufacturing is the ability to create value at geographically dispersed locations. For example, shipping costs could be minimized when products are built geographically close to their intended markets. Also, products manufactured in a number of small facilities distributed over a wide area can be customized with details adapted to individual or regional tastes. Manufacturing components in different physical locations and then managing the supply chain to bring them together for final assembly of a product is also considered a form of distributed manufacturing. Digital networks combined with additive manufacturing allow companies a decentralized and geographically independent distributed production (cloud manufacturing). Consumer Within the maker movement and DIY culture, small scale production by consumers often using peer-to-peer resources is being referred to as distributed manufacturing. Consumers download digital designs from an open design repository website like Youmagine or Thingiverse and produce a product for low costs through a distributed network of 3D printing services such as 3D Hubs, Geomiq. In the most distributed form of distributed manufacturing the consumer becomes a prosumer and manufacturers products at home with an open-source 3-D printer such as the RepRap. In 2013 a desktop 3-D printer could be economically justified as a personal product fabricator and the number of free and open hardware designs were growi
https://en.wikipedia.org/wiki/List%20of%20network%20protocol%20stacks
This is a list of protocol stack architectures. A protocol stack is a suite of complementary communications protocols in a computer network or a computer bus system. See also Lists of network protocols IEEE 802 Network protocols Communications protocols Network protocol stacks stacks
https://en.wikipedia.org/wiki/International%20Conference%20on%20Distributed%20Computing%20Systems
The International Conference on Distributed Computing Systems (ICDCS) is the oldest conference in the field of distributed computing systems in the world. It was launched by the IEEE Computer Society Technical Committee on Distributed Processing (TCDP) in October 1979, and is sponsored by such committee. It was started as an 18-month conference until 1983 and became an annual conference since 1984. The ICDCS has a long history of significant achievements and worldwide visibility, and has recently celebrated its 37th year. Location history 2019: Dallas, Texas, United States 2018: Vienna, Austria 2017: Atlanta, GA, United States 2016: Nara, Japan 2015: Columbus, Ohio, United States 2014: Madrid, Spain 2013: Philadelphia, Pennsylvania, United States 2012: Macau, China 2011: Minneapolis, Minnesota, United States 2010: Genoa, Italy 2009: Montreal, Quebec, Canada 2008: Beijing, China 2007: Toronto, Ontario, Canada 2006: Lisbon, Portugal 2005: Columbus, Ohio, United States 2004: Keio University, Japan 2003: Providence, RI, United States 2002: Vienna, Austria 2001: Phoenix, AZ, United States 2000: Taipei, Taiwan 1999: Austin, TX, United States 1998: Amsterdam, The Netherlands 1997: Baltimore, MD, United States 1996: Hong Kong 1995: Vancouver, Canada 1994: Poznań, Poland 1993: Pittsburgh, PA, United States 1992: Yokohama, Japan 1991: Arlington, TX, United States 1990: Paris, France 1989: Newport Beach, CA, United States 1988: San Jose, CA, United States 1987: Berlin, Germany 1986: Cambridge, MA, United States 1985: Denver, CO, United States 1984: San Francisco, CA, United States 1983: Hollywood, FL, United States 1981: Versailles, France 1979: Huntsville, AL, United States See also List of distributed computing conferences External links ICDCS 2018 - July 2–July 5, 2018, Vienna, Austria ICDCS 2007 - June 25–June 29, 2007, Toronto, Canada. ICDCS 2006 - July 4–July 7, 2006, Lisbon, Portugal. ICDCS 2005 - July 6–July 10, 2005, Co
https://en.wikipedia.org/wiki/Mission%20specialist
Mission specialist (MS) was a specific position held by certain NASA astronauts who were tasked with conducting a range of scientific, medical, or engineering experiments during a spaceflight mission. These specialists were usually assigned to a specific field of expertise that was related to the goals of the particular mission they were assigned to. Mission specialists were highly trained individuals who underwent extensive training in preparation for their missions. They were required to have a broad range of skills, including knowledge of science and engineering, as well as experience in operating complex equipment in a zero-gravity environment. During a mission, mission specialists were responsible for conducting experiments, operating equipment, and performing spacewalks to repair or maintain equipment outside the spacecraft. They also played a critical role in ensuring the safety of the crew by monitoring the spacecraft's systems and responding to emergencies as needed. The role of mission specialist was an important one in the Space Shuttle program, as they were instrumental in the success of the program's many scientific and engineering missions. Many of the advances in science and technology that were made during this period were made possible by the hard work and dedication of the mission specialists who worked tirelessly to push the boundaries of what was possible in space.
https://en.wikipedia.org/wiki/Constantin%20Philipsen
Lauritz Carl Constantin Philipsen (1 December 1859 in Copenhagen; died 23 August 1925 Copenhagen) is credited as one of the founders of the Cinema of Denmark. Biography Philipsen, a photographer toured Scandinavian nations from 1898 with his magic lantern He eventually sold his photography business to enter the emerging world of cinema on a full-time basis. Philipsen opened Denmark's first viable cinema the 158 seat Kosmorama in 1904 in Copenhagen He opened 26 more Kosmorama Cinemas in Denmark between 1905 and 1906. Though the majority of cinemas seated at most 300-400 people, Philipsen opened the large Palace Cinema seating 2500 and using a 30 piece orchestra in to former site of Copenhagen's Grand Central Railway station In addition to owning cinemas Philipsen began producing his own films from 1909. Legacy His son Preben Philipsen (1910–2005) named his Constantin Film company after his father. Notes External links Danish film producers 1859 births 1925 deaths Burials at the Garrison Cemetery, Copenhagen
https://en.wikipedia.org/wiki/Tuber%20of%20vermis
The tuber of vermis, the most posterior division of the inferior vermis, is of small size, and laterally spreads out into the large inferior semilunar lobules, which comprise at least two-thirds of the inferior surface of the hemisphere. Additional Images
https://en.wikipedia.org/wiki/China%20Cables
The China Cables are a collection of secret Chinese government documents from 2017 which were leaked by exiled Uyghurs to the International Consortium of Investigative Journalists, and published on 24 November 2019. The documents include a telegram which details the first known operations manual for running the Xinjiang internment camps, and bulletins which illustrate how China's centralized data collection system and mass surveillance tool, known as the Integrated Joint Operations Platform, uses artificial intelligence to identify people for interrogation and potential detention. The Chinese government has called the cables "pure fabrication" and "fake news", further stating that the West were "slandering and smearing" them. The documents release sparked renewed attention to the Uyghur internment camps and Uyghur genocide. Description and contents On November 24, 2019, the International Consortium of Investigative Journalists published secret Chinese government documents from 2017 dubbed as the "China Cables", which exiled Uyghurs had leaked to them. The documents consisted of a classified telegram called "New Secret 5656" from 2017, four bulletins/security briefings and one court document. The classified telegram detailed the first known operations manual for running "between 1,300 and 1,400" internment camps of Muslim Uyghurs in Xinjiang, It was signed by Zhu Hailun, head of Xinjiang's Political and Legal Commission A, then deputy secretary of Xinjiang's Party Committee of the Chinese Communist Party. According to the American delegate to the UN committee on the elimination of racial discrimination, China is holding one million Uyghurs in these camps. The 4 bulletins are secret government intelligence briefings from China's centralized data collection system "Integrated Joint Operation Platform" (IJOP), which uses artificial intelligence to identify people for questioning and potential detention. It illustrated a connection between mass surveillance in China
https://en.wikipedia.org/wiki/Rubber%20toughening
Rubber toughening is a process in which rubber nanoparticles are interspersed within a polymer matrix to increase the mechanical robustness, or toughness, of the material. By "toughening" a polymer it is meant that the ability of the polymeric substance to absorb energy and plastically deform without fracture is increased. Considering the significant advantages in mechanical properties that rubber toughening offers, most major thermoplastics are available in rubber-toughened versions; for many engineering applications, material toughness is a deciding factor in final material selection. The effects of disperse rubber nanoparticles are complex and differ across amorphous and partly crystalline polymeric systems. Rubber particles toughen a system by a variety of mechanisms such as when particulates concentrate stress causing cavitation or initiation of dissipating crazes. However the effects are not one-sided; excess rubber content or debonding between the rubber and polymer can reduce toughness. It is difficult to state the specific effects of a given particle size or interfacial adhesion parameter due to numerous other confounding variables. The presence of a given failure mechanism is determined by many factors: those intrinsic to the continuous polymer phase, and those that are extrinsic, pertaining to the stress, loading speed, and ambient conditions. The action of a given mechanism in a toughened polymer can be studied with microscopy. The addition of rubbery domains occurs via processes such as melt blending in a Rheomix mixer and atom-transfer radical-polymerization. Current research focuses on how optimizing the secondary phase composition and dispersion affects mechanical properties of the blend. Questions of interest include those to do with fracture toughness, tensile strength, and glass transition temperature. Toughening mechanisms Different theories describe how a dispersed rubber phase toughens a polymeric substance; most employ methods of dissipa
https://en.wikipedia.org/wiki/Joint%20Dark%20Energy%20Mission
The Joint Dark Energy Mission (JDEM) was an Einstein probe that planned to focus on investigating dark energy. JDEM was a partnership between NASA and the U.S. Department of Energy (DOE). In August 2010, the Board on Physics and Astronomy of the National Science Foundation (NSF) recommended the Wide Field Infrared Survey Telescope (WFIRST) mission, a renamed JDEM-Omega proposal which has superseded SNAP, Destiny, and Advanced Dark Energy Physics Telescope (ADEPT), as the highest priority for development in the decade around 2020. This would be a 1.5-meter telescope with a 144-megapixel HgCdTe focal plane array, located at the Sun-Earth L2 Lagrange point. The expected cost is around US$1.6 billion. Earlier proposals Dark Energy Space Telescope (Destiny) The Dark Energy Space Telescope (Destiny), was a planned project by NASA and DOE, designed to perform precision measurements of the universe to provide an understanding of dark energy. The space telescope will derive the expansion of the universe by measuring up to 3,000 distant supernovae each year of its three-year mission lifetime, and will additionally study the structure of matter in the universe by measuring millions of galaxies in a weak gravitational lensing survey. The Destiny spacecraft features an optical telescope with a 1.8 metre primary mirror. The telescope images infrared light onto an array of solid-state detectors. The mission is designed to be deployed in a halo orbit about the Sun-Earth Lagrange point. The Destiny proposal has been superseded by the Wide Field Infrared Survey Telescope (WFIRST). SuperNova Acceleration Probe (SNAP) The SuperNova Acceleration Probe (SNAP) mission was proposed to provide an understanding of the mechanism driving the acceleration of the universe and determine the nature of dark energy. To achieve these goals, the spacecraft needed to be able to detect these supernova when they are at their brightest moment. The mission was proposed as an experiment for the JDE
https://en.wikipedia.org/wiki/Marquee%20%28structure%29
A marquee is most commonly a structure placed over the entrance to a hotel, theatre, casino, train station, or similar building. It often has signage stating either the name of the establishment or, in the case of theatres, the play or movie and the artist(s) appearing at that venue. The marquee is sometimes identifiable by a surrounding cache of light bulbs, usually yellow or white, that flash intermittently or as chasing lights. Etymology The current usage of the modern English word marquee, that in US English refers specifically to a canopy projecting over the main entrance of a theater, which displays details of the entertainment or performers, was documented in the academic journal American Speech in 1926: "Marquee, the front door or main entrance of the big top." In British English "marquee" refers more generally to a large tent, usually for social uses. The English word marquee is derived from the Middle French word marquise (the final /z/ probably being mistaken as -s plural), the feminine form corresponding to marquis ('nobleman'). The word marquise was also used to refer to various objects and fashions regarded as elegant or pleasing, hence: a kind of pear (1690), a canopy placed over a tent (1718), a type of settee (1770), a canopy in front of a building (1835), a ring with an elongated stone or setting, a diamond cut as a navette (late 19th century), and a style of woman's hat (1889). The oldest form of the word's root *merg- meant "boundary, border." Other words that descended from this Proto-Indo-European root include margin, margrave, and mark. Early examples of the modern use of marquee include 1931, The American Mercury: "Marquee, the canopy at the main entrance [of a circus]." 1933, Billboard, The marquee of the Rivoli, where Samarang is playing, reads: 'One of the most exciting films ever shown.' 1967, The Boston Globe: "British actors mean little on an American movie marquee and Sherlock Holmes always seems old-fashioned." History Movie m
https://en.wikipedia.org/wiki/Ultrapure%20water
Ultrapure water (UPW), high-purity water or highly purified water (HPW) is water that has been purified to uncommonly stringent specifications. Ultrapure water is a term commonly used in manufacturing to emphasize the fact that the water is treated to the highest levels of purity for all contaminant types, including: organic and inorganic compounds; dissolved and particulate matter; volatile and non-volatile; reactive, and inert; hydrophilic and hydrophobic; and dissolved gases. UPW and the commonly used term deionized (DI) water are not the same. In addition to the fact that UPW has organic particles and dissolved gases removed, a typical UPW system has three stages: a pretreatment stage to produce purified water, a primary stage to further purify the water, and a polishing stage, the most expensive part of the treatment process. A number of organizations and groups develop and publish standards associated with the production of UPW. For microelectronics and power, they include Semiconductor Equipment and Materials International (SEMI) (microelectronics and photovoltaic), American Society for Testing and Materials International (ASTM International) (semiconductor, power), Electric Power Research Institute (EPRI) (power), American Society of Mechanical Engineers (ASME) (power), and International Association for the Properties of Water and Steam (IAPWS) (power). Pharmaceutical plants follow water quality standards as developed by pharmacopeias, of which three examples are the United States Pharmacopeia, European Pharmacopeia, and Japanese Pharmacopeia. The most widely used requirements for UPW quality are documented by ASTM D5127 "Standard Guide for Ultra-Pure Water Used in the Electronics and Semiconductor Industries" and SEMI F63 "Guide for ultrapure water used in semiconductor processing". Ultra pure water is also used as boiler feedwater in the UK AGR fleet. Sources and control Bacteria, particles, organic, and inorganic sources of contamination vary depend
https://en.wikipedia.org/wiki/P%20wave
A P wave (primary wave or pressure wave) is one of the two main types of elastic body waves, called seismic waves in seismology. P waves travel faster than other seismic waves and hence are the first signal from an earthquake to arrive at any affected location or at a seismograph. P waves may be transmitted through gases, liquids, or solids. Nomenclature The name P wave can stand for either pressure wave (as it is formed from alternating compressions and rarefactions) or primary wave (as it has high velocity and is therefore the first wave to be recorded by a seismograph). The name S wave represents another seismic wave propagation mode, standing for secondary or shear wave, a usually more destructive wave than the primary wave. Seismic waves in the Earth Primary and secondary waves are body waves that travel within the Earth. The motion and behavior of both P and S waves in the Earth are monitored to probe the interior structure of the Earth. Discontinuities in velocity as a function of depth are indicative of changes in phase or composition. Differences in arrival times of waves originating in a seismic event like an earthquake as a result of waves taking different paths allow mapping of the Earth's inner structure. P-wave shadow zone Almost all the information available on the structure of the Earth's deep interior is derived from observations of the travel times, reflections, refractions and phase transitions of seismic body waves, or normal modes. P waves travel through the fluid layers of the Earth's interior, and yet they are refracted slightly when they pass through the transition between the semisolid mantle and the liquid outer core. As a result, there is a P wave "shadow zone" between 103° and 142° from the earthquake's focus, where the initial P waves are not registered on seismometers. In contrast, S waves do not travel through liquids. As an earthquake warning Advance earthquake warning is possible by detecting the nondestructive primary waves t
https://en.wikipedia.org/wiki/Carrion%20insects
Carrion insects are insects associated with decomposing remains. The processes of decomposition begin within a few minutes of death. Decomposing remains offer a temporary, changing site of concentrated resources which are exploited by a wide range of organisms, of which arthropods are often the first to arrive and the predominant exploitive group. However, not all arthropods found on or near decomposing remains will have an active role in the decay process. Ecological roles Carrion insects are commonly described based on their ecological role. Four commonly described roles are: Necrophagous species Predators and parasites of necrophagous species Omnivorous species Adventive species Necrophagous species Necrophagous species are insects/arthropods that feed directly on remains, or the fluids released from remains during the decomposition process. This ecological classification includes many species of the order Diptera (true flies) from the families Calliphoridae (blowflies) and Sarcophagidae (flesh flies), and some species of the order Coleoptera (beetles). Although specific arthropod species present at remains will vary by geographic location, some examples of common blowflies are Calliphora vicina, Phormia regina, Protophormia terraenovae and Lucilia sericata Necrophagous blowfly species are often the first to arrive and colonize at a site of decomposing remains. These species develop from eggs laid directly on the carcass and complete their life cycle on or near the remains. Because of this, necrophagous species are considered to be the most important for post-mortem interval estimations. The initial colonizers of greatest importance are those of the family Calliphoridae, Sarcophagidae and Muscidae (house flies), as these are typically the first insects to lay eggs at remains. The fresh stage of decomposition is characterized by the arrival of necrophagous blowflies and flesh flies. These blowflies are also strongly attracted during the bloat s
https://en.wikipedia.org/wiki/Cotton%20effect
The Cotton effect in physics, is the characteristic change in optical rotatory dispersion and/or circular dichroism in the vicinity of an absorption band of a substance. In a wavelength region where the light is absorbed, the absolute magnitude of the optical rotation at first varies rapidly with wavelength, crosses zero at absorption maxima and then again varies rapidly with wavelength but in the opposite direction. This phenomenon was discovered in 1895 by the French physicist Aimé Cotton (1869–1951). The Cotton effect is called positive if the optical rotation first increases as the wavelength decreases (as first observed by Cotton), and negative if the rotation first decreases. A protein structure such as a beta sheet shows a negative Cotton effect. See also Cotton–Mouton effect
https://en.wikipedia.org/wiki/Immerman%E2%80%93Szelepcs%C3%A9nyi%20theorem
In computational complexity theory, the Immerman–Szelepcsényi theorem states that nondeterministic space complexity classes are closed under complementation. It was proven independently by Neil Immerman and Róbert Szelepcsényi in 1987, for which they shared the 1995 Gödel Prize. In its general form the theorem states that NSPACE(s(n)) = co-NSPACE(s(n)) for any function s(n) ≥ log n. The result is equivalently stated as NL = co-NL; although this is the special case when s(n) = log n, it implies the general theorem by a standard padding argument. The result solved the second LBA problem. In other words, if a nondeterministic machine can solve a problem, another machine with the same resource bounds can solve its complement problem (with the yes and no answers reversed) in the same asymptotic amount of space. No similar result is known for the time complexity classes, and indeed it is conjectured that NP is not equal to co-NP. The principle used to prove the theorem has become known as inductive counting. It has also been used to prove other theorems in computational complexity, including the closure of LOGCFL under complementation and the existence of error-free randomized logspace algorithms for USTCON. Proof sketch The theorem can be proven by showing how to translate any nondeterministic Turing machine M into another nondeterministic Turing machine that solves the complementary decision problem under the same (asymptotic) space complexity, plus a constant number of pointers and counters, which needs only a logarithmic amount of space. The idea is to simulate all the configurations of M on input w, and to check if any configuration is accepting. This can be done within the same space plus a constant number of pointers and counters to keep track of the configurations. If no configuration is accepting, the simulating Turing machine accepts the input w. This idea is elaborated below for logarithmic NSPACE class (NL), which generalizes to larger NSPACE classes via a
https://en.wikipedia.org/wiki/N-topological%20space
In mathematics, an N-topological space is a set equipped with N arbitrary topologies. If τ1, τ2, ..., τN are N topologies defined on a nonempty set X, then the N-topological space is denoted by (X,τ1,τ2,...,τN). For N = 1, the structure is simply a topological space. For N = 2, the structure becomes a bitopological space introduced by J. C. Kelly. Example Let X = {x1, x2, ...., xn} be any finite set. Suppose Ar = {x1, x2, ..., xr}. Then the collection τ1 = {φ, A1, A2, ..., An = X} will be a topology on X. If τ1, τ2, ..., τm be m such topologies (chain topologies) defined on X, then the structure (X, τ1, τ2, ..., τm) is an ''m''-topological space.
https://en.wikipedia.org/wiki/Microsomal%20triglyceride%20transfer%20protein
Microsomal triglyceride transfer protein large subunit is a protein that in humans is encoded by the MTTP gene. MTP encodes the large subunit of the heterodimeric microsomal triglyceride transfer protein. Protein disulfide isomerase (PDI) completes the heterodimeric microsomal triaglyceride transfer protein, which has been shown to play a central role in lipoprotein assembly. Mutations in MTP can cause abetalipoproteinemia. Apolipoprotein B48 on chylomicra and Apolipoprotein B100 on LDL, IDL, and VLDL are important for MTP binding. Interactive pathway map Pharmacology Drugs that inhibit MTTP prevent the assembly of apo B-containing lipoproteins thus inhibiting the synthesis of chylomicrons and VLDL and leading to decrease in plasma levels of LDL-C. Lomitapide (Juxtapid) was approved by the US FDA for adjunctive treatment of homozygous familial hypercholesterolemia. Dirlotapide (Slentrol) and mitratapide (Yarvitan) are veterinary drugs for the management of obesity in dogs.
https://en.wikipedia.org/wiki/Jacques%20Hadamard
Jacques Salomon Hadamard (; 8 December 1865 – 17 October 1963) was a French mathematician who made major contributions in number theory, complex analysis, differential geometry and partial differential equations. Biography The son of a teacher, Amédée Hadamard, of Jewish descent, and Claire Marie Jeanne Picard, Hadamard was born in Versailles, France and attended the Lycée Charlemagne and Lycée Louis-le-Grand, where his father taught. In 1884 Hadamard entered the École Normale Supérieure, having placed first in the entrance examinations both there and at the École Polytechnique. His teachers included Tannery, Hermite, Darboux, Appell, Goursat and Picard. He obtained his doctorate in 1892 and in the same year was awarded the for his essay on the Riemann zeta function. In 1892 Hadamard married Louise-Anna Trénel, also of Jewish descent, with whom he had three sons and two daughters. The following year he took up a lectureship in the University of Bordeaux, where he proved his celebrated inequality on determinants, which led to the discovery of Hadamard matrices when equality holds. In 1896 he made two important contributions: he proved the prime number theorem, using complex function theory (also proved independently by Charles Jean de la Vallée-Poussin); and he was awarded the Bordin Prize of the French Academy of Sciences for his work on geodesics in the differential geometry of surfaces and dynamical systems. In the same year he was appointed Professor of Astronomy and Rational Mechanics in Bordeaux. His foundational work on geometry and symbolic dynamics continued in 1898 with the study of geodesics on surfaces of negative curvature. For his cumulative work, he was awarded the Prix Poncelet in 1898. After the Dreyfus affair, which involved him personally because his second cousin Lucie was the wife of Dreyfus, Hadamard became politically active and a staunch supporter of Jewish causes though he professed to be an atheist in his religion. In 1897 he moved bac
https://en.wikipedia.org/wiki/Pesticides%20Safety%20Directorate
The Pesticides Safety Directorate was an agency of the Department for Environment, Food and Rural Affairs (Defra). It was based in York, England, with about 200 scientific, policy and support staff and was responsible for the authorisation of plant protection products and, from 2005, detergents, in the United Kingdom. In April 2008, it joined the Health and Safety Executive (HSE), and in April 2009, became part of a newly formed Chemicals Regulation Directorate (CRD) at the HSE. Aims of the Pesticides Safety Directorate To ensure the safe use of pesticides and detergents for people and the environment. To harmonise pesticide regulation within the European Community and provide a level playing field for crop protection. As part of the strategy for sustainable food and farming, to reduce negative impacts of pesticides on the environment.
https://en.wikipedia.org/wiki/Ax%E2%80%93Kochen%20theorem
The Ax–Kochen theorem, named for James Ax and Simon B. Kochen, states that for each positive integer d there is a finite set Yd of prime numbers, such that if p is any prime not in Yd then every homogeneous polynomial of degree d over the p-adic numbers in at least d2 + 1 variables has a nontrivial zero. The proof of the theorem The proof of the theorem makes extensive use of methods from mathematical logic, such as model theory. One first proves Serge Lang's theorem, stating that the analogous theorem is true for the field Fp((t)) of formal Laurent series over a finite field Fp with . In other words, every homogeneous polynomial of degree d with more than d2 variables has a non-trivial zero (so Fp((t)) is a C2 field). Then one shows that if two Henselian valued fields have equivalent valuation groups and residue fields, and the residue fields have characteristic 0, then they are elementarily equivalent (which means that a first order sentence is true for one if and only if it is true for the other). Next one applies this to two fields, one given by an ultraproduct over all primes of the fields Fp((t)) and the other given by an ultraproduct over all primes of the p-adic fields Qp. Both residue fields are given by an ultraproduct over the fields Fp, so are isomorphic and have characteristic 0, and both value groups are the same, so the ultraproducts are elementarily equivalent. (Taking ultraproducts is used to force the residue field to have characteristic 0; the residue fields of Fp((t)) and Qp both have non-zero characteristic p.) The elementary equivalence of these ultraproducts implies that for any sentence in the language of valued fields, there is a finite set Y of exceptional primes, such that for any p not in this set the sentence is true for Fp((t)) if and only if it is true for the field of p-adic numbers. Applying this to the sentence stating that every non-constant homogeneous polynomial of degree d in at least d2+1 variables represents 0, and usin
https://en.wikipedia.org/wiki/Cave%20of%20Swimmers
The Cave of Swimmers is a cave with ancient rock art in the mountainous Gilf Kebir plateau of the Libyan Desert section of the Sahara. It is located in the New Valley Governorate of southwest Egypt, near the border with Libya. History The cave and rock art was discovered in October 1933 by the Hungarian explorer László Almásy. It contains Neolithic pictographs (rock painting images) and is named due to the depictions of people with their limbs bent as if they were swimming. The drawings include those of giraffe and hippopotamus. They are estimated to have been created as early as 10,000 years ago with the beginning of the African Humid Period, when the Sahara was significantly greener and wetter than it is today. The cause of the climate change 10,000 years ago was due to changes in summer solar insolation and vegetation and dust feedbacks. Almásy devoted a chapter to the cave in his 1934 book, The Unknown Sahara. In it he postulates that the swimming scenes are real depictions of life at the time of painting and that the artists had realistically drawn their surroundings and that there had been a climatic change from temperate to xeric desert since that time making it drier. This theory was so new at that time that his first editor added several footnotes, to make it clear that he did not share this opinion. In 2007, Eman Ghoneim discovered an ancient mega-lake (30,750 km²) buried beneath the sand of the Great Sahara in the Northern Darfur region, Sudan. The cave is mentioned in Michael Ondaatje's novel The English Patient. The film adaptation has a scene in it that has a guide describing in his native language to Almásy, who is portrayed as a character in both the novel and the film, the location that Almásy renders a drawing and includes some text that is then placed in the book that he keeps for himself. The cave shown in the film is not the original but a film set created by a contemporary artist. Present day Physical scientists who have been conducting re
https://en.wikipedia.org/wiki/Kew%20Rule
The Kew Rule was used by some authors to determine the application of synonymous names in botanical nomenclature up to about 1906, but was and still is contrary to codes of botanical nomenclature including the International Code of Nomenclature for algae, fungi, and plants. Index Kewensis, a publication that aimed to list all botanical names for seed plants at the ranks of species and genus, used the Kew Rule until its Supplement IV was published in 1913 (prepared 1906–1910). The Kew Rule applied rules of priority in a more flexible way, so that when transferring a species to a new genus, there was no requirement to retain the epithet of the original species name, and future priority of the new name was counted from the time the species was transferred to the new genus. The effect has been summarized as "nomenclature used by an established monographer or in a major publication should be adopted". This is contrary to the modern article 11.4 of the Code of Nomenclature. History Beginnings The first discussion in print of what was to become known as the Kew Rule appears to have occurred in 1877 between Henry Trimen and Alphonse Pyramus de Candolle. Trimen did not think it was reasonable for older names discovered in the literature to destabilize the nomenclature that had been well accepted:Probably all botanists are agreed that it is very desirable to retain when possible old specific names, but some of the best authors do not certainly consider themselves bound by any generally accepted rule in this matter. Still less will they be inclined to allow that a writer is at liberty, as M. de Candolle thinks, to reject the specific appellations made by an author whose genera are accepted, in favour of older ones in other genera. It will appear to such that to do this is to needlessly create in each case another synonym. The end The first botanical code of nomenclature that declared itself to be binding was the 1906 publication that followed from the 1905 International
https://en.wikipedia.org/wiki/Repeat%20instruction
In computer instruction set architectures (ISA), a repeat instruction is a machine language instruction which repeatedly executes another instruction a fixed number of times, or until some condition is met. Since it is an instruction that operates on other instructions like the execute instruction, it has been classified as a meta-instruction. Computer models The Univac 1103 (1953) includes a repeat instruction (op code mnemonic: ) which executes the following instruction a fixed number of times, possibly incrementing one or both of the address fields of that instruction. This compensates for the architecture's lack of index registers. The GE-600/Honeywell 6000 series (1964) supports a single-instruction repeat (), a double-instruction repeat (), and a linked-list repeat (). The x86 ISA, starting with the 8086, includes a series of special-purpose repeat instructions () which are called "repeat string operation prefixes" and may only be applied to a small number of string instructions (). These instructions repeat an operation and decrement a counter until it is zero, or may also stop when a certain condition is met. The Texas Instruments TMS320 digital signal processor (1983) includes an instruction for repeating a single-cycle instruction or two single-cycle instruction in parallel () and an instruction for repeating a block of instructions (). These use special block-repeat counter registers (). Semantics The instruction or instruction pair to be executed follows the repeat instruction. Fields in the instruction determine the loop termination condition. In the case of the TMS320, a block of up to 64Kbytes can be repeated. Notes Central processing unit Instruction processing Instruction set architectures
https://en.wikipedia.org/wiki/Papilio%20appalachiensis
Papilio appalachiensis, the Appalachian tiger swallowtail, is a species of swallowtail butterfly found in eastern North America, particularly in the Appalachian Mountains. It is a hybrid of another two Papilio species, Papilio canadensis and Papilio glaucus, with which it shares many characteristics. The butterflies are normally yellow and contain black patterns in their wings. Their wingspans range from 86 to 115 mm. The caterpillars range in color from green and yellow to orange and are ornamented with black specks that give them the appearance of a bird dropping, which is useful for camouflage, or a large eye, a form of mimicry that is also efficient for protection. This species is univoltine. Females lay their eggs in May. Taxonomy Papilio appalachiensis is a member of the genus Papilio in the order Lepidoptera. It is closely related to Papilio canadensis and Papilio glaucus. Distribution The butterfly is found in the eastern United States, specifically in the Appalachian Mountains, ranging from Pennsylvania to Georgia. It is notably larger than both the eastern tiger swallowtail and the Canadian tiger swallowtail. Evolution P. appalachiensis is thought to have evolved as a hybrid species of two other Papilio butterflies: P. canadensis and P. glaucus. Originally, researchers believed that P. canadensis and P. glaucus were distributed in distinct regions separated by a hybrid zone stretching east from Minnesota to southern New England and south along the Appalachian Mountains. In 2005, researchers suggested these two species interacted at some point and produced the new hybrid P. appalachiensis in the hybrid zone. Although each species of tiger swallowtail butterflies is usually confined to a specific "thermal landscape", based on specific X-linked markers and various behavioral traits, laboratory and observational studies have shown the hybrid Appalachian tiger swallowtail butterflies have emerged. Data collected about oviposition preferences and larval m
https://en.wikipedia.org/wiki/Green%27s%20function
In mathematics, a Green's function is the impulse response of an inhomogeneous linear differential operator defined on a domain with specified initial conditions or boundary conditions. This means that if is the linear differential operator, then the Green's function is the solution of the equation , where is Dirac's delta function; the solution of the initial-value problem is the convolution (). Through the superposition principle, given a linear ordinary differential equation (ODE), , one can first solve , for each , and realizing that, since the source is a sum of delta functions, the solution is a sum of Green's functions as well, by linearity of . Green's functions are named after the British mathematician George Green, who first developed the concept in the 1820s. In the modern study of linear partial differential equations, Green's functions are studied largely from the point of view of fundamental solutions instead. Under many-body theory, the term is also used in physics, specifically in quantum field theory, aerodynamics, aeroacoustics, electrodynamics, seismology and statistical field theory, to refer to various types of correlation functions, even those that do not fit the mathematical definition. In quantum field theory, Green's functions take the roles of propagators. Definition and uses A Green's function, , of a linear differential operator acting on distributions over a subset of the Euclidean space , at a point , is any solution of where is the Dirac delta function. This property of a Green's function can be exploited to solve differential equations of the form If the kernel of is non-trivial, then the Green's function is not unique. However, in practice, some combination of symmetry, boundary conditions and/or other externally imposed criteria will give a unique Green's function. Green's functions may be categorized, by the type of boundary conditions satisfied, by a Green's function number. Also, Green's functions in general are
https://en.wikipedia.org/wiki/Cardiac%20marker
Cardiac markers are biomarkers measured to evaluate heart function. They can be useful in the early prediction or diagnosis of disease. Although they are often discussed in the context of myocardial infarction, other conditions can lead to an elevation in cardiac marker level. Most of the early markers identified were enzymes, and as a result, the term "cardiac enzymes" is sometimes used. However, not all of the markers currently used are enzymes. For example, in formal usage, troponin would not be listed as a cardiac enzyme. Applications of measurement Measuring cardiac biomarkers can be a step toward making a diagnosis for a condition. Whereas cardiac imaging often confirms a diagnosis, simpler and less expensive cardiac biomarker measurements can advise a physician whether more complicated or invasive procedures are warranted. In many cases medical societies advise doctors to make biomarker measurements an initial testing strategy especially for patients at low risk of cardiac death. Many acute cardiac marker IVD products are targeted at nontraditional markets, e.g., the hospital ER instead of traditional hospital or clinical laboratory environments. Competition in the development of cardiac marker diagnostic products and their expansion into new markets is intense. Recently, the intentional destruction of myocardium by alcohol septal ablation has led to the identification of additional potential markers. Types Types of cardiac markers include the following: Limitations Depending on the marker, it can take between 2 and 24 hours for the level to increase in the blood. Additionally, determining the levels of cardiac markers in the laboratory - like many other lab measurements - takes substantial time. Cardiac markers are therefore not useful in diagnosing a myocardial infarction in the acute phase. The clinical presentation and results from an ECG are more appropriate in the acute situation. However, in 2010, research at the Baylor College of Medicine reve
https://en.wikipedia.org/wiki/Sidney%20Redner
Sidney Redner (born 1951) is a Canadian-born physicist, professor, and a resident faculty member at the Santa Fe Institute. He was formerly department chair of physics at Boston University. Redner has published over 200 journal articles, authored a book titled A Guide to First-Passage Processes (2001, ), and coauthored a book titled A Kinetic View of Statistical Physics (2010, ) with Pavel L. Krapivsky and Eli Ben-Naim. His research focuses mainly on non-equilibrium statistical mechanics and network structure. He received his Ph.D. in physics from the Massachusetts Institute of Technology in 1977 under Gene Stanley, also on faculty at Boston University. He was awarded the American Physical Society's Leo P. Kadanoff Prize for 2021.
https://en.wikipedia.org/wiki/Lisa%20Schulte%20Moore
Lisa Schulte Moore is an American landscape ecologist. Schulte Moore is a professor of natural resource ecology and management at Iowa State University. In 2020 she received a $10 million USD grant to study anerobic digestion and its application to turning manure into usable energy. In 2021 she was named a MacArthur fellow. Work Moore has worked with farmers to develop resilient and sustainable agricultural practices and systems that take into consideration climate change, water quality and loss of biodiversity. Moore has written on various ecological topics, including the ecological effects of fire on landscapes; soil carbon storage, biodiversity improvement, the effects of wind and fire on forests, among others. Awards and honors John D. and Katherine T. MacArthur Foundation Fellowship Citation for Leadership and Achievement, Council for Scientific Society Presidents (2022)
https://en.wikipedia.org/wiki/Futurebus
Futurebus, or IEEE 896, is a computer bus standard, intended to replace all local bus connections in a computer, including the CPU, memory, plug-in cards and even, to some extent, LAN links between machines. The effort started in 1979 and didn't complete until 1987, and then immediately went into a redesign that lasted until 1994. By this point, implementation of a chip-set based on the standard lacked industry leadership. It has seen little real-world use, although custom implementations continue to be designed and used throughout industry. History In the late 1970s, VMEbus was faster than the parts plugged into it. It was quite reasonable to connect a CPU and RAM to VME on separate cards to build a computer. However, as the speed of the CPUs and RAM rapidly increased, VME was quickly overwhelmed. Increasing the speed of VME was not easy, because all of the parts plugged into it would have to be able to support these faster speeds as well. Futurebus looked to fix these problems and create a successor to systems like VMEbus with a system that could grow in speed without affecting existing devices. In order to do this the primary technology of Futurebus was built using asynchronous links, allowing the devices plugged into it to talk at whatever speed they wished. Another problem that needed to be addressed was the ability to have several cards in the system as "masters", allowing Futurebus to build multiprocessor machines. This required some form of "distributed arbitration" to allow the various cards to gain access to the bus at any point, as opposed to VME, which put a single master in slot 0 with overall control. In order to have a clear performance benefit, Futurebus was designed to have the performance needed ten years in the future. Typical IEEE standards start with a company building a device, and then submitting it to the IEEE for the standardization effort. In the case of Futurebus this was reversed, the whole system was being designed during the standar
https://en.wikipedia.org/wiki/HistoAtlas
HistoAtlas is a free collection of historic geographic information of the human culture all over the world. This is achieved as a time enabled geographic information system (GIS) on the web. All information can be used and edited freely and is intended to be a resource for education, archaeologists, historians and others. Introduction HistoAtlas provides a system to actively maintain historical information derived from historical records and check if it is consistent. It is not meant for the discovery of new historical facts but to put everything together so it can be presented as one whole story. It is an open project making sure everyone benefits from it. Anyone can use the information and collaborate on the project. Its main audience is the general public but it should also contain enough historical details that historians should be able to enjoy it. HistoAtlas is not only able to visualize the changes in extent of different countries, but also the events that caused this change altogether, because these things are historically more important than just the change of a border. The atlas has information about different aspects of history. A few examples. Political boundaries based on antique historical records or archeological research. Historical events like wars, disasters, discoveries, treaties and journeys that shaped the course of time. Facts about historical figures and their families that played an important role in history. The evolution of cultural aspects like languages and religions. Strategy HistoAtlas aims to be a free, multilingual historical encyclopedia, intended as the most precise open content global historical reference. In concept there are some similarities to what online encyclopedia like Wikipedia are doing, with differences on how data is stored, presented and used. It is a history oriented geographic information system. Information is structured more transparently so it can be searched more efficiently and presented in differen
https://en.wikipedia.org/wiki/Juan%20Bimba
Juan Bimba is a fictitious character used in the past as the national personification of Venezuela, but is now regarded as obsolete. According to the local folklore of the region of Cumaná the name comes from a mentally ill local inhabitant of the 1850s; but this version is doubtful. It was first used by Juan Vicente González, a Venezuelan columnist of the 19th century as an example of the average Venezuelan peasant, the prototype of the common people. The cartoon was drawn by cartoonist Mariano Medina Febres in the 1930s Use in politics The name was used and preserved by Andrés Eloy Blanco in several poems and in the Fantoches magazine. A sociopolitical essay by the poet, in 1936, revolving particularly on socialism and communism in Venezuelan history, was entitled Carta a Juan Bimba («A Letter to Juan Bimba»). Acción Democrática, one of the two leading political parties of Venezuela in the 20th century, used and further popularized the name and created an image to accompany the symbolism of their 1963 electoral campaign's motto: El Partido del Pueblo («The People's Party»), especially since the country's supreme court banned the use of their official flag. Recently, former Venezuelan president Hugo Chávez was seen performing a humoristic version of himself as Juan Bimba, particularly during political campaigning tactics, portraying the image of a humble llanero. See also Brother Jonathan Doña Juanita Huaso
https://en.wikipedia.org/wiki/CpG%20site
The CpG sites or CG sites are regions of DNA where a cytosine nucleotide is followed by a guanine nucleotide in the linear sequence of bases along its 5' → 3' direction. CpG sites occur with high frequency in genomic regions called CpG islands (or CG islands). Cytosines in CpG dinucleotides can be methylated to form 5-methylcytosines. Enzymes that add a methyl group are called DNA methyltransferases. In mammals, 70% to 80% of CpG cytosines are methylated. Methylating the cytosine within a gene can change its expression, a mechanism that is part of a larger field of science studying gene regulation that is called epigenetics. Methylated cytosines often mutate to thymines. In humans, about 70% of promoters located near the transcription start site of a gene (proximal promoters) contain a CpG island. CpG characteristics Definition CpG is shorthand for 5'—C—phosphate—G—3' , that is, cytosine and guanine separated by only one phosphate group; phosphate links any two nucleosides together in DNA. The CpG notation is used to distinguish this single-stranded linear sequence from the CG base-pairing of cytosine and guanine for double-stranded sequences. The CpG notation is therefore to be interpreted as the cytosine being 5 prime to the guanine base. CpG should not be confused with GpC, the latter meaning that a guanine is followed by a cytosine in the 5' → 3' direction of a single-stranded sequence. Under-representation caused by high mutation rate CpG dinucleotides have long been observed to occur with a much lower frequency in the sequence of vertebrate genomes than would be expected due to random chance. For example, in the human genome, which has a 42% GC content, a pair of nucleotides consisting of cytosine followed by guanine would be expected to occur of the time. The frequency of CpG dinucleotides in human genomes is less than one-fifth of the expected frequency. This underrepresentation is a consequence of the high mutation rate of methylated CpG sites: t
https://en.wikipedia.org/wiki/Fieller%27s%20theorem
In statistics, Fieller's theorem allows the calculation of a confidence interval for the ratio of two means. Approximate confidence interval Variables a and b may be measured in different units, so there is no way to directly combine the standard errors as they may also be in different units. The most complete discussion of this is given by Fieller (1954). Fieller showed that if a and b are (possibly correlated) means of two samples with expectations and , and variances and and covariance , and if are all known, then a (1 − α) confidence interval (mL, mU) for is given by where Here is an unbiased estimator of based on r degrees of freedom, and is the -level deviate from the Student's t-distribution based on r degrees of freedom. Three features of this formula are important in this context: a) The expression inside the square root has to be positive, or else the resulting interval will be imaginary. b) When g is very close to 1, the confidence interval is infinite. c) When g is greater than 1, the overall divisor outside the square brackets is negative and the confidence interval is exclusive. Other methods One problem is that, when g is not small, the confidence interval can blow up when using Fieller's theorem. Andy Grieve has provided a Bayesian solution where the CIs are still sensible, albeit wide. Bootstrapping provides another alternative that does not require the assumption of normality. History Edgar C. Fieller (1907–1960) first started working on this problem while in Karl Pearson's group at University College London, where he was employed for five years after graduating in Mathematics from King's College, Cambridge. He then worked for the Boots Pure Drug Company as a statistician and operational researcher before becoming deputy head of operational research at RAF Fighter Command during the Second World War, after which he was appointed the first head of the Statistics Section at the National Physical Laboratory. See also Gaussian
https://en.wikipedia.org/wiki/ISCB%20Innovator%20Award
The ISCB Innovator Award is a computational biology prize awarded annually to leading scientists who are within two decades post-degree, who consistently make outstanding contributions to the field, and who continue to forge new directions. The prize was established by the International Society for Computational Biology (ISCB) in 2016 and is awarded at the Intelligent Systems for Molecular Biology (ISMB) conference. The inaugural recipient was Serafim Batzoglou. Laureates 2021 - Benjamin J. Raphael 2020 - Xiaole Shirley Liu 2019 - 2018 - M. Madan Babu 2017 - Aviv Regev 2016 - Serafim Batzoglou Other ISCB prizes Overton Prize - "for outstanding accomplishment to a scientist in the early to mid stage of his or her career" ISCB Senior Scientist Award - "members of the computational biology community who are more than 12 to 15 years post-degree and have made major contributions to the field of computational biology through research, education, service, or a combination of the three" See also List of biology awards
https://en.wikipedia.org/wiki/Pappus%27s%20centroid%20theorem
In mathematics, Pappus's centroid theorem (also known as the Guldinus theorem, Pappus–Guldinus theorem or Pappus's theorem) is either of two related theorems dealing with the surface areas and volumes of surfaces and solids of revolution. The theorems are attributed to Pappus of Alexandria and Paul Guldin. Pappus's statement of this theorem appears in print for the first time in 1659, but it was known before, by Kepler in 1615 and by Guldin in 1640. The first theorem The first theorem states that the surface area A of a surface of revolution generated by rotating a plane curve C about an axis external to C and on the same plane is equal to the product of the arc length s of C and the distance d traveled by the geometric centroid of C: For example, the surface area of the torus with minor radius r and major radius R is Proof A curve given by the positive function is bounded by two points given by: and If is an infinitesimal line element tangent to the curve, the length of the curve is given by: The component of the centroid of this curve is: The area of the surface generated by rotating the curve around the x-axis is given by: Using the last two equations to eliminate the integral we have: The second theorem The second theorem states that the volume V of a solid of revolution generated by rotating a plane figure F about an external axis is equal to the product of the area A of F and the distance d traveled by the geometric centroid of F. (The centroid of F is usually different from the centroid of its boundary curve C.) That is: For example, the volume of the torus with minor radius r and major radius R is This special case was derived by Johannes Kepler using infinitesimals. Proof 1 The area bounded by the two functions: and bounded by the two lines: and is given by: The component of the centroid of this area is given by: If this area is rotated about the y-axis, the volume generated can be calculated using the shell method. It is given
https://en.wikipedia.org/wiki/Commutator%20%28electric%29
A commutator is a rotary electrical switch in certain types of electric motors and electrical generators that periodically reverses the current direction between the rotor and the external circuit. It consists of a cylinder composed of multiple metal contact segments on the rotating armature of the machine. Two or more electrical contacts called "brushes" made of a soft conductive material like carbon press against the commutator, making sliding contact with successive segments of the commutator as it rotates. The windings (coils of wire) on the armature are connected to the commutator segments. Commutators are used in direct current (DC) machines: dynamos (DC generators) and many DC motors as well as universal motors. In a motor the commutator applies electric current to the windings. By reversing the current direction in the rotating windings each half turn, a steady rotating force (torque) is produced. In a generator the commutator picks off the current generated in the windings, reversing the direction of the current with each half turn, serving as a mechanical rectifier to convert the alternating current from the windings to unidirectional direct current in the external load circuit. The first direct current commutator-type machine, the dynamo, was built by Hippolyte Pixii in 1832, based on a suggestion by André-Marie Ampère. Commutators are relatively inefficient, and also require periodic maintenance such as brush replacement. Therefore, commutated machines are declining in use, being replaced by alternating current (AC) machines, and in recent years by brushless DC motors which use semiconductor switches. Principle of operation A commutator consists of a set of contact bars fixed to the rotating shaft of a machine, and connected to the armature windings. As the shaft rotates, the commutator reverses the flow of current in a winding. For a single armature winding, when the shaft has made one-half complete turn, the winding is now connected so t
https://en.wikipedia.org/wiki/Liquid%20crystal%20on%20silicon
Liquid crystal on silicon (LCoS or LCOS) is a miniaturized reflective active-matrix liquid-crystal display or "microdisplay" using a liquid crystal layer on top of a silicon backplane. It is also referred to as a spatial light modulator. LCoS was initially developed for projection televisions but is now used for wavelength selective switching, structured illumination, near-eye displays and optical pulse shaping. By way of comparison, some LCD projectors use transmissive LCD, allowing light to pass through the liquid crystal. In an LCoS display, a complementary metal–oxide–semiconductor (CMOS) chip controls the voltage on square reflective aluminium electrodes buried just below the chip surface, each controlling one pixel. For example, a chip with XGA resolution will have 1024x768 plates, each with an independently addressable voltage. Typical cells are about 1–3 centimeters square and about 2 mm thick, with pixel pitch as small as 2.79 μm. A common voltage for all the pixels is supplied by a transparent conductive layer made of indium tin oxide on the cover glass. Displays History The history of LCos projectors dates back to the late 1980s when the technology was first developed. At the time, the primary use of LCos projectors was in the military and scientific fields due to their large and bulky size. However, in the late 1990s, companies like JVC and Hughes Electronics began developing smaller and more affordable LCos projectors for commercial use. The early LCos projectors had their challenges. They suffered from a phenomenon called "image sticking," where the image would remain on the screen after it was supposed to be gone. This was due to the mirrors sticking in their positions, which resulted in ghosting on the screen. However, manufacturers continued to refine the technology, and today's LCos projectors have largely overcome this issue. One of the biggest milestones in the history of LCos projectors came in 2004 when Sony introduced its SXRD (Silicon X
https://en.wikipedia.org/wiki/Jan%20Gullberg
Jan Gullberg (1936 – 21 May 1998) was a Swedish surgeon and anaesthesiologist, but became known as a writer on popular science and medical topics. He is best known outside Sweden as the author of Mathematics: From the Birth of Numbers, published by W. W. Norton in 1997 (). Life Gullberg grew up and was trained as a surgeon in Sweden. He qualified in medicine at the University of Lund in 1964. He practised as a surgeon in Saudi Arabia, Norway and Virginia Mason Hospital, Seattle in the United States, as well as in Sweden. Gullberg saw himself as a doctor rather than a writer. His first book, on science, won the Swedish Medical Society's Jubilee Prize in 1980, and saw him promoted to honorary doctor at the University of Lund the same year. He was twice married: first to Anne-Marie Hallin (d. 1983), with whom he had three children; and Ann Richardson (b. 1951) with whom he adopted twin sons, Kamen and Kalin. He died of a stroke in Nordfjordeid, Norway at the hospital where he was working. Mathematics: From the Birth of Numbers Gullberg's second (and last) book, Mathematics: From the Birth of Numbers, took ten years to write, consuming all of his spare time. It proved a major success; its first edition of 17,000 copies was virtually sold out within six months. Contents The book's 1093 pages address the following topics: Numbers and Language Systems of Numeration Types of Numbers Cornerstones of Mathematics Combinatorics Symbolic Logic Set Theory Introduction to Sequences and Series Theory of Equations Introduction to Functions Overture to the Geometries Elementary Geometry Trigonometry Hyperbolic Functions Analytic Geometry Vector Analysis Fractals Matrices and Determinants Embarking on Calculus Introduction to Differential Calculus Introduction to Integral Calculus Power Series Indeterminate Limits Complex Numbers Revisited Extrema and Critical Points Arc Length Centroids Area Volume Motion Harmonic Analysis Methods of Approxim
https://en.wikipedia.org/wiki/Probabilistic%20metric%20space
In mathematics, probabilistic metric spaces are a generalization of metric spaces where the distance no longer takes values in the non-negative real numbers , but in distribution functions. Let D+ be the set of all probability distribution functions F such that F(0) = 0 (F is a nondecreasing, left continuous mapping from R into [0, 1] such that max(F) = 1). Then given a non-empty set S and a function F: S × S → D+ where we denote F(p, q) by Fp,q for every (p, q) ∈ S × S, the ordered pair (S, F) is said to be a probabilistic metric space if: For all u and v in S, if and only if for all x > 0. For all u and v in S, . For all u, v and w in S, and for . Probability metric of random variables A probability metric D between two random variables X and Y may be defined, for example, as where F(x, y) denotes the joint probability density function of the random variables X and Y. If X and Y are independent from each other then the equation above transforms into where f(x) and g(y) are probability density functions of X and Y respectively. One may easily show that such probability metrics do not satisfy the first metric axiom or satisfies it if, and only if, both of arguments X and Y are certain events described by Dirac delta density probability distribution functions. In this case: the probability metric simply transforms into the metric between expected values , of the variables X and Y. For all other random variables X, Y the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric space, that is: Example For example if both probability distribution functions of random variables X and Y are normal distributions (N) having the same standard deviation , integrating yields: where and is the complementary error function. In this case: Probability metric of random vectors The probability metric of random variables may be extended into metric D(X, Y) of random vectors X, Y by sub
https://en.wikipedia.org/wiki/Chloragogen%20cell
Chloragogen cells, also called y cells, are star-shaped cells in annelids involved with excretory functions and intermediary metabolism. These cells function similarly to the liver found in vertebrates. Chloragogen tissue is most extensively studied in earthworms. Structure and location These cells are derived from the inner coelomic epithelium and are present in the coelomic fluid of some annelids. They have characteristic vesicular bulging due to their function in storing and transporting substances, and are yellow due to the presence of cytosolic granules known as chloragosomes. Function The most understood function of chloragogen tissue is its function in the excretory system. The cells accumulate and excrete nitrogenous wastes and silicates. They are involved in the deamination of amino acids, synthesis of urea, storage of glycogen and toxin neutralization.
https://en.wikipedia.org/wiki/GIS%20and%20Ichthyology
A Geographic Information System is a tool for mapping and analyzing data. The ability to layer many features onto the same map and select or unselect as needed allows for a multitude of views and ease of interpreting data. More important, this allows for in depth scientific analysis and problem solving. Ichthyology involves many areas of study related to fishes and their habitat. The natural habitat is water, but fish are dependent upon many other factors. Water quality, type, food, cover, sediment are essential for the life cycle of any given fish. Being able to map the presence of certain species with layers of these features provides invaluable insight into species requirements. GIS is an essential tool that allows immediate visualization of all data present and to accurately interpret impacts of habitat degradation or species success. GIS GIS is useful when data is specific to a location. It is used to classify, analyze and understand data relationships based on the location and then drawing conclusions from the data. Data capture can occur in the field on small, handheld GPS devices, and then imported and compared to an existing map. This freedom of movement between field and computer is critical to streamlining data collection in field endeavors and generating more accurate data sets. Ichthyology Ichthyology requires an understanding species geographic requirements. Fish require different abiotic environments or sediments for successful completion of biological life cycle based on species. Serious examinations of species should always include habitat because habitat differences create changes in population. Sediment could thereby be mapped and changes in sediment could easily be verified using previous records while simultaneously showing changes in resident fish populations. Various factors relating to the fish life cycle, such as food sources, migration patterns, changes in spawning grounds, could all be more accurately explained and documente
https://en.wikipedia.org/wiki/Grawlix
Grawlix (), also known as obscenicon, is a combination of various typographical symbols or other unpronounceable characters that replaces a profanity. It is mainly used in cartoons and comics. It is used to get around language restrictions or censorship in publishing. At signs (@), dollar signs ($), hashtag sign (#), ampersands (&), percent signs (%), and asterisks (*) are symbols that are often included in a grawlix. History The usage of grawlix can be seen as far back as November 1, 1901, where it appeared in a Lady Bountiful comic. In Lady Bountiful, grawlixes expanded in usage in 1902 to 1903. However, most of the other cartoons were yet to use this new feature. Cartoons such as The Katzenjammer Kids and Lady Bountiful helped to spread grawlix across other comics and media. In 1964, American cartoonist Mort Walker coined the term "grawlix" in his article Let's Get Down to Grawlixes. He elaborated on this further in his book The Lexicon of Comicana. The emoji represents a face with grawlixes over the mouth. It was proposed in 2016 and accepted into Unicode 10.0 in 2017. In November 2022, Merriam-Webster and Hasbro added the word to the seventh edition of The Official Scrabble Players Dictionary, citing familiarity among younger players. Etymology According to the Merriam-Webster post, the word grawlix may have come from the word growl, which is a sound a person makes when they are angry. Example "Come this fall, CBS will debut a 7:30 p.m. sitcom starring 79-year-old William Shatner. The title is $#*! My Dad Says. The opening profanity symbols (called grawlixes) will be pronounced "bleep," but we all know what it stands for." — Michael Storey, The Arkansas Democrat-Gazette, 20 July 2010
https://en.wikipedia.org/wiki/Randomness%20extractor
A randomness extractor, often simply called an "extractor", is a function, which being applied to output from a weak entropy source, together with a short, uniformly random seed, generates a highly random output that appears independent from the source and uniformly distributed. Examples of weakly random sources include radioactive decay or thermal noise; the only restriction on possible sources is that there is no way they can be fully controlled, calculated or predicted, and that a lower bound on their entropy rate can be established. For a given source, a randomness extractor can even be considered to be a true random number generator (TRNG); but there is no single extractor that has been proven to produce truly random output from any type of weakly random source. Sometimes the term "bias" is used to denote a weakly random source's departure from uniformity, and in older literature, some extractors are called unbiasing algorithms, as they take the randomness from a so-called "biased" source and output a distribution that appears unbiased. The weakly random source will always be longer than the extractor's output, but an efficient extractor is one that lowers this ratio of lengths as much as possible, while simultaneously keeping the seed length low. Intuitively, this means that as much randomness as possible has been "extracted" from the source. Note that an extractor has some conceptual similarities with a pseudorandom generator (PRG), but the two concepts are not identical. Both are functions that take as input a small, uniformly random seed and produce a longer output that "looks" uniformly random. Some pseudorandom generators are, in fact, also extractors. (When a PRG is based on the existence of hard-core predicates, one can think of the weakly random source as a set of truth tables of such predicates and prove that the output is statistically close to uniform.) However, the general PRG definition does not specify that a weakly random source must be u
https://en.wikipedia.org/wiki/Geometric%20quotient
In algebraic geometry, a geometric quotient of an algebraic variety X with the action of an algebraic group G is a morphism of varieties such that (i) For each y in Y, the fiber is an orbit of G. (ii) The topology of Y is the quotient topology: a subset is open if and only if is open. (iii) For any open subset , is an isomorphism. (Here, k is the base field.) The notion appears in geometric invariant theory. (i), (ii) say that Y is an orbit space of X in topology. (iii) may also be phrased as an isomorphism of sheaves . In particular, if X is irreducible, then so is Y and : rational functions on Y may be viewed as invariant rational functions on X (i.e., rational-invariants of X). For example, if H is a closed subgroup of G, then is a geometric quotient. A GIT quotient may or may not be a geometric quotient: but both are categorical quotients, which is unique; in other words, one cannot have both types of quotients (without them being the same). Relation to other quotients A geometric quotient is a categorical quotient. This is proved in Mumford's geometric invariant theory. A geometric quotient is precisely a good quotient whose fibers are orbits of the group. Examples The canonical map is a geometric quotient. If L is a linearized line bundle on an algebraic G-variety X, then, writing for the set of stable points with respect to L, the quotient   is a geometric quotient.
https://en.wikipedia.org/wiki/X-ray%20astronomy
X-ray astronomy is an observational branch of astronomy which deals with the study of X-ray observation and detection from astronomical objects. X-radiation is absorbed by the Earth's atmosphere, so instruments to detect X-rays must be taken to high altitude by balloons, sounding rockets, and satellites. X-ray astronomy uses a type of space telescope that can see x-ray radiation which standard optical telescopes, such as the Mauna Kea Observatories, cannot. X-ray emission is expected from astronomical objects that contain extremely hot gases at temperatures from about a million kelvin (K) to hundreds of millions of kelvin (MK). Moreover, the maintenance of the E-layer of ionized gas high in the Earth's thermosphere also suggested a strong extraterrestrial source of X-rays. Although theory predicted that the Sun and the stars would be prominent X-ray sources, there was no way to verify this because Earth's atmosphere blocks most extraterrestrial X-rays. It was not until ways of sending instrument packages to high altitudes were developed that these X-ray sources could be studied. The existence of solar X-rays was confirmed early in the mid-twentieth century by V-2s converted to sounding rockets, and the detection of extra-terrestrial X-rays has been the primary or secondary mission of multiple satellites since 1958. The first cosmic (beyond the Solar System) X-ray source was discovered by a sounding rocket in 1962. Called Scorpius X-1 (Sco X-1) (the first X-ray source found in the constellation Scorpius), the X-ray emission of Scorpius X-1 is 10,000 times greater than its visual emission, whereas that of the Sun is about a million times less. In addition, the energy output in X-rays is 100,000 times greater than the total emission of the Sun in all wavelengths. Many thousands of X-ray sources have since been discovered. In addition, the intergalactic space in galaxy clusters is filled with a hot, but very dilute gas at a temperature between 100 and 1000 megakelv
https://en.wikipedia.org/wiki/Thermal%20pressure
In thermodynamics, thermal pressure (also known as the thermal pressure coefficient) is a measure of the relative pressure change of a fluid or a solid as a response to a temperature change at constant volume. The concept is related to the Pressure-Temperature Law, also known as Amontons's law or Gay-Lussac's law. In general pressure, () can be written as the following sum: . is the pressure required to compress the material from its volume to volume at a constant temperature . The second term expresses the change in thermal pressure . This is the pressure change at constant volume due to the temperature difference between and . Thus, it is the pressure change along an isochore of the material. The thermal pressure is customarily expressed in its simple form as Thermodynamic definition Because of the equivalences between many properties and derivatives within thermodynamics (e.g., see Maxwell Relations), there are many formulations of the thermal pressure coefficient, which are equally valid, leading to distinct yet correct interpretations of its meaning. Some formulations for the thermal pressure coefficient include: Where is the volume thermal expansion, the isothermal bulk modulus, the Grüneisen parameter, the compressibility and the constant-volume heat capacity. Details of the calculation: The utility of the thermal pressure The thermal pressure coefficient can be considered as a fundamental property; it is closely related to various properties such as internal pressure, sonic velocity, the entropy of melting, isothermal compressibility, isobaric expansibility, phase transition, etc. Thus, the study of the thermal pressure coefficient provides a useful basis for understanding the nature of liquid and solid. Since it is normally difficult to obtain the properties by thermodynamic and statistical mechanics methods due to complex interactions among molecules, experimental methods attract much attention. The thermal pressure coefficient is used
https://en.wikipedia.org/wiki/Nephelescope
A nephelescope is a device invented by James Pollard Espy to measure the drop in temperature of a gas from a reduction in pressure; originally used to explore the formation of clouds. Original design The original design consisted of an air compression pump (a), a vessel (b), and a barometer (c). Air is pumped into the vessel until a desired pressure is reached, the stopclock is then closed and the temperature allowed to equilibriate. The stopclock is then opened, allowing the pressure of the container to equilibriate the atmosphere, and then closed again. The air inside of the container would now be colder. As it warms up, pressure inside the container once again increases above atmosphere. This increase in pressure can be used to work out the number of degrees which the container had been cooled by. Later developments A later design consisted of an air pump receiver (a) connected to a flask (c) by an intervening stopclock (b). Air was pumped out of the receiver, then the stopclock was opened. One advantage of using negative pressure was that a glass vessel could be used, which allowed the observation of condensation and droplets resulting from the drop in temperature. To observe this in a dry atmosphere, air would have needed to first be moistened by exposure to water. Historical significance The nephelescope enabled Epsy to predict the change in heat of air as water vapor became cloud. He showed that when dry air was used instead of moist air, temperature was reduced by about twice as much as moist air. In other words, latent heat released from the condensation of water mitigated some of the cooling from expansion of moist air. Since moist air is already lighter than dry air, the warmer and lighter moist air in clouds would continue to rise and cool, forcing more vapor to condense, which had consequences for meteorological theories at that time. The nephelescope has been described as an "early cloud-chamber".
https://en.wikipedia.org/wiki/Paramormyrops
Paramormyrops is a genus of elephantfish in the family Mormyridae from Africa. Species There are currently 11 recognized species in this genus: Paramormyrops batesii (Boulenger 1906) (Kribi mormyrid) Paramormyrops curvifrons (Taverne, Thys van den Audenaerde, Heymer & Géry, 1977) (Ivindo mormyrid) Paramormyrops gabonensis Taverne, Thys van den Audenaerde & Heymer 1977 (Makokou mormyrid) Paramormyrops hopkinsi (Taverne & Thys van den Audenaerde 1985) (Ivindo electric fish) Paramormyrops jacksoni (Poll 1967) (Ghost stonebasher) Paramormyrops kingsleyae (Günther 1896) (Old Calabar mormyrid) Paramormyrops longicaudatus (Taverne, Thys van den Audenaerde, Heymer & Géry 1977) (longtailed mormyrid) Paramormyrops ntotom Rich, Sullivan & Hopkins 2017 (Doume elephantfish) Paramormyrops retrodorsalis (Nichols & Griscom, 1917) (Bima elephantfish) Paramormyrops sphekodes (Sauvage 1879) Paramormyrops tavernei (Poll 1972) (Kipepe elephantfish)
https://en.wikipedia.org/wiki/Raymond%20McLenaghan
Raymond George McLenaghan (born 14 April 1939) is a Canadian theoretical physicist and mathematician. With Carminati, he is known for Carminati–McLenaghan invariants. Notes External links 1939 births Canadian mathematicians Canadian physicists Relativity theorists Theoretical physicists Alumni of the University of Cambridge Living people
https://en.wikipedia.org/wiki/Kappa%20calculus
In mathematical logic, category theory, and computer science, kappa calculus is a formal system for defining first-order functions. Unlike lambda calculus, kappa calculus has no higher-order functions; its functions are not first class objects. Kappa-calculus can be regarded as "a reformulation of the first-order fragment of typed lambda calculus". Because its functions are not first-class objects, evaluation of kappa calculus expressions does not require closures. Definition The definition below has been adapted from the diagrams on pages 205 and 207 of Hasegawa. Grammar Kappa calculus consists of types and expressions, given by the grammar below: In other words, 1 is a type If and are types then is a type. Every variable is an expression If is a type then is an expression If is a type then is an expression If is a type and e is an expression then is an expression If and are expressions then is an expression If x is a variable, is a type, and e is an expression, then is an expression The and the subscripts of , , and are sometimes omitted when they can be unambiguously determined from the context. Juxtaposition is often used as an abbreviation for a combination of and composition: Typing rules The presentation here uses sequents () rather than hypothetical judgments in order to ease comparison with the simply typed lambda calculus. This requires the additional Var rule, which does not appear in Hasegawa In kappa calculus an expression has two types: the type of its source and the type of its target. The notation is used to indicate that expression e has source type and target type . Expressions in kappa calculus are assigned types according to the following rules: {| cellpadding="9" style="text-align:center;" | || (Var) |- | || (Id) |- | || (Bang) |- | || (Comp) |- | | (Lift) |- | |(Kappa) |} In other words, Var: assuming lets you conclude that Id: for any type , Bang: for any type , Comp: if the target type
https://en.wikipedia.org/wiki/Occipital%20emissary%20vein
The occipital emissary vein is a small emissary vein which passes through the condylar canal.
https://en.wikipedia.org/wiki/Holozoic%20nutrition
Holozoic nutrition (Greek: holo-whole ; zoikos-of animals) is a type of heterotrophic nutrition that is characterized by the internalization (ingestion) and internal processing of liquids or solid food particles. Protozoa, such as amoebas, and most of the free living animals, such as humans, exhibit this type of nutrition where food is taken into the body as a liquid or solid and then further broken down is known as holozoic nutrition. Most animals exhibit this kind of nutrition. In Holozoic nutrition, the energy and organic building blocks are obtained by ingesting and then digesting other organisms or pieces of other organisms, including blood and decaying organic matter. This contrasts with holophytic nutrition, in which energy and organic building blocks are obtained through photosynthesis or chemosynthesis, and with saprozoic nutrition, in which digestive enzymes are released externally and the resulting monomers (small organic molecules) are absorbed directly from the environment. There are several stages of holozoic nutrition, which often occur in separate compartments within an organism (such as the stomach and intestines): Ingestion: In animals, this is merely taking food in through the mouth. In protozoa, this most commonly occurs through phagocytosis. Digestion: The physical breakdown of complex large and organic food particles and the enzymatic breakdown of complex organic compounds into small, simple molecules. Absorption: The active and passive transport of the chemical products of digestion out of the food-containing compartment and into the body 4. Assimilation: The chemical products used up for various metabolic processes
https://en.wikipedia.org/wiki/Communication%20Theory%20of%20Secrecy%20Systems
"Communication Theory of Secrecy Systems" is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments (arguably the foundational treatment) of modern cryptography. It is also a proof that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. Shannon published an earlier version of this research in the formerly classified report A Mathematical Theory of Cryptography, Memorandum MM 45-110-02, Sept. 1, 1945, Bell Laboratories. This report also precedes the publication of his "A Mathematical Theory of Communication", which appeared in 1948. See also Confusion and diffusion Product cipher One-time pad Unicity distance
https://en.wikipedia.org/wiki/CLNS1B
Chloride channel, nucleotide-sensitive, 1A, also known as CLNS1A, is a human gene. The protein encoded by this gene is a chloride channel regulator.
https://en.wikipedia.org/wiki/Veterinary%20public%20health
Veterinary public health (VPH) is a component of public health that focuses on the application of veterinary science to protect and improve the physical, mental and social well-being of humans. In several countries activities related to VPH are organized by the chief veterinary officer. Conventionally veterinary public health as a topic covers the following areas: Food production and safety It is desirable to consider food production as a chain, with animals reared on the farm (pre-harvest) then going for primary processing (harvest), secondary processing and distribution followed by final preparation (all post-harvest). This "farm to fork" concept can be easily described by considering a beef animal on a farm going to slaughter at the abattoir, then the hamburger production plant, then being distributed to a supermarket. The hamburger is then sold, taken home, stored, cooked and eaten. Veterinary public health concerns all aspects of food production chain from controlling epidemic diseases that may impact on agriculture, to ensuring slaughter is conducted safely and humanely, to informing the public on safe ways to store and cook hamburgers. Zoonosis control A zoonosis may be defined as any disease and/or infection which is naturally transmissible between animals and man. They are a major public concern. Headlines on issues like avian influenza, BSE (mad cow disease) and salmonella of eggs have dominated the UK newspaper headlines for the last thirty years. The picture in developed and developing countries may be quite different as far as zoonoses are concerned. In developed countries the consumer has very little contact with the live animal, limiting transmission from live animals to the general public. In addition food safety is extremely regulated. Despite this food borne disease is still a big problem in developed countries. In the EU in 2006, a total of 175,561 confirmed cases of campylobacteriosis were reported from 21 member states and reported cases
https://en.wikipedia.org/wiki/Mott%E2%80%93Bethe%20formula
The Mott–Bethe formula is an approximation used to calculate atomic electron scattering form factors, , from atomic X-ray scattering form factors, . The formula was derived independently by Hans Bethe and Neville Mott both in 1930, and simply follows from applying the first Born approximation for the scattering of electrons via the Coulomb interaction together with the Poisson equation for the charge density of an atom (including both the nucleus and electron cloud) in the Fourier domain. Following the first Born approximation, Here, is the magnitude of the scattering vector of momentum-transfer cross section in reciprocal space (in units of inverse distance), the atomic number of the atom, is Planck's constant, is the vacuum permittivity, and is the electron rest mass, is the Bohr Radius, and is the dimensionless X-ray scattering form factor for the electron density. The electron scattering factor has units of length, as is typical for the scattering factor, unlike the X-ray form factor which is usually presented in dimensionless units. To perform a one-to-one comparison between the electron and X-ray form factors in the same units, the X-ray form factor should be multiplied by the square root of the Thomson cross section , where is the classical electron radius, to convert it back to a unit of length. The Mott–Bethe formula was originally derived for free atoms, and is rigorously true provided the X-ray scattering form factor is known exactly. However, in solids, the accuracy of the Mott–Bethe formula is best for large values of ( Å) because the distribution of the charge density at smaller (i.e. long distances) can deviate from the atomic distribution of electrons due the chemical bonds between atoms in a solid. For smaller values of , can be determined from tabulated values, such as those in the International Tables for Crystallography using (non)relativistic Hartree–Fock calculations, or other numerical parameterizations of the calculated charge
https://en.wikipedia.org/wiki/SECIOP
In distributed computing, SECIOP (SECure Inter-ORB Protocol) is a protocol for secure inter-ORB communication.
https://en.wikipedia.org/wiki/Glossary%20of%20mycology
This glossary of mycology is a list of definitions of terms and concepts relevant to mycology, the study of fungi. Terms in common with other fields, if repeated here, generally focus on their mycology-specific meaning. Related terms can be found in glossary of biology and glossary of botany, among others. List of Latin and Greek words commonly used in systematic names and Botanical Latin may also be relevant, although some prefixes and suffixes very common in mycology are repeated here for clarity. A B C D E F G H I J K L M N O P R S T U V W X Y Z See also List of mycologists Outline of fungi Outline of lichens Glossary of lichen terms
https://en.wikipedia.org/wiki/BESS%20%28experiment%29
BESS is a particle physics experiment carried by a balloon. BESS stands for Balloon-borne Experiment with Superconducting Spectrometer. BESS is a series of experiments that started in 1993, and a later incarnation, BESS-Polar, circled the Antarctic from December 13 to December 21, 2004, for a total of 8 days 17 hours and 2 minutes. This joint Japanese and American project is supported by the Laboratory for High Energy Astrophysics (LHEA) at NASA's GSFC and the KEK. Overview The mission of this experiment is to detect antiparticles in the cosmic radiation at high altitudes. It is therefore designed to be carried aloft by balloon. The central detection device is a magnetic spectrometer, that is used to identify all electrically charged particles crossing its main detection aperture. Mission members are working on improving the sensitivity and precision of this system with each new launch. Scientific goals Theories of the beginning of the Universe are based on the currently-known laws of particle physics, where matter is created from energy in such a way that equal amounts of particles and antiparticles are produced. If this is so, then an amount of antimatter equal to the amount of currently visible matter must exist—though there is an equal possibility the bulk of the antimatter may have been annihilated due to the mechanism of CP violation. The aim of BESS therefore is to quantify the amount of antiparticles in the local cosmos and so help to decide between these alternatives. Up to this point, only antiprotons have been detected, which can be produced via collisions of the cosmic radiation with atoms in the thin atmosphere above the balloon. Therefore, two strategies are employed to obtain the value for the flux of antiparticles from outer space: Measure the properties of the antiproton flux precisely and look for deviations from the expected behavior of antiprotons created in the atmosphere. Look for larger antinuclei, for instance antihelium, that cannot be
https://en.wikipedia.org/wiki/Map%20of%20the%20Duke%20of%20Noja
The Map of the Duke of Noja is a topographic map of city of Naples and its environs, created in 1775. It was the primary topographic and urban planning tool for Naples between the 17th and 19th centuries. The map was an important cartographic and urban planning tool for the city, an embellished art piece, and has been used even recently, for example, to document the genesis and original layout of a group of over one hundred twenty 18th and 19th century Villas, partly located in Herculaneum and Torre del Greco, collectively known as the Vesuvian Villas of the Golden Mile. History The commissioning of the cartographic study depicting Naples and its environs dates to April 29, 1750, when the Tribunal of Electors of San Lorenzo entrusted its execution to Giovanni Carafa, Duke of Noja. The work relied technically on the skilled land surveyor Vanti. Originally projected to take two and half years, in reality the work took much longer. The use of the Plane table (tavoli praetorian) made the Noja Map the first true map of the Naples, it's underlying data was based on rigorous survey and topographical accuracy and was much closer to reality than were the bird's-eye views of previous centuries. When Carafa died 1768, the project was not complete and passed to the direction of Giovanni Pignatelli, prince of Monteroduni, who in turn enlisted the aid of architect Gaetano Brunzuoli, as technical superintendent. Brunuoili, who was at the time completing the construction of the Duke of Noja's house. Brunzuoli's work updated the mapping to depict the urban changes that had occurred over the course of the work. The cartography was completed in 1775 and was accompanied by a topographical index created by Nicola Carletti, professor of architecture and mathematics at the University of Naples and architect of the city of Naples. The first hundred copies were made by the royal printer Vittorio Barbacci, while subsequent copies were published by the Roman printer, Antonio Cenci. The
https://en.wikipedia.org/wiki/Sorptivity
In 1957 John Philip introduced the term sorptivity and defined it as a measure of the capacity of the medium to absorb or desorb liquid by capillarity. According to C Hall and W D Hoff, the sorptivity expresses the tendency of a material to absorb and transmit water and other liquids by capillarity. The sorptivity is widely used in characterizing soils and porous construction materials such as brick, stone and concrete. Calculation of the true sorptivity required numerical iterative procedures dependent on soil water content and diffusivity. John R. Philip (1969) showed that sorptivity can be determined from horizontal infiltration where water flow is mostly controlled by capillary absorption: where S is sorptivity and I is the cumulative infiltration (i.e. distance) at time t. Its associated SI unit is m⋅s−1/2. For vertical infiltration, Philip's solution is adapted using a parameter A1. This results in the following equations, which are valid for short times: cumulative: rate: where the sorptivity S is defined (when a sharp wetting front Lf exists) as:
https://en.wikipedia.org/wiki/Routing%20in%20delay-tolerant%20networking
Routing in delay-tolerant networking concerns itself with the ability to transport, or route, data from a source to a destination, which is a fundamental ability all communication networks must have. Delay- and disruption-tolerant networks (DTNs) are characterized by their lack of connectivity, resulting in a lack of instantaneous end-to-end paths. In these challenging environments, popular ad hoc routing protocols such as AODV and DSR fail to establish routes. This is due to these protocols trying to first establish a complete route and then, after the route has been established, forward the actual data. However, when instantaneous end-to-end paths are difficult or impossible to establish, routing protocols must take to a "store and forward" approach, where data is incrementally moved and stored throughout the network in hopes that it will eventually reach its destination. A common technique used to maximize the probability of a message being successfully transferred is to replicate many copies of the message in hopes that one will succeed in reaching its destination. Routing considerations There are many characteristics DTN protocols, including routing, must take into consideration. A first consideration is if information about future contacts is readily available. For example, in interplanetary communications, many times a planet or moon is the cause of contact disruption, and large distance is the cause of communication delay. However, due to the laws of physics, it is possible to predict the future in terms of the times contacts will be available, and how long they will last. These types of contacts are known as scheduled or predictable contacts. On the contrary, in disaster recovery networks the future location of communicating entities, such as emergency responders, may not be known. These types of contacts are known as intermittent or opportunistic contacts. A second consideration is if mobility can be exploited and, if so, which nodes are mobi
https://en.wikipedia.org/wiki/Johnson%27s%20figure%20of%20merit
Johnson's figure of merit is a measure of suitability of a semiconductor material for high frequency power transistor applications and requirements. More specifically, it is the product of the charge carrier saturation velocity in the material and the electric breakdown field under same conditions, first proposed by Edward O. Johnson of RCA in 1965. Note that this figure of merit (FoM) is applicable to both field-effect transistors (FETs), and with proper interpretation of the parameters, also to bipolar junction transistors (BJTs). Example materials JFM figures vary wildly between sources - see external links and talk page. External links Gallium Nitride as an Electromechanical Material. R-Z. IEEE 2014 Table IV (p 5) lists JFM (relative to Si) : Si=1, GaAs=2.7, SiC=20, InP=0.33, GaN=27.5, also shows Vsat and Ebreakdown. Why diamond? gives very different figures (but no refs) : Si GaAs GaN SiC diamond JFM 1 11 790 410 5800
https://en.wikipedia.org/wiki/Antonella%20Cupillari
Antonella Cupillari (born 1955) is an Italian-American mathematician interested in the history of mathematics and mathematics education. She is an associate professor of mathematics at Penn State Erie, The Behrend College. Education and career Cupillari earned a laurea at the University of L'Aquila in 1978, and completed her Ph.D. at the University at Albany, SUNY in 1984. Her dissertation, A Small Boundary for on a Strictly Pseudoconvex Domain, concerned functional analysis, and was supervised by R. Michael (Rolf) Range; she also published it in the Proceedings of the American Mathematical Society. Cupillari joined the faculty at Penn State Erie in 1984 and was promoted to associate professor in 1992. Books Cupillari is the author of books on mathematics and the history of mathematics including: The Nuts and Bolts of Proofs (Wadsworth, 1989; 2nd & 3rd eds., Harcourt/Academic Press, 2000 & 2005; 4th ed., Academic Press, 2011) Intermediate Algebra in Action (PWS Publishing, 1995) A Biography of Maria Gaetana Agnesi, an Eighteenth-Century Woman Mathematician: With Translations of Some of Her Work from Italian into English (Edwin Mellen Press, 2007) Recognition Cupillari was the 2008 winner of the Award for Distinguished College or University Teaching of Mathematics of the Allegheny Mountain Section of the Mathematical Association of America.
https://en.wikipedia.org/wiki/RadiumOne
RadiumOne is a marketing company that provides online display, mobile, video, and social advertising services through programmatic marketing campaigns. It was first launched in 2009 by Gurbaksh Chahal. The company buys advertising space from websites and mobile applications and resells it in targeted packages to advertisers and agencies. It also creates software that automates the process of media buying for digital marketers. Headquartered in San Francisco, the firm has offices across North America, Europe and Asia. History The company was launched in 2009 by Gurbaksh Chahal as gWallet, a loyalty and rewards program. After CEO Gurbaksh Chahal pleaded guilty to domestic violence battery on April 23, 2014, there were calls for him to step down as CEO. He was fired and immediately replaced by COO Bill Lonergan on April 27, 2014. RadiumOne utilizes data taken from social networking sites such as Twitter and Facebook and uses it to tailor online, video, social and mobile consumer advertising using real-time bidding. The startup had initially raised $33.5 million from venture capital funds. In June 2017, RhythmOne acquired the assets of data-driven marketing platform RadiumOne for US$22m.
https://en.wikipedia.org/wiki/Pantry
A pantry is a room or cupboard where beverages, food, (sometimes) dishes, household cleaning products, linens or provisions are stored within a home or office. Food and beverage pantries serve in an ancillary capacity to the kitchen. Etymology The word "pantry" derives from the same source as the Old French term ; that is from , the French form of the Latin , "bread". History in Europe and United States Late Middle Ages In a late medieval hall, there were separate rooms for the various service functions and food storage. The pantry was where bread was kept and food preparation was done. The head of the office who is responsible for this room is referred to as a pantler. There were similar rooms for storage of bacon and other meats (larder), alcoholic beverages (buttery, known for the "butts", or barrels, stored there), and cooking (kitchen). Colonial Era In the United States, pantries evolved from early Colonial American "butteries", built in a cold north corner of a colonial home (more commonly referred to and spelled as "butt'ry"), into a variety of pantries in self-sufficient farmsteads. Butler's pantries, or China pantries, were built between the dining room and kitchen of a middle-class English or American home, especially in the latter part of the 19th into the early 20th centuries. Great estates, such as the Biltmore Estate in Asheville, North Carolina or Stan Hywet Hall in Akron, Ohio, had many pantries and other domestic "offices", echoing their British "great house" counterparts. Victorian Era By the Victorian era, large houses and estates in Britain maintained the use of separate rooms, each one dedicated to distinct stages of food preparation and cleanup. The kitchen was for cooking, while food was stored in a storeroom, pantry or cellar. Meat preparation was done in a larder as game would come in undressed, fish unfilleted, and meat in half or quarter carcasses. Vegetable cleaning and preparation would be done in the scullery. Dishwashing was done
https://en.wikipedia.org/wiki/Entropy%20%28classical%20thermodynamics%29
In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relationship of the internal energy that is available or unavailable for transformations in form of heat and work. Entropy predicts that certain processes are irreversible or impossible, despite not violating the conservation of energy. The definition of entropy is central to the establishment of the second law of thermodynamics, which states that the entropy of isolated systems cannot decrease with time, as they always tend to arrive at a state of thermodynamic equilibrium, where the entropy is highest. Entropy is therefore also considered to be a measure of disorder in the system. Ludwig Boltzmann explained the entropy as a measure of the number of possible microscopic configurations of the individual atoms and molecules of the system (microstates) which correspond to the macroscopic state (macrostate) of the system. He showed that the thermodynamic entropy is , where the factor has since been known as the Boltzmann constant. Concept Differences in pressure, density, and temperature of a thermodynamic system tend to equalize over time. For example, in a room containing a glass of melting ice, the difference in temperature between the warm room and the cold glass of ice and water is equalized by energy flowing as heat from the room to the cooler ice and water mixture. Over time, the temperature of the glass and its contents and the temperature of the room achieve a balance. The entropy of the room has decreased. However, the entropy of the glass of ice and water has increased more than the entropy of the room has decreased. In an isolated system, such as the room and ice water taken together, the dispersal of energy from warmer to cooler regions always results in a net increase in entropy. Thus, when the system of the room
https://en.wikipedia.org/wiki/Reticular%20membrane
The reticular membrane (RM, also called reticular lamina or apical cuticular plate) is a thin, stiff lamina that extends from the outer hair cells to the Hensen's cells. The RM is composed of "minute-fiddle-shaped cuticular structures" called the phalangeal extensions of the outer hair cells, interspaced with extensions coming from the outer phalangeal cells. The RM separates endolymph in the cochlear duct from underlying corticolymph and perilymph of the scala tympani. The hair processes of the outer hair cells emerge through and above the RM, thus immobilizing the apical pole of the outer hair cells. At the opposite basilar pole, the outer hair cells are firmly held by the phalangeal cells. The inner phalangeal cells that surround the inner hair cells reach the surface of the organ of Corti, but, even their inner-most row, are not included in the reticular membrane. Thus, the RM up to the outer edge of the tectorial membrane and does not extend unto the surface of the organ of Corti. Additional images Notes External links Diagram at une.edu Animation at bioanim.com Ear
https://en.wikipedia.org/wiki/Segal%20category
In mathematics, a Segal category is a model of an infinity category introduced by , based on work of Graeme Segal in 1974.
https://en.wikipedia.org/wiki/Jan%20Maciejowski
Jan M. Maciejowski FIEEE is a British electrical engineer. He is Professor Emeritus of Control Engineering at the University of Cambridge. He is notable for his contributions to system identification and control. Career Maciejowski studied at the University of Sussex (BSc 1971) and the University of Cambridge (PhD 1978). As an academic, he joined the University of Cambridge where he became Professor of Control Engineering and a fellow of Pembroke College. From 2009 to 2014, Maciejowski was head of the Information Engineering Division within the University of Cambridge's Department of Engineering. From 2008 to 2018, he was President of Pembroke College. Maciejowski's research has dealt with various aspects of control engineering, notably fault-tolerant control, autonomous systems, model predictive control and system identification. When awarding him a fellowship, the IEEE cited Maciejowski's "contributions to system identification and control". He has played an active role in the activities of several learned societies, and was president of the European Control Association from 2003 to 2005. He retired in 2018. Awards Fellowship of the IEEE, awarded in 2011. See also Control reconfiguration Autonomous robot Model predictive control System identification JoJO References Further reading Members of the University of Cambridge Department of Engineering Alumni of the University of Cambridge Alumni of the University of Sussex British electrical engineers Control theorists Fellow Members of the IEEE Fellows of Pembroke College, Cambridge Living people Year of birth missing (living people)
https://en.wikipedia.org/wiki/Historical%20colours%2C%20standards%20and%20guidons
The following is a list of historical military colours, standards and guidons in different countries that do not exist today. France Middle Ages During the Middle Ages, units did not have specific colours attached to them; rather, they often wore the heraldry of their lord. The armies of France often used the fleur-de-lis, a symbol of the Capetian dynasty. The King of France also had an official battle standard, the Oriflamme: a special flag, red with gold, and the motto "Montjoie Saint-Denis". When the Oriflamme was taken into battle, it signified that there would be no quarter given to the enemy. English soldiers during the same time period sometimes wore Saint George's Cross as a symbol of identification. Ancien Régime (15th-18th centuries) The French colours of the Ancien Régime got the same design: a white cross, the Cross of France (vertical cross, but sometimes it was a St Andrew's cross, like the "Royal Deux Ponts" Régiment's flag). The rest of the standard was depending on the regiment. Often, the Cross of France divided the flag in four equal quarters. The quarters could have the same colour (specially for the Marine troops's flags). Sometimes, there were two colours: the top-left and the bottom-right quarters of one colour, the top-right and the bottom-left of another. In the most part of the time, lys flowers were on the Cross of France, with an inscription or a motto. The regiment often got the name of their province: Picardie (the oldest regiment of Europe), Normandie, Piemont, Provence... but were also called with special names, like "Régiment de la Reine" (Queen's regiment), which had a dark green and black quartered flag, with the cross of France. Each regiment had its flag. The colonel, at the head of the regiment, had a special flag: it also had the white Cross of France, but the four quarters were white too (white was the colour of the French monarchy). The actual flag of Quebec has exactly the design of the French colours of this time. The s
https://en.wikipedia.org/wiki/Desensitization%20%28medicine%29
In medicine, desensitization is a method to reduce or eliminate an organism's negative reaction to a substance or stimulus. In pharmacology, drug desensitization refers to two related concepts. First, desensitization may be equivalent to drug tolerance and refers to subjects' reactions (positive or negative) to a drug reducing following its repeated use. This is a macroscopic, organism-level effect and differs from the second meaning of desensitization, which refers to a biochemical effect where individual receptors become less responsive after repeated application of an agonist. This may be mediated by phosphorylation, for instance by beta adrenoceptor kinase at the beta adrenoceptor. Application to allergies For example, if a person with diabetes mellitus has a bad allergic reaction to taking a full dose of beef insulin, the person is given a very small amount of the insulin at first, so small that the person has no adverse reaction or very limited symptoms as a result. Over a period of time, larger doses are given until the person is taking the full dose. This is one way to help the body get used to the full dose, and to avoid having the allergic reaction to beef-origin insulin. A temporary desensitization method involves the administration of small doses of an allergen to produces an IgE-mediated response in a setting where an individual can be resuscitated in the event of anaphylaxis; this approach, through uncharacterized mechanisms, eventually overrides the hypersensitive IgE response. Desensitization approaches for food allergies are generally at the research stage. They include: oral immunotherapy, which involves building up tolerance by eating a small amount of (usually baked) food; sublingual immunotherapy, which involves placing a small drop of milk or egg white under the tongue; epicutaneous immunotherapy, which injects the allergic food under the skin; monoclonal anti-IgE antibodies, which non-specifically reduce the body's capacity to produce
https://en.wikipedia.org/wiki/Glue%20code
In computer programming, glue code is executable code (often source code) that serves solely to "adapt" different parts of code that would otherwise be incompatible. Glue code does not contribute any functionality towards meeting program requirements. Instead, it often appears in code that lets existing libraries or programs interoperate, as in language bindings or foreign function interfaces such as the Java Native Interface, when mapping objects to a database using object-relational mapping, or when integrating two or more commercial off-the-shelf programs. Glue code may be written in the same language as the code it is gluing together, or in a separate glue language. Glue code is very efficient in rapid prototyping environments, where several components are quickly put together into a single language or framework. Consequences Because each component is independent (i.e. it is unaware of its relations and is only connected to another component through glue code), the behavior of a component and its interactions can change during the execution of the script. In addition, a different version of one of the components may behave differently, breaking the glue code. High-level programming languages can suffer from performance penalties because glue code must run through the language interpreter, even when connecting high-performance subsystems. If performance is crucial, using configuration scripting is often preferred to directly connecting binary interfaces of components. In object-oriented scripting languages, glue code often eliminates the need for class hierarchies and large numbers of classes. See also Adapter pattern Scripting language Shell script SWIG Lua (programming language) Glue logic WinGlue Wrapper function Wrapper library Method stub
https://en.wikipedia.org/wiki/Energy%20homeostasis
In biology, energy homeostasis, or the homeostatic control of energy balance, is a biological process that involves the coordinated homeostatic regulation of food intake (energy inflow) and energy expenditure (energy outflow). The human brain, particularly the hypothalamus, plays a central role in regulating energy homeostasis and generating the sense of hunger by integrating a number of biochemical signals that transmit information about energy balance. Fifty percent of the energy from glucose metabolism is immediately converted to heat. Energy homeostasis is an important aspect of bioenergetics. Definition In the US, biological energy is expressed using the energy unit Calorie with a capital C (i.e. a kilocalorie), which equals the energy needed to increase the temperature of 1 kilogram of water by 1 °C (about 4.18 kJ). Energy balance, through biosynthetic reactions, can be measured with the following equation: Energy intake (from food and fluids) = Energy expended (through work and heat generated) + Change in stored energy (body fat and glycogen storage) The first law of thermodynamics states that energy can be neither created nor destroyed. But energy can be converted from one form of energy to another. So, when a calorie of food energy is consumed, one of three particular effects occur within the body: a portion of that calorie may be stored as body fat, triglycerides, or glycogen, transferred to cells and converted to chemical energy in the form of adenosine triphosphate (ATP – a coenzyme) or related compounds, or dissipated as heat. Energy Intake Energy intake is measured by the amount of calories consumed from food and fluids. Energy intake is modulated by hunger, which is primarily regulated by the hypothalamus, and choice, which is determined by the sets of brain structures that are responsible for stimulus control (i.e., operant conditioning and classical conditioning) and cognitive control of eating behavior. Hunger is regulated in part by the act
https://en.wikipedia.org/wiki/Nuclear%20orientation
Nuclear orientation, in nuclear physics, is the directional ordering of an assembly of nuclear spins with respect to some axis in space. It is one of the nuclear spectroscopy methods. A nuclear level with spin in a magnetic field will divide into magnetic sub-levels with an energy spacing. The populations of these levels are determined by the Boltzmann distribution at a steady temperature and will essentially be equal. The exponential in the Boltzmann distribution should not be equal to 1 to obtain unequal populations. To achieve this, cooling to a temperature of around 10 millikelvin is needed. Typically, this is achieved by implanting the nuclei of interest into ferromagnetic hosts. In the mid-1940s, Yevgeny Zavoisky developed electron paramagnetic resonance, eventually leading to the concept of nuclear orientation. In the early 1950s, Neville Robinson, Jim Daniels, and Michael Grace produced an example of nuclear orientation for the first time at the Clarendon Laboratory, University of Oxford. There is now a Nuclear Orientation Group at Oxford. Bibliography K. S. Krane, Nuclear orientation and nuclear structure. Hyperfine Interactions, Volume 43, Numbers 1–4, pages 3–14, December, 1988. B. Bleaney, Cross-relaxation and nuclear orientation in ytterbium vanadate. Proceedings: Mathematical, Physical and Engineering Sciences, Volume 455, Number 1988, pages 2835–2839, 8 August 1999. Published by The Royal Society. B. Bleaney, Dynamic nuclear polarization and nuclear orientation in terbium vanadate. Applied Magnetic Resonance, Volume 21, Number 1, pages 35–38, December, 1988. See also Nuclear magnetic resonance Mössbauer spectroscopy Perturbed angular correlation
https://en.wikipedia.org/wiki/Flip%20%28mathematics%29
In algebraic geometry, flips and flops are codimension-2 surgery operations arising in the minimal model program, given by blowing up along a relative canonical ring. In dimension 3 flips are used to construct minimal models, and any two birationally equivalent minimal models are connected by a sequence of flops. It is conjectured that the same is true in higher dimensions. The minimal model program The minimal model program can be summarised very briefly as follows: given a variety , we construct a sequence of contractions , each of which contracts some curves on which the canonical divisor is negative. Eventually, should become nef (at least in the case of nonnegative Kodaira dimension), which is the desired result. The major technical problem is that, at some stage, the variety may become 'too singular', in the sense that the canonical divisor is no longer a Cartier divisor, so the intersection number with a curve is not even defined. The (conjectural) solution to this problem is the flip. Given a problematic as above, the flip of is a birational map (in fact an isomorphism in codimension 1) to a variety whose singularities are 'better' than those of . So we can put , and continue the process. Two major problems concerning flips are to show that they exist and to show that one cannot have an infinite sequence of flips. If both of these problems can be solved, then the minimal model program can be carried out. The existence of flips for 3-folds was proved by . The existence of log flips, a more general kind of flip, in dimension three and four were proved by whose work was fundamental to the solution of the existence of log flips and other problems in higher dimension. The existence of log flips in higher dimensions has been settled by . On the other hand, the problem of termination—proving that there can be no infinite sequence of flips—is still open in dimensions greater than 3. Definition If is a morphism, and K is the canonical bundle o
https://en.wikipedia.org/wiki/Cytokine
Cytokines are a broad and loose category of small proteins (~5–25 kDa) important in cell signaling. Due to their size, cytokines cannot cross the lipid bilayer of cells to enter the cytoplasm and therefore typically exert their functions by interacting with specific cytokine receptors on the target cell surface. Cytokines have been shown to be involved in autocrine, paracrine and endocrine signaling as immunomodulating agents. Cytokines include chemokines, interferons, interleukins, lymphokines, and tumour necrosis factors, but generally not hormones or growth factors (despite some overlap in the terminology). Cytokines are produced by a broad range of cells, including immune cells like macrophages, B lymphocytes, T lymphocytes and mast cells, as well as endothelial cells, fibroblasts, and various stromal cells; a given cytokine may be produced by more than one type of cell. They act through cell surface receptors and are especially important in the immune system; cytokines modulate the balance between humoral and cell-based immune responses, and they regulate the maturation, growth, and responsiveness of particular cell populations. Some cytokines enhance or inhibit the action of other cytokines in complex ways. They are different from hormones, which are also important cell signaling molecules. Hormones circulate in higher concentrations, and tend to be made by specific kinds of cells. Cytokines are important in health and disease, specifically in host immune responses to infection, inflammation, trauma, sepsis, cancer, and reproduction. The word comes from the ancient Greek language: cyto, from Greek κύτος, kytos, 'cavity, cell' + kines, from Greek κίνησις, kinēsis, 'movement'. Discovery Interferon-alpha, an interferon type I, was identified in 1957 as a protein that interfered with viral replication. The activity of interferon-gamma (the sole member of the interferon type II class) was described in 1965; this was the first identified lymphocyte-derived med
https://en.wikipedia.org/wiki/Gravitino
In supergravity theories combining general relativity and supersymmetry, the gravitino () is the gauge fermion supersymmetric partner of the hypothesized graviton. It has been suggested as a candidate for dark matter. If it exists, it is a fermion of spin and therefore obeys the Rarita–Schwinger equation. The gravitino field is conventionally written as ψμα with a four-vector index and a spinor index. For one would get negative norm modes, as with every massless particle of spin 1 or higher. These modes are unphysical, and for consistency there must be a gauge symmetry which cancels these modes: , where εα(x) is a spinor function of spacetime. This gauge symmetry is a local supersymmetry transformation, and the resulting theory is supergravity. Thus the gravitino is the fermion mediating supergravity interactions, just as the photon is mediating electromagnetism, and the graviton is presumably mediating gravitation. Whenever supersymmetry is broken in supergravity theories, it acquires a mass which is determined by the scale at which supersymmetry is broken. This varies greatly between different models of supersymmetry breaking, but if supersymmetry is to solve the hierarchy problem of the Standard Model, the gravitino cannot be more massive than about 1 TeV/c2. History Murray Gell-Mann and Peter van Nieuwenhuizen intended the spin-3/2 particle associated with supergravity to be called the 'hemitrion', meaning 'half-3', however the editors of Physical Review were not keen on the name and instead suggested 'massless Rarita–Schwinger particle' for their 1977 publication. The current name of gravitino was instead suggested by Sidney Coleman and Heinz Pagels, although this term was originally coined in 1954 by Felix Pirani to describe a class of negative energy excitations with zero rest mass. Gravitino cosmological problem If the gravitino indeed has a mass of the order of TeV, then it creates a problem in the standard model of cosmology, at least naïvely.
https://en.wikipedia.org/wiki/Penny%20graph
In geometric graph theory, a penny graph is a contact graph of unit circles. It is formed from a collection of unit circles that do not cross each other, by creating a vertex for each circle and an edge for every pair of tangent circles. The circles can be represented physically by pennies, arranged without overlapping on a flat surface, with a vertex for each penny and an edge for each two pennies that touch. Penny graphs have also been called unit coin graphs, because they are the coin graphs formed from unit circles. If each vertex is represented by a point at the center of its circle, then two vertices will be adjacent if and only if their distance is the minimum distance among all pairs of vertices. Therefore, penny graphs have also been called minimum-distance graphs, smallest-distance graphs, or closest-pairs graphs. Similarly, in a mutual nearest neighbor graph that links pairs of points in the plane that are each other's nearest neighbors, each connected component is a penny graph, although edges in different components may have different lengths. Every penny graph is a unit disk graph and a matchstick graph. Like planar graphs more generally, they obey the four color theorem, but this theorem is easier to prove for penny graphs. Testing whether a graph is a penny graph, or finding its maximum independent set, is NP-hard; however, both upper and lower bounds are known for the size of the maximum independent set, higher than the bounds that are possible for arbitrary planar graphs. Properties Number of edges Every vertex in a penny graph has at most six neighboring vertices; here the number six is the kissing number for circles in the plane. However, the pennies on the boundary of the convex hull have fewer neighbors. Counting more precisely this reduction in neighbors for boundary pennies leads to a precise bound on the number of edges in any penny graph: a penny graph with vertices has at most edges. Some penny graphs, formed by arranging the pennies
https://en.wikipedia.org/wiki/Medical%20cannabis
Medical cannabis, or medical marijuana (MMJ), is cannabis and cannabinoids that are prescribed by physicians for their patients. The use of cannabis as medicine has not been rigorously tested due to production and governmental restrictions, resulting in limited clinical research to define the safety and efficacy of using cannabis to treat diseases. Preliminary evidence has indicated that cannabis might reduce nausea and vomiting during chemotherapy and reduce chronic pain and muscle spasms. Regarding non-inhaled cannabis or cannabinoids, a 2021 review found that it provided little relief against chronic pain and sleep disturbance, and caused several transient adverse effects, such as cognitive impairment, nausea, and drowsiness. Short-term use increases the risk of minor and major adverse effects. Common side effects include dizziness, feeling tired, vomiting, and hallucinations. Long-term effects of cannabis are not clear. Concerns include memory and cognition problems, risk of addiction, schizophrenia in young people, and the risk of children taking it by accident. Many cultures have used cannabis for therapeutic purposes for thousands of years. Some American medical organizations have requested removal of cannabis from the list of Schedule I controlled substances maintained by the United States federal government, followed by regulatory and scientific review. Others oppose its legalization, such as the American Academy of Pediatrics. Medical cannabis can be administered through various methods, including capsules, lozenges, tinctures, dermal patches, oral or dermal sprays, cannabis edibles, and vaporizing or smoking dried buds. Synthetic cannabinoids are available for prescription use in some countries, such as dronabinol and nabilone. Countries that allow the medical use of whole-plant cannabis include Argentina, Australia, Canada, Chile, Colombia, Germany, Greece, Israel, Italy, the Netherlands, Peru, Poland, Portugal, Spain, and Uruguay. In the United Stat
https://en.wikipedia.org/wiki/Gesine%20Reinert
Gesine Reinert is a German statistician who is University Professor in Statistics at the University of Oxford. She is a Fellow of Keble College, Oxford, a Fellow of the Alan Turing Institute, and a Fellow of the Institute of Mathematical Statistics. Her research concerns the probability theory and statistics of biological sequences and biological networks. Reinert has also been associated with the M. Lothaire pseudonymous mathematical collaboration on combinatorics on words. Education Reinert earned a diploma in mathematics from the University of Göttingen in 1989. She went on to graduate study in applied mathematics at the University of Zurich, completing her Ph.D. in 1994. Her dissertation, in probability theory, was A Weak Law of Large Numbers for Empirical Measures via Stein's Method, and Applications, and was supervised by Andrew Barbour. Career Reinert worked as a lecturer at the University of Southern California from 1994 to 1996 and the University of California, Los Angeles from 1996 to 1998, and as a senior research fellow at King's College, Cambridge from 1998 to 2000. She joined the Oxford faculty in 2000, and was given a professorship there in 2004.
https://en.wikipedia.org/wiki/Mycoplasma%20haemomuris
Mycoplasma haemomuris, formerly known as Haemobartonella muris and Bartonella muris, is a Gram-negative bacillus. It is known to cause anemia in rats and mice.
https://en.wikipedia.org/wiki/Diagnostic%20microbiology
Diagnostic microbiology is the study of microbial identification. Since the discovery of the germ theory of disease, scientists have been finding ways to harvest specific organisms. Using methods such as differential media or genome sequencing, physicians and scientists can observe novel functions in organisms for more effective and accurate diagnosis of organisms. Methods used in diagnostic microbiology are often used to take advantage of a particular difference in organisms and attain information about what species it can be identified as, which is often through a reference of previous studies. New studies provide information that others can reference so that scientists can attain a basic understanding of the organism they are examining. Aerobic vs anaerobic Anaerobic organisms require an oxygen-free environment. When culturing anaerobic microbes, broths are often flushed with nitrogen gas to extinguish oxygen present, and growth can also occur on media in a chamber without oxygen present. Sodium resazurin can be added to indicate redox potential. Cultures are to be incubated in an oxygen-free environment for 48 hours at 35 °C before growth is examined. Anaerobic bacteria collection can come from a variety of sources in patient samples, including blood, bile, bone marrow, cerebrospinal fluid, direct lung aspirate, tissue biopsies from a normally sterile site, fluid from a normally sterile site (like a joint), dental, abscess, abdominal or pelvic abscess, knife, gunshot, or surgical wound, or severe burn. Incubation length Incubation times vary based upon the microbe that requires culturing. Traditional culturing techniques, for example, require less than 24 hours culture time for Escherichia coli but 6–8 weeks for successful culturing of Mycobacterium tuberculosis before definitive results are expressed. A benefit of non-culture tests is that physicians and microbiologists are not handicapped by waiting periods. Incubation follows a growth curve variable
https://en.wikipedia.org/wiki/Gel%20electrophoresis%20of%20nucleic%20acids
Nucleic acid electrophoresis is an analytical technique used to separate DNA or RNA fragments by size and reactivity. Nucleic acid molecules which are to be analyzed are set upon a viscous medium, the gel, where an electric field induces the nucleic acids (which are negatively charged due to their sugar-phosphate backbone) to migrate toward the anode (which is positively charged because this is an electrolytic rather than galvanic cell). The separation of these fragments is accomplished by exploiting the mobilities with which different sized molecules are able to pass through the gel. Longer molecules migrate more slowly because they experience more resistance within the gel. Because the size of the molecule affects its mobility, smaller fragments end up nearer to the anode than longer ones in a given period. After some time, the voltage is removed and the fragmentation gradient is analyzed. For larger separations between similar sized fragments, either the voltage or run time can be increased. Extended runs across a low voltage gel yield the most accurate resolution. Voltage is, however, not the sole factor in determining electrophoresis of nucleic acids. The nucleic acid to be separated can be prepared in several ways before separation by electrophoresis. In the case of large DNA molecules, the DNA is frequently cut into smaller fragments using a DNA restriction endonuclease (or restriction enzyme). In other instances, such as PCR amplified samples, enzymes present in the sample that might affect the separation of the molecules are removed through various means before analysis. Once the nucleic acid is properly prepared, the samples of the nucleic acid solution are placed in the wells of the gel and a voltage is applied across the gel for a specified amount of time. The DNA fragments of different lengths are visualized using a fluorescent dye specific for DNA, such as ethidium bromide. The gel shows bands corresponding to different nucleic acid molecules populat
https://en.wikipedia.org/wiki/Ferrier%20Lecture
The Ferrier Lecture is a Royal Society lectureship given every three years "on a subject related to the advancement of natural knowledge on the structure and function of the nervous system". It was created in 1928 to honour the memory of Sir David Ferrier, a neurologist who was the first British scientist to electronically stimulate the brain for the purpose of scientific study. In its 90-year history, the Lecture has been given 30 times. It has never been given more than once by the same person. The first female to be awarded the honour was Prof. Christine Holt in 2017. The first lecture was given in 1929 by Charles Scott Sherrington, and was titled "Some functional problems attaching to convergence". The most recent lecturer was provided by Prof. Christine Holt, who presented a lecture in 2017 titled "understanding of the key molecular mechanisms involved in nerve growth, guidance and targeting which has revolutionised our knowledge of growing axon tips". In 1971, the lecture was given by two individuals (David Hunter Hubel and Torsten Nils Wiesel) on the same topic, with the title "The function and architecture of the visual cortex". List of Lecturers
https://en.wikipedia.org/wiki/Software%20deployment
Software deployment is all of the activities that make a software system available for use. The general deployment process consists of several interrelated activities with possible transitions between them. These activities can occur on the producer side or on the consumer side or both. Because every software system is unique, the precise processes or procedures within each activity can hardly be defined. Therefore, "deployment" should be interpreted as a general process that has to be customized according to specific requirements or characteristics. History When computers were extremely large, expensive, and bulky (mainframes and minicomputers), the software was often bundled together with the hardware by manufacturers. If business software needed to be installed on an existing computer, this might require an expensive, time-consuming visit by a systems architect or a consultant. For complex, on-premises installation of enterprise software today, this can still sometimes be the case. However, with the development of mass-market software for the new age of microcomputers in the 1980s came new forms of software distribution first cartridges, then Compact Cassettes, then floppy disks, then (in the 1990s and later) optical media, the internet and flash drives. This meant that software deployment could be left to the customer. However, it was also increasingly recognized over time that configuration of the software by the customer was important and that this should ideally have a user-friendly interface (rather than, for example, requiring the customer to edit registry entries on Windows). In pre-internet software deployments, deployments (and their closely related cousin, new software releases) were of necessity expensive, infrequent, bulky affairs. It is arguable therefore that the spread of the internet made end-to-end agile software development possible. Indeed, the advent of cloud computing and software as a service meant that software could be deployed to a
https://en.wikipedia.org/wiki/Noncommutative%20harmonic%20analysis
In mathematics, noncommutative harmonic analysis is the field in which results from Fourier analysis are extended to topological groups that are not commutative. Since locally compact abelian groups have a well-understood theory, Pontryagin duality, which includes the basic structures of Fourier series and Fourier transforms, the major business of non-commutative harmonic analysis is usually taken to be the extension of the theory to all groups G that are locally compact. The case of compact groups is understood, qualitatively and after the Peter–Weyl theorem from the 1920s, as being generally analogous to that of finite groups and their character theory. The main task is therefore the case of G that is locally compact, not compact and not commutative. The interesting examples include many Lie groups, and also algebraic groups over p-adic fields. These examples are of interest and frequently applied in mathematical physics, and contemporary number theory, particularly automorphic representations. What to expect is known as the result of basic work of John von Neumann. He showed that if the von Neumann group algebra of G is of type I, then L2(G) as a unitary representation of G is a direct integral of irreducible representations. It is parametrized therefore by the unitary dual, the set of isomorphism classes of such representations, which is given the hull-kernel topology. The analogue of the Plancherel theorem is abstractly given by identifying a measure on the unitary dual, the Plancherel measure, with respect to which the direct integral is taken. (For Pontryagin duality the Plancherel measure is some Haar measure on the dual group to G, the only issue therefore being its normalization.) For general locally compact groups, or even countable discrete groups, the von Neumann group algebra need not be of type I and the regular representation of G cannot be written in terms of irreducible representations, even though it is unitary and completely reducible. An examp
https://en.wikipedia.org/wiki/Blood%20smear
A blood smear, peripheral blood smear or blood film is a thin layer of blood smeared on a glass microscope slide and then stained in such a way as to allow the various blood cells to be examined microscopically. Blood smears are examined in the investigation of hematological (blood) disorders and are routinely employed to look for blood parasites, such as those of malaria and filariasis. Preparation A blood smear is made by placing a drop of blood on one end of a slide, and using a spreader slide to disperse the blood over the slide's length. The aim is to get a region, called a monolayer, where the cells are spaced far enough apart to be counted and differentiated. The monolayer is found in the "feathered edge" created by the spreader slide as it draws the blood forward. The slide is left to air dry, after which the blood is fixed to the slide by immersing it briefly in methanol. The fixative is essential for good staining and presentation of cellular detail. After fixation, the slide is stained to distinguish the cells from each other. Routine analysis of blood in medical laboratories is usually performed on blood films stained with Romanowsky stains such as Wright's stain, Giemsa stain, or Diff-Quik. Wright-Giemsa combination stain is also a popular choice. These stains allow for the detection of white blood cell, red blood cell, and platelet abnormalities. Hematopathologists often use other specialized stains to aid in the differential diagnosis of blood disorders. After staining, the monolayer is viewed under a microscope using magnification up to 1000x. Individual cells are examined and their morphology is characterized and recorded. Clinical significance Blood smear examination is usually performed in conjunction with a complete blood count in order to investigate abnormal results or confirm results that the automated analyzer has flagged as unreliable. Microscopic examination of the shape, size, and coloration of red blood cells is useful for determin
https://en.wikipedia.org/wiki/Compton%20scattering
Compton scattering (or the Compton effect) is the quantum theory of high frequency photons scattering following an interaction with a charged particle, usually an electron. Specifically, when the photon hits electrons, it releases loosely bound electrons from the outer valence shells of atoms or molecules. The effect was discovered in 1923 by Arthur Holly Compton while researching the scattering of X-rays by light elements, and earned him the Nobel Prize for Physics in 1927. The Compton Effect significantly deviated from dominating classical theories, using both special relativity and quantum mechanics to explain the interaction between high frequency photons and charged particles. Photons can interact with matter at the atomic level (e.g. photoelectric effect and Rayleigh scattering), at the nucleus, or with just an electron. Pair production and the Compton effect occur at the level of the electron. When a high frequency photon scatters, caused by its interaction with a charged particle, there is a decrease in the energy of the photon (and thus, an increase in wavelength,) which could make it an X-ray or gamma ray photon. This is the Compton effect. However, due to the conservation of energy, the lost energy from the photon is transferred to the recoiling particle during the interaction (an electron would be called a Compton Recoil electron). This implies that if the recoiling particle initially carried more energy than the photon, the reverse would occur. This is known as Inverse Compton Scattering, where the photon instead, increases in energy. Introduction In Compton's original experiment (see Fig. 1), the energy of the X ray photon (≈17 keV) was significantly larger than the binding energy of the atomic electron, so the electrons could be treated as being free after scattering. The amount by which the light's wavelength changes is called the Compton shift. Although nucleus Compton scattering exists, Compton scattering usually refers to the interaction inv
https://en.wikipedia.org/wiki/Paulette%20Libermann
Paulette Libermann (14 November 1919 – 10 July 2007) was a French mathematician, specializing in differential geometry. Early life and education Libermann was one of three sisters born to a family of Russian-Ukrainian Yiddish-speaking Jewish immigrants to Paris. After attending the Lycée Lamartine, she began her university studies in 1938 at the École normale supérieure de jeunes filles, a college in Sèvres for training women to become school teachers. Due to the reforms of the new director Eugénie Cotton, who wanted her school to be at the same level of École Normale Supérieure, Libermann benefited from being taught by leading mathematicians as Élie Cartan, Jacqueline Ferrand and André Lichnerowicz. Two years later, upon completion of her studies, she was prevented from taking the agrégation and becoming a teacher because of the anti-Jewish laws instituted by the German occupation. However, thanks to a scholarship provided by Cotton, she began doing research under Cartan's supervision. In 1942, she and her family escaped Paris for Lyon, where they hid from the persecutions by Klaus Barbie for two years. After the liberation of Paris in 1944, she returned to Sèvres and completed her studies, obtaining the agrégation. Career Libermann taught briefly in a school at Douai, and then got a scholarship to study at Oxford University between 1945 and 1947, where she obtained a bachelor's degree under the supervision of J. H. C. Whitehead. From 1947 to 1951 she hold a teaching position at a school for girls in Strasbourg. However, at the encouragement of Élie Cartan, during this period she also continued her research at Université Louis Pasteur. In 1951 she left teaching for a research position at the Centre national de la recherche scientifique, and in 1953 she completed her doctoral thesis, entitled Sur le problème d’équivalence de certaines structures infinitésimales [On the equivalence problem of certain infinitesimal structures], under the supervision of Charles
https://en.wikipedia.org/wiki/Lateral%20supracondylar%20ridge
The lateral supracondylar ridge is a prominent, rough margin on the lower part of the lateral border of the humerus. It presents an anterior lip for the origin of forearm extensors, including the brachioradialis muscle above, and the extensor carpi radialis longus muscle below. It also presents a posterior lip for the triceps brachii, and an intermediate ridge for the attachment of the lateral intermuscular septum. Clinical significance The lateral supracondylar ridge may be broken in a supracondylar humerus fracture, common in children.
https://en.wikipedia.org/wiki/Suffix%20tree%20clustering
Suffix Tree Clustering, often abbreviated as STC is an approach for clustering that uses suffix trees. A suffix tree cluster keeps track of all n-grams of any given length to be inserted into a set word string, while simultaneously allowing differing strings to be inserted incrementally in a linear order. This has the advantage of ensuring that a large number of clusters can be handled sequentially. However, a potential disadvantage may be that it also increases the number of possible documents that need to be looked through when handling large sets of data. Suffix tree clusters can either be decompositional or agglomerative in nature, depending on the type of data being handled.
https://en.wikipedia.org/wiki/Cluster%20of%20differentiation
The cluster of differentiation (also known as cluster of designation or classification determinant and often abbreviated as CD) is a protocol used for the identification and investigation of cell surface molecules providing targets for immunophenotyping of cells. In terms of physiology, CD molecules can act in numerous ways, often acting as receptors or ligands important to the cell. A signal cascade is usually initiated, altering the behavior of the cell (see cell signaling). Some CD proteins do not play a role in cell signaling, but have other functions, such as cell adhesion. CD for humans is numbered up to 371 (). Nomenclature The CD nomenclature was proposed and established in the 1st International Workshop and Conference on Human Leukocyte Differentiation Antigens (HLDA), which was held in Paris in 1982. This system was intended for the classification of the many monoclonal antibodies (mAbs) generated by different laboratories around the world against epitopes on the surface molecules of leukocytes (white blood cells). Since then, its use has expanded to many other cell types, and more than 370 CD unique clusters and subclusters have been identified. The proposed surface molecule is assigned a CD number once two specific monoclonal antibodies (mAb) are shown to bind to the molecule. If the molecule has not been well characterized, or has only one mAb, it is usually given the provisional indicator "w" (as in "CDw186"). For instance, CD2 mAbs are reagents that react with a 50‐kDa transmembrane glycoprotein expressed on T cells. The CD designations were used to describe the recognized molecules, but had to be clarified by attaching the term antigen or molecule to the designation (e.g., CD2 molecule). Currently, "CD2" is generally used to designate the molecule, and "CD2 antibody" is used to designate the antibody. Cell populations are usually defined using a '+' or a '−' symbol to indicate whether a certain cell fraction expresses or lacks a CD molecule. For