source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Ostiole
An ostiole is a small hole or opening through which algae or fungi release their mature spores. The word is a diminutive of "ostium", "opening". The term is also used in higher plants, for example to denote the opening of the involuted syconium (fig inflorescence) through which fig wasps enter to pollinate and breed. Sometimes a stomatal aperture is called an "ostiole". See also Ostium (disambiguation)
https://en.wikipedia.org/wiki/British%20Mycological%20Society
The British Mycological Society is a learned society established in 1896 to promote the study of fungi. Formation The British Mycological Society (BMS) was formed by the combined efforts of two local societies: the Woolhope Naturalists' Field Club of Hereford and the Yorkshire Naturalists’ Union. The Curator of the Hereford Club, Dr. H. G. Bull, convinced the members in 1867 to undertake the particular study of mushrooms. While the mycological efforts of the Club diminished somewhat after Dr. Bull's death, the Union of Yorkshire founded its Mycological Committee in 1892. This Committee attracted the involvement of many eminent mycologists including George Edward Massee (1845–1917), James Needham (1849–1913), Charles Crossland (1844-1916), and Henry Thomas Soppitt (1843-1899). Mycologist Kathleen Sampson was a member for sixty years, as well as serving as president in 1938. The need for a national organisation and the need for a journal to publish their observations led Cooke, Rea, Massee, and other mycologists (including Charles Crossland and James Needham) to found the Society in 1896. The Society's founding officers were Rea (Secretary), Crossland (Treasurer), and Massee (President). The choice of the latter as President was based on his international reputation (with more than 250 mycological publications) and role as the mycologist at the Royal Botanical Gardens, Kew (where he replaced Cooke as mycologist in 1893). In 1897, Rea assumed the additional role of Treasurer, also continuing as Secretary (until 1918), and was also Editor (until 1930). However, Massee and a number of Yorkshire mycologists soon left the BMS, preferring to remain with the Yorkshire Naturalists' Union. Membership By 1903, the Society's Members numbered over a hundred, which had increased to over four hundred (by shortly after World War II), and had reached over two thousand by 2006. Before World War II, Honorary Membership was awarded to: 1905 Émile Boudier (1828–1920) 1916 P
https://en.wikipedia.org/wiki/Sporidia
Sporidia are result of homokaryotic smut fungi (which are not pathogenic), asexual reproduction through the process of budding. Thus far, this has only been observed in vitro.
https://en.wikipedia.org/wiki/Contraharmonic%20mean
In mathematics, a contraharmonic mean is a function complementary to the harmonic mean. The contraharmonic mean is a special case of the Lehmer mean, , where p = 2. Definition The contraharmonic mean of a set of positive numbers is defined as the arithmetic mean of the squares of the numbers divided by the arithmetic mean of the numbers: Properties It is easy to show that this satisfies the characteristic properties of a mean of some list of values : The first property implies the fixed point property, that for all k > 0, The contraharmonic mean is higher in value than the arithmetic mean and also higher than the root mean square: where x is a list of values, H is the harmonic mean, G is geometric mean, L is the logarithmic mean, A is the arithmetic mean, R is the root mean square and C is the contraharmonic mean. Unless all values of x are the same, the ≤ signs above can be replaced by <. The name contraharmonic may be due to the fact that when taking the mean of only two variables, the contraharmonic mean is as high above the arithmetic mean as the arithmetic mean is above the harmonic mean (i.e., the arithmetic mean of the two variables is equal to the arithmetic mean of their harmonic and contraharmonic means). Two-variable formulae From the formulas for the arithmetic mean and harmonic mean of two variables we have: Notice that for two variables the average of the harmonic and contraharmonic means is exactly equal to the arithmetic mean: As a gets closer to 0 then H(a, b) also gets closer to 0. The harmonic mean is very sensitive to low values. On the other hand, the contraharmonic mean is sensitive to larger values, so as a approaches 0 then C(a, b) approaches b (so their average remains A(a, b)). There are two other notable relationships between 2-variable means. First, the geometric mean of the arithmetic and harmonic means is equal to the geometric mean of the two values: The second relationship is that the geometric mean of the ar
https://en.wikipedia.org/wiki/Ocular%20prosthesis
An ocular prosthesis, artificial eye or glass eye is a type of craniofacial prosthesis that replaces an absent natural eye following an enucleation, evisceration, or orbital exenteration. The prosthesis fits over an orbital implant and under the eyelids. Though often referred to as a glass eye, the ocular prosthesis roughly takes the shape of a convex shell and is made of medical grade plastic acrylic. A few ocular prostheses today are made of cryolite glass. A variant of the ocular prosthesis is a very thin hard shell known as a scleral shell which can be worn over a damaged or eviscerated eye. Makers of ocular prosthetics are known as ocularists. An ocular prosthesis does not provide vision; this would be a visual prosthesis. Someone with an ocular prosthesis is altogether blind on the affected side and has monocular (one sided) vision. History The earliest known evidence of the use of ocular prosthesis is that of a woman found in Shahr-I Sokhta, Iran dating back to 2900–2800 BC. It has a hemispherical form and a diameter of just over 2.5 cm (1 inch). It consists of very light material, probably bitumen paste. The surface of the artificial eye is covered with a thin layer of gold, engraved with a central circle (representing the iris) and gold lines patterned like sun rays. On both sides of the eye are drilled tiny holes, through which a golden thread could hold the eyeball in place. Since microscopic research has shown that the eye socket showed clear imprints of the golden thread, the eyeball must have been worn during her lifetime. In addition to this, an early Hebrew text references a woman who wore an artificial eye made of gold. Roman and Egyptian priests are known to have produced artificial eyes as early as the fifth century BC constructed from painted clay attached to cloth and worn outside the socket. The first in-socket artificial eyes were made of gold with colored enamel, later evolving into the use of glass (thus the name "glass eye") by the Venet
https://en.wikipedia.org/wiki/Butter%20salt
Butter salt is a seasoning developed in the late twentieth century for the purpose of combining the flavours found in salt and butter. It is a fine, golden powder, originally salt, enriched with butter flavouring. It is often used as a seasoning for popcorn. It is said to impart a "rich, buttery flavour". The contents are usually salt, artificial butter flavoring, and yellow food colouring. See also Molly McButter Popcorn seasoning
https://en.wikipedia.org/wiki/Mini-Z
Mini-Z is a brand name for a popular line of 1:28-scale electric radio-controlled cars manufactured by Kyosho Corporation, a Japanese manufacturer of various radio-controlled devices. Kyosho makes a huge number of bodies for the Mini-Z. The wheelbase can range from 86mm to 106mm. The bodies are all highly detailed, realistic looking, and fully painted with a high gloss paint. The bodies are so realistic that many are collected as display models and the bodies come with a dummy chassis and wheels for display purposes. Popular bodies for racing are the 98mm wheelbase bodies of the McLaren and Ferrari Enzo. The Enzo allows for a very wide track due to the extreme offset wheels and low center of gravity with quite a bit of steering bite due to the extreme nose on the car, while the McLaren offers lightweight and very dynamic, nimble handling. 94mm chassis are also popular for racing due to lower polar moment of inertia and weight distribution. Classic bodies include Lancia Stratos, Lotus Europa, Porsche 934 and 935, Lamborghini Countach, Shelby Cobra and many others. Mini-Z comes with parts to adjust wheelbase and motor location to fit the body. Different bodies will handle differently due to wheelbase and distribution of the masses. A rear engine car like the Porsche 934 has a rear motor mount with the motor behind the axle, so the Mini-Z replicates somewhat the handling characteristics of the real car. The Mini-Z can be extensively modified with parts both from Kyosho Corporation and from aftermarket suppliers. The cars and modification parts are typically available in hobby stores and through online hobby retailers. Many hobbyists race their cars against others. Top speeds for a stock Mini-Z can reach 12 mph. With a moderately modified Mini-Z using off the shelf parts, 35 mph is achievable. A highly modified Mini-Z can actually surpass 61 mph or 99 km/h. However, given the size of the cars, its scale speed would be equivalent to 1850 mph. At that speed, it is impo
https://en.wikipedia.org/wiki/RiskMetrics
The RiskMetrics variance model (also known as exponential smoother) was first established in 1989, when Sir Dennis Weatherstone, the new chairman of J.P. Morgan, asked for a daily report measuring and explaining the risks of his firm. Nearly four years later in 1992, J.P. Morgan launched the RiskMetrics methodology to the marketplace, making the substantive research and analysis that satisfied Sir Dennis Weatherstone's request freely available to all market participants. In 1998, as client demand for the group's risk management expertise exceeded the firm's internal risk management resources, the Corporate Risk Management Department was spun off from J.P. Morgan as RiskMetrics Group with 23 founding employees. The RiskMetrics technical document was revised in 1996. In 2001, it was revised again in Return to RiskMetrics. In 2006, a new method for modeling risk factor returns was introduced (RM2006). On 25 January 2008, RiskMetrics Group listed on the New York Stock Exchange (NYSE: RISK). In June 2010, RiskMetrics was acquired by MSCI for $1.55 billion. Risk measurement process Portfolio risk measurement can be broken down into steps. The first is modeling the market that drives changes in the portfolio's value. The market model must be sufficiently specified so that the portfolio can be revalued using information from the market model. The risk measurements are then extracted from the probability distribution of the changes in portfolio value. The change in value of the portfolio is typically referred to by portfolio managers as profit and loss, or P&L Risk factors Risk management systems are based on models that describe potential changes in the factors affecting portfolio value. These risk factors are the building blocks for all pricing functions. In general, the factors driving the prices of financial securities are equity prices, foreign exchange rates, commodity prices, interest rates, correlation and volatility. By generating future scenarios for each risk f
https://en.wikipedia.org/wiki/Mathematica%3A%20A%20World%20of%20Numbers...%20and%20Beyond
Mathematica: A World of Numbers... and Beyond is a kinetic and static exhibition of mathematical concepts designed by Charles and Ray Eames, originally debuted at the California Museum of Science and Industry in 1961. Duplicates have since been made, and they (as well as the original) have been moved to other institutions. History In March, 1961 a new science wing at the California Museum of Science and Industry in Los Angeles opened. The IBM Corporation had been asked by the Museum to make a contribution; IBM in turn asked the famous California designer team of Charles Eames and his wife Ray Eames to come up with a good proposal. The result was that the Eames Office was commissioned by IBM to design an interactive exhibition called Mathematica: A World of Numbers... and Beyond. This was the first of many exhibitions designed by the Eames Office. The exhibition stayed at the Museum until January 1998, making it the longest running of any corporate sponsored museum exhibition. Furthermore, it is the only one of the dozens of exhibitions designed by the Office of Charles and Ray Eames that is still extant. This original Mathematica exhibition was reassembled for display at the Alyce de Roulet Williamson Gallery at Art Center College of Design in Pasadena, California, July 30 through October 1, 2000. It is now owned by and on display at the New York Hall of Science, though it currently lacks the overhead plaques with quotations from mathematicians that were part of the original installation. Duplicates In November, 1961 an exact duplicate was made for Chicago's Museum of Science and Industry, where it was shown until late 1980. From there it was sold and relocated to the Museum of Science in Boston, Massachusetts, where it is permanently on display. The Boston installation bears the closest resemblance to the original Eames design, including numerous overhead plaques featuring historic quotations from famous mathematicians. As part of a refurbishment, a graphic p
https://en.wikipedia.org/wiki/Tilorone
Tilorone (trade names Amixin, Lavomax and others) is the first recognized synthetic, small molecular weight compound that is an orally active interferon inducer. It is used as an antiviral drug in some countries which do not require double-blind placebo-controlled studies, including Russia. It is effective against Ebola virus in mice. Pharmacology Tilorone activates the production of interferon.
https://en.wikipedia.org/wiki/Nickel%20titanium
Nickel titanium, also known as nitinol, is a metal alloy of nickel and titanium, where the two elements are present in roughly equal atomic percentages. Different alloys are named according to the weight percentage of nickel; e.g., nitinol 55 and nitinol 60. Nitinol alloys exhibit two closely related and unique properties: the shape memory effect and superelasticity (also called pseudoelasticity). Shape memory is the ability of nitinol to undergo deformation at one temperature, stay in its deformed shape when the external force is removed, then recover its original, undeformed shape upon heating above its "transformation temperature". Superelasticity is the ability for the metal to undergo large deformations and immediately return to its undeformed shape upon removal of the external load. Nitinol can deform 10–30 times as much as ordinary metals and return to its original shape. Whether nitinol behaves with the shape memory effect or superelasticity depends on whether it is above the transformation temperature of the specific alloy. Below the transformation temperature it exhibits the shape memory effect, and above that temperature it behaves superelastically. History The word "nitinol" is derived from its composition and its place of discovery: (Nickel Titanium-Naval Ordnance Laboratory). William J. Buehler along with Frederick Wang, discovered its properties during research at the Naval Ordnance Laboratory in 1959. Buehler was attempting to make a better missile nose cone, which could resist fatigue, heat and the force of impact. Having found that a 1:1 alloy of nickel and titanium could do the job, in 1961 he presented a sample at a laboratory management meeting. The sample, folded up like an accordion, was passed around and flexed by the participants. One of them applied heat from his pipe lighter to the sample and, to everyone's surprise, the accordion-shaped strip contracted and took its previous shape. While the potential applications for nitinol were real
https://en.wikipedia.org/wiki/Metrical%20task%20system
Task systems are mathematical objects used to model the set of possible configuration of online algorithms. They were introduced by Borodin, Linial and Saks (1992) to model a variety of online problems. A task system determines a set of states and costs to change states. Task systems obtain as input a sequence of requests such that each request assigns processing times to the states. The objective of an online algorithm for task systems is to create a schedule that minimizes the overall cost incurred due to processing the tasks with respect to the states and due to the cost to change states. If the cost function to change states is a metric, the task system is a metrical task system (MTS). This is the most common type of task systems. Metrical task systems generalize online problems such as paging, list accessing, and the k-server problem (in finite spaces). Formal Definition A task system is a pair where is a set of states and is a distance function. If is a metric, is a metrical task system. An input to the task system is a sequence such that for each , is a vector of non-negative entries that determine the processing costs for the states when processing the th task. An algorithm for the task system produces a schedule that determines the sequence of states. For instance, means that the th task is run in state . The processing cost of a schedule is The objective of the algorithm is to find a schedule such that the cost is minimized. Known Results As usual for online problems, the most common measure to analyze algorithms for metrical task systems is the competitive analysis, where the performance of an online algorithm is compared to the performance of an optimal offline algorithm. For deterministic online algorithms, there is a tight bound on the competitive ratio due to Borodin et al. (1992). For randomized online algorithms, the competitive ratio is lower bounded by and upper bounded by . The lower bound is due to Bartal et al. (2006,2005).
https://en.wikipedia.org/wiki/Fictive%20motion
Fictive motion is the metaphorical motion of an object or abstraction through space. Fictive motion has become a subject of study in psycholinguistics and cognitive linguistics. In fictive motion sentences, a motion verb applies to a subject that is not literally capable of movement in the physical world, as in the sentence, "The fence runs along the perimeter of the house." Fictive motion is so called because it is attributed to material states, objects, or abstract concepts, that cannot (sensibly) be said to move themselves through physical space. Fictive motion sentences are pervasive in English and other languages. History Cognitive linguist Leonard Talmy discussed many of the spatial and linguistic properties of fictive motion in a book chapter called "Fictive motion in language and 'ception (Talmy 1996). He provided further insights in his seminal book, Toward a Cognitive Semantics Vol. 1, in 2000. Talmy began analyzing the semantics of fictive motion in the late 1970s and early 1980s but used the term "virtual motion" at that time (e.g. Talmy 1983). Fictive motion has since been investigated by cognitive scientists interested in whether and how it evokes dynamic imagery. Methods of investigation have included reading tasks, eye-tracking tasks and drawing tasks. Influence on perception of time A recent avenue of research has focused on fictive motion's influence on perceptions of time. People often speak about time in terms of motion. English speakers may describe themselves as moving through time toward or past events with statements such as "we're entering the holidays" or "we slipped past the due date." They may also talk about events as moving toward or past themselves with statements such as "tough times are approaching us" or "summer vacation has passed". Broadly speaking, metaphorical talk about time borrows from two different perspectives for conceptualizing motion. In the ego-moving metaphor, one progresses along a timeline toward the futur
https://en.wikipedia.org/wiki/Tallgrass%20Aspen%20Parkland
The Tallgrass Aspen Parkland is an ecoregion located in southeastern Manitoba and northwestern Minnesota. The area is characterized by a mosaic of habitat types, including tallgrass prairie, aspen woodland, sedge meadow wetlands, riparian woodland, and oak savanna. A number of endangered and threatened species occur in the area, including the western prairie fringed orchid and Dakota skipper. One of Minnesota's only wild elk herds utilizes the area as well. A number of conservation organizations, as well as provincial and state governments are involved in conservation activities in the Tallgrass Aspen Parkland. Major conservation partners include The Nature Conservancy, the Nature Conservancy of Canada, the Manitoba Conservation Data Centre, the Minnesota Department of Natural Resources, the Manitoba Tall Grass Prairie Preserve, and Manitoba Conservation. These partners collaborated to produce a Conservation Area Plan for the area in 2006.
https://en.wikipedia.org/wiki/Rake%20%28angle%29
A rake is an angle of slope measured from horizontal, or in some contexts from a vertical line 90° perpendicular to horizontal. A 60° rake would mean that the line is pointing 60° up from horizontal, either forwards or backwards relative to the object. Usage Though the term may be used in a general manner, it is commonly applied in several specific contexts. The rake of a ship's prow is the angle at which the prow rises from the water (the rake below water being called the bow rake). A motorcycle or bicycle fork rake is the angle at which the forks are angled down towards the ground. See also caster angle, which is the angular displacement of the steering axis from the vertical axis of a steered wheel. In machining and sawing the rake angle is the angle from the cutting head to the object being worked on (with a perpendicular angle conventionally being a 0° rake). In geology the rake is the angle at which one rock moves against another in a geological fault. In a theatre or opera house the stage can be raked to slope up towards the back of the stage to allow better viewing for the audience. See also Pitch angle, one of the angular degrees of freedom of any stiff body (for example a vehicle), describing rotation about the side-to-side axis
https://en.wikipedia.org/wiki/Speculative%20multithreading
Thread Level Speculation (TLS), also known as Speculative Multithreading, or Speculative Parallelization, is a technique to speculatively execute a section of computer code that is anticipated to be executed later in parallel with the normal execution on a separate independent thread. Such a speculative thread may need to make assumptions about the values of input variables. If these prove to be invalid, then the portions of the speculative thread that rely on these input variables will need to be discarded and squashed. If the assumptions are correct the program can complete in a shorter time provided the thread was able to be scheduled efficiently. Description TLS extracts threads from serial code and executes them speculatively in parallel with a safe thread. The speculative thread will need to be discarded or re-run if its presumptions on the input state prove to be invalid. It is a dynamic (runtime) parallelization technique that can uncover parallelism that static (compile-time) parallelization techniques may fail to exploit because at compile time thread independence cannot be guaranteed. For the technique to achieve the goal of reducing overall execute time, there must be available CPU resource that can be efficiently executed in parallel with the main safe thread. TLS assumes optimistically that a given portion of code (generally loops) can be safely executed in parallel. To do so, it divides the iteration space into chunks that are executed in parallel by different threads. A hardware or software monitor ensures that sequential semantics are kept (in other words, that the execution progresses as if the loop were executing sequentially). If a dependence violation appears, the speculative framework may choose to stop the entire parallel execution and restart it; to stop and restart the offending threads and all their successors, in order to be fed with correct data; or to stop exclusively the offending thread and its successors that have consumed i
https://en.wikipedia.org/wiki/Cell%20culture%20assay
A cell culture assay is any method used to assess the cytotoxicity of a material. This refers to the in vitro assessment of a material to determine whether it releases toxic chemicals in the cell. It also determines if the quantity is sufficient to kill cells, either directly or indirectly, through the inhibition of cell metabolic pathways. Cell culture evaluations are the precursor to whole animal studies and are a way to determine if significant cytotoxicity exists for the given material. Cell culture assays are standardized by ASTM, ISO, and BSI (British Standards Institution.) See also Microphysiometry
https://en.wikipedia.org/wiki/Superfluid%20film
Superfluidity is a phenomenon where a fluid, or a fraction of a fluid, loses all its viscosity and can flow without resistance, which can form thin films. Superfluid helium, for example, forms a 30-nm-thick film on the surface of any container. The film's properties cause the helium to climb the walls of the container and, if this is not closed, flow out. Superfluidity and superconductivity are macroscopic manifestations of quantum mechanics. There is considerable interest, both theoretical and practical, in these quantum phase transitions. There has been a tremendous amount of work done in the field of phase transitions and critical phenomena in two dimensions. Much of the interest in this field is because as the number of dimensions increases, the number of exactly solvable models diminishes drastically. In three or more dimensions one must resort to a mean field theory approach. The theory of superfluid transitions in two dimensions is known as the Kosterlitz-Thouless (KT) theory. The 2D XY model - where the order parameter is characterized by an amplitude and a phase - is the universality class for this transition. Experimental methods In looking at phase transitions in thin films, specifically helium, the two main experimental signatures are the superfluid fraction and heat capacity. If either of these measurements were to be done on a superfluid film in a typical open container, the film signal would be overwhelmed by the background signal from the container. Therefore, when studying superfluid films, it is of paramount importance to study a system of large surface area as to enhance the film signal. There are several ways of doing this. In the first, a long thin strip of material such as PET film is rolled up into a "jelly roll" configuration. The result is a film that is a long continuous plane, referred to as a planar film. A second way is to have a highly porous material such as porous gold, Vycor, or Aerogel. This results in a multiply connected fi
https://en.wikipedia.org/wiki/Screened%20subnet
In network security a screened subnet refers to the use of one or more logical screening routers as a firewall to define three separate subnets: an external router (sometimes called an access router), that separates the external network from a perimeter network, and an internal router (sometimes called a choke router) that separates the perimeter network from the internal network. The perimeter network, also called a border network or demilitarized zone (DMZ), is intended for hosting servers (sometimes called bastion hosts) that are accessible from or have access to both the internal and external networks. The purpose of a screened subnet or DMZ is to establish a network with heightened security that is situated between an external and presumed hostile network, such as the Internet or an extranet, and an internal network. A screened subnet is an essential concept for e-commerce or any entity that has a presence in the World Wide Web or is using electronic payment systems or other network services because of the prevalence of hackers, advanced persistent threats, computer worms, botnets, and other threats to networked information systems. Physical separation of routers By separating the firewall system into two separate component routers it achieves greater potential throughput by reducing the computational load of each router. As each component router of the screened subnet firewall needs to implement only one general task, each router has a less complex configuration. A screened subnet or DMZ can also be achieved by a single firewall device with three network interfaces. Relationship to DMZ The term demilitarized zone in military context refers to an area in which treaties or agreements between contending groups forbid military installations and activities, often along an established frontier or boundary between two or more military powers or alliances. The similarity to network security is that the screened network (DMZ) has reduced fortifications because it
https://en.wikipedia.org/wiki/Expected%20shortfall
Expected shortfall (ES) is a risk measure—a concept used in the field of financial risk measurement to evaluate the market risk or credit risk of a portfolio. The "expected shortfall at q% level" is the expected return on the portfolio in the worst of cases. ES is an alternative to value at risk that is more sensitive to the shape of the tail of the loss distribution. Expected shortfall is also called conditional value at risk (CVaR), average value at risk (AVaR), expected tail loss (ETL), and superquantile. ES estimates the risk of an investment in a conservative way, focusing on the less profitable outcomes. For high values of it ignores the most profitable but unlikely possibilities, while for small values of it focuses on the worst losses. On the other hand, unlike the discounted maximum loss, even for lower values of the expected shortfall does not consider only the single most catastrophic outcome. A value of often used in practice is 5%. Expected shortfall is considered a more useful risk measure than VaR because it is a coherent spectral measure of financial portfolio risk. It is calculated for a given quantile-level and is defined to be the mean loss of portfolio value given that a loss is occurring at or below the -quantile. Formal definition If (an Lp) is the payoff of a portfolio at some future time and then we define the expected shortfall as where is the value at risk. This can be equivalently written as where is the lower -quantile and is the indicator function. Note, that the second term vanishes for random variables with continuous distribution functions. The dual representation is where is the set of probability measures which are absolutely continuous to the physical measure such that almost surely. Note that is the Radon–Nikodym derivative of with respect to . Expected shortfall can be generalized to a general class of coherent risk measures on spaces (Lp space) with a corresponding dual characterization in the corres
https://en.wikipedia.org/wiki/Schloss%20Esterh%C3%A1zy
Schloss Esterházy () is a palace in Eisenstadt, Austria, the capital of the Burgenland state. It was constructed in the late 13th century, and came under ownership of the Hungarian Esterházy family in 1622. Under Paul I, 1st Prince Esterházy of Galántha the estate was converted into a baroque castle which remained the principal residence and center of administration of the family for over 300 years. History The architectural history of the building involves a transition from an actual medieval castle, built for defense, to a palace meant for comfort and ostentatious display. The moats were removed in the early 19th century, and the architectural style was modified at various points to fit the taste of the times. Early period 1364: The palace comes into the possession of the powerful Kanizsai family and consequently experiences a substantial development. 1371: King Louis acquires and develops the castle into a "medieval city castle" included in the northwest perimeter of the city of Kismarton. 1622: Ownership falls under the possession of the Esterházy family. Baroque building phase (1663–1672) After the death of Count Ladislaus Esterházy in the battle of Vezekény in 1652, his younger brother Paul I, 1st Prince Esterházy of Bum Bum inherited the palace. The additions he made took nearly ten years to complete, and gave it the facade we see today. The rich stucco decoration was made by the Italian master Andrea Bertinalli. 18th century changes There were few changes made during the high and late baroque periods. In the 18th century, the interior design and staircases were about all that changed. Most areas were equipped with furnaces and stucco ceilings. The only large construction work in the palace was the renewal of the two main staircases, which are presently the same. The palace was one of the summer residences of the Esterházy family during the time of Joseph Haydn. Classical building phase Prince Anton Esterházy built considerably, despite his bein
https://en.wikipedia.org/wiki/Coordination%20number
In chemistry, crystallography, and materials science, the coordination number, also called ligancy, of a central atom in a molecule or crystal is the number of atoms, molecules or ions bonded to it. The ion/molecule/atom surrounding the central ion/molecule/atom is called a ligand. This number is determined somewhat differently for molecules than for crystals. For molecules and polyatomic ions the coordination number of an atom is determined by simply counting the other atoms to which it is bonded (by either single or multiple bonds). For example, [Cr(NH3)2Cl2Br2]− has Cr3+ as its central cation, which has a coordination number of 6 and is described as hexacoordinate. The common coordination numbers are 4, 6 and 8. Molecules, polyatomic ions and coordination complexes In chemistry, coordination number, defined originally in 1893 by Alfred Werner, is the total number of neighbors of a central atom in a molecule or ion. The concept is most commonly applied to coordination complexes. Simple and commonplace cases The most common coordination number for d-block transition metal complexes is 6. The coordination number does not distinguish the geometry of such complexes, i.e. octahedral vs trigonal prismatic. For transition metal complexes, coordination numbers range from 2 (e.g., AuI in Ph3PAuCl) to 9 (e.g., ReVII in [ReH9]2−). Metals in the f-block (the lanthanoids and actinoids) can accommodate higher coordination number due to their greater ionic radii and availability of more orbitals for bonding. Coordination numbers of 8 to 12 are commonly observed for f-block elements. For example, with bidentate nitrate ions as ligands, CeIV and ThIV form the 12-coordinate ions [Ce(NO3)6]2− (ceric ammonium nitrate) and [Th(NO3)6]2−. When the surrounding ligands are much smaller than the central atom, even higher coordination numbers may be possible. One computational chemistry study predicted a particularly stable ion composed of a central lead ion coordinated with n
https://en.wikipedia.org/wiki/Underground%20stem
Underground stems are modified plant parts that derive from stem tissue but exist under the soil surface. They function as storage tissues for food and nutrients, facilitate the propagation of new clones, and aid in perennation (survival from one growing season to the next). Types of underground stems include bulbs, corms, rhizomes, stolons, and tubers. Plants have two structures or axes of growth, which can be best seen from seed germination and growth. Seedlings develop two axes of growth: stems, which develop upward out of the soil, and roots, which develop downward. The roots are modified to have root hairs and branch indiscriminately with cells that take in water and nutrients, while the stems are modified to move water and nutrients to and from the leaves and flowers. Stems have nodes with buds where leaves and flowers arise at specific locations, while roots do not. Plants use underground stems to multiply by asexual reproduction and to survive from one year to the next, usually through dormancy. Some plants produce stems modified to store energy and preserve a location of potential growth to survive a cold or dry period which normally is a period of inactive growth, and when that period is over the plants resume new growth from the underground stems. Being underground protects the stems from the elements during the dormancy period, such as freezing and thawing in winter, extreme heat and drought in summer, or other potentially harmful elements such as fire. They can also protect plants from heavy grazing pressure from animals, the plant might be eaten to the ground but new growth can occur from below ground stem that can not be reached by the herbivores. Several plants, including weedy species, use underground stems to spread and colonize large areas, since the stems do not have to be supported or strong, less energy and resources are needed to produce these stems and often these plants have more mass underground than above ground. Types of underground s
https://en.wikipedia.org/wiki/Survivin
Survivin, also called baculoviral inhibitor of apoptosis repeat-containing 5 or BIRC5, is a protein that, in humans, is encoded by the BIRC5 gene. Survivin is a member of the inhibitor of apoptosis (IAP) family. The survivin protein functions to inhibit caspase activation, thereby leading to negative regulation of apoptosis or programmed cell death. This has been shown by disruption of survivin induction pathways leading to increase in apoptosis and decrease in tumour growth. The survivin protein is expressed highly in most human tumours and fetal tissue, but is completely absent in terminally differentiated cells. These data suggest survivin might provide a new target for cancer therapy that would discriminate between transformed and normal cells. Survivin expression is also highly regulated by the cell cycle and is only expressed in the G2-M phase. It is known that Survivin localizes to the mitotic spindle by interaction with tubulin during mitosis and may play a contributing role in regulating mitosis. The molecular mechanisms of survivin regulation are still not well understood, but regulation of survivin seems to be linked to the p53 protein. It also is a direct target gene of the Wnt pathway and is upregulated by beta-catenin. IAP family of anti-apoptotic proteins Survivin is a member of the IAP family of antiapoptotic proteins. It is shown to be conserved in function across evolution as homologues of the protein are found both in vertebrates and invertebrates. The first members of the IAPs identified were from the baculovirus IAPs, Cp-IAP and Op-IAP, which bind to and inhibit caspases as a mechanism that contributes to its efficient infection and replication cycle in the host. Later, five more human IAPs that included XIAP, c-IAPl, C-IAP2, NAIP, and survivin were discovered. Survivin, like the others, was discovered by its structural homology to IAP family of proteins in human B-cell lymphoma. The human IAPs, XIAP, c-IAPl, C-IAP2 have been shown to bind t
https://en.wikipedia.org/wiki/Shrinking%20city
Shrinking cities or urban depopulation are dense cities that have experienced a notable population loss. Emigration is a common reason for city shrinkage. Since the infrastructure of such cities was built to support a larger population, its maintenance can become a serious concern. A related phenomenon is counterurbanization. Definition Origins The phenomenon of shrinking cities generally refers to a metropolitan area that experiences significant population loss in a short period of time. The process is also known as counterurbanization, metropolitan deconcentration, and metropolitan turnaround. It was popularized in reference to Eastern Europe post-socialism, when old industrial regions came under Western privatization and capitalism. Shrinking cities in the United States, on the other hand, have been forming since 2006 in dense urban centers while external suburban areas continue to grow. Suburbanization in tandem with deindustrialization, human migration, and the 2008 Great Recession all contribute to origins of shrinking cities in the U.S. Scholars estimate that one in six to one in four cities worldwide are shrinking in countries with expanding economies and those with deindustrialization. However, there are some issues with the concept of shrinking cities, as it seeks to group together areas that undergo depopulation for a variety of complex reasons. These may include an aging population, shifting industries, intentional shrinkage to improve quality of life, or a transitional phase, all of which require different responses and plans. Causes There are various theoretical explanations for the shrinking city phenomenon. Hollander et al. and Glazer cite railroads in port cities, the depreciation of national infrastructure (i.e., highways), and suburbanization as possible causes of de-urbanization. Pallagst also suggests that shrinkage is a response to deindustrialization, as jobs move from the city core to cheaper land on the periphery. This case has been obser
https://en.wikipedia.org/wiki/DMARC
Domain-based Message Authentication, Reporting and Conformance (DMARC) is an email authentication protocol. It is designed to give email domain owners the ability to protect their domain from unauthorized use, commonly known as email spoofing. The purpose and primary outcome of implementing DMARC is to protect a domain from being used in business email compromise attacks, phishing email, email scams and other cyber threat activities. Once the DMARC DNS entry is published, any receiving email server can authenticate the incoming email based on the instructions published by the domain owner within the DNS entry. If the email passes the authentication, it will be delivered and can be trusted. If the email fails the check, depending on the instructions held within the DMARC record the email could be delivered, quarantined or rejected. DMARC extends two existing email authentication mechanisms, Sender Policy Framework (SPF) and DomainKeys Identified Mail (DKIM). It allows the administrative owner of a domain to publish a policy in their DNS records to specify how to check the From: field presented to end users; how the receiver should deal with failures – and provides a reporting mechanism for actions performed under those policies. DMARC is defined in the Internet Engineering Task Force's published document , dated March 2015, as "Informational". Overview A DMARC policy allows a sender's domain to indicate that their email messages are protected by SPF and/or DKIM, and tells a receiver what to do if neither of those authentication methods passes – such as to reject the message or quarantine it. The policy can also specify how an email receiver can report back to the sender's domain about messages that pass and/or fail. These policies are published in the public Domain Name System (DNS) as text TXT records. DMARC does not directly address whether or not an email is spam or otherwise fraudulent. Instead, DMARC can require that a message not only pass DKIM or SPF val
https://en.wikipedia.org/wiki/Bradbury%E2%80%93Nielsen%20shutter
A Bradbury–Nielsen shutter (or Bradbury–Nielsen gate) is a type of electrical ion gate, which was first proposed in an article by Norris Bradbury and Russel A. Nielsen, where they used it as an electron filter. Today they are used in the field of mass spectrometry where they are used in both TOF mass spectrometers and in ion mobility spectrometers , as well as Hadamard transform mass spectrometers (a variant of TOF-MS). The Bradbury–Nielsen shutter is ideal for injecting short pulses of ions and can be used to improve the mass resolution of TOF instruments by reducing the initial pulse size as compared to other methods of ion injection. Theory of operation The concept behind the Bradbury–Nielsen shutter is to apply a high frequency voltage in a 180° out-of-phase manner to alternate wires in a grid which is orthogonal to the path of the ion beam. This results in charged particles only passing directly through the shutter at certain times in the voltage phase (φ=nπ/2), when the potential difference between the grid wires is zero. At other times the ion beam is deflected to some angle by the potential difference between the neighboring wires. This deflection is divergent with ions that pass through alternate slits being deflected in opposite directions. The maximum deflection angle can be calculated by tan α = k Vp / V0 where α is the deflection angle, k is a deflection constant, Vp is the wire voltage (+Vp on one wire set and -Vp on the other), and V0 is the ion acceleration voltage in eV. The deflection constant k can be calculated by k = π / 2ln[cot(πR/2d)] where R is the wire radius and d is the wire spacing. Micromachined ion gates A Bradbury-Nielsen Gate micromachined from a silicon on insulator wafer has been reported.
https://en.wikipedia.org/wiki/Perceptual%20Evaluation%20of%20Audio%20Quality
Perceptual Evaluation of Audio Quality (PEAQ) is a standardized algorithm for objectively measuring perceived audio quality, developed in 1994-1998 by a joint venture of experts within Task Group 6Q of the International Telecommunication Union's Radiocommunication Sector (ITU-R). It was originally released as ITU-R Recommendation BS.1387 in 1998 and last updated in 2023. It utilizes software to simulate perceptual properties of the human ear and then integrates multiple model output variables into a single metric. PEAQ characterizes the perceived audio quality as subjects would do in a listening test according to ITU-R BS.1116. PEAQ results principally model mean opinion scores that cover a scale from 1 (bad) to 5 (excellent). The Subjective Difference Grade (SDG), which measures the degree of compression damage (impairment) is defined as the difference between the opinion scores of tested version and the reference (source). The SDG typically ranges from 0 (no perceived impairment) to -4 (terrible impairment). The Objective Difference Grade (ODG) is the actual output of the algorithm, designed to match SDG. Motivation The need to conserve bandwidth has led to developments in the compression of the audio data to be transmitted. Various encoding methods remove both redundancy and perceptual irrelevancy in the audio signal so that the bit rate required to encode the signal is significantly reduced. They take into account knowledge of human auditory perception and typically achieve a reduced bit rate by ignoring audio information that is not likely to be heard by most listeners. Traditional audio measurements like frequency response based on sinusoidal sweeps, S/N, THD+N do not necessarily correlate well with the audio codec quality. A psychoacoustic model must be used to predict how the information is masked by louder audio content adjacent in time and frequency. Since subjective listening tests are time-consuming, expensive and impractical for everyday use, it was
https://en.wikipedia.org/wiki/List%20of%20cosmetic%20ingredients
Ingredients of cosmetic products are listed following International Nomenclature of Cosmetic Ingredients (INCI). These INCI names often differ greatly from systematic chemical nomenclature or from more common trivial names. The below tables are sorted as follows: A B C D E G H I L M N P S T
https://en.wikipedia.org/wiki/Cauchy%27s%20theorem%20%28geometry%29
Cauchy's theorem is a theorem in geometry, named after Augustin Cauchy. It states that convex polytopes in three dimensions with congruent corresponding faces must be congruent to each other. That is, any polyhedral net formed by unfolding the faces of the polyhedron onto a flat surface, together with gluing instructions describing which faces should be connected to each other, uniquely determines the shape of the original polyhedron. For instance, if six squares are connected in the pattern of a cube, then they must form a cube: there is no convex polyhedron with six square faces connected in the same way that does not have the same shape. This is a fundamental result in rigidity theory: one consequence of the theorem is that, if one makes a physical model of a convex polyhedron by connecting together rigid plates for each of the polyhedron faces with flexible hinges along the polyhedron edges, then this ensemble of plates and hinges will necessarily form a rigid structure. Statement Let P and Q be combinatorially equivalent 3-dimensional convex polytopes; that is, they are convex polytopes with isomorphic face lattices. Suppose further that each pair of corresponding faces from P and Q are congruent to each other, i.e. equal up to a rigid motion. Then P and Q are themselves congruent. To see that convexity is necessary, consider a regular icosahedron. One can "push in" a vertex to create a nonconvex polyhedron that is still combinatorially equivalent to the regular icosahedron. Another way to see it, is to take the pentagonal pyramid around a vertex, and reflect it with respect to its base. History The result originated in Euclid's Elements, where solids are called equal if the same holds for their faces. This version of the result was proved by Cauchy in 1813 based on earlier work by Lagrange. An error in Cauchy's proof of the main lemma was corrected by Ernst Steinitz, Isaac Jacob Schoenberg, and Aleksandr Danilovich Aleksandrov. The corrected proof of
https://en.wikipedia.org/wiki/ATM%20SafetyPIN%20software
ATM SafetyPIN software is a software application that would allow users of automated teller machines (ATMs) to alert law enforcement of a forced cash withdrawal (such as in robbery) by entering their personal identification number (PIN) in reverse order. The system was patented by Illinois lawyer Joseph Zingher (). SafetyPIN is not currently used in ATM systems, despite widely circulated rumors originating from a chain letter e-mail, mainly due to issues regarding reversible PINs being incompatible with the system and potential security vulnerabilities that could arise if implemented. History The concept of a backup emergency PIN system, or duress code, for ATM systems has been around since at least July 30, 1986, when Representative Mario Biaggi, a former police officer, proposed it in the U.S. Congressional Record, pp. 18232 et seq. Biaggi then proposed House Resolution 785 in 1987 which would have had the FBI track the problem of express kidnappings and evaluate the idea of an emergency PIN system. HR785 died in committee without debate. Zingher has not been successful in marketing licenses for his patent. Police in New York, New Jersey, Ohio, Illinois, and Kansas have supported the concept. Police support prompted the Illinois legislature to pass a law making it mandatory on all ATMs in Illinois. The law was changed shortly after it was passed by a "follow-on" bill that changed the meaning to the exact opposite of what they were seeking. In 2006, an e-mail chain letter hoax circulated that claimed a reverse PIN duress code system is in place universally. American Banker reported on January 2, 2007, that no PIN-reversal duress code is used on any ATM as of that date. In September 2013 the hoax was still circulating in Australia with the text: The same kind of e-mail chain letter hoax is still circulated on Tumblr and Facebook, as well as in India and other parts of the world. Were the system implemented, palindromic PINs such as 5555 or 2112 then would
https://en.wikipedia.org/wiki/Wolves%20and%20moose%20on%20Isle%20Royale
The single predator-single prey relationship between wolves and moose on Isle Royale in Lake Superior is unique, and has been the subject of detailed study for over 50 years. Isle Royale, the principal island of Isle Royale National Park in Michigan in the United States, is an isolated island with little migration of animals into and out of the island, and as a national park, human interaction and impact on the two species is also limited. Both the wolves and the moose first became established populations on Isle Royale in the 1900s. Over the fifty years of the study, the populations of both moose and wolves have shown repeated spikes and declines and have not settled to a balanced relationship. The moose populations have ranged from 500 to 2500 while the number of wolves has ranged from almost 50 to down to two. From 2018 to 2019, 19 wolves were released at Isle Royale in hopes of bringing stability to the ecosystem, and as of 2020, there are estimated to be 14 wolves remaining on the island. The relationship between wolves and moose on Isle Royale has been the subject of the longest predator-prey research study, begun in 1958. The wolves have been subject to inbreeding and carry a spinal deformity. As of the 2014 count, there were only 9 wolves on the island, with the 2015–2017 counts showing only 2. A review completed in 2014 determined that new wolves would not be introduced into the park to attempt a genetic rescue, but as of December 2016, the National Park Service had instead decided to introduce 20 to 30 wolves to the island. In 2018, three females and one male wolf from Minnesota were transferred to the island system. Isle Royale National Park is made up of about 400 islands, and is in the northwest portion of Lake Superior. It is about from Michigan's shore, and from the Canadian shore. The main island is about long, and wide at the widest point, with an area of . There are no roads, and no motorized vehicles are allowed on the island. The park is cl
https://en.wikipedia.org/wiki/North%20of%20the%20Yukon%20%28Disney%20comics%29
"North of the Yukon" is a 24-page Disney comics adventure story featuring Scrooge McDuck and his nephews, Donald Duck and Huey, Dewey, & Louie. It was written and drawn by Carl Barks. This was his last story involving Scrooge's adventures in Alaska. It was published in September 1965, and later reprinted in May 1993. Gemstone Publishing later reprinted the story again in 2005 for a Donald Duck/Uncle Scrooge graphic novel with another story inspired by this one called "Somewhere in Nowhere". The character of Barko was inspired by an actual sled dog named Balto, who participated in the 1925 serum run to Nome. Barks had read an article about Balto in an issue of National Geographic, and was inspired to create this character. Plot The story starts with a photographer from Jolt magazine, wanting to do a picture story on Scrooge. Donald is partly to blame after he tries to convince Scrooge that if he does it, he will be paid fifty thousand dollars. Scrooge refuses, saying that if more people knew that he was the Richest Duck in the World, "every chisler from Cape Town to Nome would be waylaying me!" It's not until the photographer threatens to rip a five thousand dollar check in front of McDuck that he gives in. Scrooge even tells them how he made his first billion dollars in the Alaskan Gold Rush. Soon after the magazine is published, Scrooge is shocked to see that they extended his story to "ten pages of hogwash" and even calling him "a bashful King Midas". Just as Scrooge had feared, people come up to him asking for money. Just as he thinks things cannot get any worse, an old enemy in Goldboom, Alaska sees the article. The fiend then comes up with a plot. Since Scrooge gave him an I.O.U. in 1898, the sum doubled every month for the last sixty-seven years! It isn't long until Scrooge gets a summons from the same fiend known as Soapy Slick. Scrooge remembers how Slick was a crooked moneylender who cared for nothing but money. Scrooge doesn't worry, however; he tells D
https://en.wikipedia.org/wiki/Peter%20Junger
Peter D. Junger (1933 – November 2006) was a computer law professor and Internet activist, most famous for having fought against the U.S. government's regulations of and export controls on encryption software. The case, Junger v. Daley (6th Cir. 2000), held that computer source code is protected by the First Amendment. The US government prohibited publication of encryption software on the Internet, arguing that encryption software was a "munition" subject to export controls. Junger filed suit in 1996 challenging the regulations. Junger also did significant legal theoretical work on the interplay between intellectual property, computer law, and the First Amendment. He defined himself as a "First Amendment absolutist." Biography Junger grew up in Wyoming, graduating from Harvard University in 1955 and Harvard Law School in 1958. From January 1959 to December 1960 he was an enlisted man in the U.S. Army serving in West Germany. After practicing law from 1961 to 1970, he accepted a faculty position at Case Western Reserve University's School of Law. He retired and was Professor of Law Emeritus in 2001. Junger was also a practicing Buddhist, president of his local Buddhist Temple from 2003 to 2006. Peter Junger died in November, 2006, at the age of 73 at his home in Cleveland. He was survived by his mother, Genevieve Junger (born 1901).
https://en.wikipedia.org/wiki/Fresnel%20rhomb
A Fresnel rhomb is an optical prism that introduces a 90° phase difference between two perpendicular components of polarization, by means of two total internal reflections. If the incident beam is linearly polarized at 45° to the plane of incidence and reflection, the emerging beam is circularly polarized, and vice versa. If the incident beam is linearly polarized at some other inclination, the emerging beam is elliptically polarized with one principal axis in the plane of reflection, and vice versa. The rhomb usually takes the form of a right parallelepiped — that is, a right parallelogram-based prism. If the incident ray is perpendicular to one of the smaller rectangular faces, the angle of incidence and reflection at both of the longer faces is equal to the acute angle of the parallelogram. This angle is chosen so that each reflection introduces a phase difference of 45° between the components polarized parallel and perpendicular to the plane of reflection. For a given, sufficiently high refractive index, there are two angles meeting this criterion; for example, an index of 1.5 requires an angle of 50.2° or 53.3°. Conversely, if the angle of incidence and reflection is fixed, the phase difference introduced by the rhomb depends only on its refractive index, which typically varies only slightly over the visible spectrum. Thus the rhomb functions as if it were a wideband quarter-wave plate — in contrast to a conventional birefringent (doubly-refractive) quarter-wave plate, whose phase difference is more sensitive to the frequency (color) of the light. The material of which the rhomb is made — usually glass — is specifically not birefringent. The Fresnel rhomb is named after its inventor, the French physicist Augustin-Jean Fresnel, who developed the device in stages between 1817 and 1823. During that time he deployed it in crucial experiments involving polarization, birefringence, and optical rotation, all of which contributed to the eventual acceptance of his tr
https://en.wikipedia.org/wiki/Cambrian%20House
Cambrian House began as a crowdsourcing community that pioneered the technology to tap crowds for the best software ideas. To power open innovation in other businesses, they developed a crowdsourcing platform Chaordix – the technology to harness a crowd for breakthrough ideas. Results Launched in 2006, the original Cambrian House community attracted 50,000+ members and more than 7000 ideas from the crowd. Weaknesses in the idea-community model included the challenge of convincing users to read and rate a rapidly growing pool of ideas, the relatively low quality of some ideas, the management complexity of distributed development, and the large number of duplicate submissions. These weaknesses have been successfully overcome by other companies including InnoCentive, IStockphoto, and T-shirt design company Threadless. In the news Cambrian House has been discussed, primarily in relation to crowdsourcing in a variety of media including: CBC Radio's - The Current, The Financial Times and Economist Intelligence Unit, in addition to new media such as TechDigest and Mashable. Cambrian House is also the subject of an eponymous Harvard Business case, which examines and discusses the concept of peer production, the crowdsourcing business model, and the possibility of using prediction markets to estimate market demand for products. The company has also gained a degree of notoriety for its unconventional promotional efforts, including feeding 1000 pizzas to Google unannounced, auctioning a celebrity endorsed laptop for charity, and weighing a goat
https://en.wikipedia.org/wiki/Minimal%20subtraction%20scheme
In quantum field theory, the minimal subtraction scheme, or MS scheme, is a particular renormalization scheme used to absorb the infinities that arise in perturbative calculations beyond leading order, introduced independently by Gerard 't Hooft and Steven Weinberg in 1973. The MS scheme consists of absorbing only the divergent part of the radiative corrections into the counterterms. In the similar and more widely used modified minimal subtraction, or MS-bar scheme (), one absorbs the divergent part plus a universal constant that always arises along with the divergence in Feynman diagram calculations into the counterterms. When using dimensional regularization, i.e. , it is implemented by rescaling the renormalization scale: , with the Euler–Mascheroni constant.
https://en.wikipedia.org/wiki/Renormalon
In physics, a renormalon (a term suggested by 't Hooft) is a particular source of divergence seen in perturbative approximations to quantum field theories (QFT). When a formally divergent series in a QFT is summed using Borel summation, the associated Borel transform of the series can have singularities as a function of the complex transform parameter. The renormalon is a possible type of singularity arising in this complex Borel plane, and is a counterpart of an instanton singularity. Associated with such singularities, renormalon contributions are discussed in the context of quantum chromodynamics (QCD) and usually have the power-like form as functions of the momentum (here is the momentum cut-off). They are cited against the usual logarithmic effects like . Brief history Perturbation series in quantum field theory are usually divergent as was firstly indicated by Freeman Dyson. According to the Lipatov method, -th order contribution of perturbation theory into any quantity can be evaluated at large in the saddle-point approximation for functional integrals and is determined by instanton configurations. This contribution behaves usually as in dependence on and is frequently associated with approximately the same () number of Feynman diagrams. Lautrup has noted that there exist individual diagrams giving approximately the same contribution. In principle, it is possible that such diagrams are automatically taken into account in Lipatov's calculation, because its interpretation in terms of diagrammatic technique is problematic. However, 't Hooft put forward a conjecture that Lipatov's and Lautrup's contributions are associated with different types of singularities in the Borel plane, the former with instanton ones and the latter with renormalon ones. Existence of instanton singularities is beyond any doubt, while existence of renormalon ones was never proved rigorously in spite of numerous efforts. Among the essential contributions one should mention the
https://en.wikipedia.org/wiki/Consensus%20theorem
In Boolean algebra, the consensus theorem or rule of consensus is the identity: The consensus or resolvent of the terms and is . It is the conjunction of all the unique literals of the terms, excluding the literal that appears unnegated in one term and negated in the other. If includes a term which is negated in (or vice versa), the consensus term is false; in other words, there is no consensus term. The conjunctive dual of this equation is: Proof Consensus The consensus or consensus term of two conjunctive terms of a disjunction is defined when one term contains the literal and the other the literal , an opposition. The consensus is the conjunction of the two terms, omitting both and , and repeated literals. For example, the consensus of and is . The consensus is undefined if there is more than one opposition. For the conjunctive dual of the rule, the consensus can be derived from and through the resolution inference rule. This shows that the LHS is derivable from the RHS (if A → B then A → AB; replacing A with RHS and B with (y ∨ z) ). The RHS can be derived from the LHS simply through the conjunction elimination inference rule. Since RHS → LHS and LHS → RHS (in propositional calculus), then LHS = RHS (in Boolean algebra). Applications In Boolean algebra, repeated consensus is the core of one algorithm for calculating the Blake canonical form of a formula. In digital logic, including the consensus term in a circuit can eliminate race hazards. History The concept of consensus was introduced by Archie Blake in 1937, related to the Blake canonical form. It was rediscovered by Samson and Mills in 1954 and by Quine in 1955. Quine coined the term 'consensus'. Robinson used it for clauses in 1965 as the basis of his "resolution principle".
https://en.wikipedia.org/wiki/NtrC
NtrC (Nitrogen regulatory protein C) is the name of the protein necessary for the prokaryotic regulation transcription factor sigma N (sigma 54) to form an open complex with RNA polymerase in order to activate glnA transcription. The closed -> open conformational change of the sigma N-RNA polymerase complex around the glutamine synthetase gene promoter requires ATP and involves the formation of a loop between the enhancer and the promoter regions, which may be facilitated by DNA-bending proteins (such as IHF). The NtrC proteins bind at two sites located -160 and -80 upstream from the point of gene transcription.
https://en.wikipedia.org/wiki/Dense-in-itself
In general topology, a subset of a topological space is said to be dense-in-itself or crowded if has no isolated point. Equivalently, is dense-in-itself if every point of is a limit point of . Thus is dense-in-itself if and only if , where is the derived set of . A dense-in-itself closed set is called a perfect set. (In other words, a perfect set is a closed set without isolated point.) The notion of dense set is distinct from dense-in-itself. This can sometimes be confusing, as "X is dense in X" (always true) is not the same as "X is dense-in-itself" (no isolated point). Examples A simple example of a set that is dense-in-itself but not closed (and hence not a perfect set) is the set of irrational numbers (considered as a subset of the real numbers). This set is dense-in-itself because every neighborhood of an irrational number contains at least one other irrational number . On the other hand, the set of irrationals is not closed because every rational number lies in its closure. Similarly, the set of rational numbers is also dense-in-itself but not closed in the space of real numbers. The above examples, the irrationals and the rationals, are also dense sets in their topological space, namely . As an example that is dense-in-itself but not dense in its topological space, consider . This set is not dense in but is dense-in-itself. Properties A singleton subset of a space can never be dense-in-itself, because its unique point is isolated in it. The dense-in-itself subsets of any space are closed under unions. In a dense-in-itself space, they include all open sets. In a dense-in-itself T1 space they include all dense sets. However, spaces that are not T1 may have dense subsets that are not dense-in-itself: for example in the space with the indiscrete topology, the set is dense, but is not dense-in-itself. The closure of any dense-in-itself set is a perfect set. In general, the intersection of two dense-in-itself sets is not dense-in-itself. But
https://en.wikipedia.org/wiki/Substrate%20%28biology%29
In biology, a substrate is the surface on which an organism (such as a plant, fungus, or animal) lives. A substrate can include biotic or abiotic materials and animals. For example, encrusting algae that lives on a rock (its substrate) can be itself a substrate for an animal that lives on top of the algae. Inert substrates are used as growing support materials in the hydroponic cultivation of plants. In biology substrates are often activated by the nanoscopic process of substrate presentation. In agriculture and horticulture Cellulose substrate Expanded clay aggregate (LECA) Rock wool Potting soil Soil In animal biotechnology Requirements for animal cell and tissue culture Requirements for animal cell and tissue culture are the same as described for plant cell, tissue and organ culture (In Vitro Culture Techniques: The Biotechnological Principles). Desirable requirements are (i) air conditioning of a room, (ii) hot room with temperature recorder, (iii) microscope room for carrying out microscopic work where different types of microscopes should be installed, (iv) dark room, (v) service room, (vi) sterilization room for sterilization of glassware and culture media, and (vii) preparation room for media preparation, etc. In addition the storage areas should be such where following should be kept properly : (i) liquids-ambient (4-20°C), (ii) glassware-shelving, (iii) plastics-shelving, (iv) small items-drawers, (v) specialized equipments-cupboard, slow turnover, (vi) chemicals-sidled containers. For cell growth There are many types of vertebrate cells that require support for their growth in vitro otherwise they will not grow properly. Such cells are called anchorage-dependent cells. Therefore, many substrates which may be adhesive (e.g. plastic, glass, palladium, metallic surfaces, etc.) or non-adhesive (e.g. agar, agarose, etc.) types may be used as discussed below: Plastic as a substrate. Disposable plastics are cheaper substrate as they are commonly made
https://en.wikipedia.org/wiki/IBM%207080
The IBM 7080 was a variable word length BCD transistor computer in the IBM 700/7000 series commercial architecture line, introduced in August 1961, that provided an upgrade path from the vacuum tube IBM 705 computer. The 7080 weighed about . After the introduction of the IBM 7070, in June 1960, as an upgrade path for both the IBM 650 and IBM 705 computers, IBM realized that it was so incompatible with the 705 that few users of that system wanted to upgrade to the 7070. That prompted the development of the 7080, which was fully compatible with all models of the 705 and added many improvements. IBM 705 compatibility modes For backward compatibility with the IBM 705 the machine had two switches on the operator's control panel, 705 I-II and 40K memory, that selected the mode the machine started in. 705 I mode — 20,000 characters (705 I-II On, 40K memory Off) Indirect addressing is disabled Communication channels are disabled 705 II mode — 40,000 characters (705 I-II On, 40K memory On) Indirect addressing is disabled Communication channels are disabled 705 III mode — 40,000 characters (705 I-II Off, 40K memory On) Indirect addressing is enabled Communication channels are enabled 705 III mode — 80,000 characters (705 I-II Off, 40K memory Off) Indirect addressing is enabled Communication channels are enabled Software can then command the 7080 to enter full 7080 mode from any 705 startup mode. 7080 mode — 160,000 characters Indirect addressing is disabled Communication channels are enabled Regardless of mode, the 7080 operates at full 7080 speed. The 7080 system included the IBM 7305 Central Storage and I/O Control unit, the IBM 7102 Arithmetic and Logical unit, the IBM 7302 Core Storage unit, the IBM 7153 Console Control unit, and the IBM 7804 Power unit. The IBM 7622 Signal Control unit could be added to the system to convert transistor signal levels to levels usable with first generation equipment, allowing all 705 peripherals, including punched card input/output w
https://en.wikipedia.org/wiki/Three-process%20view
The three-process view is a psychological term coined by Janet E. Davidson and Robert Sternberg. According to this concept, there are three kinds of insight: selective-encoding, selective-comparison, and selective-combination. Selective-encoding insight – Distinguishing what is important in a problem and what is irrelevant. (i.e. filter) Selective-comparison insight – Identifying information by finding a connection between acquired knowledge and experience. Selective-combination insight – Identifying a problem through understanding the different components and putting everything together.
https://en.wikipedia.org/wiki/Discitis
Discitis, or diskitis, is an infection in the intervertebral disc space that affects different age groups. Symptoms include severe back pain, leading to lack of mobility. In adults, it can lead to severe consequences, such as sepsis or epidural abscess, but it can also spontaneously resolve, especially in children under 8 years of age. Discitis occurs post-surgically in approximately 1–2 percent of patients after spinal surgery. There is debate as to the cause. Diagnosis is usually apparent on MRI, although plain X-rays and CT examinations can be suggestive. Treatment usually includes antibiotics, and reducing the mobility of the affected region. Description Discitis is an infection in the intervertebral disc space. It affects different age groups. Signs and symptoms Symptoms include severe back pain, leading to lack of mobility. In adults, it can lead to severe consequences, such as sepsis or epidural abscess, but it can also spontaneously resolve, especially in children under 8 years of age. Discitis occurs post-surgically in approximately 1–2 percent of patients after spinal surgery. Some very young children may refuse to walk and arching of the back is possible. In post-operative situations, the symptoms occur within a week and result in severe low back pain or neck pain (depending on the surgical location). If untreated, the discitis may resolve on its own, causing spontaneous fusion of the intervertebral disc space, cause a chronic low grade infection, or progress to osteomyelitis and possibly even an epidural abscess. In case of concomitant inflammation of one or more vertebrae (in such cases usually involving the areas adjacent to the intervertebral disc spaces) the condition is called spondylodiscitis. Causes There is debate as to the cause, although hematogenous seeding of the offending organism is favored as well as direct spread. Spontaneous discitis is usually from hematologic spread from a urinary or respiratory infection while discitis from a pos
https://en.wikipedia.org/wiki/List%20of%20Bohol%20provincial%20symbols
The following is a list of symbols of Bohol province, Philippines. Provincial symbols
https://en.wikipedia.org/wiki/Frank%20Kelly%20%28mathematician%29
Francis Patrick Kelly, CBE, FRS (born 28 December 1950) is Professor of the Mathematics of Systems at the Statistical Laboratory, University of Cambridge. He served as Master of Christ's College, Cambridge from 2006 to 2016. Kelly's research interests are in random processes, networks and optimisation, especially in very large-scale systems such as telecommunication or transportation networks. In the 1980s, he worked with colleagues in Cambridge and at British Telecom's Research Labs on Dynamic Alternative Routing in telephone networks, which was implemented in BT's main digital telephone network. He has also worked on the economic theory of pricing to congestion control and fair resource allocation in the internet. From 2003 to 2006 he served as Chief Scientific Advisor to the United Kingdom Department for Transport. Kelly was elected a Fellow of the Royal Society in 1989. In December 2006 he was elected 37th Master of Christ's College, Cambridge. He was appointed Commander of the Order of the British Empire (CBE) in the 2013 New Year Honours for services to mathematical science. Awards 1979 Davidson Prize of the University of Cambridge 1989 Guy Medal in Silver of the Royal Statistical Society 1989 Fellow of the Royal Society 1992 Lanchester Prize of the Institute for Operations Research and the Management Sciences 1997 Naylor Prize of the London Mathematical Society 2001 Honorary D.Sc. from Heriot-Watt University 2005 IEEE Koji Kobayashi Computers and Communications Award 2006 Companionship of OR by the Operational Research Society 2008 John von Neumann Theory Prize of the Institute for Operations Research and the Management Sciences 2009 SIGMETRICS Achievement Award 2009 EURO Gold Medal from European Operational Research Society 2013 Commander of the Order of the British Empire in the New Years Honours List for "services to mathematical sciences" 2015 IEEE Alexander Graham Bell Medal, for "Creating principled mathematical foundations for the d
https://en.wikipedia.org/wiki/Insect%20sting%20allergy
Insect sting allergy is the term commonly given to the allergic response of an animal in response to the bite or sting of an insect. Typically, insects which generate allergic responses are either stinging insects (wasps, bees, hornets and ants ) or biting insects (mosquitoes, ticks). Stinging insects inject venom into their victims, whilst biting insects normally introduce anti-coagulants into their victims. The great majority of insect allergic animals just have a simple allergic response – a reaction local to the sting site which appears as just a swelling arising from the release of histamine and other chemicals from the body tissues near to the sting site. The swelling, if allergic, can be helped by the provision of an anti-histamine ointment as well as an ice pack. This is the typical response for all biting insects and many people have this common reaction. Mosquito allergy may result in a collection of symptoms called skeeter syndrome that occur after a bite. This syndrome may be mistaken for an infection such as cellulitis. In anaphylactic patients the response is more aggressive leading to a systemic reaction where the response progresses from the sting site around the whole body. This is potentially something very serious and can lead to anaphylaxis, which is potentially life-threatening. Epidemiology The majority of individuals who receive a sting from an insect experience local reactions. It is estimated that 5-10% of individuals will experience a generalized systemic reaction that can involve symptoms ranging from hives to wheezing and even anaphylaxis. In the United States approximately 40 people die each year from anaphylaxis due to stinging insect allergy. Potentially life-threatening reactions occur in 3% of adults and 0.4–0.8% of children. Prevention A 2012 meta-analysis found that venom immunotherapy is an effective prophylactic treatment against insect bite and sting allergic reactions, and significantly improves the quality of life of pe
https://en.wikipedia.org/wiki/Constructive%20perception
Constructive perception, is the theory of perception in which the perceiver uses sensory information and other sources of information to construct a cognitive understanding of a stimulus. In contrast to this top-down approach, there is the bottom-up approach of direct perception. Perception is more of a hypothesis, and the evidence to support this is that "Perception allows behaviour to be generally appropriate to non-sensed object characteristics," meaning that we react to obvious things that, for example, are like doors even though we only see a "long, narrow rectangle as the door is ajar." Also known as intelligent perception, constructive perception shows the relationship between intelligence and perception. This comes from the importance of high-order thinking and learning in perception. During perception, hypotheses are formed and tested about percepts that are based on three things: sensory data, knowledge, and high-level cognitive processes. Visual sensations are usually correctly attributed because we unconsciously assimilate information from many sources and then unconsciously make judgments based on this information. The philosophy of Immanuel Kant explains that our perception of the world is reciprocal; it both is affected by and affects our experience of the world. Evidence of constructive perception Context effects are not explained by bottom-up theories of accounting. Irving Biederman performed experiments that demonstrated dramatic context effects. For example, Stephen Palmer carried out an experiment in which the participants were asked to identify objects after they were shown either a relevant or irrelevant context. They might be shown a scene of a baseball game, followed by images of a baseball, car, and a phone. The stimuli that was most relevant to the context, the baseball, was recognized quicker than those that were irrelevant, car and phone. Perceptual constancy gives evidence that high-level constructive processes occur during perceptio
https://en.wikipedia.org/wiki/Immunohaematology
Immunohematology is a branch of hematology and transfusion medicine which studies antigen-antibody reactions and analogous phenomena as they relate to the pathogenesis and clinical manifestations of blood disorders. A person employed in this field is referred to as an immunohematologist. Their day-to-day duties include blood typing, cross-matching and antibody identification. Immunohematology and Transfusion Medicine is a medical post graduate specialty in many countries. The specialist Immunohematology and Transfusion Physician provides expert opinion for difficult transfusions, massive transfusions, incompatibility work up, therapeutic plasmapheresis, cellular therapy, irradiated blood therapy, leukoreduced and washed blood products, stem cell procedures, platelet rich plasma therapies, HLA and cord blood banking. Other research avenues are in the field of stem cell researches, regenerative medicine and cellular therapy. Immunohematology is one of the specialized branches of medical science. It deals with the concepts and clinical 2 techniques related to modern transfusion therapy. Efforts to save human lives by transfusing blood have been recorded for several centuries. The era of blood transfusion, however, really began when William Harvey described the circulation of blood in 1616. See also Clinical laboratory scientist Transfusion medicine
https://en.wikipedia.org/wiki/Human%20body%20temperature
Normal human body-temperature (normothermia, euthermia) is the typical temperature range found in humans. The normal human body temperature range is typically stated as . Human body temperature varies. It depends on sex, age, time of day, exertion level, health status (such as illness and menstruation), what part of the body the measurement is taken at, state of consciousness (waking, sleeping, sedated), and emotions. Body temperature is kept in the normal range by a homeostatic function known as thermoregulation, in which adjustment of temperature is triggered by the central nervous system. Methods of measurement Taking a human's temperature is an initial part of a full clinical examination. There are various types of medical thermometers, as well as sites used for measurement, including: In the rectum (rectal temperature) In the mouth (oral temperature) Under the arm (axillary temperature) In the ear (tympanic temperature) On the skin of the forehead over the temporal artery Using heat flux sensors Variations Temperature control (thermoregulation) is a homeostatic mechanism that keeps the organism at optimum operating temperature, as the temperature affects the rate of chemical reactions. In humans, the average internal temperature is widely accepted to be 37 °C (98.6 °F), a "normal" temperature established in the 1800s. But newer studies show that average internal temperature for men and women is . No person always has exactly the same temperature at every moment of the day. Temperatures cycle regularly up and down through the day, as controlled by the person's circadian rhythm. The lowest temperature occurs about two hours before the person normally wakes up. Additionally, temperatures change according to activities and external factors. In addition to varying throughout the day, normal body temperature may also differ as much as from one day to the next, so that the highest or lowest temperatures on one day will not always exactly match the high
https://en.wikipedia.org/wiki/Nigrescence
Nigrescence is a word with a Latin origin. It describes a process of becoming black or developing a racial identity. Nigrescence extends through history and impacts those victimized by racism and white supremacy. Recent psychological adaptations instigated identity formation for persons of African American descent. The process of enslavement typically included deliberate and forceful repression of traditional languages and mental development to stifle the desire for freedom and to make freedom feel unattainable and unrealistic. Slave owners knew that physical restraints were never as effective as broken spirits. Hundreds of years later, the descendants of African diaspora struggle to process any form of trauma, which typically results in delayed progress in emotional development. Professor William E. Cross Jr. included a theory of Nigrescence in his groundbreaking book Shades of Black: Diversity in African American Identity, which was published in 1991. His theory assumed that African Americans are "believed to be socialized into the predominant culture, which resulted in diminished racial identification", and thus the Nigrescence model posits that an encounter with an instance of racism or racial discrimination may precipitate the exploration and formation of racial identity, and foster a deeper understanding of the role race plays in the lives of African Americans. African Americans then proceed through a series of distinct psychological stages as they move from self-degradation to self-pride over time. This model is depicted in the 6th episode of the 2023 TV series Unprisoned. Charles Thomas came up with the concept of negromachy. He believed there was a confusion of self-worth where the person shows inappropriate dependence on white society for self definition. He created a five-stage nigrescence model. Bailey Jackson created a four-stage nigrescense model. Frantz Fanon coined the term, "to become black", which meant as a process of developing a black ide
https://en.wikipedia.org/wiki/Obturator%20canal
The obturator canal is a passageway formed in the obturator foramen by part of the obturator membrane and the pelvis. It connects the pelvis to the thigh. Structure The obturator canal is formed between the obturator membrane and the pelvis. The obturator artery, obturator vein, and obturator nerve all travel through the canal. Clinical significance An obturator hernia is a type of hernia involving an intrusion into the obturator canal. The obturator nerve can be compressed in the obturator canal. The obturator canal may be compressed during pregnancy and major traumatic injuries, causing obturator syndrome. See also Obturator fascia
https://en.wikipedia.org/wiki/Canal%20%28anatomy%29
In anatomy, a canal (or canalis in Latin) is a tubular passage or channel which connects different regions of the body. Examples include: Cranial Region Alveolar canals Carotid canal Facial canal Greater palatine canal Incisive canals Infraorbital canal Mandibular canal Optic canal Palatovaginal canal Pterygoid canal Abdominal Region Inguinal canal Pelvic Region Anal canal Pudendal canal Upper Extremities Suprascapular canal Carpal canal Ulnar canal Radial canal Lower Extremities Adductor canal Femoral canal Obturator canal See also Foramen Canaliculus
https://en.wikipedia.org/wiki/Einstein%E2%80%93Brillouin%E2%80%93Keller%20method
The Einstein–Brillouin–Keller method (EBK) is a semiclassical method (named after Albert Einstein, Léon Brillouin, and Joseph B. Keller) used to compute eigenvalues in quantum-mechanical systems. EBK quantization is an improvement from Bohr-Sommerfeld quantization which did not consider the caustic phase jumps at classical turning points. This procedure is able to reproduce exactly the spectrum of the 3D harmonic oscillator, particle in a box, and even the relativistic fine structure of the hydrogen atom. In 1976–1977, Michael Berry and M. Tabor derived an extension to Gutzwiller trace formula for the density of states of an integrable system starting from EBK quantization. There have been a number of recent results on computational issues related to this topic, for example, the work of Eric J. Heller and Emmanuel David Tannenbaum using a partial differential equation gradient descent approach. Procedure Given a separable classical system defined by coordinates , in which every pair describes a closed function or a periodic function in , the EBK procedure involves quantizing the line integrals of over the closed orbit of : where is the action-angle coordinate, is a positive integer, and and are Maslov indexes. corresponds to the number of classical turning points in the trajectory of (Dirichlet boundary condition), and corresponds to the number of reflections with a hard wall (Neumann boundary condition). Examples 1D Harmonic oscillator The Hamiltonian of a simple harmonic oscillator is given by where is the linear momentum and the position coordinate. The action variable is given by where we have used that is the energy and that the closed trajectory is 4 times the trajectory from 0 to the turning point . The integral turns out to be , which under EBK quantization there are two soft turning points in each orbit and . Finally, that yields , which is the exact result for quantization of the quantum harmonic oscillator. 2D hydrogen atom Th
https://en.wikipedia.org/wiki/Urban%20wild
An urban wild is a remnant of a natural ecosystem found in the midst of an otherwise highly developed urban area. Utility Urban wilds, particularly those of several acres or more, are often intact ecological systems that can provide essential ecosystem functions such as the filtering of urban run-off, the storing and slowing the flow of stormwater, amelioration of the warming effect of urban development, and generally benefiting local air quality. Typically, urban wilds are home to native vegetation and animal life as well as some introduced species. Urban wilds are vital to species of migratory birds that have nested in a given area since prior to its urbanization. Preservation Without formal protection, urban wilds are vulnerable to development. However, achieving formal protection of a large urban wild can be difficult. Land tenure of a single ecological area can be complex, with multiple public and private entities owning adjacent properties. Key strategies used in the preservation of urban wilds have included conservation restrictions that keep complex land tenure systems in place while protecting the entire landscape. Public/private partnerships have also been successful in protecting urban wilds. The urban wilds prioritized by municipalities tend to be partial wetlands that perform a range of ecological services while contributing to the biological diversity of the region. Passive parks There is some discussion about whether natural areas that are not at an appropriate scale to perform significant ecosystem services should instead be categorized as passive parks as opposed to urban wilds. Smaller urban wilds are used for passive recreation and have less value to the city in terms of enhancing ecosystem function.
https://en.wikipedia.org/wiki/List%20of%20national%20dances
This is a list of national dances. This may be a formal or informal designation. Not all nations officially recognize a national dance or dances. By country
https://en.wikipedia.org/wiki/International%20Conference%20on%20High%20Energy%20Physics
ICHEP or International Conference on High Energy Physics is one of the most prestigious international scientific conferences in the field of particle physics, bringing together leading theorists and experimentalists of the world. It was first held in 1950, and is biennial since 1960. Since the first conferences of the series took place in Rochester, New York, the event is also commonly referred to as the Rochester conference. Past conferences I Rochester (1950) II Rochester (1952) III Rochester (1952) IV Rochester (1954) V Rochester (1955) VI Rochester (1956) VII Rochester (1957) VIII Geneva (1958) IX Kiev (1959) X Rochester (1960) XI Geneva (1962) XII Dubna (1964) XIII Berkeley (1966) XIV Vienna (1968) XV Kiev (1970) XVI Chicago (1972) XVII London (1974) XVIII Tbilisi (1976) XIX Tokyo (1978) XX Madison (1980) XXI Paris (1982) XXII Leipzig (1984) XXIII Berkeley (1986) XXIV Munich (1988) XXV Singapore (1990) XXVI Dallas (1992) XXVII Glasgow (1994) XXVIII Warsaw (1996) XXIX Vancouver (1998) XXX Osaka (2000) XXXI Amsterdam (2002) XXXII Beijing (2004) XXXIII Moscow (2006) XXXIV Philadelphia (2008) XXXV Paris (2010) XXXVI Melbourne (2012) XXXVII Valencia (2014) XXXVIII Chicago (2016) XXXIX Seoul (2018) XL Prague (2020), virtual XLI Bologna (2022)
https://en.wikipedia.org/wiki/Average%20path%20length
Average path length, or average shortest path length is a concept in network topology that is defined as the average number of steps along the shortest paths for all possible pairs of network nodes. It is a measure of the efficiency of information or mass transport on a network. Concept Average path length is one of the three most robust measures of network topology, along with its clustering coefficient and its degree distribution. Some examples are: the average number of clicks which will lead you from one website to another, or the number of people you will have to communicate through, on an average, to contact a complete stranger. It should not be confused with the diameter of the network, which is defined as the longest geodesic, i.e., the longest shortest path between any two nodes in the network (see Distance (graph theory)). The average path length distinguishes an easily negotiable network from one, which is complicated and inefficient, with a shorter average path length being more desirable. However, the average path length is simply what the path length will most likely be. The network itself might have some very remotely connected nodes and many nodes, which are neighbors of each other. Definition Consider an unweighted directed graph with the set of vertices . Let , where denote the shortest distance between and . Assume that if cannot be reached from . Then, the average path length is: where is the number of vertices in . Applications In a real network like the Internet, a short average path length facilitates the quick transfer of information and reduces costs. The efficiency of mass transfer in a metabolic network can be judged by studying its average path length. A power grid network will have fewer losses if its average path length is minimized. Most real networks have a very short average path length leading to the concept of a small world where everyone is connected to everyone else through a very short path. As a result, most m
https://en.wikipedia.org/wiki/Malleolus
A malleolus is the bony prominence on each side of the human ankle. Each leg is supported by two bones, the tibia on the inner side (medial) of the leg and the fibula on the outer side (lateral) of the leg. The medial malleolus is the prominence on the inner side of the ankle, formed by the lower end of the tibia. The lateral malleolus is the prominence on the outer side of the ankle, formed by the lower end of the fibula. The word malleolus (), plural malleoli (), comes from Latin and means "small hammer". (It is cognate with mallet.) Medial malleolus The medial malleolus is found at the foot end of the tibia. The medial surface of the lower extremity of tibia is prolonged downward to form a strong pyramidal process, flattened from without inward - the medial malleolus. The medial surface of this process is convex and subcutaneous. The lateral or articular surface is smooth and slightly concave, and articulates with the talus. The anterior border is rough, for the attachment of the anterior fibers of the deltoid ligament of the ankle-joint. The posterior border presents a broad groove, the malleolar sulcus, directed obliquely downward and medially, and occasionally double; this sulcus lodges the tendons of the Tibialis posterior and Flexor digitorum longus. The summit of the medial malleolus is marked by a rough depression behind, for the attachment of the deltoid ligament. The major structure that passes anterior to the medial malleolus is the saphenous vein. Structures that pass behind medial malleolus deep to the flexor retinaculum: Tibialis posterior tendon Flexor digitorum longus Posterior tibial artery Posterior tibial vein Tibial nerve Flexor hallucis longus Lateral malleolus The lateral malleolus is found at the foot end of the fibula, of a pyramidal form, and somewhat flattened from side to side; it descends to a lower level than the medial malleolus. The medial surface presents in front a smooth triangular surface, convex from above do
https://en.wikipedia.org/wiki/Center%20for%20Drug%20Evaluation%20and%20Research
The Center for Drug Evaluation and Research (CDER, pronounced "see'-der") is a division of the U.S. Food and Drug Administration (FDA) that monitors most drugs as defined in the Food, Drug, and Cosmetic Act. Some biological products are also legally considered drugs, but they are covered by the Center for Biologics Evaluation and Research. The center reviews applications for brand name, generic, and over the counter pharmaceuticals, manages US current Good Manufacturing Practice (cGMP) regulations for pharmaceutical manufacturing, determines which medications require a medical prescription, monitors advertising of approved medications, and collects and analyzes safety data about pharmaceuticals that are already on the market. CDER receives considerable public scrutiny, and thus implements processes that tend toward objectivity and tend to isolate decisions from being attributed to specific individuals. The decisions on approval will often make or break a small company's stock price (e.g., Martha Stewart and Imclone), so the markets closely watch CDER's decisions. The center has around 1,300 employees in "review teams" that evaluate and approve new drugs. Additionally, the CDER employs a "safety team" with 72 employees to determine whether new drugs are unsafe or present risks not disclosed in the product's labeling. The FDA's budget for approving, labeling, and monitoring drugs is roughly $290 million per year. The safety team monitors the effects of more than 3,000 prescription drugs on 200 million people with a budget of about $15 million a year. Patrizia Cavazzoni is the current director of CDER. Responsibilities CDER reviews New Drug Applications to ensure that the drugs are safe and effective. Its primary objective is to ensure that all prescription and over-the-counter (OTC) medications are safe and effective when used as directed. The FDA requires a four-phased series of clinical trials for testing drugs. Phase I involves testing new drugs on healthy
https://en.wikipedia.org/wiki/Moreton%20wave
A Moreton wave, Solar Tsunami, or Moreton-Ramsey wave is the chromospheric signature of a large-scale solar corona shock wave. Described as a kind of solar "tsunami", they are generated by solar flares. They are named for American astronomer Gail Moreton, an observer at the Lockheed Solar Observatory in Burbank, and Harry E. Ramsey, an observer who spotted them in 1959 at The Sacramento Peak Observatory. He discovered them in time-lapse photography of the chromosphere in the light of the Balmer alpha transition. There were few follow-up studies for decades. Then the 1995 launch of the Solar and Heliospheric Observatory (SOHO) led to observation of coronal waves, which cause Moreton waves. Moreton waves were a research topic again. (SOHO's EIT instrument discovered another, different wave type called "EIT waves".) The reality of Moreton waves (also known as fast-mode MHD waves) has also been confirmed by the two Solar Terrestrial Relations Observatory (STEREO) spacecraft. They observed a 100,000-km-high wave of hot plasma and magnetism, moving at 250 km/s, in conjunction with a big coronal mass ejection in February 2009. Moreton measured the waves propagating at a speed of 500–1500 km/s. Yutaka Uchida interpreted Moreton waves as MHD fast mode shock waves propagating in the corona. He links them to type II radio bursts, which are radio-wave discharges created when coronal mass ejections accelerate shocks. Moreton waves can be observed primarily in the Hα band. See also Asteroseismology Gravity wave Helioseismology OSO 8 Solar prominence Solar spicule Solar transition region
https://en.wikipedia.org/wiki/List%20of%20mathematic%20operators
In mathematics, an operator or transform is a function from one space of functions to another. Operators occur commonly in engineering, physics and mathematics. Many are integral operators and differential operators. In the following L is an operator which takes a function to another function . Here, and are some unspecified function spaces, such as Hardy space, Lp space, Sobolev space, or, more vaguely, the space of holomorphic functions. See also List of transforms List of Fourier-related transforms Transfer operator Fredholm operator Borel transform Glossary of mathematical symbols Operators Operators Operators
https://en.wikipedia.org/wiki/B.A.T.M.A.N.
The Better Approach to Mobile Ad-hoc Networking (B.A.T.M.A.N.) is a routing protocol for multi-hop mobile ad hoc networks which is under development by the German "Freifunk" community and intended to replace the Optimized Link State Routing Protocol (OLSR). B.A.T.M.A.N.'s crucial point is the decentralization of knowledge about the best route through the network — no single node has all the data. This technique eliminates the need to spread information about network changes to every node in the network. The individual node only saves information about the "direction" it received data from and sends its data accordingly. The data gets passed from node to node, and packets get individual, dynamically created routes. A network of collective intelligence is created. In early 2007, the B.A.T.M.A.N. developers started experimenting with the idea of routing on layer 2 (Ethernet layer) instead of layer 3. To differentiate from the layer 3 routing daemon, the suffix "adv" (for: advanced) was chosen. Instead of manipulating routing tables based on information exchanged via UDP/IP, it provides a virtual network interface and transparently transports Ethernet packets on its own. The batman-adv kernel module has been part of the official Linux kernel since 2.6.38. Operation B.A.T.M.A.N. has elements of classical routing protocols: It detects other B.A.T.M.A.N. nodes and finds the best way (route) to these. It also keeps track of new nodes and informs its neighbors about their existence. In static networks, network administrators or technicians decide which computer is reached via which way or cable. As radio networks undergo constant changes and low participation-thresholds are a vital part of the "Freifunk"-networks' foundation, this task has to be automated as much as possible. On a regular basis, every node sends out a broadcast, thereby informing all its neighbors about its existence. The neighbors then relay this message to their neighbors, and so on. This carries the
https://en.wikipedia.org/wiki/Postmortem%20studies
Postmortem studies are a type of neurobiological research, which provides information to researchers and individuals who will have to make medical decisions in the future. Postmortem researchers conduct a longitudinal study of the brain of an individual, who has some sort of phenomenological condition (i.e. cannot speak, trouble moving left side of body, Alzheimer's, etc.) that is examined after death. Researchers look at certain lesions in the brain that could have an influence on cognitive or motor functions. These irregularities, damage, or other cerebral anomalies observed in the brain are attributed to an individual's pathophysiology and their environmental surroundings. Postmortem studies provide a unique opportunity for researchers to study different brain attributes that would be unable to be studied on a living person. Postmortem studies allow researchers to determine causes and cure for certain diseases and functions. It is critical for researchers to develop hypotheses, in order to discover the characteristics that are meaningful to a particular disorder. The results that the researcher discovers from the study will help the researcher trace the location in the brain to specific behaviors. When tissue from a postmortem study is obtained it is imperative that the researcher ensures the quality is adequate to study. This is specifically important when an individual is researching gene expression (i.e. DNA, RNA, and proteins). Some key ways researchers monitor the quality are by determining the pain level/time of death of the individual, pH of the tissue, refrigeration time and temperature of storage, time until the brain tissue is frozen, and the thawing conditions. As well as finding out specific information about the individual's life such as: age, sex, legal/illegal substance use, and a treatment analysis of the individual. Background Postmortem studies have been used to further the understanding of the brain for centuries. Before the time of the MRI,
https://en.wikipedia.org/wiki/Time-sharing%20system%20evolution
This article covers the evolution of time-sharing systems, providing links to major early time-sharing operating systems, showing their subsequent evolution. The meaning of the term time-sharing has shifted from its original usage. From 1949 to 1960, time-sharing was used to refer to multiprogramming; it evolved to mean multi-user interactive computing. Time-sharing Time-sharing was first proposed in the mid- to late-1950s and first implemented in the early 1960s. The concept was born out of the realization that a single expensive computer could be efficiently utilized by enabling multiprogramming, and, later, by allowing multiple users simultaneous interactive access. In 1984, Christopher Strachey wrote he considered the change in the meaning of the term time-sharing to be a source of confusion and not what he meant when he wrote his original paper in 1959. Without time-sharing, an individual user would enter bursts of information followed by long pauses; but with a group of users working at the same time, the pauses of one user would be filled by the activity of the others. Similarly, small slices of time spent waiting for disk, tape, or network input could be granted to other users. Given an optimal group size, the overall process could be very efficient. Each user would use their own computer terminal, initially electromechanical teleprinters such as the Teletype Model 33 ASR or the Friden Flexowriter; from about 1970 these were progressively superseded by CRT-based units such as the DEC VT05, Datapoint 2200 and Lear Siegler ADM-3A. Terminals were initially linked to a nearby computer via current loop or serial cables, by conventional telegraph circuits provided by PTTs and over specialist digital leased lines such T1. Modems such as the Bell 103 and successors, allowed remote and higher-speed use over the analogue voice telephone network. Family tree of major systems See details and additional systems in the table below. Relationships shown here are for
https://en.wikipedia.org/wiki/Curvatures%20of%20the%20stomach
The curvatures of the stomach refer to the long, convex, lateral suface and the shorter, concave, medial surface of the organ, which are referred to as the greater and lesser curvatures, respectively. The greater curvature, which begins at the cardiac notch, and arches backwards, passing inferiorly to the left, is four or five times as long as the lesser curvature, which attaches to the hepatogastric ligament and is supplied by the left gastric artery and right gastric branch of the hepatic artery. Greater curvature The greater curvature of the stomach forms the lower left or lateral border of the stomach. Surface Starting from the cardiac orifice at the incisura cardiaca, it forms an arch backward, upward, and to the left; the highest point of the convexity is on a level with the sixth left costal cartilage. From this level it may be followed downward and forward, with a slight convexity to the left as low as the cartilage of the ninth rib; it then turns to the right, to the end of the pylorus. Directly opposite the incisura angularis of the lesser curvature the greater curvature presents a dilatation, which is the left extremity of the pyloric part; this dilatation is limited on the right by a slight groove, the sulcus intermedius, which is about 2.5 cm, from the duodenopyloric constriction. The portion between the sulcus intermedius and the duodenopyloric constriction is termed the pyloric antrum. At its commencement the greater curvature is covered by peritoneum continuous with that covering the front of the organ. The left part of the curvature gives attachment to the gastrolienal ligament, while to its anterior portion are attached the two layers of the greater omentum, separated from each other by the gastroepiploic vessels. Blood supply There are three arteries which primarily supply the greater curvature: short gastric arteries — upper part and the gastroepiploic vessels: gastric branches of left gastro-omental artery — middle part gastric b
https://en.wikipedia.org/wiki/Serositis
Serositis refers to inflammation of the serous tissues of the body, the tissues lining the lungs (pleura), heart (pericardium), and the inner lining of the abdomen (peritoneum) and organs within. It is commonly found with fat wrapping or creeping fat. Causes Serositis is seen in numerous conditions: Lupus erythematosus (SLE), for which it is one of the criteria, Rheumatoid arthritis Familial Mediterranean fever (FMF) Chronic kidney failure / Uremia Juvenile idiopathic arthritis Inflammatory bowel disease (especially Crohn's disease) Acute appendicitis Diffuse cutaneous systemic sclerosis See also Hyaloserositis
https://en.wikipedia.org/wiki/Paul%20K.%20Guillow%2C%20Inc.
Paul K. Guillow, Inc., commonly known as Guillow's, is an American manufacturer of balsa wood model aircraft kits. The company was founded by Paul K. Guillow in 1926 in Wakefield, Massachusetts, and was originally called NuCraft Toys. Founder Born in 1893, Paul Guillow was a naval aviator during World War I, and returned from Europe with an interest in aviation related toys. He later went on to graduate from Worcester Polytechnic Institute. Guillow died in 1951. Company history Among the company's earliest products were a card game called The Lindy Flying Game, which was introduced in 1927, and a board game called Crash: The New Airplane Game which was introduced in 1928. Soon after Charles Lindbergh's famous solo transatlantic flight in 1927, a craze for all things aeronautical swept over America. Guillow's capitalized on that fad by introducing a line of balsa wood model kits. The first line of Guillow's balsa non-flying shelf model kits consisted of twelve different World War I biplane fighters with six-inch wingspans that retailed for 10-cents each. Each kit contained a 3-view plan, balsa wood cement, two bottles of colored aircraft dope, a strip of bamboo for wing and landing gear strutsthis was considered relatively good value for such toys at that time. In 1933, demand for the kits were high enough as to enable Guillow's to move out of the family barn where it had started, and into its present-day location in Wakefield. In the 1940s, the company also supplemented the production of model airplanes with the publication of several books on the construction of flying model planes . During World War II, the supply of balsa wood was diverted to the war effort for the manufacture of rafts and life jackets. Guillow's was forced to use alternative materials like cardboard or pine wood to manufacture the model kits. In the meantime the company also diversified into building target drone aircraft as training aids for gunners. After the war, to meet changing cus
https://en.wikipedia.org/wiki/Lateral%20earth%20pressure
Lateral earth pressure is the pressure that soil exerts in the horizontal direction. The lateral earth pressure is important because it affects the consolidation behavior and strength of the soil and because it is considered in the design of geotechnical engineering structures such as retaining walls, basements, tunnels, deep foundations and braced excavations. The earth pressure problem dates from the beginning of the 18th century, when Gautier listed five areas requiring research, one of which was the dimensions of gravity-retaining walls needed to hold back soil. However, the first major contribution to the field of earth pressures was made several decades later by Coulomb, who considered a rigid mass of soil sliding upon a shear surface. Rankine extended earth pressure theory by deriving a solution for a complete soil mass in a state of failure, as compared with Coulomb's solution which had considered a soil mass bounded by a single failure surface. Originally, the Rankine's theory considered the case of only cohesionless soils. However, this theory has subsequently been extended by Bell to cover the case of soils possessing both cohesion and friction. Caquot and Kerisel modified Muller-Breslau's equations to account for a nonplanar rupture surface. The coefficient of lateral earth pressure The coefficient of lateral earth pressure, K, is defined as the ratio of the horizontal effective stress, σ’h, to the vertical effective stress, σ’v. The effective stress is the intergranular stress calculated by subtracting the pore pressure from the total stress as described in soil mechanics. K for a particular soil deposit is a function of the soil properties and the stress history. The minimum stable value of K is called the active earth pressure coefficient, Ka; the active earth pressure is obtained, for example, when a retaining wall moves away from the soil. The maximum stable value of K is called the passive earth pressure coefficient, Kp; the passive earth press
https://en.wikipedia.org/wiki/Addressin
Mucosal vascular addressin cell adhesion molecule 1 (MAdCAM-1) is a protein that in humans is encoded by the MADCAM1 gene. The protein encoded by this gene is an endothelial cell adhesion molecule that interacts preferentially with the leukocyte beta7 integrin LPAM-1 (alpha4 / beta7), L-selectin, and VLA-4 (alpha4 / beta1) on myeloid cells to direct leukocytes into mucosal and inflamed tissues. It is a member of the immunoglobulin superfamily and is similar to ICAM-1 and VCAM-1. Nomenclature Addressin is a lesser-used term to describe the group of adhesion molecules that are involved with lymphocyte homing, commonly found at high-endothelial venules (HEVs) where lymphocytes exit the blood and enter the lymph node. Addressins are the ligands to the homing receptors of lymphocytes. The task of these ligands and their receptors is to determine which tissue the lymphocyte will enter next. They carry carbohydrates in order to be recognized by L-selectin. Addressins physically bind to mobile lymphocytes to guide them to the HEVs. Examples of molecules that are often referred to as addressins are CD34 and GlyCAM-1 on HEVs in peripheral lymph nodes, and MAdCAM-1 on endothelial cells in the intestine. Function In terms of migration, MAdCAM-1 is selectively expressed on mucosal endothelial cells, driving memory T-cell re-circulation through mucosal tissues. In contrast, and indeed the main difference between the two molecules, ICAM molecules are involved with naïve T-cell re-circulation. Whereas MAdCAM-1 is selectively expressed, ICAM is broadly expressed on inflamed endothelium. Peripheral node addressins Peripheral node addressins (PNAd) are carbohydrate residues that are lymphocyte homing receptor ligands that are expressed on the HEVs of peripheral lymph nodes. These proteins collectively bind to L-selectin to guide lymphocytes such as mature naïve B and T cells into the lymph node. During the development of secondary lymphoid organs, PNAd expression is upregulat
https://en.wikipedia.org/wiki/Perineal%20branches%20of%20posterior%20femoral%20cutaneous%20nerve
The perineal branches of the posterior femoral cutaneous nerve are distributed to the skin at the upper and medial side of the thigh. One long perineal branch, inferior pudendal (long scrotal nerve), curves forward below and in front of the ischial tuberosity, pierces the fascia lata, and runs forward beneath the superficial fascia of the perineum to the skin of the scrotum in the male, and of the labium majus in the female. It communicates with the inferior anal nerves and the posterior scrotal nerves. See also Posterior cutaneous nerve of thigh
https://en.wikipedia.org/wiki/X-wave
In physics, X-waves are localized solutions of the wave equation that travel at a constant velocity in a given direction. X-waves can be sound, electromagnetic, or gravitational waves. They are built as a non-monochromatic superposition of Bessel beams. Ideal X-waves carry infinite energy, but finite-energy realizations have been observed in various frameworks. X-wave pulses can have superluminal phase and group velocity. In optics, X-waves solution have been reported within a quantum mechanical formulation. See also Nonlinear X-wave Droplet-shaped wave
https://en.wikipedia.org/wiki/Moon%20Pilot
Moon Pilot is a 1962 American Technicolor science fiction comedy film from Walt Disney Productions, released through Buena Vista Distribution, directed by James Neilson, and starring Tom Tryon, Brian Keith, Edmond O'Brien, Dany Saval, and Tommy Kirk. The film is based on Robert Buckner's 1960 novel Starfire, and reflects Disney's interest in America's space program during John F. Kennedy's presidential era in the early 1960s. Plot Air Force Capt. Richmond Talbot inadvertently volunteers to make the first manned flight around the Moon. He is ordered to keep the upcoming flight a secret, even from his family on his upcoming leave. On his flight to visit his family, Talbot is approached by Lyrae, a mysterious "foreign" girl who seems to know all about the astronaut's coming mission. She approaches Talbot to warn him about possible defects in his spacecraft. He assumes she is a spy, runs away from her, and contacts the Air Force. The Air Force orders him home and places him under the protection of "National Security", a thinly disguised FBI. Eventually, Lyrae reveals that she is a friendly alien from the planet Beta Lyrae. She wants to offer him a special paint formula that when applied to his rocket, will safeguard his brain from "proton rays". Enchanted by the young woman, Talbot sneaks away from the agents who have been guarding him to spend more time with Lyrae. Eventually, after his rocket is launched, Lyrae appears by his side and convinces him to visit her planet with her. Talbot informs Mission Control that he will be a little late coming back. The film ends with Mission Control totally confounded by the bizarre transmissions they are receiving from both singing a romantic song duet about her home planet Beta Lyrae. Cast Tom Tryon as Captain Richmond Talbot Dany Saval as Lyrae Brian Keith as Major General Vanneman Edmond O'Brien as McClosky Tommy Kirk as Walter Talbot Bob Sweeney as Sen. Henry McGuire Kent Smith as Secretary of the Air Force S
https://en.wikipedia.org/wiki/Thomas%20Fuller%20%28mental%20calculator%29
Thomas Fuller (1710 – December 1790), also known as "Negro Demus" and the "Virginia Calculator", was an enslaved African renowned for his mathematical abilities. History Born in Africa, likely somewhere between present-day Liberia and Benin, Fuller was enslaved and shipped to America 1724 at the age of 14. He became the legal property of Elizabeth Cox of Alexandria, Virginia. Despite his mathematical skill, Fuller was illiterate. Ethnomathematics researcher Ron Eglash theorizes that Fuller could've been Bassari, comparing his abilities to their mathematical traditions. Before colonialism, the Bassari used to have "specialists who were trained in the memorization of sums". Stories of his mathematical achievements spread through the Eastern Seaboard and reached as far as France and Germany, becoming fuel for the abolitionist movement. Documentation of abilities When Fuller was about 70 years old, William Hartshorne and Samuel Coates, members of the Pennsylvania Abolition Society (PAS), heard about Fuller's "extraordinary powers in arithmetic." The pair stopped their travels to investigate Fuller. Benjamin Rush, physician and Founding Father, had sought proof of Black intelligence, through the lens of "scientific achievement", to bolster antislavery causes . Through his membership in PAS, Rush was acquainted with Hartshorne and Coates and reported on their interview of Fuller in the Columbian Magazine. In this report, Rush stressed the credibility of Hartshorne and Coates. Rush retold how Hartshorne and Coates tested Fuller's mathematical abilities as follows: First. Upon being asked, how many seconds there are in a year and a half, he answered in about two minutes, 47,304,000. Second. On being asked how many seconds a man has lived, who is seventy years, seventeen days and twelve hours old, he answered, in a minute and a half, 2,210,500,800. One of the gentlemen, who employed himself with his pen in making these calculations, told him he was wrong, and that the
https://en.wikipedia.org/wiki/Calcific%20tendinitis
Calcific tendinitis is a common condition where calcium phosphate deposits form in a tendon, sometimes causing pain at the affected site. Deposits can occur in several places in the body, but are by far most common in the rotator cuff of the shoulder. Around 80% of those with deposits experience symptoms, typically chronic pain during certain shoulder movements, or sharp acute pain that worsens at night. Calcific tendinitis is typically diagnosed by physical exam and X-ray imaging. The disease often resolves completely on its own, but is typically treated with non-steroidal anti-inflammatory drugs to relieve pain, rest and physical therapy to promote healing, and in some cases various procedures to breakdown and/or remove the calcium deposits. Adults aged 30–50 are most commonly affected by calcific tendinitis. It is twice as common in women as men, and is not associated with exercise. Calcifications in the rotator cuff were first described by Ernest Codman in 1934. The name, "calcifying tendinitis" was coined by Henry Plenk in 1952. Signs and symptoms Up to 20% of those with calcific tendinitis have no symptoms. For those with symptoms, the symptoms vary based on the phase of the disease. In the initial "formative phase" when the calcium deposits are being formed, people rarely experience any symptoms. Those that do have symptoms tend to have intermittent shoulder pain, particularly during forward should flexion (i.e. lifting the arm in front of the body). In the "resorptive phase" when the calcium deposit is breaking down, many experience severe acute pain that worsens at night. Those affected tend to hold the shoulder rotated inwards to alleviate pain, and have difficulty lying on the affected shoulder. Some people experience heat and redness at the affected shoulder, as well as a limited range of motion. Cause Calcific tendinitis is caused by deposits of calcium phosphate crystals in the tendons of the shoulder. These deposits are most frequently found in the
https://en.wikipedia.org/wiki/Insulin-like%20growth%20factor-binding%20protein
The insulin-like growth factor-binding protein (IGFBP) serves as a transport protein for insulin-like growth factor 1 (IGF-1). Function Approximately 98% of IGF-1 is always bound to one of six binding proteins (IGF-BP). IGFBP-3, the most abundant protein, accounts for 80% of all IGF binding. IGF-1 binds to IGFBP-3 in a 1:1 molar ratio. IGF-BP also binds to IGF-1 inside the liver, allowing growth hormone to continuously act upon the liver to produce more IGF-1. IGF binding proteins (IGFBPs) are proteins of 24 to 45 kDa. All six IGFBPs share 50% homology with each other and have binding affinities for IGF-I and IGF-II at the same order of magnitude as the ligands have for the IGF-IR. The IGFBPs help to lengthen the half-life of circulating IGFs in all tissues, including the prostate. Individual IGFBPs may act to enhance or attenuate IGF signaling depending on their physiological context (i.e. cell type). Even with these similarities, some characteristics are different: chromosomal location, heparin binding domains, RGD recognition site, preference for binding IGF-I or IGF-II, and glycosylation and phosphorylation differences. These structural differences can have a tremendous impact on how the IGFBPs interact with cellular basement membranes. Family members In humans, IGFBPs are transcribed from the following seven genes: IGFBP1 IGFBP2 IGFBP3 IGFBP4 IGFBP5 IGFBP6 IGFBP7 See also Insulin-like growth factor receptor
https://en.wikipedia.org/wiki/Liability-driven%20investment%20strategy
Liability-driven investment policies and asset management decisions are those largely determined by the sum of current and future liabilities attached to the investor, be it a household or an institution. As it purports to associate constantly both sides of the balance sheet in the investment process, it has been called a "holistic" investment methodology. In essence, the liability-driven investment strategy (LDI) is an investment strategy of a company or individual based on the cash flows needed to fund future liabilities. It is sometimes referred to as a "dedicated portfolio" strategy. It differs from a “benchmark-driven” strategy, which is based on achieving better returns than an external index such as the S&P 500 or a combination of indices that invest in the same types of asset classes. LDI is designed for situations where future liabilities can be predicted with some degree of accuracy. For individuals, the classic example would be the stream of withdrawals from a retirement portfolio that a retiree will make to pay living expenses from the date of retirement to the date of death. For companies, the classic example would be a pension fund that must make future payouts to pensioners over their expected lifetimes (see below). LDI for individuals A retiree following an LDI strategy begins by estimating the income needed each year in the future. Social security payments and any other income is subtracted from the income needed to determine how much will have to be withdrawn each year from the money in the retirement portfolio to meet the income need. These withdrawals become the liabilities that the investment strategy targets. The portfolio must be invested so as to provide the cash flows that match the withdrawals each year, after factoring in adjustments for inflation, irregular spending (such as an ocean cruise every other year), and so on. Individual bonds provide the ability to match the cash flows needed, which is why the term "cash flow matching" is s
https://en.wikipedia.org/wiki/Canonical%20S-expressions
A Canonical S-expression (or csexp) is a binary encoding form of a subset of general S-expression (or sexp). It was designed for use in SPKI to retain the power of S-expressions and ensure canonical form for applications such as digital signatures while achieving the compactness of a binary form and maximizing the speed of parsing. The particular subset of general S-expressions applicable here is composed of atoms, which are byte strings, and parentheses used to delimit lists or sub-lists. These S-expressions are fully recursive. While S-expressions are typically encoded as text, with spaces delimiting atoms and quotation marks used to surround atoms that contain spaces, when using the canonical encoding each atom is encoded as a length-prefixed byte string. No whitespace separating adjacent elements in a list is permitted. The length of an atom is expressed as an ASCII decimal number followed by a ":". Example The sexp (this "Canonical S-expression" has 5 atoms) becomes the csexp (4:this22:Canonical S-expression3:has1:55:atoms) No quotation marks are required to escape the space character internal to the atom "Canonical S-expression", because the length prefix clearly points to the end of the atom. There is no whitespace separating an atom from the next element in the list. Properties Uniqueness of canonical encoding: Forbidding whitespace between list elements and providing just one way of encoding atoms ensures that every S-expression has exactly one encoded form. Thus, we can decide whether two S-expressions are equivalent by comparing their encodings. Support for binary data: Atoms can be any binary string. So, a cryptographic hash value or a public key modulus that would otherwise have to be encoded in base64 or some other printable encoding can be expressed in csexp as its binary bytes. Support for type-tagging encoded information: A csexp includes a non-S-expression construct for indicating the encoding of a string, when that encoding is not
https://en.wikipedia.org/wiki/Vergence%20%28optics%29
In optics, vergence is the angle formed by rays of light that are not perfectly parallel to one another. Rays that move closer to the optical axis as they propagate are said to be converging, while rays that move away from the axis are diverging. These imaginary rays are always perpendicular to the wavefront of the light, thus the vergence of the light is directly related to the radii of curvature of the wavefronts. A convex lens or concave mirror will cause parallel rays to focus, converging toward a point. Beyond that focal point, the rays diverge. Conversely, a concave lens or convex mirror will cause parallel rays to diverge. Light does not actually consist of imaginary rays and light sources are not single-point sources, thus vergence is typically limited to simple ray modeling of optical systems. In a real system, the vergence is a product of the diameter of a light source, its distance from the optics, and the curvature of the optical surfaces. An increase in curvature causes an increase in vergence and a decrease in focal length, and the image or spot size (waist diameter) will be smaller. Likewise, a decrease in curvature decreases vergence, resulting in a longer focal length and an increase in image or spot diameter. This reciprocal relationship between vergence, focal length, and waist diameter are constant throughout an optical system, and is referred to as the optical invariant. A beam that is expanded to a larger diameter will have a lower degree of divergence, but if condensed to a smaller diameter the divergence will be greater. The simple ray model fails for some situations, such as for laser light, where Gaussian beam analysis must be used instead. Definition In geometrical optics, vergence describes the curvature of optical wavefronts. Vergence is defined as where n is the medium's refractive index and r is the distance from the point source to the wavefront. Vergence is measured in units of dioptres (D) which are equivalent to m−1. This desc
https://en.wikipedia.org/wiki/L%C3%A9on%20Van%20Hove
Léon Charles Prudent Van Hove (10 February 1924 – 2 September 1990) was a Belgian physicist and a Director General of CERN. He developed a scientific career spanning mathematics, solid state physics, elementary particle and nuclear physics to cosmology. Biography Van Hove studied mathematics and physics at the Université Libre de Bruxelles (ULB). In 1946 he received his PhD in mathematics at the ULB. From 1949 to 1954 he worked at the Institute for Advanced Study in Princeton, New Jersey by virtue of his meeting with Robert Oppenheimer. Later he worked at the Brookhaven National Laboratory and was a professor and Director of the Theoretical Physics Institute at the University of Utrecht in the Netherlands. In the 1950s he laid the theoretical foundations for the analysis of inelastic neutron scattering in terms of the dynamic structure factor. In 1958, he was awarded the Francqui Prize in Exact Sciences. In 1959, he received an invitation to become the head of the Theory Division at CERN in Geneva. In 1975 Prof. Van Hove was appointed CERN Director-General, with John Adams, responsible for the research activities of the Organization. The LEP project was proposed during Van Hove's tenure as Director General. Awards Francqui Prize, 1958 Dannie Heineman Prize for Mathematical Physics, 1962 Member, American Academy of Arts and Sciences, 1964 Max Planck Medal, 1974 Member, United States National Academy of Sciences, 1980 Member, American Philosophical Society, 1980 There is a square, Square Van Hove, named after Van Hove at CERN, Geneva, Switzerland. See also Quark–gluon plasma Quasielastic scattering Quasielastic neutron scattering List of Directors General of CERN Théophile de Donder Hilbrand J. Groenewold for the Groenewold–Van Hove theorem
https://en.wikipedia.org/wiki/Radioactivity%20in%20the%20life%20sciences
Radioactivity is generally used in life sciences for highly sensitive and direct measurements of biological phenomena, and for visualizing the location of biomolecules radiolabelled with a radioisotope. All atoms exist as stable or unstable isotopes and the latter decay at a given half-life ranging from attoseconds to billions of years; radioisotopes useful to biological and experimental systems have half-lives ranging from minutes to months. In the case of the hydrogen isotope tritium (half-life = 12.3 years) and carbon-14 (half-life = 5,730 years), these isotopes derive their importance from all organic life containing hydrogen and carbon and therefore can be used to study countless living processes, reactions, and phenomena. Most short lived isotopes are produced in cyclotrons, linear particle accelerators, or nuclear reactors and their relatively short half-lives give them high maximum theoretical specific activities which is useful for detection in biological systems. Radiolabeling is a technique used to track the passage of a molecule that incorporates a radioisotope through a reaction, metabolic pathway, cell, tissue, organism, or biological system. The reactant is 'labeled' by replacing specific atoms by their isotope. Replacing an atom with its own radioisotope is an intrinsic label that does not alter the structure of the molecule. Alternatively, molecules can be radiolabeled by chemical reactions that introduce an atom, moiety, or functional group that contains a radionuclide. For example, radio-iodination of peptides and proteins with biologically useful iodine isotopes is easily done by an oxidation reaction that replaces the hydroxyl group with iodine on tyrosine and histadine residues. Another example is to use chelators such DOTA that can be chemically coupled to a protein; the chelator in turn traps radiometals thus radiolabeling the protein. This has been used for introducing Yttrium-90 onto a monoclonal antibody for therapeutic purpose
https://en.wikipedia.org/wiki/Enterprise%20encryption%20gateway
An enterprise encryption gateway (EEG) is a layer 2 encryption device, similar to VPN, that allows for strong authentication and encryption for data across a wireless medium. Unlike a residential gateway, an enterprise gateway typically has both an LAN and WLAN interface, where the EEG acts a bridge between the two. The client devices have client-side authentication/encryption software, and the EEGs are the encryption termination point in the network. Benefits of these devices include offloading the encryption duties from the access points. Autonomous access points are placed downstream from the EEGs and may act as an 802.1X authenticator.
https://en.wikipedia.org/wiki/Dynamic%20priority%20scheduling
Dynamic priority scheduling is a type of scheduling algorithm in which the priorities are calculated during the execution of the system. The goal of dynamic priority scheduling is to adapt to dynamically changing progress and to form an optimal configuration in a self-sustained manner. It can be very hard to produce well-defined policies to achieve the goal depending on the difficulty of a given problem. Earliest deadline first scheduling and Least slack time scheduling are examples of Dynamic priority scheduling algorithms. Optimal Schedulable Utilization The idea of real-time scheduling is to confine processor utilization under schedulable utilization of a certain scheduling algorithm, which is scaled from 0 to 1. Higher schedulable utilization means higher utilization of resource and the better the algorithm. In preemptible scheduling, dynamic priority scheduling such as earliest deadline first (EDF) provides the optimal schedulable utilization of 1 in contrast to less than 0.69 with fixed priority scheduling such as rate-monotonic (RM). In periodic real-time task model, a task's processor utilization is defined as execution time over period. Every set of periodic tasks with total processor utilization less or equal to the schedulable utilization of an algorithm can be feasibly scheduled by that algorithm. Unlike fixed priority, dynamic priority scheduling could dynamically prioritize task deadlines achieving optimal schedulable utilization in the preemptible case. See also Earliest deadline first scheduling Least slack time scheduling
https://en.wikipedia.org/wiki/Mathematikum
The Mathematikum is a science museum, located in Gießen, Germany, which offers a huge variety of mathematical hands-on exhibits. It was founded by Albrecht Beutelspacher, a German mathematician. The Mathematikum opened its doors to visitors on 19 November 2002. It was inaugurated by the German president Johannes Rau. Since then, the museum has attracted more than 1,500,000 visitors. Annually the museum is visited by more than 150,000 people. The museum is opened every day of the week, including Sunday and Monday. Concept The purpose of the Mathematikum is to let people of any age, gender and any qualification learn mathematics by personal experience, rather than teaching it using formulae or equations and hardly ever numbers and symbols. The visitors can therefore learn, by participating in more than 150 interactive exhibits in the museum and by gathering, a different mathematical experience from each of the exhibits. Exhibits Mathematical experiments include mirrors, a Leonardo bridge, soap films, and puzzles. Once every month on a Tuesday, a mathematician is invited. The mathematician is interviewed by professor Beutelspacher on Beutelspachers Sofa (Beutelspacher's Couch). At the end of the interview the audience can talk to the guest and ask them questions. Awards 2004: IQ Award External links http://www.mathematikum.de Museums in Hesse Science museums in Germany Museums established in 2002 2002 establishments in Germany Giessen
https://en.wikipedia.org/wiki/Propionyl-CoA
Propionyl-CoA is a coenzyme A derivative of propionic acid. It is composed of a 24 total carbon chain (without the coenzyme, it is a 3 carbon structure) and its production and metabolic fate depend on which organism it is present in. Several different pathways can lead to its production, such as through the catabolism of specific amino acids or the oxidation of odd-chain fatty acids. It later can be broken down by propionyl-CoA carboxylase or through the methylcitrate cycle. In different organisms, however, propionyl-CoA can be sequestered into controlled regions, to alleviate its potential toxicity through accumulation. Genetic deficiencies regarding the production and breakdown of propionyl-CoA also have great clinical and human significance. Production There are several different pathways through which propionyl-CoA can be produced: Propionyl-CoA, a three-carbon structure, is considered to be a minor species of propionic acid. Therefore, odd-number chains of fatty acids are oxidized to yield both propionyl-CoA as well as acetyl-CoA. Propionyl-CoA is later converted into succinyl-CoA through biotin-dependant propionyl-CoA carboxylase (PCC) and b12-dependant methylmalonyl-CoA mutase (MCM), sequentially. Propionyl-CoA is not only produced from the oxidation of odd-chain fatty acids, but also by the oxidation of amino acids including methionine, valine, isoleucine, and threonine. Furthermore, catabolism of amino acids can also be a result of the conversion of propionyl-CoA to methylmalonyl-CoA by propionyl-CoA carboxylase. Cholesterol oxidation, which forms bile acids, also forms propionyl-CoA as a side product. In an experiment performed by Suld et al., when combining liver mitochondria and propionic acid with the addition of coenzyme A, labeled isotopes of psionic acid were degraded. However, following 5β-cholestane-3α,7α,12α,26-tetrol-26,27-C14 incubation, propionyl CoA was able to be rescued along with the formation of bile. Metabolic fate The metabolic (
https://en.wikipedia.org/wiki/Developmental%20systems%20theory
Developmental systems theory (DST) is an overarching theoretical perspective on biological development, heredity, and evolution. It emphasizes the shared contributions of genes, environment, and epigenetic factors on developmental processes. DST, unlike conventional scientific theories, is not directly used to help make predictions for testing experimental results; instead, it is seen as a collection of philosophical, psychological, and scientific models of development and evolution. As a whole, these models argue the inadequacy of the modern evolutionary synthesis on the roles of genes and natural selection as the principal explanation of living structures. Developmental systems theory embraces a large range of positions that expand biological explanations of organismal development and hold modern evolutionary theory as a misconception of the nature of living processes. Overview All versions of developmental systems theory espouse the view that: All biological processes (including both evolution and development) operate by continually assembling new structures. Each such structure transcends the structures from which it arose and has its own systematic characteristics, information, functions and laws. Conversely, each such structure is ultimately irreducible to any lower (or higher) level of structure, and can be described and explained only on its own terms. Furthermore, the major processes through which life as a whole operates, including evolution, heredity and the development of particular organisms, can only be accounted for by incorporating many more layers of structure and process than the conventional concepts of ‘gene’ and ‘environment’ normally allow for. In other words, although it does not claim that all structures are equal, development systems theory is fundamentally opposed to reductionism of all kinds. In short, developmental systems theory intends to formulate a perspective which does not presume the causal (or ontological) priority of any p
https://en.wikipedia.org/wiki/Curve%20%28tonality%29
In image editing, a curve is a remapping of image tonality, specified as a function from input level to output level, used as a way to emphasize colours or other elements in a picture. Curves can usually be applied to all channels together in an image, or to each channel individually. Applying a curve to all channels typically changes the brightness in part of the spectrum. Light parts of a picture can be easily made lighter and dark parts darker to increase contrast. Applying a curve to individual channels can be used to stress a colour. This is particularly efficient in the Lab colour space due to the separation of luminance and chromaticity, but it can also be used in RGB, CMYK or whatever other colour models the software supports. See also Blend modes Image histogram Hurter–Driffield curve Tone reproduction curve
https://en.wikipedia.org/wiki/Fred%20Diamond
Fred Irvin Diamond (born November 19, 1964) is a mathematician, known for his role in proving the modularity theorem for elliptic curves. His research interest is in modular forms and Galois representations. Life Diamond received his B.A. from the University of Michigan in 1984, and received his Ph.D. in mathematics from Princeton University in 1988 as a doctoral student of Andrew Wiles. He has held positions at Brandeis University and Rutgers University, and is currently a professor at King's College London. Diamond is the author of several research papers, and is also a coauthor along with Jerry Shurman of A First Course in Modular Forms, in the Graduate Texts in Mathematics series published by Springer-Verlag.
https://en.wikipedia.org/wiki/List%20of%20temperature%20sensors
Mechanical temperature sensors Thermometer Therm Electrical temperature sensors Thermistor- Thermistors are thermally sensitive resistors whose prime function is to exhibit a large, predictable and precise change in electrical resistance when subjected to a corresponding change in body temperature. Negative Temperature Coefficient (NTC) thermistors exhibit a decrease in electrical resistance when subjected to an increase in body temperature and Positive Temperature Coefficient (PTC) thermistors exhibit an increase in electrical resistance when subjected to an increase in body temperature. Thermocouple Resistance thermometer Silicon bandgap temperature sensor Integrated circuit sensors The integrated circuit sensors may come in a variety of interfaces — analogue or digital; for digital, these could be Serial Peripheral Interface, SMBus/I2C or 1-Wire. In OpenBSD, many of the I2C temperature sensors from the below list have been supported and are accessible through the generalised hardware sensors framework since OpenBSD 3.9 (2006), which has also included an ad-hoc method of automatically scanning the I2C bus by default during system boot since 2006 as well. In NetBSD, many of these I2C sensors are also supported and are accessible through the envsys framework, although none are enabled by default outside of Open Firmware architectures like macppc, and a manual configuration is required before first use on i386 or amd64. Remote uncooled IR thermal radiometer sensors are also commonly used in integrated circuits. List Non-exhaustive list of products classified by manufacturer. Legend Manufacturer : IC Manufacturer Part Number : IC Part Number Output Type : We can find 3 different Output types : Analog, Digital and Switch Designation : IC Designation Temperature Range : Die temperature range where the IC can operate. Accuracy (Typical) : Typical IC accuracy Accuracy (Max) : Maximum IC accuracy Linear Temperature Slope : Linear temperature slope
https://en.wikipedia.org/wiki/Fiwix
Fiwix is an operating system kernel based on the UNIX architecture and fully focused on being POSIX compatible. It is designed and developed mainly as a hobbyist operating system, but it also serves for educational purposes. It runs on the i386 hardware platform and is compatible with a good base of existing GNU applications. It follows the System V Application Binary Interface and is also Linux 2.0 System Call ABI mostly compatible The FiwixOS 3.2 operating system is a Fiwix distribution. It uses the Fiwix kernel, includes the GNU toolchain (GCC, Binutils, Make), it uses Newlib v4.2.0 as its C standard library, and Ext2 as its primary file system. Features Features according to the official website include: GRUB Multiboot Specification v1 compliant. Full 32bit protected mode non-preemptive kernel. POSIX compliant (mostly). For i386 processors and higher. Process groups, sessions and job control. Interprocess communication with pipes and signals. UNIX System V IPC (semaphores, message queues and shared memory). BSD file locking mechanism (POSIX restricted to whole file and advisory only). Virtual memory management up to 4GB (1GB physical only and no swapping yet). Linux 2.0 ABI system calls compatibility (mostly). ELF-386 executable format support (statically and dynamically linked). Round Robin based scheduler algorithm (no priorities yet). VFS abstraction layer. Ext2 filesystem support with 1KB, 2KB and 4KB block sizes. Minix v1 and v2 filesystem support. Linux-like Proc filesystem support (read only). ISO9660 filesystem support with Rock Ridge extensions. RAMdisk device support. Initial RAMdisk (initrd) image support. SVGALib based applications support. PCI local bus support. Virtual consoles support (up to 12). Keyboard driver with Linux keymaps support. Frame buffer device support for VESA VBE 2.0+ compliant graphic cards. Serial port RS-232 driver support. Remote serial console support. QEMU Bochs-style debug console support. Basi
https://en.wikipedia.org/wiki/Mathematical%20Grammar%20School
Mathematical Grammar School (, abbr. "MG" or "MGB"), is a special school for gifted and talented students of mathematics, physics and informatics located in Belgrade, Serbia. It is ranked number one at International Science Olympiads by the number of medals won by its students (more than 400). The School has developed its own Mathematical Grammar School Curriculum in various mathematics, physics, and IT subjects. There are approx. 160 professors employed, mostly scientists. One half of the professors come from University of Belgrade staff, Institute of Physics Belgrade, and Mathematical Institute of Serbian Academy of Sciences and Arts. More than half of the professors are former students of the School. School's staff maintains connections to, collaborates with, and frequently visits world's leading scientific institutions, such as CERN, Joint Institute for Nuclear Research – Dubna, Lomonosov Moscow State University, UC Berkeley, Oxford, Cambridge, Warwick University, Imperial College London. During the previous decade, students received full scholarships for UC Berkeley, Oxford, Cambridge, Warwick University, Imperial College London, Massachusetts Institute of Technology, MIT, Columbia University, Stanford University, Harvard University, University College London. The rest mostly obtain full scholarships from University of Belgrade. The School has 550 students, aged 12–19. There are 155 girls, and 395 boys. The average professors' work experience is 18 years. Reputation Mathematical Gymnasium is ranked as number one in the world in the field of mathematics. The School is famous for its unique results in international competitions and for results its students achieve at later stages of their university education. The School has approx. 2,000 PhD holders in its alumni, and approx. 6,000 Master of Science degree holders. Its advanced curriculum earned it an esteemed reputation as the breeding ground for future scientists, researchers, and industry leaders. Ma
https://en.wikipedia.org/wiki/CAPS%20%28buffer%29
CAPS is the common name for N-cyclohexyl-3-aminopropanesulfonic acid, a chemical used as buffering agent in biochemistry. The similar substance N-cyclohexyl-2-hydroxyl-3-aminopropanesulfonic acid (CAPSO) is also used as buffering agent in biochemistry. Its useful pH range is 9.7-11.1. See also CHES Good's buffers § List of Good's buffers
https://en.wikipedia.org/wiki/Idle%20scan
An idle scan is a TCP port scan method for determining what services are open on a target computer without leaving traces pointing back at oneself. This is accomplished by using packet spoofing to impersonate another computer (called a "zombie") so that the target believes it's being accessed by the zombie. The target will respond in different ways depending on whether the port is open, which can in turn be detected by querying the zombie. Overview This action can be done through common software network utilities such as nmap and hping. The attack involves sending forged packets to a specific machine target in an effort to find distinct characteristics of another zombie machine. The attack is sophisticated because there is no interaction between the attacker computer and the target: the attacker interacts only with the "zombie" computer. This exploit functions with two purposes, as a port scanner and a mapper of trusted IP relationships between machines. The target system interacts with the "zombie" computer and difference in behavior can be observed using different "zombies" with evidence of different privileges granted by the target to different computers. The overall intention behind the idle scan is to "check the port status while remaining completely invisible to the targeted host." Origins Discovered by Salvatore Sanfilippo (also known by his handle "Antirez") in 1998, the idle scan has been used by many black hat "hackers" to covertly identify open ports on a target computer in preparation for attacking it. Although it was originally named dumb scan, the term idle scan was coined in 1999, after the publication of a proof of concept 16-bit identification field (IPID) scanner named idlescan, by Filipe Almeida (aka LiquidK). This type of scan can also be referenced as zombie scan; all the nomenclatures are due to the nature of one of the computers involved in the attack. TCP/IP basics The design and operation of the Internet is based on the Internet Protoc
https://en.wikipedia.org/wiki/Dichloroisocyanuric%20acid
Dichloroisocyanuric acid, also known as dichlor or dichloro-s-triazinetrione and is marketed under many names (e.g. troclosene), is a chemical compound with the formula (C(O)NCl)2(C(O)NH). Synthesis Dichloroisocyanuric acid is manufactured by chlorination of cyanuric acid: (C(O)NH)3 + 2 Cl2 → (C(O)NCl)2(C(O)NH) + 2 HCl It is a colourless solid. Mechanism of action Dichloroisocyanuric acid is an oxidizer, reacting with water to form chlorine gas. Although the bleaching agent in most chlorine based bleach is sodium hypochlorite, the sodium salt of dichloroisocyanuric acid, sodium dichloroisocyanurate, is the active ingredient in commercial disinfectant bacteriocides, algicides, and cleaning agents such as the pulverized cleanser Comet. See also Trichloroisocyanuric acid (trichlor)
https://en.wikipedia.org/wiki/Clostridial%20necrotizing%20enteritis
Clostridial necrotizing enteritis (CNE) is a severe and potentially fatal type of food poisoning caused by a β-toxin of Clostridium perfringens, Type C. It occurs in some developing regions, particularly in New Guinea, where it is known as pig-bel. The disease was also documented in Germany following World War II, where it was called Darmbrand (literally "bowel fire," or bowel necrosis). The toxin is normally inactivated by certain proteolytic enzymes and by normal cooking, but when these protections are impeded by diverse factors, and high protein is consumed, the disease can emerge. Sporadic and extremely rare cases occur in diabetics. In New Guinea, where people generally have low-protein diets apart from tribal feasts, a number of factors—diet and endemic helminth infections among them—compound to result in pig-bel. Preterm infants The majority of preterm infants who develop NEC are generally healthy, feeding well, and growing prior to developing NEC. The most frequent sign of NEC is a sudden change in feeding tolerance, which can be manifest by numerous clinical signs listed below. While gastric residuals are often seen in early NEC, there is no evidence that routine measurement of gastric residual volumes in asymptomatic infants is a useful guide to prevent or detect the onset of NEC, or help to advance feeds. The timing of the onset of symptoms varies and appears to be inversely related to gestational age (GA). There appears to be a bimodal distribution (early versus late onset) based on GA. For example, the median age at onset of NEC in infants with a GA of less than 26 weeks was 23 days (late), and for those with a GA of greater than 31 weeks, the median age at onset was 11 days (early). Laboratory findings of infants presenting with NEC often include anemia, thrombocytopenia, evidence of disseminated intravascular coagulopathy (DIC), and in 20 percent of cases a positive blood culture. Signs and symptoms CNE is a necrotizing inflammation of the small b
https://en.wikipedia.org/wiki/Habit%20%28biology%29
Habit, equivalent to habitus in some applications in biology, refers variously to aspects of behaviour or structure, as follows: In zoology (particularly in ethology), habit usually refers to aspects of more or less predictable behaviour, instinctive or otherwise, though it also has broader application. Habitus refers to the characteristic form or morphology of a species. In botany, habit is the characteristic form in which a given species of plant grows (see plant habit). Behavior In zoology, habit (not to be confused with habitus as described below) usually refers to a specific behavior pattern, either adopted, learned, pathological, innate, or directly related to physiology. For example: ...the [cat] was in the habit of springing upon the [door knocker] in order to gain admission... If these sensitive parrots are kept in cages, they quickly take up the habit of feather plucking. The spider monkey has an arboreal habit and rarely ventures onto the forest floor. The brittle star has the habit of breaking off arms as a means of defense. Mode of life (or lifestyle, modus vivendi) is a concept related to habit, and it is sometimes referred to as the habit of an animal. It may refer to the locomotor capabilities, as in "(motile habit", sessile, errant, sedentary), feeding behaviour and mechanisms, nutrition mode (free-living, parasitic, holozoic, saprotrophic, trophic type), type of habitat (terrestrial, arboreal, aquatic, marine, freshwater, seawater, benthic, pelagic, nektonic, planktonic, etc.), period of activity (diurnal, nocturnal), types of ecological interaction, etc. The habits of plants and animals often change responding to changes in their environment. For example: if a species develops a disease or there is a drastic change of habitat or local climate, or it is removed to a different region, then the normal habits may change. Such changes may be either pathological, or adaptive. Structure In botany, habit is the general appearance, growth form,
https://en.wikipedia.org/wiki/Batman%3A%20Legacy
Legacy is a crossover story arc in the Batman comic book series, which is a sequel to another Batman story arc, Contagion and also serves as a follow-up to the Knightfall story arc. The tagline is: "The stakes are higher than they've ever been as Batman and his outnumbered forces race to solve a riddle from the distant past that threatens to erase all of mankind's tomorrow". The story concerns the returning outbreak of a lethal disease in Gotham City, and Batman's attempts to combat it with his closest allies by discovering its origin in the Middle East. The disease is known as the Apocalypse Plague, the Filovirus, Ebola Gulf A, or its more popular nickname: the Clench. An unlikely alliance searches the world for a possible cure including: Batman, Robin, Oracle, Nightwing, Huntress, Azrael, and Catwoman. There, Batman faces two of his deadliest foes: Ra's al Ghul and Bane. The Gotham Knights travel throughout the world as they race to stop the League of Assassins from releasing the pure strain of the virus across the globe, and Gotham itself would be a place for the rematch between the Dark Knight and Bane. Ra's continues his search for the suitable mate for his daughter, Talia al Ghul. Meanwhile, Batman leads the chase for a cure to save the life of Tim Drake (Robin) and prevent the end of the world. The events of this lead into Batman: Cataclysm (though there was a gap of over a year between the two story arcs), which itself leads into Batman: No Man's Land. 1st Edition - 1st printing. Collects Batman (1940-2011) #533-534, Batman: Shadow of the Bat (1992-2000) #53-54, Catwoman (1993-2001 2nd Series) #35-36, Detective Comics (1937-2011 1st Series) #699-702, and Robin (1993-2009) #31-33. Written by Chuck Dixon, Doug Moench, and Alan Grant. Art by Graham Nolan, Jim Aparo, Staz Johnson, Dave Taylor, Jim Balent, and Mike Wieringo. Batman and his small cadre of allies race against the clock to stop a threat from the past that may wipe out mankind's future. Reading
https://en.wikipedia.org/wiki/Nullary%20constructor
In computer programming, a nullary constructor is a constructor that takes no arguments. Also known as a 0-argument constructor, no-argument constructors or default constructor. Object-oriented constructors In object-oriented programming, a constructor is code that is run when an object is created. Default constructors of objects are usually nullary. Java example public class Example { protected int data; /* Nullary constructor */ public Example() { this(0); } /* Non-nullary constructor */ public Example(final int data) { this.data = data; } } Algebraic data types In algebraic data types, a constructor is one of many tags that wrap data. If a constructor does not take any data arguments, it is nullary. Haskell example -- nullary type constructor with two nullary data constructors data Bool = False | True -- non-nullary type constructor with one non-nullary data constructor data Point a = Point a a -- non-nullary type constructor with... data Maybe a = Nothing -- ...nullary data constructor | Just a -- ...unary data constructor
https://en.wikipedia.org/wiki/Energetic%20space
In mathematics, more precisely in functional analysis, an energetic space is, intuitively, a subspace of a given real Hilbert space equipped with a new "energetic" inner product. The motivation for the name comes from physics, as in many physical problems the energy of a system can be expressed in terms of the energetic inner product. An example of this will be given later in the article. Energetic space Formally, consider a real Hilbert space with the inner product and the norm . Let be a linear subspace of and be a strongly monotone symmetric linear operator, that is, a linear operator satisfying for all in for some constant and all in The energetic inner product is defined as for all in and the energetic norm is for all in The set together with the energetic inner product is a pre-Hilbert space. The energetic space is defined as the completion of in the energetic norm. can be considered a subset of the original Hilbert space since any Cauchy sequence in the energetic norm is also Cauchy in the norm of (this follows from the strong monotonicity property of ). The energetic inner product is extended from to by where and are sequences in Y that converge to points in in the energetic norm. Energetic extension The operator admits an energetic extension defined on with values in the dual space that is given by the formula for all in Here, denotes the duality bracket between and so actually denotes If and are elements in the original subspace then by the definition of the energetic inner product. If one views which is an element in as an element in the dual via the Riesz representation theorem, then will also be in the dual (by the strong monotonicity property of ). Via these identifications, it follows from the above formula that In different words, the original operator can be viewed as an operator and then is simply the function extension of from to An example from physics Consider a strin