source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Death%20zone
In mountaineering, the death zone refers to altitudes above a certain point where the pressure of oxygen is insufficient to sustain human life for an extended time span. This point is generally tagged as , where atmospheric pressure is less than . The concept was conceived in 1953 by Edouard Wyss-Dunant, a Swiss doctor, who called it the lethal zone. All 14 peaks above 8000 m in the death zone are located in the Himalaya and Karakoram of Asia. Many deaths in high-altitude mountaineering have been caused by the effects of the death zone, either directly by loss of vital functions or indirectly by wrong decisions made under stress, or physical weakening leading to accidents. An extended stay above without supplementary oxygen will result in deterioration of bodily functions and death. Physiological background The human body has optimal endurance below elevation. The concentration of oxygen (O2) in air is 20.9% so the partial pressure of O2 (PO2) at sea level is about . In healthy individuals, this saturates hemoglobin, the oxygen-binding red pigment in red blood cells. Atmospheric pressure decreases with altitude while the O2 fraction remains constant to about , so PO2 decreases with altitude as well. It is about half of its sea level value at , the altitude of the Mount Everest base camp, and less than a third at , the summit of Mount Everest. When PO2 drops, the body responds with altitude acclimatization. Additional red blood cells are manufactured; the heart beats faster; non-essential body functions are suppressed, food digestion efficiency declines (as the body suppresses the digestive system in favor of increasing its cardiopulmonary reserves); and one breathes more deeply and more frequently. But acclimatization requires days or even weeks. Failure to acclimatize may result in altitude sickness, including high-altitude pulmonary edema (HAPE) or cerebral edema (HACE). Humans have survived for 2 years at [ of atmospheric pressure], which appears to be ne
https://en.wikipedia.org/wiki/Sensu
Sensu is a Latin word meaning "in the sense of". It is used in a number of fields including biology, geology, linguistics, semiotics, and law. Commonly it refers to how strictly or loosely an expression is used in describing any particular concept, but it also appears in expressions that indicate the convention or context of the usage. Common qualifiers Sensu is the ablative case of the noun sensus, here meaning "sense". It is often accompanied by an adjective (in the same case). Three such phrases are: sensu stricto – "in the strict sense", abbreviation s.s. or s.str.; sensu lato – "in the broad sense", abbreviation s.l.; sensu amplo – "in a relaxed, generous (or 'ample') sense", a similar meaning to sensu lato. Søren Kierkegaard uses the phrase sensu eminenti to mean "in the pre-eminent [or most important or significant] sense". When appropriate, comparative and superlative adjectives may also be used to convey the meaning of "more" or "most". Thus sensu stricto becomes sensu strictiore ("in the stricter sense" or "more strictly speaking") and sensu strictissimo ("in the strictest possible sense" or "most strictly speaking"). Current definitions of the plant kingdom (Plantae) offer a biological example of when such phrases might be used. One definition of Plantae is that it consists of all green plants (comprising green algae and land plants), all red algae and all glaucophyte algae. A stricter definition excludes the red and glaucophyte algae; the group defined in this way could be called Plantae in sensu stricto. An even stricter definition excludes green algae, leaving only land plants; the group defined in this way could be called Plantae in sensu strictiore. Conversely, where convenient, some authors derive expressions such as "sensu non strictissimo", meaning "not in the narrowest possible sense". A similar form is in use to indicate the sense of a particular context, such as "Nonmonophyletic groups are ... nonnatural (sensu cladistics) in that .
https://en.wikipedia.org/wiki/Chronemics
Chronemics is an anthropological, philosophical, and linguistic subdiscipline that describes how time is perceived, coded, and communicated across a given culture. It is one of several subcategories to emerge from the study of nonverbal communication. According to the Encyclopedia of Special Education, "Chronemics includes time orientation, understanding and organisation, the use of and reaction to time pressures, the innate and learned awareness of time, by physically wearing or not wearing a watch, arriving, starting, and ending late or on time." A person's perception and values placed on time plays a considerable role in their communication process. The use of time can affect lifestyles, personal relationships, and work life. Across cultures, people usually have different time perceptions, and this can result in conflicts between individuals. Time perceptions include punctuality, interactions, and willingness to wait. Definition Chronemics is the study of the use of time in nonverbal communication. Time perceptions include punctuality, willingness to wait, and interactions. The use of time can affect lifestyles, daily agendas, speed of speech, movements, and how long people are willing to listen. Thomas J. Bruneau, a professor in communication at Radford University who focused his studies on nonverbal communication, interpersonal communication, and intercultural communication, coined the term "chronemics" in the late 1970s to help define the function of time in human interaction: Time can be used as an indicator of status. For example, in most companies the boss can interrupt progress to hold an impromptu meeting in the middle of the work day, yet the average worker would have to make an appointment to see the boss. The way in which different cultures perceive time can influence communication as well. Monochronic time A monochronic time system means that things are done one at a time and time is segmented into small precise units. Under this system, time
https://en.wikipedia.org/wiki/Index%20register
An index register in a computer's CPU is a processor register (or an assigned memory location) used for pointing to operand addresses during the run of a program. It is useful for stepping through strings and arrays. It can also be used for holding loop iterations and counters. In some architectures it is used for read/writing blocks of memory. Depending on the architecture it may be a dedicated index register or a general-purpose register. Some instruction sets allow more than one index register to be used; in that case additional instruction fields may specify which index registers to use. Generally, the contents of an index register is added to (in some cases subtracted from) an immediate address (that can be part of the instruction itself or held in another register) to form the "effective" address of the actual data (operand). Special instructions are typically provided to test the index register and, if the test fails, increments the index register by an immediate constant and branches, typically to the start of the loop. While normally processors that allow an instruction to specify multiple index registers add the contents together, IBM had a line of computers in which the contents were or'd together. Index registers has proved useful for doing vector/array operations and in commercial data processing for navigating from field to field within records. In both uses index registers substantially reduced the amount of memory used and increased execution speed. History In early computers without any form of indirect addressing, array operations had to be performed by modifying the instruction address, which required several additional program steps and used up more computer memory, a scarce resource in computer installations of the early era (as well as in early microcomputers two decades later). Index registers, commonly known as B-lines in early British computers, as B-registers on some machines and a X-registers on others, were first used in the British M
https://en.wikipedia.org/wiki/Puzzle%20Panic
Puzzle Panic, also known as Ken Uston's Puzzle Panic, is a video game created by blackjack strategist Ken Uston, Bob Polin (designer of Blue Max), and Ron Karr. It was published by Epyx in 1984 for the Atari 8-bit family and Commodore 64. Gameplay The player guides Benny, a light bulb, through a series of 11 puzzles, each with varying difficulty settings (a total of over 40 levels). At the completion of each level, there are a few available exits, each bearing an obscure symbol, which take Benny forward or back in the game (or possibly to repeat the level). The final level, the "Metasequence," is a cryptic puzzle with a non-explicit objective. Its original purpose was part of a contest: those who solved it correctly by the August 13, 1984 deadline could enter in a drawing to win a weekend at an Atlantic City casino with co-creator Ken Uston. Development A pre-release version of the game was called PuzzleMania. Reception Steve Panak wrote in ANALOG Computing, "Puzzle Panic is so radically different, so unlike anything else you've ever set your cathode-raybloodshot eyes on, that there's no readily memorable program to compare it with," and called the game "addictive." He disliked the brief window for winning the contest; it had already expired by the time he played. Fred Pinho wrote in Antic:
https://en.wikipedia.org/wiki/Deep%20Blue%20%28chess%20computer%29
Deep Blue was a chess-playing expert system run on a unique purpose-built IBM supercomputer. It was the first computer to win a game, and the first to win a match, against a reigning world champion under regular time controls. Development began in 1985 at Carnegie Mellon University under the name ChipTest. It then moved to IBM, where it was first renamed Deep Thought, then again in 1989 to Deep Blue. It first played world champion Garry Kasparov in a six-game match in 1996, where it lost four games to two. It was upgraded in 1997 and in a six-game re-match, it defeated Kasparov by winning two games and drawing three. Deep Blue's victory is considered a milestone in the history of artificial intelligence and has been the subject of several books and films. History While a doctoral student at Carnegie Mellon University, Feng-hsiung Hsu began development of a chess-playing supercomputer under the name ChipTest. The machine won the North American Computer Chess Championship in 1987 and Hsu and his team followed up with a successor, Deep Thought, in 1988. After receiving his doctorate in 1989, Hsu and Murray Campbell joined IBM Research to continue their project to build a machine that could defeat a world chess champion. Their colleague Thomas Anantharaman briefly joined them at IBM before leaving for the finance industry and being replaced by programmer Arthur Joseph Hoane. Jerry Brody, a long-time employee of IBM Research, subsequently joined the team in 1990. After Deep Thought's two-game 1989 loss to Kasparov, IBM held a contest to rename the chess machine: the winning name was "Deep Blue", submitted by Peter Fitzhugh Brown, was a play on IBM's nickname, "Big Blue". After a scaled-down version of Deep Blue played Grandmaster Joel Benjamin, Hsu and Campbell decided that Benjamin was the expert they were looking for to help develop Deep Blue's opening book, so hired him to assist with the preparations for Deep Blue's matches against Garry Kasparov. In 1995, a Deep
https://en.wikipedia.org/wiki/Ricker%20model
The Ricker model, named after Bill Ricker, is a classic discrete population model which gives the expected number N t+1 (or density) of individuals in generation t + 1 as a function of the number of individuals in the previous generation, Here r is interpreted as an intrinsic growth rate and k as the carrying capacity of the environment. Unlike some other models like the Logistic map, the carrying capacity in the Ricker model is not a hard barrier that cannot be exceeded by the population, but it only determines the overall scale of the population. The Ricker model was introduced in 1954 by Ricker in the context of stock and recruitment in fisheries. The model can be used to predict the number of fish that will be present in a fishery. Subsequent work has derived the model under other assumptions such as scramble competition, within-year resource limited competition or even as the outcome of source-sink Malthusian patches linked by density-dependent dispersal. The Ricker model is a limiting case of the Hassell model which takes the form When c = 1, the Hassell model is simply the Beverton–Holt model. See also Population dynamics of fisheries Notes
https://en.wikipedia.org/wiki/3D%20cell%20culturing%20by%20magnetic%20levitation
3D cell culturing by magnetic levitation method (MLM) is the application of growing 3D tissue by inducing cells treated with magnetic nanoparticle assemblies in spatially varying magnetic fields, using neodymium magnetic drivers and promoting cell to cell interactions by levitating the cells up to the air/liquid interface of a standard petri dish. The magnetic nanoparticle assemblies consist of magnetic iron oxide nanoparticles, gold nanoparticles, and the polymer polylysine. 3D cell culturing is scalable, with the capability for culturing 500 cells to millions of cells or from single dish to high-throughput low volume systems. Once magnetized cultures are generated, they can also be used as the building block material, or the "ink", for the magnetic 3D bioprinting process. Overview Standard monolayer cell culturing on tissue culture plastic has notably improved our understanding of basic cell biology, but it does not replicate the complex 3D architecture of in vivo tissue, and it can significantly modify cell properties. This often compromises experiments in basic life science, leads to misleading drug-screening results on efficacy and toxicity, and produces cells that may lack the characteristics needed for developing tissue regeneration therapies. The future of cell culturing for fundamental studies and biomedical applications lies in the creation of multicellular structure and organization in three dimensions. Many schemes for 3D culturing are being developed or marketed, such as bio-reactors or protein-based gel environments. A 3D cell culturing system known as the Bio-Assembler uses biocompatible polymer-based reagents to deliver magnetic nanoparticles to individual cells so that an applied magnetic driver can levitate cells off the bottom of the cell culture dish and rapidly bring cells together near the air-liquid interface. This initiates cell-cell interactions in the absence of any artificial surface or matrix. Magnetic fields are designed to rapidly fo
https://en.wikipedia.org/wiki/Eumycetozoa
Eumycetozoa (), or true slime molds, is a diverse group of protists that behave as slime molds and develop fruiting bodies, either as sorocarps or as sporocarps. It is a monophyletic group or clade within the phylum Amoebozoa that contains the myxogastrids, dictyostelids and protosporangiids. Characteristics Eumycetozoa is a clade that includes three groups of amoebozoan protists: Myxogastria, Dictyostelia and Protosporangiida—also known as Myxomycetes, Dictyosteliomycetes and Ceratiomyxomycetes, respectively. It is defined on a node-based approach as the least inclusive clade containing the species Dictyostelium discoideum (a dictyostelid), Physarum polycephalum (a myxogastrid) and Ceratiomyxa fruticulosa (a protosporangiid). All known members of Eumycetozoa generate fruiting bodies, either as sorocarps (in dictyostelids) or as sporocarps (in myxogastrids and protosporangiids). Within their life cycle, they may appear as a single haploid amoeboid cells (in dictyostelids), or as flagellated amoebae with two cilia that give rise to obligate amoebae with no cilia, from which the sporocarps develop (in myxogastrids and protosporangiids). The flagellated amoebae of myxogastrids and protosporangiids and non-flagellated amoebae of dictyostelids have a flat cell shape. They form wide pseudopodia with acutely pointed subpseudopodia (i.e. smaller pseudopodia that grow beneath). Unlike other amoebae, the pseudopodia lack a prominent streaming of granular cytoplasm. In eumycetozoans where sexual reproduction is well studied, the zygote cannibalizes on haploid amoebae. Evolution Eumycetozoa is a well supported clade within Amoebozoa. In independent phylogenetic analyses, it has been consistently recovered as the sister group to Archamoebae. The Eumycetozoa+Archamoebae clade is, in turn, the sister group to Variosea. Within Eumycetozoa, Dictyostelia has a basal position while Myxogastria and Protosporangiida form a clade. Together, these three groups are part of the large
https://en.wikipedia.org/wiki/Voigt%20profile
The Voigt profile (named after Woldemar Voigt) is a probability distribution given by a convolution of a Cauchy-Lorentz distribution and a Gaussian distribution. It is often used in analyzing data from spectroscopy or diffraction. Definition Without loss of generality, we can consider only centered profiles, which peak at zero. The Voigt profile is then where x is the shift from the line center, is the centered Gaussian profile: and is the centered Lorentzian profile: The defining integral can be evaluated as: where Re[w(z)] is the real part of the Faddeeva function evaluated for In the limiting cases of and then simplifies to and , respectively. History and applications In spectroscopy, a Voigt profile results from the convolution of two broadening mechanisms, one of which alone would produce a Gaussian profile (usually, as a result of the Doppler broadening), and the other would produce a Lorentzian profile. Voigt profiles are common in many branches of spectroscopy and diffraction. Due to the expense of computing the Faddeeva function, the Voigt profile is sometimes approximated using a pseudo-Voigt profile. Properties The Voigt profile is normalized: since it is a convolution of normalized profiles. The Lorentzian profile has no moments (other than the zeroth), and so the moment-generating function for the Cauchy distribution is not defined. It follows that the Voigt profile will not have a moment-generating function either, but the characteristic function for the Cauchy distribution is well defined, as is the characteristic function for the normal distribution. The characteristic function for the (centered) Voigt profile will then be the product of the two: Since normal distributions and Cauchy distributions are stable distributions, they are each closed under convolution (up to change of scale), and it follows that the Voigt distributions are also closed under convolution. Cumulative distribution function Using the above definition for z ,
https://en.wikipedia.org/wiki/High-nutrient%2C%20low-chlorophyll%20regions
High-nutrient, low-chlorophyll (HNLC) regions are regions of the ocean where the abundance of phytoplankton is low and fairly constant despite the availability of macronutrients. Phytoplankton rely on a suite of nutrients for cellular function. Macronutrients (e.g., nitrate, phosphate, silicic acid) are generally available in higher quantities in surface ocean waters, and are the typical components of common garden fertilizers. Micronutrients (e.g., iron, zinc, cobalt) are generally available in lower quantities and include trace metals. Macronutrients are typically available in millimolar concentrations, while micronutrients are generally available in micro- to nanomolar concentrations. In general, nitrogen tends to be a limiting ocean nutrient, but in HNLC regions it is never significantly depleted. Instead, these regions tend to be limited by low concentrations of metabolizable iron. Iron is a critical phytoplankton micronutrient necessary for enzyme catalysis and electron transport. Between the 1930s and '80s, it was hypothesized that iron is a limiting ocean micronutrient, but there were not sufficient methods reliably to detect iron in seawater to confirm this hypothesis. In 1989, high concentrations of iron-rich sediments in nearshore coastal waters off the Gulf of Alaska were detected. However, offshore waters had lower iron concentrations and lower productivity despite macronutrient availability for phytoplankton growth. This pattern was observed in other oceanic regions and led to the naming of three major HNLC zones: the North Pacific Ocean, the Equatorial Pacific Ocean, and the Southern Ocean. The discovery of HNLC regions has fostered scientific debate about the ethics and efficacy of iron fertilization experiments which attempt to draw down atmospheric carbon dioxide by stimulating surface-level photosynthesis. It has also led to the development of hypotheses such as grazing control which poses that HNLC regions are formed, in part, from the grazing
https://en.wikipedia.org/wiki/Glycoside%20hydrolase%20family%2014
In molecular biology, Glycoside hydrolase family 14 is a family of glycoside hydrolases. Glycoside hydrolases are a widespread group of enzymes that hydrolyse the glycosidic bond between two or more carbohydrates, or between a carbohydrate and a non-carbohydrate moiety. A classification system for glycoside hydrolases, based on sequence similarity, has led to the definition of >100 different families. This classification is available on the CAZy web site, and also discussed at CAZypedia, an online encyclopedia of carbohydrate active enzymes. Glycoside hydrolase family 14 CAZY GH_14 comprises enzymes with only one known activity; beta-amylase (). A Glu residue has been proposed as a catalytic residue, but it is not known if it is the nucleophile or the proton donor. Beta-amylase is an enzyme that hydrolyzes 1,4-alpha-glucosidic linkages in starch-type polysaccharide substrates so as to remove successive maltose units from the non-reducing ends of the chains. Beta-amylase is present in certain bacteria as well as in plants. Three highly conserved sequence regions are found in all known beta-amylases. The first of these regions is located in the N-terminal section of the enzymes and contains an aspartate which is known to be involved in the catalytic mechanism. The second, located in a more central location, is centred on a glutamate which is also involved in the catalytic mechanism. The 3D structure of a complex of soybean beta-amylase with an inhibitor (alpha-cyclodextrin) has been determined to 3.0A resolution by X-ray diffraction. The enzyme folds into large and small domains: the large domain has a (beta alpha)8 super-secondary structural core, while the smaller is formed from two long loops extending from the beta-3 and beta-4 strands of the (beta alpha)8 fold. The interface of the two domains, together with shorter loops from the (beta alpha)8 core, form a deep cleft, in which the inhibitor binds. Two maltose molecules also bind in the cleft, one sharing a
https://en.wikipedia.org/wiki/M4V
The M4V file format is a video container format developed by Apple and is very similar to the MP4 format. The primary difference is that M4V files may optionally be protected by DRM copy protection. Apple uses M4V to encode video files in its iTunes Store. Unauthorized reproduction of M4V files may be prevented using Apple's FairPlay copy protection. A FairPlay-protected M4V file can only be played on a computer authorized (using iTunes) with the account that was used to purchase the video. In QuickTime, M4V videos using FairPlay DRM are identified as "AVC0 Media". Besides Apple iTunes and the Apple QuickTime Player, M4V files can also be opened and played with Media Player Classic, K-Multimedia Player, RealPlayer, Zoom Player, VLC media player, MPlayer, DivX Plus Player, and Nero Showtime (included with Nero Multimedia Suite). The format without DRM can also be played in the webOS Video Player for use on the Palm Pre, Palm Pixi smartphones. It is also playable by the Android operating system with its video player. It is used as the default video conversion format for HandBrake and Air Video Server on the Macintosh. Some other video players can also recognize and play M4V files if the file extension is changed from ".m4v" to ".mp4". HandBrake-produced M4V files can also be played on the PlayStation 3, with full Dolby Digital 5.1 surround support. See also Comparison of (audio/video) container formats List of multimedia (audio/video) codecs List of open-source codecs Comparison of video codecs Comparison of audio coding formats
https://en.wikipedia.org/wiki/Computer%20science%20and%20engineering
Computer science and engineering (CSE) is an academic program at many universities which comprises computer science classes (e.g. data structures and algorithms) and computer engineering classes (e.g computer architecture). There is no clear division in computing between science and engineering, just like in the field of materials science and engineering. CSE is also a term often used in Europe to translate the name of engineering informatics academic programs. It is offered in both undergraduate as well postgraduate with specializations. Academic courses Academic programs vary between colleges, but typically include a combination of topics in computer science, computer engineering, and electrical engineering. Undergraduate courses usually include programming, algorithms and data structures, computer architecture, operating systems, computer networks, parallel computing, embedded systems, algorithms design, circuit analysis and electronics, digital logic and processor design, computer graphics, scientific computing, software engineering, database systems, digital signal processing, virtualization, computer simulations and games programming. CSE programs also include core subjects of theoretical computer science such as theory of computation, numerical methods, machine learning, programming theory and paradigms. Modern academic programs also cover emerging computing fields like image processing, data science, robotics, bio-inspired computing, computational biology, autonomic computing and artificial intelligence. Most CSE programs require introductory mathematical knowledge, hence the first year of study is dominated by mathematical courses, primarily discrete mathematics, mathematical analysis, linear algebra, probability, and statistics, as well as the basics of electrical and electronic engineering, physics, and electromagnetism. Example universities with CSE majors and departments APJ Abdul Kalam Technological University American International University-B
https://en.wikipedia.org/wiki/Mathematical%20Models%20and%20Methods%20in%20Applied%20Sciences
Mathematical Models and Methods in Applied Sciences is a journal founded in 1991 and published by World Scientific. It covers: mathematical modelling of systems in the applied sciences(physics, mathematical physics, natural, and technological sciences); qualitative and quantitative analysis of mathematical physics and technological sciences; and numerical and computer treatment of mathematical models or real systems. Abstracting and indexing The journal is abstracted and indexed in: Science Citation Index ISI Alerting Services CompuMath Citation Index Current Contents/Physical, Chemical & Earth Sciences Mathematical Reviews Inspec Zentralblatt MATH External links Journal Website World Scientific academic journals Mathematics journals Academic journals established in 1991
https://en.wikipedia.org/wiki/Alethic%20modality
Alethic modality (from Greek ἀλήθεια = truth) is a linguistic modality that indicates modalities of truth, in particular the modalities of logical necessity, contingency, possibility and impossibility. Alethic modality is often associated with epistemic modality in research, and it has been questioned whether this modality should be considered distinct from epistemic modality which denotes the speaker's evaluation or judgment of the truth. The criticism states that there is no real difference between "the truth in the world" (alethic) and "the truth in an individual's mind" (epistemic). An investigation has not found a single language in which alethic and epistemic modalities would be formally distinguished, for example by the means of a grammatical mood. In such a language, "A circle can't be square", "can't be" would be expressed by an alethic mood, whereas for "He can't be that wealthy", "can't be" would be expressed by an epistemic mood. As we can see, this is not a distinction drawn in English grammar. "You can't give these plants too much water." is a well-known play on the distinction between perhaps alethic and hortatory or injunctive modalities (it can mean either "it is impossible to give these plants too much water = giving them too much water is harmless" or "you must not give these plants too much water = giving them too much water is harmful"). The dilemma is fairly easily resolved when listening through paralinguistic cues and particularly suprasegmental cues (intonation). So while there may not be a morphologically based alethic mood, this does not seem to preclude the usefulness of distinguishing between these two types of modes. Alethic modality might then concern what are considered to be apodictic statements.
https://en.wikipedia.org/wiki/Torsion%20%28mechanics%29
In the field of solid mechanics, torsion is the twisting of an object due to an applied torque. Torsion is expressed in either the pascal (Pa), an SI unit for newtons per square metre, or in pounds per square inch (psi) while torque is expressed in newton metres (N·m) or foot-pound force (ft·lbf). In sections perpendicular to the torque axis, the resultant shear stress in this section is perpendicular to the radius. In non-circular cross-sections, twisting is accompanied by a distortion called warping, in which transverse sections do not remain plane. For shafts of uniform cross-section unrestrained against warping, the torsion is: where: T is the applied torque or moment of torsion in Nm. (tau) is the maximum shear stress at the outer surface JT is the torsion constant for the section. For circular rods, and tubes with constant wall thickness, it is equal to the polar moment of inertia of the section, but for other shapes, or split sections, it can be much less. For more accuracy, finite element analysis (FEA) is the best method. Other calculation methods include membrane analogy and shear flow approximation. r is the perpendicular distance between the rotational axis and the farthest point in the section (at the outer surface). ℓ is the length of the object to or over which the torque is being applied. φ (phi) is the angle of twist in radians. G is the shear modulus, also called the modulus of rigidity, and is usually given in gigapascals (GPa), lbf/in2 (psi), or lbf/ft2 or in ISO units N/mm2. The product JTG is called the torsional rigidity wT. Properties The shear stress at a point within a shaft is: Note that the highest shear stress occurs on the surface of the shaft, where the radius is maximum. High stresses at the surface may be compounded by stress concentrations such as rough spots. Thus, shafts for use in high torsion are polished to a fine surface finish to reduce the maximum stress in the shaft and increase their service life. The angle o
https://en.wikipedia.org/wiki/Azagny%20virus
Azagny virus (AZGV) is an Orthohantavirus found in West African pygmy shrews. The virus was named after the Azagny National Park, where some sample collecting occurred.
https://en.wikipedia.org/wiki/Intrinsic%20metric
In the mathematical study of metric spaces, one can consider the arclength of paths in the space. If two points are at a given distance from each other, it is natural to expect that one should be able to get from the first point to the second along a path whose arclength is equal to (or very close to) that distance. The distance between two points of a metric space relative to the intrinsic metric is defined as the infimum of the lengths of all paths from the first point to the second. A metric space is a length metric space if the intrinsic metric agrees with the original metric of the space. If the space has the stronger property that there always exists a path that achieves the infimum of length (a geodesic) then it is called a geodesic metric space or geodesic space. For instance, the Euclidean plane is a geodesic space, with line segments as its geodesics. The Euclidean plane with the origin removed is not geodesic, but is still a length metric space. Definitions Let be a metric space, i.e., is a collection of points (such as all of the points in the plane, or all points on the circle) and is a function that provides us with the distance between points . We define a new metric on , known as the induced intrinsic metric, as follows: is the infimum of the lengths of all paths from to . Here, a path from to is a continuous map with and . The length of such a path is defined as explained for rectifiable curves. We set if there is no path of finite length from to (this is consistent with the infimum definition since the infimum of the empty set within the closed interval [0,+∞] is +∞). The mapping is idempotent, i.e. If for all points and in , we say that is a length space or a path metric space and the metric is intrinsic. We say that the metric has approximate midpoints if for any and any pair of points and in there exists in such that and are both smaller than Examples Euclidean space with the ordinary Euclidean metri
https://en.wikipedia.org/wiki/Audio%20crossover
Audio crossovers are a type of electronic filter circuitry that splits an audio signal into two or more frequency ranges, so that the signals can be sent to loudspeaker drivers that are designed to operate within different frequency ranges. The crossover filters can be either active or passive. They are often described as two-way or three-way, which indicate, respectively, that the crossover splits a given signal into two frequency ranges or three frequency ranges. Crossovers are used in loudspeaker cabinets, power amplifiers in consumer electronics (hi-fi, home cinema sound and car audio) and pro audio and musical instrument amplifier products. For the latter two markets, crossovers are used in bass amplifiers, keyboard amplifiers, bass and keyboard speaker enclosures and sound reinforcement system equipment (PA speakers, monitor speakers, subwoofer systems, etc.). Crossovers are used because most individual loudspeaker drivers are incapable of covering the entire audio spectrum from low frequencies to high frequencies with acceptable relative volume and absence of distortion. Most hi-fi speaker systems and sound reinforcement system speaker cabinets use a combination of multiple loudspeaker drivers, each catering to a different frequency band. A standard simple example is in hi-fi and PA system cabinets that contain a woofer for low and mid frequencies and a tweeter for high frequencies. Since a sound signal source, be it recorded music from a CD player or a live band's mix from an audio console, has all of the low, mid and high frequencies combined, a crossover circuit is used to split the audio signal into separate frequency bands that can be separately routed to loudspeakers, tweeters or horns optimized for those frequency bands. Passive crossovers are probably the most common type of audio crossover. They use a network of passive electrical components (e.g., capacitors, inductors and resistors) to split up an amplified signal coming from one power amplifie
https://en.wikipedia.org/wiki/CIML%20community%20portal
The computational intelligence and machine learning (CIML) community portal is an international multi-university initiative. Its primary purpose is to help facilitate a virtual scientific community infrastructure for all those involved with, or interested in, computational intelligence and machine learning. This includes CIML research-, education, and application-oriented resources residing at the portal and others that are linked from the CIML site. Overview The CIML community portal was created to facilitate an online virtual scientific community wherein anyone interested in CIML can share research, obtain resources, or simply learn more. The effort is currently led by Jacek Zurada (principal investigator), with Rammohan Ragade and Janusz Wojtusiak, aided by a team of 25 volunteer researchers from 13 different countries. The ultimate goal of the CIML community portal is to accommodate and cater to a broad range of users, including experts, students, the public, and outside researchers interested in using CIML methods and software tools. Each community member and user will be guided through the portal resources and tools based on their respective CIML experience (e.g. expert, student, outside researcher) and goals (e.g. collaboration, education). A preliminary version of the community's portal, with limited capabilities, is now operational and available for users. All electronic resources on the portal are peer-reviewed to ensure high quality and cite-ability for literature. Further reading Jacek M. Zurada, Janusz Wojtusiak, Fahmida Chowdhury, James E. Gentle, Cedric J. Jeannot, and Maciej A. Mazurowski, Computational Intelligence Virtual Community: Framework and Implementation Issues, Proceedings of the IEEE World Congress on Computational Intelligence, Hong Kong, June 1–6, 2008. Jacek M. Zurada, Janusz Wojtusiak, Maciej A. Mazurowski, Devendra Mehta, Khalid Moidu, Steve Margolis, Toward Multidisciplinary Collaboration in the CIML Virtual Community, Proc
https://en.wikipedia.org/wiki/SARS%20conspiracy%20theory
The SARS conspiracy theory began to emerge during the severe acute respiratory syndrome (SARS) outbreak in China in the spring of 2003, when Sergei Kolesnikov, a Russian scientist and a member of the Russian Academy of Medical Sciences, first publicized his claim that the SARS coronavirus is a synthesis of measles and mumps. According to Kolesnikov, this combination cannot be formed in the natural world and thus the SARS virus must have been produced under laboratory conditions. Another Russian scientist, Nikolai Filatov, head of Moscow's epidemiological services, had earlier commented that the SARS virus was probably man-made. However, independent labs concluded these claims to be premature since the SARS virus is a coronavirus, whereas measles and mumps are paramyxoviruses. The primary differences between a coronavirus and a paramyxovirus are in their structures and method of infection, thus making it implausible for a coronavirus to have been created from two paramyxoviruses. The widespread reporting of claims by Kolesnokov and Filatov caused controversy in many Chinese internet discussion boards and chat rooms. Many Chinese believed that the SARS virus could be a biological weapon manufactured by the United States, which perceived China as a potential threat. The failure to find the source of the SARS virus further convinced these people and many more that SARS was artificially synthesised and spread by some individuals and even governments. Circumstantial evidence suggests that the SARS virus crossed over to humans from Asian palm civets ("civet cats"), a type of animal that is often killed and eaten in Guangdong, where SARS was first discovered. Supporters of the conspiracy theory suggest that SARS caused the most serious harm in mainland China, Hong Kong, Taiwan and Singapore, regions where most Chinese reside, while the United States, Europe and Japan were not affected as much. However, the highest mortality from SARS outside of China occurred in Canada w
https://en.wikipedia.org/wiki/Synthetic-aperture%20magnetometry
Synthetic-aperture magnetometry (SAM) is a method for analysis of data obtained from magnetoencephalography (MEG) and electroencephalography (EEG). SAM is a nonlinear beamforming approach which can be thought of as a spatial filter. See also Aperture synthesis
https://en.wikipedia.org/wiki/Phagocyte
Phagocytes are cells that protect the body by ingesting harmful foreign particles, bacteria, and dead or dying cells. Their name comes from the Greek , "to eat" or "devour", and "-cyte", the suffix in biology denoting "cell", from the Greek kutos, "hollow vessel". They are essential for fighting infections and for subsequent immunity. Phagocytes are important throughout the animal kingdom and are highly developed within vertebrates. One litre of human blood contains about six billion phagocytes. They were discovered in 1882 by Ilya Ilyich Mechnikov while he was studying starfish larvae. Mechnikov was awarded the 1908 Nobel Prize in Physiology or Medicine for his discovery. Phagocytes occur in many species; some amoebae behave like macrophage phagocytes, which suggests that phagocytes appeared early in the evolution of life. Phagocytes of humans and other animals are called "professional" or "non-professional" depending on how effective they are at phagocytosis. The professional phagocytes include many types of white blood cells (such as neutrophils, monocytes, macrophages, mast cells, and dendritic cells). The main difference between professional and non-professional phagocytes is that the professional phagocytes have molecules called receptors on their surfaces that can detect harmful objects, such as bacteria, that are not normally found in the body. Non-professional phagocytes do not have efficient phagocytic receptors, such as those for opsonins. Phagocytes are crucial in fighting infections, as well as in maintaining healthy tissues by removing dead and dying cells that have reached the end of their lifespan. During an infection, chemical signals attract phagocytes to places where the pathogen has invaded the body. These chemicals may come from bacteria or from other phagocytes already present. The phagocytes move by a method called chemotaxis. When phagocytes come into contact with bacteria, the receptors on the phagocyte's surface will bind to them. This b
https://en.wikipedia.org/wiki/Angle-resolved%20photoemission%20spectroscopy
Angle-resolved photoemission spectroscopy (ARPES) is an experimental technique used in condensed matter physics to probe the allowed energies and momenta of the electrons in a material, usually a crystalline solid. It is based on the photoelectric effect, in which an incoming photon of sufficient energy ejects an electron from the surface of a material. By directly measuring the kinetic energy and emission angle distributions of the emitted photoelectrons, the technique can map the electronic band structure and Fermi surfaces. ARPES is best suited for the study of one- or two-dimensional materials. It has been used by physicists to investigate high-temperature superconductors, graphene, topological materials, quantum well states, and materials exhibiting charge density waves. ARPES systems consist of a monochromatic light source to deliver a narrow beam of photons, a sample holder connected to a manipulator used to position the sample of a material, and an electron spectrometer. The equipment is contained within an ultra-high vacuum (UHV) environment, which protects the sample and prevents scattering of the emitted electrons. After being dispersed along two perpendicular directions with respect to kinetic energy and emission angle, the electrons are directed to a detector and counted to provide ARPES spectra—slices of the band structure along one momentum direction. Some ARPES instruments can extract a portion of the electrons alongside the detector to measure the polarization of their spin. Principle Electrons in crystalline solids can only populate states of certain energies and momenta, others being forbidden by quantum mechanics. They form a continuum of states known as the band structure of the solid. The band structure determines if a material is an insulator, a semiconductor, or a metal, how it conducts electricity and in which directions it conducts best, or how it behaves in a magnetic field. Angle-resolved photoemission spectroscopy determines the band
https://en.wikipedia.org/wiki/Comparative%20physiology
Comparative physiology is a subdiscipline of physiology that studies and exploits the diversity of functional characteristics of various kinds of organisms. It is closely related to evolutionary physiology and environmental physiology. Many universities offer undergraduate courses that cover comparative aspects of animal physiology. According to Clifford Ladd Prosser, "Comparative Physiology is not so much a defined discipline as a viewpoint, a philosophy." History Originally, as narrated in a recent history of the field, physiology focused primarily on human beings, in large part from a desire to improve medical practices. When physiologists first began comparing different species it was sometimes out of simple curiosity to understand how organisms work but also stemmed from a desire to discover basic physiological principles. This use of specific organisms convenient to study specific questions is known as the Krogh Principle. Methodology C. Ladd Prosser, a founder of modern comparative physiology, outlined a broad agenda for comparative physiology in his 1950 edited volume (see summary and discussion in Garland and Carter): 1. To describe how different kinds of animals meet their needs. This amounts to cataloging functional aspects of biological diversity, and has recently been criticized as "stamp collecting" with the suggestion that the field should move beyond that initial, exploratory phase. 2. The use of physiological information to reconstruct phylogenetic relationships of organisms. In principle physiological information could be used just as morphological information or DNA sequence is used to measure evolutionary divergence of organisms. In practice, this has rarely been done, for at least four reasons: physiology doesn't leave many fossil cues, it can't be measured on museum specimens, it is difficult to quantify as compared with morphology or DNA sequences, and physiology is more likely to be adaptive than DNA, and so subject to parallel
https://en.wikipedia.org/wiki/Mir-873%20microRNA%20precursor%20family
In molecular biology mir-873 microRNA is a short RNA molecule. MicroRNAs function to regulate the expression levels of other genes by several mechanisms. Dysmaculinisation in males Significant miR-873 reductions have been observed in the brains of dysmasculinised second generation males following the introduction of prenatal stress factors. Sexual dimorphisms in the brain are altered and there is female-like expression of key neurodevelopment genes. The common target gene for miR-873, β-glycan, has increased expression. It is thought that miR-873, along with other miRNAs, may play a role in organisation of the sexually dimorphic brain due to a marked response to organisational testosterone. Chromatin remodelling miR-873 is associated with translational regulation of factors involved in chromatin remodelling, and is downregulated by the human T-cell leukaemia virus type 1 (HTLV-1) transactivating protein, Tax. Specifically, it has been found to directly target the chromatin remodelling factors p300 and p/CAF. See also MicroRNA
https://en.wikipedia.org/wiki/Proper%20zero-signal%20collector%20current
Consider an NPN transistor circuit. During the positive half-cycle of the signal, the base is positive with respect to the emitter and hence the base-emitter junction is forward biased. This causes a base current and much larger collector current to flow. The positive half-cycle of the signal is amplified in the collector. During the negative half-cycle, the base-emitter junction is reverse biased and hence no current flows. No output flows during the negative half-cycle of the signal. Thus the positive-only amplified output is unfaithful. A sufficient battery source in the base circuit keeps the input circuit forward biased even during the peak of the negative half-cycle. When no signal is applied, a DC current I C will flow in the collector circuit due to the battery. This is known as zero signal collector current. The value of zero signal collector current should be at least equal to the maximum collector current due to AC signal alone.
https://en.wikipedia.org/wiki/Real-time%20adaptive%20security
Real-time Adaptive Security is the network security model necessary to accommodate the emergence of multiple perimeters and moving parts on the network, and increasingly advanced threats targeting enterprises. Adaptive security can watch a network for malicious traffic and behavioral anomalies, ferret out end point vulnerabilities, identify real-time changes to systems, automatically enforce end point protections and access rules, block malicious traffic, follow a compliance dashboard while providing audit data, and more. Among the key features of an adaptive security infrastructure are security platforms that share and correlate information rather than point solutions, so the heuristics system could communicate its suspicions to the firewall. Other features include finer-grained controls, automation (in addition to human intervention), on-demand security services, security as a service, and integration of security and management data. Rather than adding security to custom applications after they go operational, security models would be created at the design phase of an app. A major change with this model of real-time adaptive security is shifting authorization management and policy to an on-demand service that contains details and policy enforcement that matches compliance and can adapt to the user's situation when he or she is trying to access an application, for instance. Dependence on Machine Learning The factual importance in getting adapted to changing network for any real time adaptive scenario cannot overlook the possibilities of machine learning. It is all about the behaviour of users over network. Adaptive authentication depends on machine learning to model a baseline over time of mannerism of normal users. Recent advents in machine learning offers a brighter prospect in artificial intelligence integration to real time adaptation. Unique risk score is figured out which will define and decide of possibilities of security issues thereby ensuring escalated
https://en.wikipedia.org/wiki/Hot%20spare
A hot spare or warm spare or hot standby is used as a failover mechanism to provide reliability in system configurations. The hot spare is active and connected as part of a working system. When a key component fails, the hot spare is switched into operation. More generally, a hot standby can be used to refer to any device or system that is held in readiness to overcome an otherwise significant start-up delay. Examples Examples of hot spares are components such as A/V switches, computers, network printers, and hard disks. The equipment is powered on, or considered "hot," but not actively functioning in (i.e. used by) the system. Electrical generators may be held on hot standby, or a steam train may be held at the shed fired up (literally hot) ready to replace a possible failure of an engine in service. Explanation In designing a reliable system, it is recognized that there will be failures. At the extreme, a complete system can be duplicated and kept up to date—so in the event of the primary system failing, the secondary system can be switched in with little or no interruption. More often, a hot spare is a single vital component without which the entire system would fail. The spare component is integrated into the system in such a way that in the event of a problem, the system can be altered to use the spare component. This may be done automatically or manually, but in either case it is normal to have some means of error detection. A hot spare does not necessarily give 100% availability or protect against temporary loss of the system during the switching process; it is designed to significantly reduce the time that the system is unavailable. Hot standby may have a slightly different connotation of being active but not productive to hot spare, that is it is a state rather than object. For example, in a national power grid, the supply of power needs to be balanced to demand over a short term. It can take many hours to bring a coal-fired power station up to produ
https://en.wikipedia.org/wiki/Krull%27s%20separation%20lemma
In abstract algebra, Krull's separation lemma is a lemma in ring theory. It was proved by Wolfgang Krull in 1928. Statement of the lemma Let be an ideal and let be a multiplicative system (i.e. is closed under multiplication) in a ring , and suppose . Then there exists a prime ideal satisfying and .
https://en.wikipedia.org/wiki/Chord%20%28geometry%29
A chord (from the Latin chorda, meaning "bowstring") of a circle is a straight line segment whose endpoints both lie on a circular arc. If a chord were to be extended infinitely on both directions into a line, the object is a secant line. The perpendicular line passing through the chord's midpoint is called sagitta (Latin for "arrow"). More generally, a chord is a line segment joining two points on any curve, for instance, on an ellipse. A chord that passes through a circle's center point is the circle's diameter. In circles Among properties of chords of a circle are the following: Chords are equidistant from the center if and only if their lengths are equal. Equal chords are subtended by equal angles from the center of the circle. A chord that passes through the center of a circle is called a diameter and is the longest chord of that specific circle. If the line extensions (secant lines) of chords AB and CD intersect at a point P, then their lengths satisfy AP·PB = CP·PD (power of a point theorem). In conics The midpoints of a set of parallel chords of a conic are collinear (midpoint theorem for conics). In trigonometry Chords were used extensively in the early development of trigonometry. The first known trigonometric table, compiled by Hipparchus, tabulated the value of the chord function for every degrees. In the second century AD, Ptolemy of Alexandria compiled a more extensive table of chords in his book on astronomy, giving the value of the chord for angles ranging from to 180 degrees by increments of degree. The circle was of diameter 120, and the chord lengths are accurate to two base-60 digits after the integer part. The chord function is defined geometrically as shown in the picture. The chord of an angle is the length of the chord between two points on a unit circle separated by that central angle. The angle θ is taken in the positive sense and must lie in the interval (radian measure). The chord function can be related to the modern s
https://en.wikipedia.org/wiki/Tux%2C%20of%20Math%20Command
Tux, of Math Command (TuxMath, for short) is an open source arcade-style video game for learning arithmetic, initially created for Linux. History The first alpha of the game was released by its initial developer, Bill Kendrick, in September 2001, days prior to the September 11, 2001 attacks. It was decided that the imagery of exploding buildings was no longer suitable. Eventually the city imagery was replaced with igloos, to match the arctic theme of Tux, the Linux penguin, who stars in the game. Since 1.7.0 the game also include a multiplayer mode and Factor-fraction activity called Factoroids. Gameplay The game-play mechanic is based loosely on that of the arcade game Missile Command, but with comets falling on cities, rather than missiles. Like Missile Command, players attempt to protect their cities, but rather than using a trackball-controlled targeting cross-hair, players solve math problems that label each comet, which causes a laser to destroy it. Features The game has multiple user support (useful for schools), LAN multiplayer mode, on-screen tutorials and a training mode - over 50 bundled lessons ranging from simple number typing up through all four basic arithmetic operations with negative numbers and "missing number" questions (e.g. "3 x ? = 12"). Being an open source project, multi-platform support for Linux, Windows, Mac OS X, BeOS and others is available. Localization to over thirty (human) languages was created by the games community. Included is also "Factoroids", a clone of classic Atari video game "Asteroids", modified to be an activity to train factorization. JavaScript Version Originally written in C language and based on the SDL library, TuxMath has been rewritten in JavaScript in 2022, allowing it to be played from a web browser or from a smartphone. The web version of TuxMath allows to play the game where the player has to solve operations to shoot comet TuxMath, but not the "factoroid" game. It adds an "autolevel" option which adjus
https://en.wikipedia.org/wiki/Total%20absorption%20spectroscopy
Total absorption spectroscopy is a measurement technique that allows the measurement of the gamma radiation emitted in the different nuclear gamma transitions that may take place in the daughter nucleus after its unstable parent has decayed by means of the beta decay process. This technique can be used for beta decay studies related to beta feeding measurements within the full decay energy window for nuclei far from stability. It is implemented with a special type of detector, the "total absorption spectrometer" (TAS), made of a scintillator crystal that almost completely surrounds the activity to be measured, covering a solid angle of approximately 4π. Also, in an ideal case, it should be thick enough to have a peak efficiency close to 100%, in this way its total efficiency is also very close to 100% (this is one of the reasons why it is called "total" absorption spectroscopy). Finally, it should be blind to any other type of radiation. The gamma rays produced in the decay under study are collected by photomultipliers attached to the scintillator material. This technique may solve the problem of the Pandemonium effect. There is a change in philosophy when measuring with a TAS. Instead of detecting the individual gamma rays (as high-resolution detectors do), it will detect the gamma cascades emitted in the decay. Then, the final energy spectrum will not be a collection of different energy peaks coming from the different transitions (as can be expected in the case of a germanium detector), but a collection of peaks situated at an energy that is the sum of the different energies of all the gammas of the cascade emitted from each level. This means that the energy spectrum measured with a TAS will be in reality a spectrum of the levels of the nuclei, where each peak is a level populated in the decay. Since the efficiency of these detectors is close to 100%, it is possible to see the feeding to the high excitation levels that usually can not be seen by high-resolution
https://en.wikipedia.org/wiki/Amkette%20EvoTV
EvoTV is a range of digital media players developed by Amkette that brings internet and web2.0 features to the television. EvoTV was envisioned to be a substitute to the growing range of Smart TVs in the market. Instead of replacing complete television sets to get Internet-based features users could just connect an EvoTV and get the same features at a much lower cost. EvoTV is based on the Android operating system, can connect to the internet wirelessly, and can stream local, network and internet media and audio files. Once connected users can access Android Playstore Apps using EvoTV on their television. The first EvoTV was launched in the middle of 2012 and received many positive reviews and awards. In 2013 Amkette EvoTV won the NDTV Gadget Award for the Best Consumer Electronic. In 2014 two news versions of EvoTV were introduced bringing the total variants to five. As of 2022 the Web site evotv.amkette.com was dead, and the search term "EvoTV" found no matches on the Amkette Web site. 2014 models With the continuing demand for EvoTV, two new models were introduced in May 2014 - EvoTV Media Center and Android Central. These were focused towards online and offline media consumption. Features EvoView - Is the user interface which allows customization of the home screen for applications, videos, and weblinks. EvoRemote - is a feature packed remote control which has gathered much praise as being an intuitive and advanced method to interact with EvoTV and access all the features available on it. It comes with a touch sensitive button to control the movement of the EvoTV pointer. MyYTViewer - is an application that allows creation of live groups based on various criteria such as channels, category, terms and more for video from YouTube. Motion Gaming - allows users to play games with the motion of the remote control. EvoTV XL In December 2012 EvoTV XL a next generation EvoTV device was launched and received many positive reviews and awards. It supports better
https://en.wikipedia.org/wiki/Vincula%20tendina
Within each osseo-aponeurotic canal, the tendons of the flexor digitorum superficialis and flexor digitorum profundus are connected to each other, and to the phalanges, by slender, tendinous bands, called vincula tendina. Structure There are around three to seven vincula for each flexor tendon. Vincula tendina can be classified into two types according to their morphology. The vincula brevia (short), which are two in number in each finger, and consist of triangular bands of fibers, one connecting the tendon of the flexor digitorum superficialis to the front of the first interphalangeal joint and head of the first phalanx, and the other the tendon of the flexor digitorum profundus to the front of the second interphalangeal joint and head of the second phalanx. The vincula longa (long and slender), one which connects the flexor digitorum superficialis to the base of the first phalanx, and the other which connects the under surfaces of the tendons of the flexor digitorum profundus to those of the subjacent flexor digitorum superficialis after the tendons of the former have passed through the latter. Function The vincula tendina carry blood supply to the flexor digitorum superficialis and profundus tendons. The vincula breve helps facilitate digital flexion following injury to the distal flexor digitorum profundus tendon. Additional images
https://en.wikipedia.org/wiki/Sequence%20assembly
In bioinformatics, sequence assembly refers to aligning and merging fragments from a longer DNA sequence in order to reconstruct the original sequence. This is needed as DNA sequencing technology might not be able to 'read' whole genomes in one go, but rather reads small pieces of between 20 and 30,000 bases, depending on the technology used. Typically, the short fragments (reads) result from shotgun sequencing genomic DNA, or gene transcript (ESTs). The problem of sequence assembly can be compared to taking many copies of a book, passing each of them through a shredder with a different cutter, and piecing the text of the book back together just by looking at the shredded pieces. Besides the obvious difficulty of this task, there are some extra practical issues: the original may have many repeated paragraphs, and some shreds may be modified during shredding to have typos. Excerpts from another book may also be added in, and some shreds may be completely unrecognizable. Genome assemblers The first sequence assemblers began to appear in the late 1980s and early 1990s as variants of simpler sequence alignment programs to piece together vast quantities of fragments generated by automated sequencing instruments called DNA sequencers. As the sequenced organisms grew in size and complexity (from small viruses over plasmids to bacteria and finally eukaryotes), the assembly programs used in these genome projects needed increasingly sophisticated strategies to handle: terabytes of sequencing data which need processing on computing clusters; identical and nearly identical sequences (known as repeats) which can, in the worst case, increase the time and space complexity of algorithms quadratically; DNA read errors in the fragments from the sequencing instruments, which can confound assembly. Faced with the challenge of assembling the first larger eukaryotic genomes—the fruit fly Drosophila melanogaster in 2000 and the human genome just a year later,—scientists developed
https://en.wikipedia.org/wiki/Autographa%20californica%20multiple%20nucleopolyhedrovirus
Autographa californica multiple nucleopolyhedrovirus (AcMNPV) is a nucleopolyhedrovirus belonging to the family Baculoviridae.It has a double-stranded DNA genome that is 133,894 base pairs in length with 155 ORFs. The virus forms occluded bodies called polyhedra each containing multiple virions. AcMNPV has been shown to infect more than thirty lepidopteran hosts from 10 families. See also Autographa californica
https://en.wikipedia.org/wiki/CSS%20framework
A CSS framework is a library allowing for easier, more standards-compliant web design using the Cascading Style Sheets language. Most of these frameworks contain at least a grid. More functional frameworks also come with more features and additional JavaScript based functions, but are mostly design oriented and focused around interactive UI patterns. This detail differentiates CSS frameworks from other JavaScript frameworks. Two notable and widely used examples are Bootstrap and Foundation. CSS frameworks offer different modules and tools: reset style sheet grid especially for responsive web design web typography set of icons in sprites or icon fonts styling for tooltips, buttons, elements of forms parts of graphical user interfaces like accordion, tabs, slideshow or modal windows (Lightbox) equalizer to create equal height content often used CSS helper classes (left, hide) Bigger frameworks use a CSS interpreter like Less or Sass. List of notable CSS frameworks See also Comparison of layout engines (Cascading Style Sheets)
https://en.wikipedia.org/wiki/RealVNC
RealVNC is a company that provides remote access software. Their VNC Connect software consists of a server (VNC Server) and client (VNC Viewer) application, which exchange data over the RFB protocol to allow the Viewer to control the Server's screen remotely. The application is used, for example, by IT support engineers to provide helpdesk services to remote users. History Andy Harter and other members of the original VNC team at AT&T founded RealVNC Limited in 2002. The automotive division of RealVNC spun out as a separate company (VNC Automotive) in 2018. Platforms, editions, versions For a desktop-to-desktop connection RealVNC runs on Windows, macOS, and many Unix-like operating systems. A list of supported platforms can be found on the website. A RealVNC client also runs on the Java platform and on the Apple iPhone, iPod touch and iPad and Google Android devices. A Windows-only client, VNC Viewer Plus was launched in 2010, designed to interface to the embedded server on Intel AMT chipsets found on Intel vPro motherboards. RealVNC removed VNC Viewer Plus from sale on 28th February 2021. For remote access to view one computer desktop on another, RealVNC requires one of three subscriptions: Home – free registration and activation required Professional – commercial version geared towards home or small-business users, with authentication and encryption, remote printing, chat and file transfer Enterprise – commercial version geared towards enterprises, with enhanced authentication and encryption, remote printing, chat, file transfer, and command-line deployment As of release 4.3 (released August 2007), separate versions of both the Personal and Enterprise editions exist for 32-bit and 64-bit systems. Release 4.6 included features such as HTTP proxy support, chat, an address book, remote printing, unicode support, and connection notification. Users must activate each of the server versions ("Home", "Professional", "Enterprise"). With the release of VNC 5.0 l
https://en.wikipedia.org/wiki/Meningeal%20branches%20of%20vertebral%20artery
The meningeal branches of vertebral artery (posterior meningeal branch) springs from the vertebral opposite the foramen magnum, ramifies between the bone and dura mater in the cerebellar fossa, and supplies the falx cerebelli. It is frequently represented by one or two small branches.
https://en.wikipedia.org/wiki/Structuralism%20%28biology%29
Biological or process structuralism is a school of biological thought that objects to an exclusively Darwinian or adaptationist explanation of natural selection such as is described in the 20th century's modern synthesis. It proposes instead that evolution is guided differently, basically by more or less physical forces which shape the development of an animal's body, and sometimes implies that these forces supersede selection altogether. Structuralists have proposed different mechanisms that might have guided the formation of body plans. Before Darwin, Étienne Geoffroy Saint-Hilaire argued that animals shared homologous parts, and that if one was enlarged, the others would be reduced in compensation. After Darwin, D'Arcy Thompson hinted at vitalism and offered geometric explanations in his classic 1917 book On Growth and Form. Adolf Seilacher suggested mechanical inflation for "pneu" structures in Ediacaran biota fossils such as Dickinsonia. Günter P. Wagner argued for developmental bias, structural constraints on embryonic development. Stuart Kauffman favoured self-organisation, the idea that complex structure emerges holistically and spontaneously from the dynamic interaction of all parts of an organism. Michael Denton argued for laws of form by which Platonic universals or "Types" are self-organised. Stephen J. Gould and Richard Lewontin proposed biological "spandrels", features created as a byproduct of the adaptation of nearby structures. Gerd B. Müller and Stuart A. Newman argued that the appearance in the fossil record of most of the current phyla in the Cambrian explosion was "pre-Mendelian" evolution caused by physical factors. Brian Goodwin, described by Wagner as part of "a fringe movement in evolutionary biology", denies that biological complexity can be reduced to natural selection, and argues that pattern formation is driven by morphogenetic fields. Darwinian biologists have criticised structuralism, emphasising that there is plentiful evidence bot
https://en.wikipedia.org/wiki/Test%20point
A test point is a location within an electronic circuit that is used to either monitor the state of the circuitry or to inject test signals. Test points have two primary uses: During manufacturing they are used to verify that a newly assembled device is working correctly. Any equipment that fails this testing is either discarded or sent to a rework station to attempt to repair the manufacturing defects. After sale of the device to a customer, test points may be used at a later time to repair the device if it malfunctions, or if the device needs to be re-calibrated after having components replaced. Test points can be labeled and may include pins for attachment of alligator clips or may have complete connectors for test clips. Modern miniature surface-mount electronics often simply have a row of unlabelled, tinned solder pads. The device is placed into a test fixture that holds the device securely, and a special surface-contact connector plate is pressed down onto the solder pads to connect them all as a group. Electronics manufacturing
https://en.wikipedia.org/wiki/IEEE%201355
IEEE Standard 1355-1995, IEC 14575, or ISO 14575 is a data communications standard for Heterogeneous Interconnect (HIC). IEC 14575 is a low-cost, low latency, scalable serial interconnection system, originally intended for communication between large numbers of inexpensive computers. IEC 14575 lacks many of the complexities of other data networks. The standard defined several different types of transmission media (including wires and optic fiber), to address different applications. Since the high-level network logic is compatible, inexpensive electronic adapters are possible. IEEE 1355 is often used in scientific laboratories. Promoters include large laboratories, such as CERN, and scientific agencies. For example, the ESA advocates a derivative standard called SpaceWire. Goals The protocol was designed for a simple, low cost switched network made of point-to-point links. This network sends variable length data packets reliably at high speed. It routes the packets using wormhole routing. Unlike Token Ring or other types of local area networks (LANs) with comparable specifications, IEEE 1355 scales beyond a thousand nodes without requiring higher transmission speeds. The network is designed to carry traffic from other types of networks, notably Internet Protocol and Asynchronous Transfer Mode (ATM), but does not depend on other protocols for data transfers or switching. In this, it resembles Multiprotocol Label Switching (MPLS). IEEE 1355 had goals like Futurebus and its derivatives Scalable Coherent Interface (SCI), and InfiniBand. The packet routing system of IEEE 1355 is also similar to VPLS, and uses a packet labeling scheme similar to MPLS. IEEE 1355 achieves its design goals with relatively simple digital electronics and very little software. This simplicity is valued by many engineers and scientists. Paul Walker (see links ) said that when implemented in an FPGA, the standard takes about a third the hardware resources of a UART (a standard seria
https://en.wikipedia.org/wiki/The%20Algebra%20of%20Infinite%20Justice
The Algebra of Infinite Justice (2001) is a collection of essays written by Booker Prize winner Arundhati Roy. The book discusses a wide range of issues including political euphoria in India over its successful nuclear bomb tests, the effect of public works projects on the environment, the influence of foreign multinational companies on policy in poorer countries, and the "war on terror". Some of the essays in the collection were republished later, along with later writing, in her book My Seditious Heart. The official introduction to the book by Penguin India states : A few weeks after India detonated a thermonuclear device in 1998, Arundhati Roy wrote ‘The End of Imagination’. The essay attracted worldwide attention as the voice of a brilliant Indian writer speaking out with clarity and conscience against nuclear weapons. Over the next three and a half years, she wrote a series of political essays on a diverse range of momentous subjects: from the illusory benefits of big dams, to the downside of corporate globalization and the US Government’s war against terror. Essays The end of Imagination This is the name of the first essay in the 2001 book. It was later used as the title of a comprehensive collection of Roy's essays in 2016 The greater common good Essay concerning the controversial Sardar Sarovar Dam project in India's Narmada Valley. Power politics This essay examines Indian dam construction and challenges the idea that only "experts" can influence economic policy. It explores the human costs of the privatization of India’s power supply and the construction of monumental dams in India. This is the second essay in the original 2001 book. There is also a 2002 book of Roy's essays with this title Power Politics. The ladies have feelings so... The Algebra of Infinite Justice War is peace The world doesn't have to choose between the Taliban and the US government. All the beauty of the world—literature, music, art—lies between these two fundamentalis
https://en.wikipedia.org/wiki/Exterior%20algebra
In mathematics, the exterior algebra of a vector space is a graded associative algebra Elements in ∧nV are called -multivectors, and are given by a sum of -blades ("products" of elements of ); it is an abstraction of oriented lengths, areas, volumes and more generally oriented n-volumes for n ≥ 0. The algebra product ∧ on ∧V is called the exterior product or wedge product; informally, it acts by taking the product of oriented volumes. Every element in ∧V is a sum of v1∧ ... ∧ vk where , and Such an element is called a k-blade, which in the above figure corresponds to the parallelotope spanned by them. This product satisfies the alternating property v∧v = 0 for all v ∈ V. This implies it is antisymmetric: u∧v = - v∧u. and more generally any blade flips sign whenever two of its constituent vectors are exchanged, corresponding to a parallelotope of opposite orientation. The exterior algebra is also called the Grassmann algebra, after Hermann Grassmann. Formal definitions There are several equivalent ways to define the exterior algebra of a vector space . Definition as a quotient of the tensor algebra Let V be a vector space over a field. Recall the tensor algebra TV. As a vector space it is spanned by symbols v1⊗ ... ⊗vk for k ≥ 0 and vi ∈ V, and the only relations are those specifying these objects be linear in each variable vi. ∧V inherits the structure of a graded algebra from TV. An element of ∧V may be written (non-uniquely) as a finite sum v1∧ ... ∧ vk1+ w1∧ ... ∧ wk2 + ... + z1∧ ... ∧ zkn where each letter is an element of V, and ki, n ≥ 0. It is called a k-vector if all ki = k, in which case its rank is the minimum n for which it can be written in the above form. Rank one k-vectors v1∧ ... ∧ vk are also called k-blades. Definition in terms of explicit basis Let e1, ..., en be a basis of V. Motivating examples The first two examples assume a metric tensor field and an orientation; the third example does not assume either. Areas in t
https://en.wikipedia.org/wiki/WDMA%20%28computer%29
The Word DMA (WDMA) interface was the fastest method used to transfer data between the computer (through the Advanced Technology Attachment (ATA) controller) and an ATA device until Ultra Direct Memory Access (UDMA) was implemented. Single/Multiword DMA took over from Programmed input/output (PIO) as the choice of interface between ATA devices and the computer. The WDMA interface is grouped into different modes. In single transfer mode, only one word (16-bit) will be transferred between the device and the computer before returning control to the CPU, and later it will repeat this cycle, allowing the CPU to process data while data is transferred. In multiword transfer mode (block mode), once a transfer has begun it will continue until all words are transferred. Two additional Advanced Timing modes have been defined in the CompactFlash specification 2.1. Those are Multiword DMA mode 3 and Multiword DMA mode 4. They are specific to CompactFlash. Multiword DMA is only permitted for CompactFlash devices configured in True IDE mode. AT Attachment the category
https://en.wikipedia.org/wiki/Carla%20Cotwright-Williams
Carla Denise Cotwright-Williams (born November 6) is an American mathematician who works as a Technical Director and Data Scientist for the United States Department of Defense. She was the second African-American woman to earn a doctorate in mathematics at the University of Mississippi. Early life and education She is the daughter of a police officer and grew up in South Central Los Angeles. Moving to a better neighborhood in Los Angeles as a teenager. She went to Westchester High School and attended summer enrichment programs for underrepresented students there that included courses at the University of California, Los Angeles, and a field trip to see the Space Shuttle at NASA's Armstrong Flight Research Center on Edwards Air Force Base. She graduated in 1991. As an undergraduate at California State University, Long Beach, Cotwright-Williams started in engineering. Then, as a math major, she struggled initially and earned low enough grades to be academically disqualified from the university, but worked hard to return as a student in good standing, eventually earning a bachelor's degree in mathematics in 2000. She then earned a master's degree in mathematics from Southern University in Baton Rouge, Louisiana, in 2002. Initially intending to follow a science & math Ph.D. track, she was persuaded to shift to pure mathematics under the mentorship of an African-American professor, Stella R. Ashford, who became the supervisor for her master's thesis in number theory, Unique Factorization in Bi-Quadratic Number Fields. She went on to doctoral studies at the University of Mississippi, where she became president of the Graduate Student Council and earned a second master's degree there along the way in 2004. She completed her Ph.D. at the University of Mississippi in 2006. Her dissertation was supervised by T. James Reid and concerned matroid theory. She was the second African-American woman to earn a doctorate in mathematics at the university, and was part of a group of
https://en.wikipedia.org/wiki/Pkg-config
pkg-config is a computer program that defines and supports a unified interface for querying installed libraries for the purpose of compiling software that depends on them. It allows programmers and installation scripts to work without explicit knowledge of detailed library path information. pkg-config was originally designed for Linux, but it is now also available for BSD, Microsoft Windows, macOS, and Solaris. It outputs various information about installed libraries. This information may include: Parameters (flags) for C or C++ compiler Parameters (flags) for linker Version of the package in question The first implementation was written in shell. Later, it was rewritten in C using the GLib library. Synopsis When a library is installed (automatically through the use of an RPM, deb, or other binary packaging system or by compiling from the source), a .pc file should be included and placed into a directory with other .pc files (the exact directory is dependent upon the system and outlined in the pkg-config man page). This file has several entries. These entries typically contain a list of dependent libraries that programs using the package also need to compile. Entries also typically include the location of header files, version information and a description. Here is an example .pc file for libpng: prefix=/usr/local exec_prefix=${prefix} libdir=${exec_prefix}/lib includedir=${exec_prefix}/include Name: libpng Description: Loads and saves PNG files Version: 1.2.8 Libs: -L${libdir} -lpng12 -lz Cflags: -I${includedir}/libpng12 This file demonstrates how libpng informs that its libraries can be found in /usr/local/lib and its headers in /usr/local/include, that the library name is libpng, and that the version is 1.2.8. It also gives the additional linker flags that are needed to compile code that uses this library. Here is an example of usage of pkg-config while compiling: $ gcc -o test test.c $(pkg-config --libs --cflags libpng) pkg-config can be used by bu
https://en.wikipedia.org/wiki/Raikov%27s%20theorem
Raikov’s theorem, named for Russian mathematician Dmitrii Abramovich Raikov, is a result in probability theory. It is well known that if each of two independent random variables ξ1 and ξ2 has a Poisson distribution, then their sum ξ=ξ1+ξ2 has a Poisson distribution as well. It turns out that the converse is also valid. Statement of the theorem Suppose that a random variable ξ has Poisson's distribution and admits a decomposition as a sum ξ=ξ1+ξ2 of two independent random variables. Then the distribution of each summand is a shifted Poisson's distribution. Comment Raikov's theorem is similar to Cramér’s decomposition theorem. The latter result claims that if a sum of two independent random variables has normal distribution, then each summand is normally distributed as well. It was also proved by Yu.V.Linnik that a convolution of normal distribution and Poisson's distribution possesses a similar property (). An extension to locally compact Abelian groups Let be a locally compact Abelian group. Denote by the convolution semigroup of probability distributions on , and by the degenerate distribution concentrated at . Let . The Poisson distribution generated by the measure is defined as a shifted distribution of the form One has the following Raikov's theorem on locally compact Abelian groups Let be the Poisson distribution generated by the measure . Suppose that , with . If is either an infinite order element, or has order 2, then is also a Poisson's distribution. In the case of being an element of finite order , can fail to be a Poisson's distribution.
https://en.wikipedia.org/wiki/Wake%20knot
The Wake knot or Ormond knot is an English heraldic knot used historically as an heraldic badge by the Wake family, lords of the manor of Bourne in Lincolnshire and also by the Butler family, Earls of Ormond. Form It takes the form of a Carrick bend knot connecting two ropes but the Wake knot shows the knot joining a rope and a strap. Usage It is depicted in the coat of arms of Bourne Town Council and Bourne Academy, Lincolnshire where the Wakes were lords of the manor. The crest of the arms of the Isle of Ely County Council was a human hand grasping a trident around which an eel was entwined; on the wrist of the hand was a Wake knot, representing Hereward the Wake. The crest of No. 2 Squadron RAF includes a Wake knot; its motto is Hereward.
https://en.wikipedia.org/wiki/Discrete%20calculus
Discrete calculus or the calculus of discrete functions, is the mathematical study of incremental change, in the same way that geometry is the study of shape and algebra is the study of generalizations of arithmetic operations. The word calculus is a Latin word, meaning originally "small pebble"; as such pebbles were used for calculation, the meaning of the word has evolved and today usually means a method of computation. Meanwhile, calculus, originally called infinitesimal calculus or "the calculus of infinitesimals", is the study of continuous change. Discrete calculus has two entry points, differential calculus and integral calculus. Differential calculus concerns incremental rates of change and the slopes of piece-wise linear curves. Integral calculus concerns accumulation of quantities and the areas under piece-wise constant curves. These two points of view are related to each other by the fundamental theorem of discrete calculus. The study of the concepts of change starts with their discrete form. The development is dependent on a parameter, the increment of the independent variable. If we so choose, we can make the increment smaller and smaller and find the continuous counterparts of these concepts as limits. Informally, the limit of discrete calculus as is infinitesimal calculus. Even though it serves as a discrete underpinning of calculus, the main value of discrete calculus is in applications. Two initial constructions Discrete differential calculus is the study of the definition, properties, and applications of the difference quotient of a function. The process of finding the difference quotient is called differentiation. Given a function defined at several points of the real line, the difference quotient at that point is a way of encoding the small-scale (i.e., from the point to the next) behavior of the function. By finding the difference quotient of a function at every pair of consecutive points in its domain, it is possible to produce a new func
https://en.wikipedia.org/wiki/Delitto%20perfetto
Delitto perfetto () is a genetic technique for in vivo site-directed mutagenesis in yeast. This name is the Italian term for "perfect murder", and it refers to the ability of the technique to create desired genetic changes without leaving any foreign DNA in the genome. Background This technique was developed by a group at the National Institute of Environmental Health Sciences (NIEHS) composed of Michael A. Resnick, Francesca Storici (now at Georgia Institute of Technology), and L. Kevin Lewis (now at Southwest Texas State University). The method uses synthetic oligonucleotides in combination with the cellular process of homologous recombination. Consequently, it is well suited for genetic manipulation of yeast, which has highly efficient homologous recombination. The delitto perfetto approach has been used to produce single and multiple point mutations, gene truncations or insertions, and whole gene deletions (including essential genes). Advantages The primary advantage of this technique is its ability to eliminate any foreign DNA from the genome after the mutagenesis process. This ensures there are no selectable markers or exogenous sequences used for targeting left in the genome that may cause unforeseen effects. The delitto perfetto technique is also simpler compared to other methods for in vivo site-directed mutagenesis. Other methods require multiple cloning steps and extensive DNA sequencing to confirm mutagenesis, which is often a complicated and inefficient process. There is great flexibility in this approach because after the CORE cassette is inserted (see Method Overview for details), multiple mutations in the gene of interest can be made easily and quickly. This method can be applied to other organisms where homologous recombination is efficient, such as the moss Physcomitrella patens, DT40 chicken cells, or E. coli. In addition, human genes can be studied and similarly genetically manipulated in yeast by using yeast artificial chromosomes (YA
https://en.wikipedia.org/wiki/Systems%20Tool%20Kit
Systems Tool Kit (formerly Satellite Tool Kit), often referred to by its initials STK, is a multi-physics software application from Analytical Graphics, Inc. (an Ansys company) that enables engineers and scientists to perform complex analyses of ground, sea, air, and space platforms, and to share results in one integrated environment. At the core of STK is a geometry engine for determining the time-dynamic position and attitude of objects ("assets"), and the spatial relationships among the objects under consideration including their relationships or accesses given a number of complex, simultaneous constraining conditions. STK has been developed since 1989 as a commercial off the shelf software tool. Originally created to solve problems involving Earth-orbiting satellites, it is now used in the aerospace and defense communities and for many other applications. STK is used in government, commercial, and defense applications around the world. Clients of AGI are organizations such as NASA, ESA, CNES, DLR, Boeing, JAXA, ISRO, Lockheed Martin, Northrop Grumman, Airbus, The US DoD, and Civil Air Patrol. History In 1989, the three founders of Analytical Graphics, Inc. — Paul Graziani, Scott Reynolds, and Jim Poland, left GE Aerospace to create Satellite Tool Kit (STK) as an alternative to bespoke, project-specific aerospace software. The original version of STK ran only on Sun Microsystems computers, but as PCs became more powerful, the code was converted to run on Windows. STK was first adopted by the aerospace community for orbit analysis and access calculations (when a satellite can see a ground-station or image target), but as the software was expanded, more modules were added that included the ability to perform calculations for communications systems, radar, interplanetary missions and orbit collision avoidance. The addition of 3D viewing capabilities led to the adoption of the STK by military users for real-time visualization of air, land and sea forces as we
https://en.wikipedia.org/wiki/Push%20Access%20Protocol
Push Access Protocol (or PAP) is a protocol defined in WAP-164 of the Wireless Application Protocol (WAP) suite from the Open Mobile Alliance. PAP is used for communicating with the Push Proxy Gateway, which is usually part of a WAP Gateway. PAP is intended for use in delivering content from Push Initiators to Push Proxy Gateways for subsequent delivery to narrow band devices, including mobile phones and pagers. Example messages include news, stock quotes, weather, traffic reports, and notification of events such as email arrival. With Push functionality, users are able to receive information without having to request it. In many cases it is important for the user to get the information as soon as it is available. The Push Access Protocol is not intended for use over the air. PAP is designed to be independent of the underlying transport protocol. PAP specifies the following possible operations between the Push Initiator and the Push Proxy Gateway: Submit a Push Cancel a Push Query for status of a Push Query for wireless device capabilities Result notification The interaction between the Push Initiators and the Push Proxy Gateways is in the form of XML messages. Operations Push Submission The purpose of the Push Submission is to deliver a Push message from a Push Initiator to a PPG, which should then deliver the message to a user agent in a device on the wireless network. The Push message contains a control entity and a content entity, and MAY contain a capabilities entity. The control entity is an XML document that contains control information (push-message) for the PPG to use in processing the message for delivery. The content entity represents content to be sent to the wireless device. The capabilities entity contains client capabilities assumed by the Push Initiator and is in the RDF [RDF] format as defined in the User Agent Profile [UAPROF]. The PPG MAY use the capabilities information to validate that the message is appropriate for the cli
https://en.wikipedia.org/wiki/Tert-Butylhydroquinone
tert-Butylhydroquinone (TBHQ, tertiary butylhydroquinone) is a synthetic aromatic organic compound which is a type of phenol. It is a derivative of hydroquinone, substituted with a tert-butyl group. Applications Food preservative In foods, TBHQ is used as a preservative for unsaturated vegetable oils and many edible animal fats. It does not cause discoloration even in the presence of iron, and does not change flavor or odor of the material to which it is added. It can be combined with other preservatives such as butylated hydroxyanisole (BHA). As a food additive, its E number is E319. It is added to a wide range of foods. Its primary advantage is extending storage life. Other In perfumery, it is used as a fixative to lower the evaporation rate and improve stability. It is used industrially as a stabilizer to inhibit autopolymerization of organic peroxides. It is used as an antioxidant in biodiesel. It is also added to varnishes, lacquers, resins, and oil-field additives. Safety and regulation The European Food Safety Authority (EFSA) and the United States Food and Drug Administration (FDA) have evaluated TBHQ and determined that it is safe to consume at the concentration allowed in foods. The FDA and European Union both set an upper limit of 0.02% (200mg/kg) of the oil or fat content in foods. At very high doses, it has some negative health effects on lab animals, such as producing precursors to stomach tumors and damage to DNA. A number of studies have shown that prolonged exposure to very high doses of TBHQ may be carcinogenic, especially for stomach tumors. Other studies, however, have shown opposite effects including inhibition against HCA-induced carcinogenesis (by depression of metabolic activation) for TBHQ and other phenolic antioxidants (TBHQ was one of several, and not the most potent). The EFSA considers TBHQ to be noncarcinogenic. A 1986 review of scientific literature concerning the toxicity of TBHQ determined that a wide margin of safety
https://en.wikipedia.org/wiki/Gene%20nomenclature
Gene nomenclature is the scientific naming of genes, the units of heredity in living organisms. It is also closely associated with protein nomenclature, as genes and the proteins they code for usually have similar nomenclature. An international committee published recommendations for genetic symbols and nomenclature in 1957. The need to develop formal guidelines for human gene names and symbols was recognized in the 1960s and full guidelines were issued in 1979 (Edinburgh Human Genome Meeting). Several other genus-specific research communities (e.g., Drosophila fruit flies, Mus mice) have adopted nomenclature standards, as well, and have published them on the relevant model organism websites and in scientific journals, including the Trends in Genetics Genetic Nomenclature Guide. Scientists familiar with a particular gene family may work together to revise the nomenclature for the entire set of genes when new information becomes available. For many genes and their corresponding proteins, an assortment of alternate names is in use across the scientific literature and public biological databases, posing a challenge to effective organization and exchange of biological information. Standardization of nomenclature thus tries to achieve the benefits of vocabulary control and bibliographic control, although adherence is voluntary. The advent of the information age has brought gene ontology, which in some ways is a next step of gene nomenclature, because it aims to unify the representation of gene and gene product attributes across all species. Relationship with protein nomenclature Gene nomenclature and protein nomenclature are not separate endeavors; they are aspects of the same whole. Any name or symbol used for a protein can potentially also be used for the gene that encodes it, and vice versa. But owing to the nature of how science has developed (with knowledge being uncovered bit by bit over decades), proteins and their corresponding genes have not always been discov
https://en.wikipedia.org/wiki/3p%20deletion%20syndrome
3p deletion syndrome is a rare genetic disorder caused by the deletion of small fragments of chromosome 3. Presentation Reports symptoms in patients with 3p deletion syndrome are intellectual disability, delayed psychomotor development, abnormal facial features, muscular hypotonia, epilepsy, and deformation of the gastrointestinal and urinary tracts. Clinical phenotypes are often considerably mild, and genetic testing is required for diagnosis.
https://en.wikipedia.org/wiki/Tammann%20and%20H%C3%BCttig%20temperatures
The Tammann temperature (also spelled Tamman temperature) and the Hüttig temperature of a given material are approximations to the absolute temperatures at which atoms in a bulk crystal lattice (Tammann) or on the surface (Hüttig) of the solid material become sufficiently mobile to diffuse readily, and are consequently more chemically reactive and susceptible to recrystallization, agglomeration or sintering. These temperatures are equal to one-half (Tammann) or one-third (Hüttig) of the absolute temperature of the compound's melting point. The absolute temperatures are usually measured in Kelvin. Tammann and Hüttig temperatures are an important considerations for catalytic activity, segregation and sintering. The Tammann temperature is important for reactive compounds like explosives and fuel oxiders, such as potassium chlorate (, TTammann = 42 °C), potassium nitrate (, TTammann = 31 °C), and sodium nitrate (NaNO3, TTammann = 17 °C), which may unexpectedly react at much lower temperatures than their melting or decomposition temperatures. The bulk compounds should be contrasted with nanoparticles which exhibit melting-point depression, meaning that they have significantly lower melting points than the bulk material, and correspondingly lower Tammann and Hüttig temperatures. For instance, 2 nm gold nanoparticles melt at only about 327 °C, in contrast to 1065 °C for a bulk gold. History Tammann temperature was pioneered by German astronomer, solid-state chemistry, and physics professor Gustav Tammann in the first half of the 20th century. He had considered a lattice motion very important for the reactivity of matter and quantified his theory by calculating a ratio of the given material temperatures at solid-liquid phases at absolute temperatures. The division of a solid's temperature by a melting point would yield a Tammann temperature. The value is usually measured in Kelvins (K): where is a constant dimensionless number. The threshold temperature for act
https://en.wikipedia.org/wiki/Climate%20as%20complex%20networks
The field of complex networks has emerged as an important area of science to generate novel insights into nature of complex systems The application of network theory to climate science is a young and emerging field. To identify and analyze patterns in global climate, scientists model climate data as complex networks. Unlike most real-world networks where nodes and edges are well defined, in climate networks, nodes are identified as the sites in a spatial grid of the underlying global climate data set, which can be represented at various resolutions. Two nodes are connected by an edge depending on the degree of statistical similarity (that may be related to dependence) between the corresponding pairs of time-series taken from climate records. The climate network approach enables novel insights into the dynamics of the climate system over different spatial and temporal scales. Construction of climate networks Depending upon the choice of nodes and/or edges, climate networks may take many different forms, shapes, sizes and complexities. Tsonis et al. introduced the field of complex networks to climate. In their model, the nodes for the network were constituted by a single variable (500 hPa) from NCEP/NCAR Reanalysis datasets. In order to estimate the edges between nodes, correlation coefficient at zero time lag between all possible pairs of nodes were estimated. A pair of nodes was considered to be connected, if their correlation coefficient is above a threshold of 0.5. Steinhaeuser and team introduced the novel technique of multivariate networks in climate by constructing networks from several climate variables separately and capture their interaction in multivariate predictive model. It was demonstrated in their studies that in context of climate, extracting predictors based on cluster attributes yield informative precursors to improve predictive skills. Kawale et al. presented a graph based approach to find dipoles in pressure data. Given the importance of tele
https://en.wikipedia.org/wiki/P6%20%28microarchitecture%29
The P6 microarchitecture is the sixth-generation Intel x86 microarchitecture, implemented by the Pentium Pro microprocessor that was introduced in November 1995. It is frequently referred to as i686. It was planned to be succeeded by the NetBurst microarchitecture used by the Pentium 4 in 2000, but was revived for the Pentium M line of microprocessors. The successor to the Pentium M variant of the P6 microarchitecture is the Core microarchitecture which in turn is also derived from P6. P6 was used within Intel's mainstream offerings from the Pentium Pro to Pentium III, and was widely known for low power consumption, excellent integer performance, and relatively high instructions per cycle (IPC). Features The P6 core was the sixth generation Intel microprocessor in the x86 line. The first implementation of the P6 core was the Pentium Pro CPU in 1995, the immediate successor to the original Pentium design (P5). P6 processors dynamically translate IA-32 instructions into sequences of buffered RISC-like micro-operations, then analyze and reorder the micro-operations to detect parallelizable operations that may be issued to more than one execution unit at once. The Pentium Pro was the first x86 microprocessor designed by Intel to use this technique, though the NexGen Nx586, introduced in 1994, did so earlier. Other features first implemented in the x86 space in the P6 core include: Speculative execution and out-of-order completion (called "dynamic execution" by Intel), which required new retire units in the execution core. This lessened pipeline stalls, and in part enabled greater speed-scaling of the Pentium Pro and successive generations of CPUs. Superpipelining, which increased from Pentium's 5-stage pipeline to 14 of the Pentium Pro and early model of the Pentium III (Coppermine), and eventually morphed into less than 10-stage pipeline of the Pentium M for embedded and mobile market due to energy inefficiency and higher voltage issues that encountered in the pr
https://en.wikipedia.org/wiki/List%20of%20WLAN%20channels
Wireless LAN (WLAN) channels are frequently accessed using IEEE 802.11 protocols, and equipment that does so is sold mostly under the trademark Wi-Fi. Other equipment also accesses the same channels, such as Bluetooth. The radio frequency (RF) spectrum is vital for wireless communications infrastructure. The 802.11 standard provides several distinct radio frequency bands for use in Wi-Fi communications: 860/900 MHz, 2.4 GHz, 3.6 GHz, 4.9 GHz, 5 GHz, 5.9 GHz, 6 GHz, 45 GHz and 60 GHz. Each range is divided into a multitude of channels. In the standards, channels are numbered at 5 MHz spacing within a band (except in the 45/60 GHz band, where they are 0.54/1.08/2.16 GHz apart), and the number linearly relates to the centre frequency of the channel. Although channels are numbered at 5 MHz spacing, transmitters generally occupy at least 20 MHz, and standards allow for channels to be bonded together to form wider channels for faster throughput. Countries apply their own regulations to allowable channels, allowed users and maximum power levels within these frequency ranges. The ISM band ranges are also often used. 860/900 MHz (802.11ah) 802.11ah operates in sub-gigahertz unlicensed bands. Each world region supports different sub-bands, and the channels number depends on the starting frequency of the sub-band it belongs to. Therefore there is no global channels numbering plan, and the channels numbers are incompatible between world regions (and even between sub-bands of a same world region). The following sub-bands are defined in the 802.11ah specifications: 2.4 GHz (802.11b/g/n/ax) Fourteen channels are designated in the 2.4 GHz range, spaced 5 MHz apart from each other except for a 12 MHz space before channel 14. Interference happens when two networks try to operate in the same band, or when their bands overlap. The two modulation methods used have different characteristics of band usage and therefore occupy different widths: The DSSS method used by legacy 802.1
https://en.wikipedia.org/wiki/Regulation%20of%20algorithms
Regulation of algorithms, or algorithmic regulation, is the creation of laws, rules and public sector policies for promotion and regulation of algorithms, particularly in artificial intelligence and machine learning. For the subset of AI algorithms, the term regulation of artificial intelligence is used. The regulatory and policy landscape for artificial intelligence (AI) is an emerging issue in jurisdictions globally, including in the European Union. Regulation of AI is considered necessary to both encourage AI and manage associated risks, but challenging. Another emerging topic is the regulation of blockchain algorithms (Use of the smart contracts must be regulated) and is mentioned along with regulation of AI algorithms. Many countries have enacted regulations of high frequency trades, which is shifting due to technological progress into the realm of AI algorithms. The motivation for regulation of algorithms is the apprehension of losing control over the algorithms, whose impact on human life increases. Multiple countries have already introduced regulations in case of automated credit score calculation—right to explanation is mandatory for those algorithms. For example, The IEEE has begun developing a new standard to explicitly address ethical issues and the values of potential future users. Bias, transparency, and ethics concerns have emerged with respect to the use of algorithms in diverse domains ranging from criminal justice to healthcare—many fear that artificial intelligence could replicate existing social inequalities along race, class, gender, and sexuality lines. Regulation of artificial intelligence Public discussion In 2016, Joy Buolamwini founded Algorithmic Justice League after a personal experience with biased facial detection software in order to raise awareness of the social implications of artificial intelligence through art and research. In 2017 Elon Musk advocated regulation of algorithms in the context of the existential risk from artifi
https://en.wikipedia.org/wiki/David%20Bevan%20%28mathematician%29
David Bevan is an English mathematician, computer scientist and software developer. He is known for Bevan's theorem, which gives the asymptotic enumeration of grid classes of permutations and for his work on enumerating the class of permutations avoiding the pattern 1324. He is also known for devising weighted reference counting, an approach to computer memory management that is suitable for use in distributed systems. Work and research Bevan is a lecturer in combinatorics in the department of Mathematics and Statistics at the University of Strathclyde. He has degrees in mathematics and computer science from the University of Oxford and a degree in theology from the London School of Theology. He received his PhD in mathematics from The Open University in 2015; his thesis, On the growth of permutation classes, was supervised by Robert Brignall. In 1987, as a research scientist at GEC's Hirst Research Centre in Wembley, he developed an approach to computer memory management, called weighted reference counting, that is suitable for use in distributed systems. During the 1990s, while working for the Summer Institute of Linguistics in Papua New Guinea, he developed a computer program, called FindPhone, that was widely used by field linguists to analyse phonetic data in order to understand the phonology of minority languages. While employed by Pitney Bowes, he was a major contributor to the development of the FreeType text rendering library. Bevan's mathematical research has concerned areas of enumerative combinatorics, particularly in relation to permutation classes. He established that the growth rate of a monotone grid class of permutations is equal to the square of the spectral radius of a related bipartite graph. He has also determined bounds on the growth rate of the class of permutations avoiding the pattern 1324. In the Acknowledgements sections of his journal articles, he often includes the Latin phrase Soli Deo gloria. Selected publications External links
https://en.wikipedia.org/wiki/Renal%20dysplasia-limb%20defects%20syndrome
Renal dysplasia-limb defects syndrome (RL syndrome), also known as Ulbright–Hodes syndrome, is a very rare autosomal recessive congenital disorder. It has been described in three infants, all of whom died shortly after birth. Presentation RL syndrome is characterized by renal dysplasia, growth retardation, phocomelia or mesomelia, radiohumeral fusion (joining of radius and humerus), rib abnormalities, anomalies of the external genitalia and potter-like facies among many others. Genetics RL syndrome is inherited in an autosomal recessive manner. This means the defective gene responsible for the disorder is located on an autosome, and two copies of the defective gene (one inherited from each parent) are required in order to be born with the disorder. The parents of an individual with an autosomal recessive disorder both carry one copy of the defective gene, but usually do not experience any signs or symptoms of the disorder. Diagnosis Treatment
https://en.wikipedia.org/wiki/Program%20slicing
In computer programming, program slicing is the computation of the set of program statements, the program slice, that may affect the values at some point of interest, referred to as a slicing criterion. Program slicing can be used in debugging to locate source of errors more easily. Other applications of slicing include software maintenance, optimization, program analysis, and information flow control. Slicing techniques have been seeing a rapid development since the original definition by Mark Weiser. At first, slicing was only static, i.e., applied on the source code with no other information than the source code. Bogdan Korel and Janusz Laski introduced dynamic slicing, which works on a specific execution of the program (for a given execution trace). Other forms of slicing exist, for instance path slicing. Static slicing Based on the original definition of Weiser, informally, a static program slice S consists of all statements in program P that may affect the value of variable v in a statement x. The slice is defined for a slicing criterion C=(x,v) where x is a statement in program P and v is variable in x. A static slice includes all the statements that can affect the value of variable v at statement x for any possible input. Static slices are computed by backtracking dependencies between statements. More specifically, to compute the static slice for (x,v), we first find all statements that can directly affect the value of v before statement x is encountered. Recursively, for each statement y which can affect the value of v in statement x, we compute the slices for all variables z in y that affect the value of v. The union of all those slices is the static slice for (x,v). Example For example, consider the C program below. Let's compute the slice for ( write(sum), sum ). The value of sum is directly affected by the statements "sum = sum + i + w" if N>1 and "int sum = 0" if N <= 1. So, slice( write(sum), sum) is the union of three slices and the "in
https://en.wikipedia.org/wiki/Helianthus%20%C3%97%20laetiflorus
Helianthus × laetiflorus, the cheerful sunflower or perennial sunflower, is a plant in the family of Asteraceae. It is widespread in scattered locations across much of Canada from Newfoundland to British Columbia, and the central and eastern United States as far south as Texas and Georgia. Description Helianthus × laetiflorus is a herbaceous plant with alternate, simple leaves, on green stems. The flowers are yellow, borne in late summer.
https://en.wikipedia.org/wiki/Sum%20rule%20in%20quantum%20mechanics
In quantum mechanics, a sum rule is a formula for transitions between energy levels, in which the sum of the transition strengths is expressed in a simple form. Sum rules are used to describe the properties of many physical systems, including solids, atoms, atomic nuclei, and nuclear constituents such as protons and neutrons. The sum rules are derived from general principles, and are useful in situations where the behavior of individual energy levels is too complex to be described by a precise quantum-mechanical theory. In general, sum rules are derived by using Heisenberg's quantum-mechanical algebra to construct operator equalities, which are then applied to the particles or energy levels of a system. Derivation of sum rules Assume that the Hamiltonian has a complete set of eigenfunctions with eigenvalues : For the Hermitian operator we define the repeated commutator iteratively by: The operator is Hermitian since is defined to be Hermitian. The operator is anti-Hermitian: By induction one finds: and also For a Hermitian operator we have Using this relation we derive: The result can be written as For this gives: See also Oscillator strength Sum rules (quantum field theory) QCD sum rules
https://en.wikipedia.org/wiki/Loggly
SolarWinds Loggly is a cloud-based log management and analytics service provider based in San Francisco, California. Jon Gifford, Raffael Marty, and Kord Campbell founded the company in 2009, and Charlie Oppenheimer was the CEO of Loggly until its announced acquisition by SolarWinds (as part of the SolarWinds Cloud division of brands) on January 8, 2018. History In 2009, Jon Gifford, Raffael Marty, and Kord Campbell founded Loggly. App47, a mobile application management provider, partnered with Loggly in September 2012. The company chose Loggly because of its software-as-a-service (SaaS) deployment option. In September 2013, Loggly released "Generation 2", an updated version of its service. The update included log collection through standard syslog protocols and a graphical web interface that allowed users to use a point-and-click process to find log events and generate charts. That month, Loggly completed a $10.5 million funding round led by Cisco and Data Collective. Trinity Ventures, True Ventures and Matrix Partners also participated in the round. In October 2014, the company announced a $15 million series C funding round led by Harmony Partners. Matrix Partners, Trinity Ventures, Cisco, Data Collective, and True Ventures also participated. The funding round raised Loggly's total investment funding to $33.4 million. The company released Loggly Dynamic Field Explorer, a new user experience that aims to reduce the time developers spend on identifying and troubleshooting problems, that month. On January 8, 2018, the company announced that they are now part of SolarWinds. Operations Loggly is headquartered in San Francisco, California. The company had 54 employees and 10,000 customers in October 2017. Loggly records log data from any device and reports it in a real-time management platform with trend data. Technology Loggly is a cloud-based log management service provider. It does not require the use of proprietary software agents to collect log data. The serv
https://en.wikipedia.org/wiki/Nicking%20enzyme%20amplification%20reaction
Nicking Enzyme Amplification Reaction (NEAR) is a method for in vitro DNA amplification like the polymerase chain reaction (PCR). NEAR is isothermal, replicating DNA at a constant temperature using a polymerase (and nicking enzyme) to exponentially amplify the DNA at a temperature range of 55 °C to 59 °C. One disadvantage of PCR is that it consumes time uncoiling the double-stranded DNA with heat into single strands (a process called denaturation) . This leads to amplification times typically thirty minutes or more for significant production of amplified products. Potential advantages of NEAR over PCR are increased speed and lower energy requirements, characteristics that are shared with other isothermal amplification schemes. A major disadvantage of NEAR relative to PCR is that production of nonspecific amplification products is a common issue with isothermal amplification reactions. The NEAR reaction uses naturally occurring or engineered endonucleases that introduce a strand break on only one strand of a double-stranded DNA cleavage site. The ability of several of these enzymes to catalyze isothermal DNA amplification was disclosed but not claimed in the patents issued for the enzymes themselves.
https://en.wikipedia.org/wiki/Library%20Hub%20Discover
Library Hub Discover is a union catalog operated by Jisc. It replaces Copac and SUNCAT. Its user interface is centred around a simple search engine-like query box.
https://en.wikipedia.org/wiki/Uridine%20triphosphate
Uridine-5′-triphosphate (UTP) is a pyrimidine nucleoside triphosphate, consisting of the organic base uracil linked to the 1′ carbon of the ribose sugar, and esterified with tri-phosphoric acid at the 5′ position. Its main role is as substrate for the synthesis of RNA during transcription. UTP is the precursor for the production of CTP via CTP synthetase. UTP can be biosynthesized from UDP by Nucleoside Diphosphate Kinase after using the phosphate group from ATP. UDP + ATP ⇌ UTP + ADP; both UTP and ATP are energetically equal. The homologue in DNA is thymidine triphosphate (TTP or dTTP). UTP also has a deoxyribose form (dUTP). Role in metabolism UTP also has the role of a source of energy or an activator of substrates in metabolic reactions, like that of ATP, but more specific. When UTP activates a substrate (like Glucose-1-phosphate), UDP-glucose is formed and inorganic phosphate is released. UDP-glucose enters the synthesis of glycogen. UTP is used in the metabolism of galactose, where the activated form UDP-galactose is converted to UDP-glucose. UDP-glucuronate is used to conjugate bilirubin to a more water-soluble bilirubin diglucuronide. UTP is also used to activate amino sugars like Glucosamine-1-phosphate to UDP-glucosamine, and N-acetyl-glucosamine-1-phosphate to UDP-N-acetylglucosamine. Role in receptor mediation UTP also has roles in mediating responses by extracellular binding to the P2Y receptors of cells. UTP and its derivatives are still being investigated for their applications in human medicine. See also CTP synthase
https://en.wikipedia.org/wiki/Multi-trials%20technique
The multi-trials technique by Schneider et al. is employed for distributed algorithms and allows breaking of symmetry efficiently. Symmetry breaking is necessary, for instance, in resource allocation problems, where many entities want to access the same resource concurrently. Many message passing algorithms typically employ one attempt to break symmetry per message exchange. The multi-trials technique transcends this approach through employing more attempts with every message exchange. For example, in a simple algorithm for computing an O(Δ) vertex coloring, where Δ denotes the maximum degree in the graph, every uncolored node randomly picks an available color and keeps it if no neighbor (concurrently) chooses the same color. For the multi-trials technique, a node gradually increases the number of chosen colors in every communication round. The technique can yield more than an exponential reduction in the required communication rounds. However, if the maximum degree Δ is small more efficient techniques exist, e.g. the (extended) coin-tossing technique by Richard Cole and Uzi Vishkin. Notes
https://en.wikipedia.org/wiki/Poisson-Dirichlet%20distribution
In probability theory, a branch of mathematics Poisson-Dirichlet distributions are probability distributions on the set of nonnegative, non-decreasing sequences with sum 1, depending on two parameters and . It can be defined as follows. One considers independent random variables such that follows the beta distribution of parameters and . Then, the Poisson-Dirichlet distribution of parameters and is the law of the random decreasing sequence containing and the products . This definition is due to Jim Pitman and Marc Yor. It generalizes Kingman's law, which corresponds to the particular case . Number theory Patrick Billingsley has proven the following result: if is a uniform random integer in , if is a fixed integer, and if are the largest prime divisors of (with arbitrarily defined if has less than prime factors), then the joint distribution ofconverges to the law of the first elements of a distributed random sequence, when goes to infinity. Random permutations and Ewens's sampling formula The Poisson-Dirichlet distribution of parameters and is also the limiting distribution, for going to infinity, of the sequence , where is the length of the largest cycle of a uniformly distributed permutation of order . If for , one replaces the uniform distribution by the distribution on such that , where is the number of cycles of the permutation , then we get the Poisson-Dirichlet distribution of parameters and . The probability distribution is called Ewens's distribution, and comes from the Ewens's sampling formula, first introduced by Warren Ewens in population genetics, in order to describe the probabilities associated with counts of how many different alleles are observed a given number of times in the sample.
https://en.wikipedia.org/wiki/Madeira%20finch
The Madeira finch (Goniaphea leucocephala) is a hypothetical species of recently extinct small passerine bird from the Portuguese archipelago of Madeira, which is bigeographically part of Macaronesia. It should not be confused with the extant Madeiran chaffinch (Fringilla coelebs maderensis). Discovery The only known account is the 1823 description and illustration of a bird by British author Thomas Edward Bowdich, published by his widow Sarah Bowdich Lee in the 1825 book Excursions in Madeira and Porto Santo, during the autumn of 1823, while on his third voyage to Africa. No remains of this bird survive; if collected it could have been lost while en route to Europe, as it happened to most of Bowdich's specimens. Description Bowdich's original text and footnote read: Taxonomy Bowdich's concern that G. leucocephala might not be native to Madeira was echoed by other authors that placed it on the New World genus Passerina rather than the Fringillidae sensu stricto. However this was rejected as implausible by Harald Pieper in 1985. Pieper found abundant subfossil remains of fringillids in Madeira, which included at least one clearly new species of the genus Acanthis or a close relative. But despite noting Bowdich's account and supporting its interpretation as belonging to an endemic, recently extinct fringillid from Madeira, he refrained from assigning any remains to this species. Extinction No similar bird was seen or described again, so the species must have disappeared before Richard Thomas Lowe surveyed Madeira and Porto Santo in 1853. It could have disappeared due to human alteration of its habitat or introduced predators. The natural vegetation of the islands was considerably altered after settlement began in 1420, particularly in the lowlands, which caused the extinction of several bird species including flightless rails and quails. By 1859, Charles Darwin noted that the islands were strangely devoid of endemic species. Pieper, who studied pre-settlement
https://en.wikipedia.org/wiki/Pontecorvo%E2%80%93Maki%E2%80%93Nakagawa%E2%80%93Sakata%20matrix
In particle physics, the Pontecorvo–Maki–Nakagawa–Sakata matrix (PMNS matrix), Maki–Nakagawa–Sakata matrix (MNS matrix), lepton mixing matrix, or neutrino mixing matrix is a unitary mixing matrix which contains information on the mismatch of quantum states of neutrinos when they propagate freely and when they take part in weak interactions. It is a model of neutrino oscillation. This matrix was introduced in 1962 by Ziro Maki, Masami Nakagawa, and Shoichi Sakata, to explain the neutrino oscillations predicted by Bruno Pontecorvo. The PMNS matrix The Standard Model of particle physics contains three generations or "flavors" of neutrinos, , , and , each labeled with a subscript showing the charged lepton that it partners with in the charged-current weak interaction. These three eigenstates of the weak interaction form a complete, orthonormal basis for the Standard Model neutrino. Similarly, one can construct an eigenbasis out of three neutrino states of definite mass, , , and , which diagonalize the neutrino's free-particle Hamiltonian. Observations of neutrino oscillation established experimentally that for neutrinos, as for quarks, these two eigenbases are different – they are 'rotated' relative to each other. Consequently, each flavor eigenstate can be written as a combination of mass eigenstates, called a "superposition", and vice versa. The PMNS matrix, with components corresponding to the amplitude of mass eigenstate in terms of flavor "", "", ""; parameterizes the unitary transformation between the two bases: The vector on the left represents a generic neutrino expressed in the flavor-eigenstate basis, and on the right is the PMNS matrix multiplied by a vector representing that same neutrino in the mass-eigenstate basis. A neutrino of a given flavor is thus a "mixed" state of neutrinos with distinct mass: If one could measure directly that neutrino's mass, it would be found to have mass with probability . The PMNS matrix for antineutrinos is identical
https://en.wikipedia.org/wiki/Multivalued%20function
In mathematics, a multivalued function is a set-valued function with additional properties depending on context. The terms multifunction and many-valued function are sometimes also used. A multivalued function of sets f : X → Y is a subset Write f(x) for the set of those y ∈ Y with (x,y) ∈ Γf. If f is an ordinary function, it is a multivalued function by taking its graph They are called single-valued functions to distinguish them. Motivation The term multivalued function originated in complex analysis, from analytic continuation. It often occurs that one knows the value of a complex analytic function in some neighbourhood of a point . This is the case for functions defined by the implicit function theorem or by a Taylor series around . In such a situation, one may extend the domain of the single-valued function along curves in the complex plane starting at . In doing so, one finds that the value of the extended function at a point depends on the chosen curve from to ; since none of the new values is more natural than the others, all of them are incorporated into a multivalued function. For example, let be the usual square root function on positive real numbers. One may extend its domain to a neighbourhood of in the complex plane, and then further along curves starting at , so that the values along a given curve vary continuously from . Extending to negative real numbers, one gets two opposite values for the square root—for example for —depending on whether the domain has been extended through the upper or the lower half of the complex plane. This phenomenon is very frequent, occurring for th roots, logarithms, and inverse trigonometric functions. To define a single-valued function from a complex multivalued function, one may distinguish one of the multiple values as the principal value, producing a single-valued function on the whole plane which is discontinuous along certain boundary curves. Alternatively, dealing with the multivalued function allo
https://en.wikipedia.org/wiki/RTI%20International
Research Triangle Institute, trading as RTI International, is a nonprofit organization headquartered in the Research Triangle Park in North Carolina, USA. RTI provides research and technical services. It was founded in 1958 with $500,000 in funding from local businesses and the three North Carolina universities that form the Research Triangle. RTI research has covered topics like HIV/AIDS, healthcare, education curriculum and the environment. The US Agency for International Development accounts for about 35 percent of RTI's research revenue. History In 1954, a building contractor, met with the North Carolina state treasurer and the president of Wachovia to discuss building a research park in North Carolina to attract new industries to the region. They obtained support for the concept from the state governor, Luther Hodges, and the three universities that form the research triangle: University of North Carolina at Chapel Hill, Duke University and North Carolina State University. The Research Triangle Institute (now RTI International) was formed by the park's founders as the research park's first tenant in 1958. The following January, they announced that $1.425 million had been raised by the Research Triangle Foundation to fund the park and that $500,000 of it had been set aside for RTI International. RTI started with three divisions: Isotope Development, Operational Sciences and Statistics Research. Its first contract was a $4,500 statistical study of morbidity data from Tennessee. In RTI's first year of operation, it had 25 staff and $240,000 in research contracts. Its early work was focused on statistics, but within a few years RTI expanded into radioisotopes, organic chemistry and polymers. In 1960, the institute had its first international research contract for an agricultural census in Nigeria. RTI won contracts with the Department of Education, Defense Department, NASA and the Atomic Energy Commission, growing to $3.4 million in contracts in 1964 and $85 mil
https://en.wikipedia.org/wiki/Octagon
In geometry, an octagon (from the Greek ὀκτάγωνον oktágōnon, "eight angles") is an eight-sided polygon or 8-gon. A regular octagon has Schläfli symbol {8} and can also be constructed as a quasiregular truncated square, t{4}, which alternates two types of edges. A truncated octagon, t{8} is a hexadecagon, {16}. A 3D analog of the octagon can be the rhombicuboctahedron with the triangular faces on it like the replaced edges, if one considers the octagon to be a truncated square. Properties The sum of all the internal angles of any octagon is 1080°. As with all polygons, the external angles total 360°. If squares are constructed all internally or all externally on the sides of an octagon, then the midpoints of the segments connecting the centers of opposite squares form a quadrilateral that is both equidiagonal and orthodiagonal (that is, whose diagonals are equal in length and at right angles to each other). The midpoint octagon of a reference octagon has its eight vertices at the midpoints of the sides of the reference octagon. If squares are constructed all internally or all externally on the sides of the midpoint octagon, then the midpoints of the segments connecting the centers of opposite squares themselves form the vertices of a square. Regularity A regular octagon is a closed figure with sides of the same length and internal angles of the same size. It has eight lines of reflective symmetry and rotational symmetry of order 8. A regular octagon is represented by the Schläfli symbol {8}. The internal angle at each vertex of a regular octagon is 135° ( radians). The central angle is 45° ( radians). Area The area of a regular octagon of side length a is given by In terms of the circumradius R, the area is In terms of the apothem r (see also inscribed figure), the area is These last two coefficients bracket the value of pi, the area of the unit circle. The area can also be expressed as where S is the span of the octagon, or the second-shortest diagonal
https://en.wikipedia.org/wiki/Margit%20Voigt
Margit Voigt is a German mathematician specializing in graph theory and graph coloring. She is a professor of operations research at the University of Applied Sciences Dresden. Voigt completed her Ph.D. in 1992 at the Technische Universität Ilmenau. Her dissertation, Über die chromatische Zahl einer speziellen Klasse unendlicher Graphen [On the chromatic number of a special class of infinite graphs] was jointly supervised by Rainer Bodendiek and Hansjoachim Walther. Her results include the first known planar graph that requires five colors for list coloring, and a counterexample to a related conjecture that list coloring of planar graphs requires at most one more color than graph coloring for the same graphs.
https://en.wikipedia.org/wiki/Wikidata
Wikidata is a collaboratively edited multilingual knowledge graph hosted by the Wikimedia Foundation. It is a common source of open data that Wikimedia projects such as Wikipedia, and anyone else, can use under the CC0 public domain license. Wikidata is a wiki powered by the software MediaWiki, including its extension for semi-structured data, the Wikibase. Concept Wikidata is a document-oriented database, focused on items, which represent any kind of topic, concept, or object. Each item is allocated a unique, persistent identifier, a positive integer prefixed with the upper-case letter Q, known as a "QID". This enables the basic information required to identify the topic that the item covers to be translated without favouring any language. Examples of items include , , , , and . Item labels need not be unique. For example, there are two items named "Elvis Presley": , which represents the American singer and actor, and , which represents his self-titled album. However, the combination of a label and its description must be unique. To avoid ambiguity, an item's unique identifier (QID) is therefore linked to this combination. Main parts Fundamentally, an item consists of: An identifier (the QID), related to a label and a description. Optionally, multiple aliases and some number of statements (and their properties and values). Statements Statements are how any information known about an item is recorded in Wikidata. Formally, they consist of key–value pairs, which match a property (such as "author", or "publication date") with one or more entity values (such as "Sir Arthur Conan Doyle" or "1902"). For example, the informal English statement "milk is white" would be encoded by a statement pairing the property with the value under the item . Statements may map a property to more than one value. For example, the "occupation" property for Marie Curie could be linked with the values "physicist" and "chemist", to reflect the fact that she engaged in both occ
https://en.wikipedia.org/wiki/Friction%20torque
In mechanics, friction torque is the torque caused by the frictional force that occurs when two objects in contact move. Like all torques, it is a rotational force that may be measured in newton meters or pounds-feet. Engineering Friction torque can be disruptive in engineering. There are a variety of measures engineers may choose to take to eliminate these disruptions. Ball bearings are an example of an attempt to minimize the friction torque. Friction torque can also be an asset in engineering. Bolts and nuts, or screws are often designed to be fastened with a given amount of torque, where the friction is adequate during use or operation for the bolt, nut, or screw to remain safely fastened. This is true with such applications as lug nuts retaining wheels to vehicles, or equipment subjected to vibration with sufficiently well-attached bolts, nuts, or screws to prevent the vibration from shaking them loose. Examples When a cyclist applies the brake to the forward wheel, the bicycle tips forward due to the frictional torque between the wheel and the ground. When a golf ball hits the ground it begins to spin in part because of the friction torque applied to the golf ball from the friction between the golf ball and the ground. See also Torque Force Engineering Mechanics Moment (physics)
https://en.wikipedia.org/wiki/Worst-case%20distance
In fabrication the yield Y=(number of good samples)/(total number of samples) is one of the most important measures. Also in the design phase engineers already try to maximize the yield by using simulation techniques and statistical models. Often the data follows the well-known bell-shaped normal distribution, and for such distributions there is a simple direct relationship between the design margin (to a given specification limit) and the yield. If we express the specification margin in terms of standard deviation sigma, we can immediately calculate yield Y according to this specification. The concept of worst-case distance (WCD) extends this simple idea for applying it to more complex problems (like having non-normal distributions, multiple specs, etc.). The WCD is a metric originally applied in electronic design for yield optimization and design centering, nowadays also applied as a metric for quantifying electronic system and device robustness. For yield optimization in electronic circuit design the WCD relates the following yield influencing factors to each other: Statistical distribution of design parameters usually based on the used technology process Operating range of operating conditions the design will work in Performance specification for performance parameters Although the strict mathematical formalism may be complex, in a simple interpretation the WCD is the maximum of all possible (i.e. being within the specification limits) performance variances divided by the distance to the performance specification, given that the performance variances are evaluated under the space spanned by the operating range range. Note: This interpretation is valid for normal (Gaussian) distributed variables and performances, luckily the "specification-margin" of a design is almost intuitively related to the yield, e.g. if we have a larger "safety margin" in our design to the limit(s) we are more on the safe side and the production will contain less fail samples.
https://en.wikipedia.org/wiki/Segue
A segue (; ) is a transition from one topic or section to the next. The term is derived from Italian segue, which literally means "follows". In music In music, segue is a direction to the performer. It means continue (the next section) without a pause. The term attacca is used synonymously. For written music, it implies a transition from one section to the next without any break. In improvisation, it is often used for transitions created as a part of the performance, leading from one section to another. In live performance, a segue can occur during a jam session, where the improvisation of the end of one song progresses into a new song. Segues can even occur between groups of musicians during live performance. For example, as one band finishes its set, members of the following act replace members of the first band one by one, until a complete band swap occurs. In recorded music, a segue sometimes means a seamless change between one song and another, sometimes achieved through beatmatching, especially on dance and disco recordings. However, as noted by composer John Williams in the liner notes for his Star Wars soundtrack album, a series of musical ideas can be juxtaposed with no transitions whatsoever. Arrangements that involve or create the effect of a classical musical suite, may be used in many pieces or progressive rock recordings, but by definition, a segue does not involve a bridging transition--it is an abrupt change of musical idea. With breakless joins of the elements in his albums Frank Zappa made extensive use of the segue technique. This was first used in 1966 on Zappa's Freak Out!, and a year later on the Beatles' Sgt. Pepper's Lonely Hearts Club Band. In some Brazilian musical styles, where it is called "emendar" ("to splice"), in particular in Samba and Forró Pé de Serra, it is very commonly used in live performances, creating sets that usually last around 20 minutes but can sometimes take more than an hour, switching seamlessly between differ
https://en.wikipedia.org/wiki/Treedom
Treedom is a platform that allows anyone to plant trees in different countries of the world. Treedom also allows the 'owner's of the planted trees to receive images of the trees that have just been planted along with its GPS coordinates and updates from the project it is part of. Procedure The project developer who applies to become a "tree planter" has to make a formal request in the form of the "project". The submission is reviewed to exclude projects that require cutting other trees to make the space, violate the law, consider planting invasive species and the like. The farmer confirms the fact of planting the tree with the help of the specialized mobile application that captures both photo and GPS coordinates. These reports are then manually checked, verifying the location, quality of the image and species of the planted tree. Trees that do not take root for the first three years must be replanted. Treedom claims inspecting in place at least 25% of these projects per year. Additionally, 5% of the planted trees are put aside as so called “Project Reserve” that should cover possible loss of trees and the related absorption (like trees that die after the third year, for which a substitution is not provided). A user can then order planting a selected tree online, paying as for a web purchase. Tree planters Treedom works in collaboration with small collective of farmers, local community and NGO across different countries including Kenya, Tanzania, Guatemala, Ecuador, Italy, Haiti, Nepal, Pakistan, Peru, Italy, etc. For the trees which bear fruits, the fruits are reckoned to belong to the farmers who planted it. The farmer planting a tree remains responsible for its growth and take care where the organization provides support by arranging agroforestry training and income opportunities. The platform is known to promote welfare of farmers, including female farmers for which it announced 'Mothers day campaign' during March 2020 with the aim to spread awareness ab
https://en.wikipedia.org/wiki/Rolling%20resistance
Rolling resistance, sometimes called rolling friction or rolling drag, is the force resisting the motion when a body (such as a ball, tire, or wheel) rolls on a surface. It is mainly caused by non-elastic effects; that is, not all the energy needed for deformation (or movement) of the wheel, roadbed, etc., is recovered when the pressure is removed. Two forms of this are hysteresis losses (see below), and permanent (plastic) deformation of the object or the surface (e.g. soil). Note that the slippage between the wheel and the surface also results in energy dissipation. Although some researchers have included this term in rolling resistance, some suggest that this dissipation term should be treated separately from rolling resistance because it is due to the applied torque to the wheel and the resultant slip between the wheel and ground, which is called slip loss or slip resistance. In addition, only the so-called slip resistance involves friction, therefore the name "rolling friction" is to an extent a misnomer. Analogous with sliding friction, rolling resistance is often expressed as a coefficient times the normal force. This coefficient of rolling resistance is generally much smaller than the coefficient of sliding friction. Any coasting wheeled vehicle will gradually slow down due to rolling resistance including that of the bearings, but a train car with steel wheels running on steel rails will roll farther than a bus of the same mass with rubber tires running on tarmac/asphalt. Factors that contribute to rolling resistance are the (amount of) deformation of the wheels, the deformation of the roadbed surface, and movement below the surface. Additional contributing factors include wheel diameter, load on wheel, surface adhesion, sliding, and relative micro-sliding between the surfaces of contact. The losses due to hysteresis also depend strongly on the material properties of the wheel or tire and the surface. For example, a rubber tire will have higher rolling res
https://en.wikipedia.org/wiki/Potassium%20channel%20tetramerisation%20domain
K+ channel tetramerisation domain is the N-terminal, cytoplasmic tetramerisation domain (T1) of voltage-gated K+ channels. It defines molecular determinants for subfamily-specific assembly of alpha-subunits into functional tetrameric channels. It is distantly related to the BTB/POZ domain . Potassium channels Potassium channels are the most diverse group of the ion channel family. They are important in shaping the action potential, and in neuronal excitability and plasticity. The potassium channel family is composed of several functionally distinct isoforms, which can be broadly separated into 2 groups: the practically non-inactivating 'delayed' group and the rapidly inactivating 'transient' group. These are all highly similar proteins, with only small amino acid changes causing the diversity of the voltage-dependent gating mechanism, channel conductance and toxin binding properties. Each type of K+ channel is activated by different signals and conditions depending on their type of regulation: some open in response to depolarisation of the plasma membrane; others in response to hyperpolarisation or an increase in intracellular calcium concentration; some can be regulated by binding of a transmitter, together with intracellular kinases; while others are regulated by GTP-binding proteins or other second messengers. In eukaryotic cells, K+ channels are involved in neural signalling and generation of the cardiac rhythm, act as effectors in signal transduction pathways involving G protein-coupled receptors (GPCRs) and may have a role in target cell lysis by cytotoxic T-lymphocytes. In prokaryotic cells, they play a role in the maintenance of ionic homeostasis. Alpha subunits of the channels All K+ channels discovered so far possess a core of alpha subunits, each comprising either one or two copies of a highly conserved pore loop domain (P-domain). The P-domain contains the sequence (T/SxxTxGxG), which has been termed the K+ selectivity sequence. In families that cont
https://en.wikipedia.org/wiki/Poincar%C3%A9%20complex
In mathematics, and especially topology, a Poincaré complex (named after the mathematician Henri Poincaré) is an abstraction of the singular chain complex of a closed, orientable manifold. The singular homology and cohomology groups of a closed, orientable manifold are related by Poincaré duality. Poincaré duality is an isomorphism between homology and cohomology groups. A chain complex is called a Poincaré complex if its homology groups and cohomology groups have the abstract properties of Poincaré duality. A Poincaré space is a topological space whose singular chain complex is a Poincaré complex. These are used in surgery theory to analyze manifold algebraically. Definition Let be a chain complex of abelian groups, and assume that the homology groups of are finitely generated. Assume that there exists a map , called a chain-diagonal, with the property that . Here the map denotes the ring homomorphism known as the augmentation map, which is defined as follows: if , then . Using the diagonal as defined above, we are able to form pairings, namely: , where denotes the cap product. A chain complex C is called geometric if a chain-homotopy exists between and , where is the transposition/flip given by . A geometric chain complex is called an algebraic Poincaré complex, of dimension n, if there exists an infinite-ordered element of the n-dimensional homology group, say , such that the maps given by are group isomorphisms for all . These isomorphisms are the isomorphisms of Poincaré duality. Example The singular chain complex of an orientable, closed n-dimensional manifold is an example of a Poincaré complex, where the duality isomorphisms are given by capping with the fundamental class . See also Poincaré space
https://en.wikipedia.org/wiki/Lewy%20body
Lewy bodies are the inclusion bodies – abnormal aggregations of protein – that develop inside nerve cells affected by Parkinson's disease (PD), the Lewy body dementias (Parkinson's disease dementia and dementia with Lewy bodies (DLB)), and some other disorders. They are also seen in cases of multiple system atrophy, particularly the parkinsonian variant (MSA-P). Lewy bodies appear as spherical masses in the cytoplasm that displace other cell components. For instance, some Lewy bodies tend to displace the nucleus to one side of the cell. There are two main kinds of Lewy bodies: classical and cortical. Lewy bodies may be found in the midbrain (within the substantia nigra) or within the cortex. A classical Lewy body is an eosinophilic cytoplasmic inclusion consisting of a dense core surrounded by a halo of 10 nm wide radiating fibrils, the primary structural component of which is alpha-synuclein. History In 1910, Fritz Heinrich Lewy was studying in Berlin for his doctorate. He was the first doctor to notice some unusual proteins in the brain, comparing them to earlier findings by Gonzalo Rodríguez Lafora. In 1913, Lafora described another case, and acknowledged Lewy as the discoverer, naming them cuerpos intracelulares de Lewy (intracellular Lewy bodies). Konstantin Nikolaevich Trétiakoff found them in 1919 in the substantia nigra of PD brains, called them corps de Lewy and is credited with the eponym. In 1923, Lewy published his findings in a book, The Study on Muscle Tone and Movement. Including Systematic Investigations on the Clinic, Physiology, Pathology, and Pathogenesis of Paralysis agitans. Eliasz Engelhardt, who is in the neurology department at Federal University of Rio de Janeiro, argued in 2017 that Lafora should be credited with the eponym, because he named them six years before Trétiakoff. Nonetheless, Trétiakoff is still the primary figure acknowledged for coining the term, “Lewy bodies.” According to the Journal of the History of the Neurosciences,
https://en.wikipedia.org/wiki/Djenkolic%20acid
Djenkolic acid (or sometimes jengkolic acid) is a sulfur-containing non-protein amino acid naturally found in the djenkol beans of the Southeast Asian plant Archidendron jiringa. Its chemical structure is similar to cystine but contains a methylene (single carbon) unit in between the two sulfur atoms. There is about 20 grams of djenkolic acid per kilogram of dry djenkol beans, and it has also been reported in smaller amounts in the seeds of other leguminous plants such as Leucaena esculenta (2.2 g/kg) and Pithecolobium ondulatum (2.8 g/kg). Toxicity The toxicity of djenkolic acid in humans arises from its poor solubility under acidic conditions after consumption of the djenkol bean. The amino acid precipitates into crystals which cause mechanical irritation of the renal tubules and urinary tract, resulting in symptoms such as abdominal discomfort, loin pains, severe colic, nausea, vomiting, dysuria, gross hematuria, and oliguria, occurring 2 to 6 hours after the beans were ingested. Urine analysis of patients reveals erythrocytes, epithelial cells, protein, and the needle-like crystals of djenkolic acid. Urolithiasis can also happen, with djenkolic acid as the nucleus. In young children it has also been reported to produce painful swelling of the genitalia. Treatment for this toxicity requires hydration to increase urine flow and alkalinization of urine by sodium bicarbonate. Furthermore, this poisoning can be prevented when consuming djenkol beans by boiling them beforehand, since djenkolic acid is removed from the beans. Discovery and synthesis Djenkolic acid was first isolated by Van Veen and Hyman in 1933 from the urine of the natives of Java who had eaten the djenkol bean and were suffering from poisoning. They then isolated the djenkolic acid crystals by treating the djenkol beans with barium hydroxide at 30°C for a prolonged period of time. Du Vigneaud and Patterson managed to synthesize djenkolic acid by condensation of methylene chloride with 2 moles
https://en.wikipedia.org/wiki/Yamartino%20method
The Yamartino method is an algorithm for calculating an approximation of the circular variance of wind direction during a single pass through the incoming data. Background The simple method for calculating circular variance requires two passes through the list of values. The first pass determines the circular mean of those values, while the second pass determines the variance. This double-pass method requires access to all values. There is also a single-pass method for calculating the standard deviation, but this method is unsuitable for angular data such as wind direction. Trying to calculate angular moments by naively applying the standard formulas to angular expressions yields absurd results. For example, a dataset that measures wind speeds of 1° and 359° would average to 180°, but expressing the same data as 1° and -1° (equal to 359°) would give an average of 0°. Thus, we define circular moments by placing all measured angles on a unit circle, then calculating the moments of these points. The Yamartino method, introduced by Robert J. Yamartino in 1984, solves both problems A further discussion of the Yamartino method, along with other methods of estimating the standard deviation of wind direction can be found in Farrugia & Micallef. It is possible to calculate the exact standard deviation in one pass. However, that method needs slightly more calculation effort. Algorithm Over the time interval to be averaged across, n measurements of wind direction (θ) will be made and two totals are accumulated without storage of the n individual values. At the end of the interval the calculations are as follows: with the average values of sin θ and cos θ defined as Then the average wind direction is given via the four-quadrant arctan(x,y) function as From twenty different functions for σθ using variables obtained in a single-pass of the wind direction data, Yamartino found the best function to be where The key here is to remember that sin2θ + cos2θ = 1 so that for
https://en.wikipedia.org/wiki/Code%20page%201104
Code page 1104 (CCSID 1104), also known as CP1104, F7DEC, ISO-IR-025 or NF Z 62-010 (1973) is an IBM code page number assigned to the French variant of DEC's National Replacement Character Set (NRCS). The 7-bit character set was introduced for DEC's computer terminal systems, starting with the VT200 series in 1983, but it is also used by IBM for their DEC emulation. ISO-IR-025 was previously also the French variant of ISO 646 (NF Z 62-010), it was superseded by Code page 1010 (NF Z 62-010:1982, ISO-IR-069) in that respect, from which it differs in only one point. It is also a close derivation from ASCII, with only nine code points differing. Code page layout See also Code page 1010 (similar ISO 646-FR code page) Code page 1020 (French-Canadian NRCS) National Replacement Character Set (NRCS)
https://en.wikipedia.org/wiki/Piezoelectric%20coefficient
The piezoelectric coefficient or piezoelectric modulus, usually written d33, quantifies the volume change when a piezoelectric material is subject to an electric field, or the polarization on the application of stress. In general, piezoelectricity is described by a tensor of coefficients ; see for further details. External links List of piezoelectric materials Table of properties for lead zirconate titanate Piezoelectric terminology Piezoelectric Constant (or coefficient); a simple explanation Electrical phenomena
https://en.wikipedia.org/wiki/Blitzar
In astronomy, blitzars are a hypothetical type of neutron star, specifically pulsars that can rapidly collapse into black holes if their spinning slows down. Heino Falcke and Luciano Rezzolla proposed these stars in 2013 as an explanation for fast radio bursts. Overview These stars, if they exist, are thought to start from a neutron star with a mass that would cause it to collapse into a black hole if it were not rapidly spinning. Instead, the neutron star spins fast enough so that its centrifugal force overcomes gravity. This makes the neutron star a typical but doomed pulsar whose strong magnetic field radiates energy away and slows its spin. Eventually the weakening centrifugal force is no longer able to halt the pulsar from collapsing into a black hole. At that moment, part of the pulsar's magnetic field outside the black hole is suddenly cut off from its vanished source. This magnetic energy is instantly transformed into a burst of wide spectrum radio energy. As of January 2015, seven radio events detected so far might represent such possible collapses; they are projected to occur every 10 seconds within the observable universe. Because the magnetic field had previously cleared the surrounding space of gas and dust, there is no nearby material that will fall into the new black hole. Thus there is no burst of X-rays or gamma rays that usually happens when other black holes form. If blitzars exist, they may offer a new way to observe details of black hole formation.
https://en.wikipedia.org/wiki/Lecture%20Notes%20in%20Physics
Lecture Notes in Physics (LNP) is a book series published by Springer Science+Business Media in the field of physics, including articles related to both research and teaching. It was established in 1969. See also Lecture Notes in Computer Science Lecture Notes in Mathematics External links Publications established in 1969 Physics books Series of books Springer Science+Business Media books Books of lectures
https://en.wikipedia.org/wiki/Boston%20MXP
The Boston MXP was an Internet Exchange Point in Boston, Massachusetts and Quincy, Massachusetts. It was founded by MAI in 1996. It supports 10 megabit and 100 megabit connections on copper, and gigabit connections on fiber. Global NAPs, the main sponsor of the Boston MXP, was sold on February 29, 2012. The Boston MXP is no longer operational and was replaced by the Boston Internet Exchange. See also Internet Exchange Point External links Official Boston MXP website profile on PeeringDB Internet exchange points in the United States Network access Telecommunications in the United States Communications in Massachusetts
https://en.wikipedia.org/wiki/CCR4-Not
Carbon Catabolite Repression—Negative On TATA-less, or CCR4-Not, is a multiprotein complex that functions in gene expression. The complex has multiple enzymatic activities as both a poly(A) 3′-5′ exonuclease and a ubiquitin ligase. The complex is present both in the nucleus where it regulates transcription and in the cytoplasm where it associates with translating ribosomes and RNA processing bodies. In mammalian cell, it has a function in the regulation of the cell cycle, chromatin modification, activation and inhibition of transcription initiation, control of transcription elongation, RNA export, nuclear RNA surveillance, and DNA damage repair in nucleus. Ccr4–Not complex plays an important role in mRNA decay and protein quality control in the cytoplasm. Subunits The human CCR4-Not complex is composed of structural (non-catalytic) subunits and those that have exonuclease and E3 ligase activity. Some but not all of the human subunits are conserved in budding yeast. Molecular weight of human subunits from Uniprot. See also Deadenylation Gene expression
https://en.wikipedia.org/wiki/Journal%20of%20Graph%20Theory
The Journal of Graph Theory is a peer-reviewed mathematics journal specializing in graph theory and related areas, such as structural results about graphs, graph algorithms with theoretical emphasis, and discrete optimization on graphs. The scope of the journal also includes related areas in combinatorics and the interaction of graph theory with other mathematical sciences. It is published by John Wiley & Sons. The journal was established in 1977 by Frank Harary. The editors-in-chief are Paul Seymour (Princeton University) and Carsten Thomassen (Technical University of Denmark). Abstracting and indexing The journal is abstracted and indexed in the Science Citation Index Expanded, Scopus, and Zentralblatt MATH. According to the Journal Citation Reports, the journal has a 2020 impact factor of 0.857.
https://en.wikipedia.org/wiki/History%20of%20information%20theory
The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948. In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point." With it came the ideas of the information entropy and redundancy of a source, and its relevance through the source coding theorem; the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem; the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; and of course the bit - a new way of seeing the most fundamental unit of information. Before 1948 Early telecommunications Some of the oldest methods of telecommunications implicitly use many of the ideas that would later be quantified in information theory. Modern telegraphy, starting in the 1830s, used Morse code, in which more common letters (like "E", which is expressed as one "dot") are transmitted more quickly than less common letters (like "J", which is expressed by one "dot" followed by three "dashes"). The idea of encoding information in this manner is the cornerstone of lossless data compression. A hundred years later, frequency modulation illustrated that bandwidth can be considered merely another degree of freedom. The vocoder, now largely looked at as an audio engineering curiosity, was originally designed in 1
https://en.wikipedia.org/wiki/Stefan%20Marinov
Stefan Marinov () (1 February 1931 – 15 July 1997) was a Bulgarian physicist, researcher, writer and lecturer who promoted anti-relativistic theoretical viewpoints, and later in his life defended the ideas of perpetual motion and free energy. In 1997 he self-published experimental results that confirmed classical electromagnetism and disproved that a machine constructed by Marinov himself could be a source of perpetual motion. Devastated by the negative results, he committed suicide in Graz, Austria on 15 July 1997. Life and education Marinov was born on 1 February 1931 in Sofia to a family of intellectual communists. In 1948 he finished Soviet College in Prague, then studied physics at the Czech Technical University in Prague and Sofia University. He was an Assistant Professor of Physics from 1960 to 1974 at Sofia University. In 1966–67, 1974, and 1977 he was subject to compulsory psychiatric treatment in Sofia because of his political dissent. In September 1977 Marinov received a passport and he successfully emigrated out of the country, moving to Brussels. In 1978, Marinov moved to Washington, D.C. Later he lived in Italy and Austria. In his later years, Marinov earned a living as a groom for horses. On 15 July 1997, Marinov jumped to his death from a staircase at a library at the University of Graz, after leaving suicide notes. He was 66 years old and was survived by his son Marin Marinov, who at the time was a vice-Minister of Industry of Bulgaria. Work One of Marinov's interests was the quest for free energy sources via construction of toy theories (new axiomatic systems that putatively describe our physical reality) and their experimental testing against mainstream physical theories. In 1992 Marinov wrote a letter to German Federal Chancellor Helmut Kohl in support of a German company, Becocraft, that was doing research into "free energy" technologies and had recently been the target of lawsuits. In the letter, Marinov threatened to set himself on fire a