source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Compositio%20Mathematica
Compositio Mathematica is a monthly peer-reviewed mathematics journal established by L.E.J. Brouwer in 1935. It is owned by the Foundation Compositio Mathematica, and since 2004 it has been published on behalf of the Foundation by the London Mathematical Society in partnership with Cambridge University Press. According to the Journal Citation Reports, the journal has a 2020 2-year impact factor of 1.456 and a 2020 5-year impact factor of 1.696. The editors-in-chief are Fabrizio Andreatta, David Holmes, Bruno Klingler, and Éric Vasserot. Early history The journal was established by L. E. J. Brouwer in response to his dismissal from Mathematische Annalen in 1928. An announcement of the new journal was made in a 1934 issue of the American Mathematical Monthly. In 1940, the publication of the journal was suspended due to the German occupation of the Netherlands.
https://en.wikipedia.org/wiki/Russian%20Morse%20code
The Russian Morse code approximates the Morse code for the Latin alphabet. It was enacted by the Russian government in 1856. To memorize the codes, practitioners use mnemonics known as напевы (loosely translated "melodies" or "chants"). The "melody" corresponding to a character is a sung phrase: syllables containing the vowels а, о, and ы correspond to dashes and are sung long, while syllables containing other vowels, as well the syllable ай, correspond to dots and are sung short. The specific "melodies" employed differ among various schools. The correspondences between Cyrillic and Latin letters were codified in MTK-2, KOI-7, and KOI-8. Table & Melody See also Morse code Morse code for non-Latin alphabets
https://en.wikipedia.org/wiki/Maidenhead%20Locator%20System
The Maidenhead Locator System (a.k.a. QTH Locator and IARU Locator) is a geocode system used by amateur radio operators to succinctly describe their geographic coordinates, which replaced the deprecated QRA locator, which was limited to European contacts. Its purpose is to be concise, accurate, and robust in the face of interference and other adverse transmission conditions. The Maidenhead Locator System can describe locations anywhere in the world. Maidenhead locators are also commonly referred to as QTH locators, grid locators or grid squares, although the "squares" are distorted on any non-equirectangular cartographic projection. Use of the terms QTH locator and QRA locator was initially discouraged, as it caused confusion with the older QRA locator system. The only abbreviation recommended to indicate a Maidenhead reference in Morse code and radio teleprinter transmission was LOC, as in LOC KN28LH. John Morris G4ANB originally devised the system and it was adopted at a meeting of the IARU VHF Working Group in Maidenhead, England in 1980. History Amateur radio contests on VHF and UHF are often scored based on the distance of contacts, typically 1 point per kilometre, so there is a need for amateurs to exchange their locations over the air. To facilitate this, following the growth of the sport in the 1950s, the German QRA locator system was adopted in 1959. The QRA locator system was limited to describing European coordinates, and by the mid-1970s there was growing need for a global locator system. By the time of their April 1980 meeting, in Maidenhead, England, the VHF Working Group had received twenty different proposals to replace the QRA locator grid. That devised by John Morris (G4ANB) was deemed to be the best. At the 1999 IARU Conference in Lillehammer it was decided that the latitude and longitude to be used as a reference for the determining of locators should be based on the World Geodetic System 1984 (WGS-84). Description of the system A Maidenhe
https://en.wikipedia.org/wiki/List%20of%20mobile%20telephone%20prefixes%20by%20country
This is a list of mobile telephone prefixes by country. International prefixes table See also List of country calling codes Notes
https://en.wikipedia.org/wiki/Edgar%20Silinsh
Edgar Imant Silinsh (, , Edgar Aleksandrovich Silinsh; 21 March 1927 – 26 May 1998) was a Soviet and Latvian scientist in the field of semiconductor physics and philosophy of science, academician of the Latvian Academy of Sciences (1992). Biography Edgar Silinsh was born the 4th child in a family of prosperous farmer Aleksandrs Siliņš (1875—1934) on the "Veclapsas" farmstead in Līgatne municipality of Riga district. During his school years, he was mostly interested in literature and history rather than in physical sciences. Due to the start of World War II and the death of his mother in 1943, E. Silinsh was forced to discontinue his education; nevertheless, in 1946 he took secondary school exams and enrolled at the Faculty of Chemistry of the State University of Latvia (SUL). The choice for natural sciences was rooted in Edgar's awareness of humanities being ideologically constrained in the Soviet Union. Yet he was forced to cease his studies even in the field of chemistry during the Stalinist repressions of 1949, because of class-based mistreatment. After that, E. Silinsh worked as a laborant for 14 years, and for last 12 of these he was employed at the Central Laboratory of the Riga Plant of Electrical Machine Building (). At this institution, E. Silinsh could for the first time perform actual scientific research, mostly in the field of atomic spectroscopy. In 1958, his two messages were included in the X All-Union Spectroscopy Conference in Lvov. In total, Edgar Silinsh published 26 scientific and technical papers in the field of atomic and molecular spectroscopy during his years at RER, as well as 16 technical and technological publications of other kinds. During the "Khrushchev Thaw", Edgar Silinsh was finally able to enroll (in 1957) and graduate (in 1961) the Faculty of Physics and Mathematics of the University of Latvia. In 1962, he began his extramural studies for the Candidate of Sciences degree (equivalent to the Western PhD) at the S. Vavilov State Opt
https://en.wikipedia.org/wiki/Florentine%20flask
A florentine flask, also known as florentine receiver, florentine separator or essencier (from the French), other shapes called florentine vase or florentine vessel, is an oil–water separator fed with condensed vapors of a steam distillation in a fragrance extraction process. Description When the raw material is heated with steam from boiling water, volatile fragrant compounds and steam leave the still. The vapours are cooled in the condenser and become liquid. The liquid runs into the florentine receiver where the water and essential oil phases separate. The essential oils phase separates from water because the oils have a different density than water, and are not water-soluble. There are two main types of florentines in use. One separates essential oils of lower density than water, for example lavender oil, accumulating in a layer floating on the water. This kind of florentine has to be airtight to reduce the loss of volatile substances. The other type is intended for oils that are denser than water, where the oil accumulates beneath the water phase, for instance cinnamon, wintergreen, vetiver, patchouli or cloves. The floating water phase avoids the loss of volatile compounds from the oil. There are also florentines that are able to accommodate oils that are denser or less dense than water. The separated water is a herbal distillate and can be fed back into the still or may in some cases be sold as herbal water. Because small droplets of oil are entrained with the water flowing out of the florentine, the yield can sometimes be increased by using more than one florentine in series. For laboratory use, a small glass florentine without a base is called a florentine vase, as it has a slight resemblance to a small amphora. Larger glass receivers with base are called florentine flasks or essenciers. Glass is normally only used up to 15 liter vessels; above this size, glass is too fragile, so that metal is used for larger capacities.
https://en.wikipedia.org/wiki/Statistical%20static%20timing%20analysis
Conventional static timing analysis (STA) has been a stock analysis algorithm for the design of digital circuits over the last 30 years. However, in recent years the increased variation in semiconductor devices and interconnect has introduced a number of issues that cannot be handled by traditional (deterministic) STA. This has led to considerable research into statistical static timing analysis, which replaces the normal deterministic timing of gates and interconnects with probability distributions, and gives a distribution of possible circuit outcomes rather than a single outcome. Comparison with conventional STA Deterministic STA is popular for good reasons: It requires no vectors, so it does not miss paths. The run time is linear in circuit size (for the basic algorithm). The result is conservative. It typically uses some fairly simple libraries (typically delay and output slope as a function of input slope and output load). It is easy to extend to incremental operation for use in optimization. STA, while very successful, has a number of limitations: Cannot easily handle within-die correlation, especially if spatial correlation is included. Needs many corners to handle all possible cases. If there are significant random variations, then in order to be conservative at all times, it is too pessimistic to result in competitive products. Changes to address various correlation problems, such as CPPR (Common Path Pessimism Removal) make the basic algorithm slower than linear time, or non-incremental, or both. SSTA attacks these limitations more or less directly. First, SSTA uses sensitivities to find correlations among delays. Then it uses these correlations when computing how to add statistical distributions of delays. There is no technical reason why determistic STA could not be enhanced to handle correlation and sensitivities, by keeping a vector of sensitivities with each value as SSTA does. Historically, this seemed like a big burden to add to STA, wh
https://en.wikipedia.org/wiki/Elementary%20effects%20method
The elementary effects (EE) method is the most used screening method in sensitivity analysis. EE is applied to identify non-influential inputs for a computationally costly mathematical model or for a model with a large number of inputs, where the costs of estimating other sensitivity analysis measures such as the variance-based measures is not affordable. Like all screening, the EE method provides qualitative sensitivity analysis measures, i.e. measures which allow the identification of non-influential inputs or which allow to rank the input factors in order of importance, but do not quantify exactly the relative importance of the inputs. Methodology To exemplify the EE method, let us assume to consider a mathematical model with input factors. Let be the output of interest (a scalar for simplicity): The original EE method of Morris provides two sensitivity measures for each input factor: the measure , assessing the overall importance of an input factor on the model output; the measure , describing non-linear effects and interactions. These two measures are obtained through a design based on the construction of a series of trajectories in the space of the inputs, where inputs are randomly moved One-At-a-Time (OAT). In this design, each model input is assumed to vary across selected levels in the space of the input factors. The region of experimentation is thus a -dimensional -level grid. Each trajectory is composed of points since input factors move one by one of a step in while all the others remain fixed. Along each trajectory the so-called elementary effect for each input factor is defined as: , where is any selected value in such that the transformed point is still in for each index elementary effects are estimated for each input by randomly sampling points . Usually ~ 4-10, depending on the number of input factors, on the computational cost of the model and on the choice of the number of levels , since a high number of levels to
https://en.wikipedia.org/wiki/The%20Art%20of%20Racing%20in%20the%20Rain
The Art of Racing in the Rain is a 2008 novel by American author Garth Stein that is narrated by a dog named Enzo. The novel was a New York Times bestseller for 156 weeks. A film adaptation of the same name directed by Simon Curtis and starring Milo Ventimiglia, Amanda Seyfried, and Kevin Costner as the voice of Enzo, was released in 2019. Summary The novel follows the story of Denny Swift, a race car driver and customer representative at a Seattle BMW dealership, and his dog, Enzo, who believes in the legend that a dog "who is prepared" will be reincarnated in his next life as a human. Enzo sets out to prepare, with The Seattle Times calling his journey "a struggle to hone his humanness, to make sense of the good, the bad and the unthinkable." Enzo spends most of his days watching and learning from television, gleaning what he can about his owner's greatest passion, race car driving—and relating it to life. He watches as Denny marries Eve, the birth of their daughter, Zoe, and then Eve's development of brain cancer, which only he can detect through his acute sense of smell. Enzo eventually plays a key role in Denny's child-custody battle with his in-laws and distills his observations of the human condition in the mantra "that which you manifest is before you." Enzo helps Denny throughout his life, through his ups and downs, and gets Zoe back. Background Inspiration for the novel came after Stein watched the 1998 Mongolian documentary State of Dogs, and then later in 2004 heard poet Billy Collins give a reading of the poem "The Revenant" told from a dog's point of view. Stein had originally named the dog "Juan Pablo" after Colombian race car driver Juan Pablo Montoya, but changed his name at the suggestion of his wife, naming the dog instead after Enzo Ferrari, founder of the famous Italian automobile marque of the same name. The race car driving experience of the novel's character, Denny, is based on Stein's own experience in racing cars, and on another race
https://en.wikipedia.org/wiki/Kaye%20and%20Laby
Tables of Physical and Chemical Constants and Some Mathematical Functions is a scientific reference work. First compiled and published in 1911 by the physicists G. W. C. Kaye and T. H. Laby, it is more commonly known as Kaye and Laby. It is a standard textbook for scientists and engineers. The final print edition was the 16th in 1995, after which the entire content was made available online in association with the National Physical Laboratory. The online version was removed on 21 May 2019, the day after the redefinition of the SI base units. An archived version is still available online, but is no longer maintained, and does not have the updated values of physical constants.
https://en.wikipedia.org/wiki/Catabolite%20Control%20Protein%20A
Catabolite Control Protein A (CcpA) is a master regulator of carbon metabolism in gram-positive bacteria. It is a member of the LacI/GalR transcription regulator family. In contrast to most LacI/GalR proteins, CcpA is allosterically regulated principally by a protein-protein interaction, rather than a protein-small molecule interaction. CcpA interacts with the phosphorylated form of Hpr and Crh, which is formed when high concentrations of glucose or fructose-1,6-bisphosphate are present in the cell. Interaction of Hpr or Crh modulates the DNA sequence specificity of CcpA, allowing it to bind operator DNA to modulate transcription. Small molecules glucose-6-phosphate and fructose-1,6-bisphosphate are also known allosteric effectors, fine-tuning CcpA function. Structure The DNA-binding functional unit of CcpA consists of a homodimer. The N-terminal region of each monomer form a DNA-binding site while the C-terminal portion forms a "regulatory" domain. A short linker connects the N-terminal DNA binding domain and the C-terminal regulatory domain, which partially contacts DNA when bound. The LacI/GalR subfamily can be functionally subdivided based on the presence or absence of a "YxxPxxxAxxL" motif in the liker sequence; CcpA belongs to the subdivision containing this motif. The regulatory domain is further subdivided into a N-terminal and C-terminal subdomain. Small molecule effector binding occurs in the cleft between these subdomains. Binding to phosphorylated Hpr/Crh occurs along the regulatory domain's N-subdomain.
https://en.wikipedia.org/wiki/Fertilisation%20of%20Orchids
Fertilisation of Orchids is a book by English naturalist Charles Darwin published on 15 May 1862 under the full explanatory title On the Various Contrivances by Which British and Foreign Orchids Are Fertilised by Insects, and On the Good Effects of Intercrossing. Darwin's previous book, On the Origin of Species, had briefly mentioned evolutionary interactions between insects and the plants they fertilised, and this new idea was explored in detail. Field studies and practical scientific investigations that were initially a recreation for Darwin—a relief from the drudgery of writing—developed into enjoyable and challenging experiments. Aided in his work by his family, friends, and a wide circle of correspondents across Britain and worldwide, Darwin tapped into the contemporary vogue for growing exotic orchids. The book was his first detailed demonstration of the power of natural selection, and explained how complex ecological relationships resulted in the coevolution of orchids and insects. The view has been expressed that the book led directly or indirectly to all modern work on coevolution and the evolution of extreme specialisation. It influenced botanists, and revived interest in the neglected idea that insects played a part in pollinating flowers. It opened up the new study areas of pollination research and reproductive ecology, directly related to Darwin's ideas on evolution, and supported his view that natural selection led to a variety of forms through the important benefits achieved by cross-fertilisation. Although the general public showed less interest and sales of the book were low, it established Darwin as a leading botanist. Orchids was the first in a series of books on his innovative investigations into plants. The book describes how the relationship between insects and plants resulted in the beautiful and complex forms which natural theology attributed to a grand designer. By showing how practical adaptations develop from cumulative minor variations
https://en.wikipedia.org/wiki/On-Line%20Encyclopedia%20of%20Integer%20Sequences
The On-Line Encyclopedia of Integer Sequences (OEIS) is an online database of integer sequences. It was created and maintained by Neil Sloane while researching at AT&T Labs. He transferred the intellectual property and hosting of the OEIS to the OEIS Foundation in 2009. Sloane is the chairman of the OEIS Foundation. OEIS records information on integer sequences of interest to both professional and amateur mathematicians, and is widely cited. , it contains over 360,000 sequences, making it the largest database of its kind. Each entry contains the leading terms of the sequence, keywords, mathematical motivations, literature links, and more, including the option to generate a graph or play a musical representation of the sequence. The database is searchable by keyword, by subsequence, or by any of 16 fields. History Neil Sloane started collecting integer sequences as a graduate student in 1964 to support his work in combinatorics. The database was at first stored on punched cards. He published selections from the database in book form twice: A Handbook of Integer Sequences (1973, ), containing 2,372 sequences in lexicographic order and assigned numbers from 1 to 2372. The Encyclopedia of Integer Sequences with Simon Plouffe (1995, ), containing 5,488 sequences and assigned M-numbers from M0000 to M5487. The Encyclopedia includes the references to the corresponding sequences (which may differ in their few initial terms) in A Handbook of Integer Sequences as N-numbers from N0001 to N2372 (instead of 1 to 2372.) The Encyclopedia includes the A-numbers that are used in the OEIS, whereas the Handbook did not. These books were well received and, especially after the second publication, mathematicians supplied Sloane with a steady flow of new sequences. The collection became unmanageable in book form, and when the database had reached 16,000 entries Sloane decided to go online—first as an email service (August 1994), and soon after as a website (1996). As a spin-off fro
https://en.wikipedia.org/wiki/Binary%20black%20hole
A binary black hole (BBH), or black hole binary, is a system consisting of two black holes in close orbit around each other. Like black holes themselves, binary black holes are often divided into stellar binary black holes, formed either as remnants of high-mass binary star systems or by dynamic processes and mutual capture; and binary supermassive black holes, believed to be a result of galactic mergers. For many years, proving the existence of binary black holes was made difficult because of the nature of black holes themselves and the limited means of detection available. However, in the event that a pair of black holes were to merge, an immense amount of energy should be given off as gravitational waves, with distinctive waveforms that can be calculated using general relativity. Therefore, during the late 20th and early 21st century, binary black holes became of great interest scientifically as a potential source of such waves and a means by which gravitational waves could be proven to exist. Binary black hole mergers would be one of the strongest known sources of gravitational waves in the universe, and thus offer a good chance of directly detecting such waves. As the orbiting black holes give off these waves, the orbit decays, and the orbital period decreases. This stage is called binary black hole inspiral. The black holes will merge once they are close enough. Once merged, the single hole settles down to a stable form, via a stage called ringdown, where any distortion in the shape is dissipated as more gravitational waves. In the final fraction of a second the black holes can reach extremely high velocity, and the gravitational wave amplitude reaches its peak. The existence of stellar-mass binary black holes (and gravitational waves themselves) was finally confirmed when LIGO detected GW150914 (detected September 2015, announced February 2016), a distinctive gravitational wave signature of two merging stellar-mass black holes of around 30 solar masses each
https://en.wikipedia.org/wiki/Circle%20criterion
In nonlinear control and stability theory, the circle criterion is a stability criterion for nonlinear time-varying systems. It can be viewed as a generalization of the Nyquist stability criterion for linear time-invariant (LTI) systems. Overview Consider a linear system subject to non-linear feedback, i.e. a non linear element is present in the feedback loop. Assume that the element satisfies a sector condition , and (to keep things simple) that the open loop system is stable. Then the closed loop system is globally asymptotically stable if the Nyquist locus does not penetrate the circle having as diameter the segment located on the x-axis. General description Consider the nonlinear system Suppose that is stable Then such that for any solution of the system the following relation holds: Condition 3 is also known as the frequency condition. Condition 1 the sector condition. External links Sufficient Conditions for Dynamical Output Feedback Stabilization via the Circle Criterion Popov and Circle Criterion (Cam UK) Stability analysis using the circle criterion in Mathematica
https://en.wikipedia.org/wiki/S1C6x
The S1C6x series is a microcontroller families introduced by Epson. It is a 4-bit architecture. This Series includes S1C60 and S1C63 families. S1C60 is low end low power version. S1C63 is high end version. This family is used in many applications as it contains specialized peripherals such as LCD driver, dot-matrix driver, FSK demodulator, R/F converter ... etc. Technical description The S1C6x series is a CISC Harvard architecture with 12-bit instructions with an 8.192 word instruction space. It uses a 4-bit word for either a binary format or as a BCD digit and has 16 memory mapped registers in register window together with two accumulators, two 12-bit pointers and a stack pointer for use in subroutines. Most of the instructions operate on either two registers or a register and an immediate value, but the S1C6X series also has some memory-memory and memory-immediate instructions.
https://en.wikipedia.org/wiki/Worldwide%20LHC%20Computing%20Grid
The Worldwide LHC Computing Grid (WLCG), formerly (until 2006) the LHC Computing Grid (LCG), is an international collaborative project that consists of a grid-based computer network infrastructure incorporating over 170 computing centers in 42 countries, . It was designed by CERN to handle the prodigious volume of data produced by Large Hadron Collider (LHC) experiments. By 2012, data from over 300 trillion (3×1014) LHC proton-proton collisions had been analyzed, and LHC collision data was being produced at approximately 25 petabytes per year. the LHC Computing Grid is the world's largest computing grid comprising over 170 computing facilities in a worldwide network across 42 countries. Background The Large Hadron Collider at CERN was designed to test the existence of the Higgs boson, an important but elusive piece of knowledge that had been sought by particle physicists for over 40 years. A very powerful particle accelerator was needed, because Higgs bosons might not be seen in lower energy experiments, and because vast numbers of collisions would need to be studied. Such a collider would also produce unprecedented quantities of collision data requiring analysis. Therefore, advanced computing facilities were needed to process the data. Description A design report was published in 2005. It was announced to be ready for data on 3 October 2008. A popular 2008 press article predicted "the internet could soon be made obsolete" by its technology. CERN had to publish its own articles trying to clear up the confusion. It incorporates both private fiber optic cable links and existing high-speed portions of the public Internet. At the end of 2010, the Grid consisted of some 200,000 processing cores and 150 petabytes of disk space, distributed across 34 countries. The data stream from the detectors provides approximately 300 GByte/s of data, which after filtering for "interesting events", results in a data stream of about 300 MByte/s. The CERN computer center, considere
https://en.wikipedia.org/wiki/Higher-order%20singular%20value%20decomposition
In multilinear algebra, the higher-order singular value decomposition (HOSVD) of a tensor is a specific orthogonal Tucker decomposition. It may be regarded as one type of generalization of the matrix singular value decomposition. It has applications in computer vision, computer graphics, machine learning, scientific computing, and signal processing. Some aspects can be traced as far back as F. L. Hitchcock in 1928, but it was L. R. Tucker who developed for third-order tensors the general Tucker decomposition in the 1960s, further advocated by L. De Lathauwer et al. in their Multilinear SVD work that employs the power method, or advocated by Vasilescu and Terzopoulos that developed M-mode SVD a parallel algorithm that employs the matrix SVD. The term higher order singular value decomposition (HOSVD) was coined be DeLathauwer, but the algorithm referred to commonly in the literature as the HOSVD and attributed to either Tucker or DeLathauwer was developed by Vasilescu and Terzopoulos. Robust and L1-norm-based variants of HOSVD have also been proposed. Definition For the purpose of this article, the abstract tensor is assumed to be given in coordinates with respect to some basis as a M-way array, also denoted by , where M is the number of modes and the order of the tensor. is the complex numbers and it includes both the real numbers and the pure imaginary numbers. Let be a unitary matrix containing a basis of the left singular vectors of the standard mode-m flattening of such that the jth column of corresponds to the jth largest singular value of . Observe that the mode/factor matrix does not depend on the particular on the specific definition of the mode m flattening. By the properties of the multilinear multiplication, we havewhere denotes the conjugate transpose. The second equality is because the 's are unitary matrices. Define now the core tensorThen, the HOSVD of is the decomposition The above construction shows that every tensor has a HOSVD. Comp
https://en.wikipedia.org/wiki/Bernstein%27s%20problem
In differential geometry, Bernstein's problem is as follows: if the graph of a function on Rn−1 is a minimal surface in Rn, does this imply that the function is linear? This is true for n at most 8, but false for n at least 9. The problem is named for Sergei Natanovich Bernstein who solved the case n = 3 in 1914. Statement Suppose that f is a function of n − 1 real variables. The graph of f is a surface in Rn, and the condition that this is a minimal surface is that f satisfies the minimal surface equation Bernstein's problem asks whether an entire function (a function defined throughout Rn−1 ) that solves this equation is necessarily a degree-1 polynomial. History proved Bernstein's theorem that a graph of a real function on R2 that is also a minimal surface in R3 must be a plane. gave a new proof of Bernstein's theorem by deducing it from the fact that there is no non-planar area-minimizing cone in R3. showed that if there is no non-planar area-minimizing cone in Rn−1 then the analogue of Bernstein's theorem is true for graphs in Rn, which in particular implies that it is true in R4. showed there are no non-planar minimizing cones in R4, thus extending Bernstein's theorem to R5. showed there are no non-planar minimizing cones in R7, thus extending Bernstein's theorem to R8. He also showed that the surface defined by is a locally stable cone in R8, and asked if it is globally area-minimizing. showed that Simons' cone is indeed globally minimizing, and that in Rn for n≥9 there are graphs that are minimal, but not hyperplanes. Combined with the result of Simons, this shows that the analogue of Bernstein's theorem is true in Rn for n≤8, and false in higher dimensions.
https://en.wikipedia.org/wiki/Electrowetting
Electrowetting is the modification of the wetting properties of a surface (which is typically hydrophobic) with an applied electric field. History The electrowetting of mercury and other liquids on variably charged surfaces was probably first explained by Gabriel Lippmann in 1875 and was certainly observed much earlier. A. N. Frumkin used surface charge to change the shape of water drops in 1936. The term electrowetting was first introduced in 1981 by G. Beni and S. Hackwood to describe an effect proposed for designing a new type of display device for which they received a patent. The use of a "fluid transistor" in microfluidic circuits for manipulating chemical and biological fluids was first investigated by J. Brown in 1980 and later funded in 1984–1988 under NSF Grants 8760730 & 8822197, employing insulating dielectric and hydrophobic layer(s) (EWOD), immiscible fluids, DC or RF power; and mass arrays of miniature interleaved (saw tooth) electrodes with large or matching indium tin oxide (ITO) electrodes to digitally relocate nano droplets in linear, circular, and directed paths, pump or mix fluids, fill reservoirs, and control fluid flow electronically or optically. Later, in collaboration with J. Silver at the NIH, EWOD-based electrowetting was disclosed for single and immiscible fluids to move, separate, hold, and seal arrays of digital PCR sub-samples. Electrowetting using an insulating layer on top of a bare electrode was later studied by Bruno Berge in 1993. Electrowetting on this dielectric-coated surface is called electrowetting-on-dielectric (EWOD) to distinguish it from the conventional electrowetting on the bare electrode. Electrowetting can be demonstrated by replacing the metal electrode in the EWOD system by a semiconductor. Electrowetting is also observed when a reverse bias is applied to a conducting droplet (e.g. mercury) which has been placed directly onto a semiconductor surface (e.g. silicon) to form a Schottky contact in a Schottky diode el
https://en.wikipedia.org/wiki/Transverse%20cervical%20veins
The transverse cervical veins are veins that cross the neck.
https://en.wikipedia.org/wiki/George%20Kellie
Dr George Kellie MD, FRSE (1770–1829) was a Scottish surgeon who, together with Alexander Monro secundus gave his name to the Monro-Kellie doctrine, a concept which relates intracranial pressure to the volume of intracranial contents and is a basic tenet of our understanding of the neuropathology of raised intracranial pressure. The doctrine states that since the skull is incompressible, and the volume inside the skull is fixed then any increase in volume of one of the cranial constituents must be compensated by a decrease in volume of another. Previous research about George Kellie (1720–1779) may have been hampered by a widely cited incorrect year of birth, by the spelling of his name as Kellie or Kelly and by confusion with his father, also a surgeon in Leith, with the same name and subject to similar spelling variations. Early life George Kellie was born in Leith, the seaport for Edinburgh which was at that time the fifth largest town in Scotland. His parents George Kellie (1742–1805), originally from Dunbar, East Lothian, and Catherin [sic] McCall of Haddington, East Lothian had married in South Leith in August 1769 On his baptismal entry in the parish of Dunbar, East Lothian for 6 October 1742 George senior's surname is spelt Kellie, as is that of his father. In the South Leith parish records of his marriage to Catherin McCall in August 1764 and the record of the birth of his son, the spelling is given as 'Kelly'. George Kellie senior practised as a surgeon and while there is no record of his registration as a surgical apprentice in Wallis's extensive listing of British medical and surgical apprentices that listing showed that he trained three apprentices between 1771–75. The Street directories for Edinburgh and Leith for the years 1773–1805 show that ‘George Kelly’ senior practised as a surgeon in Tolbooth Wynd, Leith, the only Kelly or Kellie listed in Leith for that period. In 1774 he published a paper describing a case of extensive surgical emphysema wh
https://en.wikipedia.org/wiki/Baroreflex
The baroreflex or baroreceptor reflex is one of the body's homeostatic mechanisms that helps to maintain blood pressure at nearly constant levels. The baroreflex provides a rapid negative feedback loop in which an elevated blood pressure causes the heart rate to decrease. Decreased blood pressure decreases baroreflex activation and causes heart rate to increase and to restore blood pressure levels. Their function is to sense pressure changes by responding to change in the tension of the arterial wall The baroreflex can begin to act in less than the duration of a cardiac cycle (fractions of a second) and thus baroreflex adjustments are key factors in dealing with postural hypotension, the tendency for blood pressure to decrease on standing due to gravity. The system relies on specialized neurons, known as baroreceptors, chiefly in the aortic arch and carotid sinuses, to monitor changes in blood pressure and relay them to the medulla oblongata. Baroreceptors are stretch receptors and respond to the pressure induced stretching of the blood vessel in which they are found. Baroreflex-induced changes in blood pressure are mediated by both branches of the autonomic nervous system: the parasympathetic and sympathetic nerves. Baroreceptors are active even at normal blood pressures so their activity informs the brain about both increases and decreases in blood pressure. The body contains two other, slower-acting systems to regulate blood pressure: the heart releases atrial natriuretic peptide when blood pressure is too high, and the kidneys sense and correct low blood pressure with the renin–angiotensin system. Anatomy Baroreceptors are present in the atria of the heart and vena cavae, but the most sensitive baroreceptors are in the carotid sinuses and aortic arch. While the carotid sinus baroreceptor axons travel within the glossopharyngeal nerve (CN IX), the aortic arch baroreceptor axons travel within the vagus nerve (CN X). Baroreceptor activity travels along these ne
https://en.wikipedia.org/wiki/Bisection%20%28software%20engineering%29
Bisection is a method used in software development to identify change sets that result in a specific behavior change. It is mostly employed for finding the patch that introduced a bug. Another application area is finding the patch that indirectly fixed a bug. Overview The process of locating the changeset that introduced a specific regression was described as "source change isolation" in 1997 by Brian Ness and Viet Ngo of Cray Research. Regression testing was performed on Cray's compilers in editions comprising one or more changesets. Editions with known regressions could not be validated until developers addressed the problem. Source change isolation narrowed the cause to a single changeset that could then be excluded from editions, unblocking them with respect to this problem, while the author of the change worked on a fix. Ness and Ngo outlined linear search and binary search methods of performing this isolation. Code bisection has the goal of minimizing the effort to find a specific change set. It employs a divide and conquer algorithm that depends on having access to the code history which is usually preserved by revision control in a code repository. Bisection method Code bisection algorithm Code history has the structure of a directed acyclic graph which can be topologically sorted. This makes it possible to use a divide and conquer search algorithm which: splits up the search space of candidate revisions tests for the behavior in question reduces the search space depending on the test result re-iterates the steps above until a range with at most one bisectable patch candidate remains Algorithmic complexity Bisection is in LSPACE having an algorithmic complexity of with denoting the number of revisions in the search space, and is similar to a binary search. Desirable repository properties For code bisection it is desirable that each revision in the search space can be built and tested independently. Monotonicity For the bisection algorithm to id
https://en.wikipedia.org/wiki/VM/386
VM/386 is a multitasking operating system or 'control program' that took early advantage of the capabilities of Intel's 386 processor. By utilizing Virtual 8086 mode, users were able to run their existing text-based and graphical DOS software in safely separate environments. The system offered a high degree of control, with the ability to set memory limits, CPU usage and scheduling parameters, device assignments, and interrupt priorities through a virtual machine manager menu. Unique CONFIG.SYS and AUTOEXEC.BAT files could be configured for each application, and even different DOS versions. In 1991 the vendor announced intentions to support DPMI 1.0 in VM/386. Overview VM/386 had initially been developed by Softguard Systems, a producer of copy-protection software, with plans to include features like non-DOS system support, but financial constraints forced its sale to Intelligent Graphics Corporation (IGC), which launched the product in 1987. It won a PC Magazine award for technical excellence in 1988. The company also introduced a multi-user version, which allowed a number of serial terminals and even graphical systems to be connected to a single 386 computer. Current versions of the software have built on the multi-user support, and can handle tens of users in a networked environment with Windows 3.11 support, access controls, virtual memory and device sharing, among other features. A version of the software designed to cooperate with Unix was bundled with Everex Systems workstations. The system now sees use mainly in vertical applications like point-of-sale systems, where its ability to run reliably on cheap, reliable hardware outweigh any gains from newer operating systems that are more complex and less reliable. Early competition included DESQview 386, Sunny Hill Software's Omniview, StarPath Systems' Vmos/3, and Windows/386 2.01. As the target market shifted away from single-user systems to multiple-user setups with many serial terminals it began to compete
https://en.wikipedia.org/wiki/HP-IL
The HP-IL (Hewlett-Packard Interface Loop), was a short-range interconnection bus or network introduced by Hewlett-Packard in the early 1980s. It enabled many devices such as printers, plotters, displays, storage devices (floppy disk drives and tape drives), test equipment, etc. to be connected to programmable calculators such as the HP-41C, HP-71B and HP-75C/D, the 80-series and HP-110 computers, as well as generic ISA bus based PCs. Principles As its name implies, an HP-IL network formed a loop (i.e. it was a Ring network): each device in the loop had a pair of two-wire connections, one designated in, which received messages from the previous device in the loop; and one designated out, which delivered messages to the next device in the loop. One device on the loop is designated the controller, and manages all other devices on the loop. HP-IL cables utilize a unique two-pin connector design with polarizing "D"-shaped shells, and can be connected together without further adapters to extend their length. HP-IL uses a token passing protocol for media access control: messages are passed from one device to the next until they return to the originator. When the loop is initialized, the controller sends an "Auto Address 1" message to the first device; that device (and each subsequent device) takes the number in the message it receives as its own address, and then forwards the message with the address incremented to the next device. When the "Auto Address n" message finally returns to the controller, it can tell how many devices are on the loop (n-1). Up to 31 devices can be addressed using this method. Once addresses are assigned, the controller can then assign "talker" or "listener" roles to any device on the loop. By addressing each device in turn, and using the "Send Accessory ID" message, the controller can determine the role and capability of each device on the loop. When the controller assigns listener role to a device, that device accepts and processes data re
https://en.wikipedia.org/wiki/Chinese%20Character%20Code%20for%20Information%20Interchange
The Chinese Character Code for Information Interchange () or CCCII is a character set developed by the Chinese Character Analysis Group in Taiwan. It was first published in 1980, and significantly expanded in 1982 and 1987. It is used mostly by library systems. It is one of the earliest established and most sophisticated encodings for traditional Chinese (predating the establishment of Big5 in 1984 and CNS 11643 in 1986). It is distinguished by its unique system for encoding simplified versions and other variants of its main set of hanzi characters. A variant of an earlier version of CCCII is used by the Library of Congress as part of MARC-8, under the name East Asian Character Code (EACC, ANSI/NISO Z39.64), where it comprises part of MARC 21's JACKPHY support. However, EACC contains fewer characters than the most recent versions of CCCII. Work at Apple based on Research Libraries Group's CJK Thesaurus, which was used to maintain EACC, was one of the direct predecessors of Unicode's Unihan set. Design Byte ranges CCCII is designed as an 94n set, as defined by ISO/IEC 2022. Each Chinese character is represented by a 3-byte code in which each byte is 7-bit, between 0x21 and 0x7E inclusive. Thus, the maximum number of Chinese characters representable in CCCII is 94×94×94 = 830584. In practice the number of characters encodable by CCCII would be less than this number, because variant characters are encoded in related ISO 2022 planes under CCCII, so most of the code points would have to be reserved for variants. In practice, however, bytes outside of these ranges are sometimes used. The code 0x212320 is used by some implementations as an ideographic space. A CCCII specification used by libraries in Hong Kong uses codes starting with 0x2120 for punctuation and symbols. The first byte 0x7F is used by some variants to encode codes for some otherwise unavailable Unified Repertoire and Ordering or CJK Unified Ideographs Extension A hanzi (e.g. 0x7F3449 for U+3449 or 0x7F
https://en.wikipedia.org/wiki/E-commerce%20credit%20card%20payment%20system
Electronic commerce, commonly known as e-commerce or eCommerce, or e-business consists of the buying and selling of products or services over electronic systems such as the Internet and other computer networks. The amount of trade conducted electronically has grown extraordinarily with widespread Internet usage. The use of commerce is conducted in this way, spurring and drawing on innovations in electronic funds transfer, supply chain management, Internet marketing, online transaction processing, electronic data interchange (EDI), inventory management systems, and automated data collection systems. Modern electronic commerce typically uses the World Wide Web at least at some point in the transaction's lifecycle, although it can encompass a wider range of technologies such as e-mail as well. A large percentage of electronic commerce is conducted entirely electronically for virtual items such as access to premium content on a website, but most electronic commerce involves the transportation of physical items in some way. Online retailers are sometimes known as e-tailers and online retail is sometimes known as e-tail. Almost all big retailers have electronic commerce presence on the World Wide Web. Electronic commerce that is conducted between businesses is referred to as business-to-business or B2B. Electronic commerce that is conducted between businesses and consumers, on the other hand, is referred to as business-to-consumer or B2C. This is the type of electronic commerce conducted by companies such as Amazon.com. Online shopping is a form of electronic commerce where the buyer is connected directly online to the seller's computer usually via the Internet. There is no specific intermediary service. The sale and purchase transaction is completed electronically and interactively in real-time, such as when buying a new book on Amazon.com. If an intermediary is present, then the sale and purchase transaction is called consumer-to-consumer, such as an online auction
https://en.wikipedia.org/wiki/Homokaryotic
Monokaryotic (adj.) is a term used to refer to multinucleate cells where all nuclei are genetically identical. In multinucleate cells, nuclei share one common cytoplasm, as is found in hyphal cells or mycelium of filamentous fungi. See also Dikaryon Eukaryote Prokaryote Cell biology
https://en.wikipedia.org/wiki/Chung%20Kwei%20%28algorithm%29
Chung Kwei is a spam filtering algorithm based on the TEIRESIAS Algorithm for finding coding genes within bulk DNA. It is named after Zhong Kui, a figure in Chinese folklore. See also Spam (electronic) CAN-SPAM Act of 2003 DNSBL SpamAssassin External links Official Report TEIRESIAS: Sequence Pattern Discovery, from IBM Bioinformatics Group DNA technique protects against "evil" emails, from NewScientist.com "DNA analysis" spots e-mail spam, from BBC News Networking algorithms Anti-spam
https://en.wikipedia.org/wiki/Stamen
The stamen (: stamina or stamens) is the pollen-producing reproductive organ of a flower. Collectively the stamens form the androecium. Morphology and terminology A stamen typically consists of a stalk called the filament and an anther which contains microsporangia. Most commonly anthers are two-lobed (each lobe is termed a locule) and are attached to the filament either at the base or in the middle area of the anther. The sterile tissue between the lobes is called the connective, an extension of the filament containing conducting strands. It can be seen as an extension on the dorsal side of the anther. A pollen grain develops from a microspore in the microsporangium and contains the male gametophyte. The size of anthers differs greatly, from a tiny fraction of a millimeter in Wolfia spp up to five inches (13 centimeters) in Canna iridiflora and Strelitzia nicolai. The stamens in a flower are collectively called the androecium. The androecium can consist of as few as one-half stamen (i.e. a single locule) as in Canna species or as many as 3,482 stamens which have been counted in the saguaro (Carnegiea gigantea). The androecium in various species of plants forms a great variety of patterns, some of them highly complex. It generally surrounds the gynoecium and is surrounded by the perianth. A few members of the family Triuridaceae, particularly Lacandonia schismatica and Lacandonia braziliana, along with a few species of Trithuria (family Hydatellaceae) are exceptional in that their gynoecia surround their androecia. Etymology Stamen is the Latin word meaning "thread" (originally thread of the warp, in weaving). Filament derives from classical Latin filum, meaning "thread" Anther derives from French anthère, from classical Latin anthera, meaning "medicine extracted from the flower" in turn from Ancient Greek ἀνθηρά (), feminine of ἀνθηρός () meaning "flowery", from ἄνθος () meaning "flower" Androecium (: androecia) derives from Ancient Greek ἀνήρ () meanin
https://en.wikipedia.org/wiki/Richard%20Lewontin
Richard Charles Lewontin (March 29, 1929 – July 4, 2021) was an American evolutionary biologist, mathematician, geneticist, and social commentator. A leader in developing the mathematical basis of population genetics and evolutionary theory, he applied techniques from molecular biology, such as gel electrophoresis, to questions of genetic variation and evolution. In a pair of seminal 1966 papers co-authored with J. L. Hubby in the journal Genetics, Lewontin helped set the stage for the modern field of molecular evolution. In 1979, he and Stephen Jay Gould introduced the term "spandrel" into evolutionary theory. From 1973 to 1998, he held an endowed chair in zoology and biology at Harvard University, and from 2003 until his death in 2021 he was a research professor there. From a sociological perspective, Lewontin strongly opposed genetic determinism and neodarwinism as expressed in the fields of sociobiology and evolutionary psychology. Previously, as a member of Science for the People, he denounced the involvement of prominent scientists in Pentagon programs aimed at developing weapons for the Vietnam War. From the 1990s, he condemned the lobbying of GMOs by the "genetic-industrial complex". Early life and education Lewontin was born in New York City to parents descended from late 19th-century Ashkenazi Jewish immigrants. His father was a broker of textiles, and his mother a homemaker. He attended Forest Hills High School and the École Libre des Hautes Études in New York. In 1951 he graduated from Harvard College with a BS degree in biology. In 1952, Lewontin received an MS degree in mathematical statistics, followed by a PhD degree in zoology in 1954, both from Columbia University, where he was a student of Theodosius Dobzhansky. He held faculty positions at North Carolina State University, the University of Rochester, and the University of Chicago. In 1973 Lewontin was appointed as Alexander Agassiz Professor of Zoology and Professor of Biology at Harvard Uni
https://en.wikipedia.org/wiki/Thodex
Koineks Teknoloji A.Ş, known online as Thodex, is a defunct Turkey based cryptocurrency exchange founded by Faruk Fatih Özer. Thodex was the first global exchange based in Turkey and of the 40 cryptocurrency exchanges in Turkey at the time, it was one of the big ones. It had 391,000 users when it froze in April 2021 and the owner fled Turkey with the clients' money. Thodex was founded as Koineks in 2017, at the time only the 4th cryptocurrency exchange to be founded in Turkey. They established Turkey's first Bitcoin ATMs. Koineks went global in 2020 changing its name to Thodex in March 2020. That year, Thodex was also licensed by Financial Crimes Enforcement Network as a Money Service Business (MSB). The total transaction volume on Thodex was ~$3 billion as of November 2020. The exchange had various rewards campaigns to draw in new users and new capital. Shutdown What would turn out to be the last of these rewards campaigns ran from 15 March to 15 April 2021. It was a Dogecoin rewards campaign where they stated that they would distribute 2 million dogecoin, 150 per new user that signs up. This campaign was set up in the month leading up to what is known by fans of Dogecoin as Dogeday (April 20) a day at which Dogecoin fans expected Dogecoin to go up in price significantly. Due to the hype, over half of the trading on the exchange was in Dogecoin. On the 19th of April, users experienced some disruptions in their transactions. Following complaints, Thodex stated that the issues were due to the exchange being under a cyberattack and the next day, April 20, trading on Thodex was halted entirely. Thodex assured users in a statement that things would resume to normal in 4–5 days and that trading had temporarily shut down because "big-name banks" were interested in investing substantially as partners of the exchange, and they needed to temporarily suspend trading for the partnership to settle. In the meantime, Faruk Fatih Özer, the owner of the exchange, left Turkey
https://en.wikipedia.org/wiki/Bel%E2%80%93Robinson%20tensor
In general relativity and differential geometry, the Bel–Robinson tensor is a tensor defined in the abstract index notation by: Alternatively, where is the Weyl tensor. It was introduced by Lluís Bel in 1959. The Bel–Robinson tensor is constructed from the Weyl tensor in a manner analogous to the way the electromagnetic stress–energy tensor is built from the electromagnetic tensor. Like the electromagnetic stress–energy tensor, the Bel–Robinson tensor is totally symmetric and traceless: In general relativity, there is no unique definition of the local energy of the gravitational field. The Bel–Robinson tensor is a possible definition for local energy, since it can be shown that whenever the Ricci tensor vanishes (i.e. in vacuum), the Bel–Robinson tensor is divergence-free:
https://en.wikipedia.org/wiki/Cancer%20Genome%20Anatomy%20Project
The Cancer Genome Anatomy Project (CGAP), created by the National Cancer Institute (NCI) in 1997 and introduced by Al Gore, is an online database on normal, pre-cancerous and cancerous genomes. It also provides tools for viewing and analysis of the data, allowing for identification of genes involved in various aspects of tumor progression. The goal of CGAP is to characterize cancer at a molecular level by providing a platform with readily accessible updated data and a set of tools such that researchers can easily relate their findings to existing knowledge. There is also a focus on development of software tools that improve the usage of large and complex datasets. The project is directed by Daniela S. Gerhard, and includes sub-projects or initiatives, with notable ones including the Cancer Chromosome Aberration Project (CCAP) and the Genetic Annotation Initiative (GAI). CGAP contributes to many databases and organisations such as the NCBI contribute to CGAP's databases. The eventual outcomes of CGAP include establishing a correlation between a particular cancer's progression with its therapeutic outcome, improved evaluation of treatment and development of novel techniques for prevention, detection and treatment. This is achieved by characterisation of biological tissue mRNA products. Research Background The fundamental cause of cancer is the inability for a cell to regulate its gene expression. To characterise a specific type of cancer, the proteins that are produced from the altered gene expression or the mRNA precursor to the protein can be examined. CGAP works to associate a particular cell's expression profile, molecular signature or transcriptome, which is essentially the cell's fingerprint, with the cell's phenotype. Therefore, expression profiles exist with consideration to cancer type and stage of progression. Sequencing CGAP's initial goal was to establish a Tumor Gene Index (TGI) to store the expression profiles. This would have contributions to both n
https://en.wikipedia.org/wiki/Zdenka%20Samish
Zdenka Samish (, also spelled Samisch) (March 13, 1904 – March 8, 2008) was a Czech-Israeli food technology researcher. One of the first agricultural researchers in Mandatory Palestine and then Israel, she studied methods for industrial processing of fruits and vegetables, canning, and food infestation. Her research was published in numerous peer-reviewed journals. She was director of the Department of Food Technology at the Agricultural Research Station (Volcani Center) in Rehovot from the early 1950s to 1969. Early life and education Zdenka (Devorah) Kohn was born in Prague to Otto and Vilma Kohn. Active in the Zionist youth movement, she immigrated to Palestine in 1924. In 1926 she and her husband, Moshe Rudolf Samish, also a Czech native, went to California to complete their degrees at UC Davis and UC Berkeley. She received her B.S. in 1931 at UC Davis and her M.A. in household science in 1933 at UC Berkeley; her masters thesis was on "The Effect of Excess Viosterol and of Parathyroid Extract upon the Tissues of Rats". Career In 1934 the couple returned to Palestine and she began working as a chemist at a fruit canning factory in Rehovot. In 1937 she joined the experimental research station in that city, and in 1946 was named director of the laboratory for canned fruits and vegetables. In 1949 she became an instructor at the Hebrew University of Jerusalem's Faculty of Agriculture in Rehovot on the subject of food technology. After 1951 she became director of food technology at the Agricultural Research Station in Rehovot. Research In 1946 Samish received a grant from the Mandatory government to develop methods for producing juices and concentrates from citrus fruits. In 1947 she received a U.S. patent for the manufacture of dried citrus fruit paste (fruit leather). Other research projects included a joint U.S.-Israeli study of microorganisms found in fruit and vegetable pulp; techniques for squeezing olives and producing olive oil; tomato paste production;
https://en.wikipedia.org/wiki/Vector%20field
In vector calculus and physics, a vector field is an assignment of a vector to each point in a space, most commonly Euclidean space . A vector field on a plane can be visualized as a collection of arrows with given magnitudes and directions, each attached to a point on the plane. Vector fields are often used to model, for example, the speed and direction of a moving fluid throughout three dimensional space, such as the wind, or the strength and direction of some force, such as the magnetic or gravitational force, as it changes from one point to another point. The elements of differential and integral calculus extend naturally to vector fields. When a vector field represents force, the line integral of a vector field represents the work done by a force moving along a path, and under this interpretation conservation of energy is exhibited as a special case of the fundamental theorem of calculus. Vector fields can usefully be thought of as representing the velocity of a moving flow in space, and this physical intuition leads to notions such as the divergence (which represents the rate of change of volume of a flow) and curl (which represents the rotation of a flow). A vector field is a special case of a vector-valued function, whose domain's dimension has no relation to the dimension of its range; for example, the position vector of a space curve is defined only for smaller subset of the ambient space. Likewise, n coordinates, a vector field on a domain in n-dimensional Euclidean space can be represented as a vector-valued function that associates an n-tuple of real numbers to each point of the domain. This representation of a vector field depends on the coordinate system, and there is a well-defined transformation law (covariance and contravariance of vectors) in passing from one coordinate system to the other. Vector fields are often discussed on open subsets of Euclidean space, but also make sense on other subsets such as surfaces, where they associate an a
https://en.wikipedia.org/wiki/Pair%20of%20pants%20%28mathematics%29
In mathematics, a pair of pants is a surface which is homeomorphic to the three-holed sphere. The name comes from considering one of the removed disks as the waist and the two others as the cuffs of a pair of pants. Pairs of pants are used as building blocks for compact surfaces in various theories. Two important applications are to hyperbolic geometry, where decompositions of closed surfaces into pairs of pants are used to construct the Fenchel-Nielsen coordinates on Teichmüller space, and in topological quantum field theory where they are the simplest non-trivial cobordisms between 1-dimensional manifolds. Pants and pants decomposition Pants as topological surfaces A pair of pants is any surface that is homeomorphic to a sphere with three holes, which formally is the result of removing from the sphere three open disks with pairwise disjoint closures. Thus a pair of pants is a compact surface of genus zero with three boundary components. The Euler characteristic of a pair of pants is equal to −1, and the only other surface with this property is the punctured torus (a torus minus an open disk). Pants decompositions The importance of the pairs of pants in the study of surfaces stems from the following property: define the complexity of a connected compact surface of genus with boundary components to be , and for a non-connected surface take the sum over all components. Then the only surfaces with negative Euler characteristic and complexity zero are disjoint unions of pairs of pants. Furthermore, for any surface and any simple closed curve on which is not homotopic to a boundary component, the compact surface obtained by cutting along has a complexity that is strictly less than . In this sense, pairs of pants are the only "irreducible" surfaces among all surfaces of negative Euler characteristic. By a recursion argument, this implies that for any surface there is a system of simple closed curves which cut the surface into pairs of pants. This is ca
https://en.wikipedia.org/wiki/Electronic%20entropy
Electronic entropy is the entropy of a system attributable to electrons' probabilistic occupation of states. This entropy can take a number of forms. The first form can be termed a density of states based entropy. The Fermi–Dirac distribution implies that each eigenstate of a system, , is occupied with a certain probability, . As the entropy is given by a sum over the probabilities of occupation of those states, there is an entropy associated with the occupation of the various electronic states. In most molecular systems, the energy spacing between the highest occupied molecular orbital and the lowest unoccupied molecular orbital is usually large, and thus the probabilities associated with the occupation of the excited states are small. Therefore, the electronic entropy in molecular systems can safely be neglected. Electronic entropy is thus most relevant for the thermodynamics of condensed phases, where the density of states at the Fermi level can be quite large, and the electronic entropy can thus contribute substantially to thermodynamic behavior. A second form of electronic entropy can be attributed to the configurational entropy associated with localized electrons and holes. This entropy is similar in form to the configurational entropy associated with the mixing of atoms on a lattice. Electronic entropy can substantially modify phase behavior, as in lithium ion battery electrodes, high temperature superconductors, and some perovskites. It is also the driving force for the coupling of heat and charge transport in thermoelectric materials, via the Onsager reciprocal relations. From the density of states General Formulation The entropy due to a set of states that can be either occupied with probability or empty with probability can be written as: , where is Boltzmann constant. For a continuously distributed set of states as a function of energy, such as the eigenstates in an electronic band structure, the above sum can be written as an integral over the
https://en.wikipedia.org/wiki/Data%20General/One
The Data General/One (DG-1) was a laptop introduced in 1984 by Data General. Description The nine-pound battery-powered 1984 Data General/One ran MS-DOS and had dual 3.5" diskettes, a 79-key full-stroke keyboard, 128 KB to 512 KB of RAM, and a monochrome LCD screen capable of either the standard 80×25 characters or full CGA graphics (640×200). It was a laptop comparable in capabilities to desktops of the era. History The Data General/One offered several features in comparison with contemporary portable computers. For instance, the popular 1983 Radio Shack TRS-80 Model 100, a non-PC-compatible machine, was comparably sized. It was a small battery-operated computer resting in one's lap, but had a 40×8 character (240×64 pixel) screen, a rudimentary ROM-based menu in lieu of a full OS, and no built-in floppy. IBM's 1984 Portable PC was comparable in capability with desktops, but was not battery operable and, being much larger and heavier, was by no means a laptop. Drawbacks The DG-1 was only a modest success. One problem was its use of 3.5" diskettes. Popular software titles were thus not widely available (5.25" being still the standard), a serious issue since then-common diskette copy-protection schemes made it difficult for users to copy software into that format. The device achieved moderate success in a large OEM deal with Allen-Bradley, where it was private labelled as a T-45 "programming terminal" and was resold from 1987 to 1991 with thousands of units sold. The CPU was a CMOS version of the 8086, compatible with the IBM PC's 8088 except it ran slightly slower, at 4.0 MHz instead of the standard 4.77 MHz. Unlike the Portable PC, the DG-1 laptop could not take regular PC/XT expansion cards. RS-232 serial ports were built-in, but the CMOS (low battery consumption) serial I-O chip available at design time, a CMOS version of the Intel 8251, was register incompatible with the 8250 serial IC standard for the IBM PC. As a result, software written for the PC ser
https://en.wikipedia.org/wiki/Cenovis
Cenovis is a dark brown food paste from Switzerland consisting of yeast extract, onions, carrots and spices. It is similar to English Marmite, Brazilian Cenovit, and Australian Vegemite. It is rich in vitamin B1. It is used to flavour soups, sausages, and salads. The most popular way to consume Cenovis, however, is to spread it on a slice of buttered bread, as stated on the product's packaging (it can also be blended directly into butter, and then spread on bread, or used as a filling in croissants and buns). The company does not disclose whether the Swiss Cenovis was a licensed product from the older German one. In contrast to comparable yeast extracts, the Swiss Cenovis, similar to Thomy mustard, was sold in tubes early on and is somewhat lighter and more liquid. Protein versus vitamin Since the beginning of the 20th century many attempts to turn brewer's yeast into food have been made. The main reason being its availability and nutritional physiology. The English Marmite (1902) and the Australian Vegemite (1922) became successful as products. 1912 Casimir Funk discovered an active ingredient against deficiency diseases which he called vitamin. The high thiamine content (vitamin B1) then became the quality of nutritional yeast that was more effective in advertising than its protein content, which had been known for a long time. Origins in Germany In 1915, Cenovis Nahrungsmittelwerke GmbH was founded in Munich as a brewer's yeast and malt factory, which also produced by-products of these products and other foods such as oatmeal and baking powder, making it one of Maggi's main competitors. The German Cenovis vitamin extract was available from around 1920 in jars labeled "unbegrenzt haltbar" (unlimited shelf life). The image of the Cenovis products was associated with the life reform movement (from which the Reformhäuser emerged). It was reported in 1921 that the Cenovis yeast extract consisted of cleaned and de-bittered brewer's yeast and had a honey-like con
https://en.wikipedia.org/wiki/Numeric%20precision%20in%20Microsoft%20Excel
As with other spreadsheets, Microsoft Excel works only to limited accuracy because it retains only a certain number of figures to describe numbers (it has limited precision). With some exceptions regarding erroneous values, infinities, and denormalized numbers, Excel calculates in double-precision floating-point format from the IEEE 754 specification (besides numbers, Excel uses a few other data types). Although Excel allows display of up to 30 decimal places, its precision for any specific number is no more than 15 significant figures, and calculations may have an accuracy that is even less due to five issues: round off, truncation, and binary storage, accumulation of the deviations of the operands in calculations, and worst: cancellation at subtractions resp. 'Catastrophic cancellation' at subtraction of values with similar magnitude. Accuracy and binary storage In the top figure the fraction 1/9000 in Excel is displayed. Although this number has a decimal representation that is an infinite string of ones, Excel displays only the leading 15 figures. In the second line, the number one is added to the fraction, and again Excel displays only 15 figures. In the third line, one is subtracted from the sum using Excel. Because the sum has only eleven 1s after the decimal, the true difference when ‘1’ is subtracted is three 0s followed by a string of eleven 1s. However, the difference reported by Excel is three 0s followed by a 15 digit string of thirteen 1s and two extra erroneous digits. Thus, the numbers Excel calculates with are not the numbers that it displays. Moreover, the error in Excel's answer is not simply round-off error, it is an effect in floating point calculations called 'cancellation'. The inaccuracy in Excel calculations is more complicated than errors due to a precision of 15 significant figures. Excel's storage of numbers in binary format also affects its accuracy. To illustrate, the lower figure tabulates the simple addition for several values of
https://en.wikipedia.org/wiki/Well-known%20URI
A well-known URI is a Uniform Resource Identifier for URL path prefixes that start with /.well-known/. They are implemented in webservers so that requests to the servers for well-known services or information are available at URLs consistent well-known locations across servers. Description Well-known URIs are Uniform Resource Identifiers defined by the IETF in RFC 8615. They are URL path prefixes that start with /.well-known/. This implementation is in response to the common expectation for web-based protocols to require certain services or information be available at URLs consistent across servers, regardless of the way URL paths are organized on a particular host. The URIs are implemented in webservers so that requests to the servers for well-known services or information are available at URLs consistently in well-known locations across servers. The IETF has defined a simple way for web servers to hold metadata that any user agent (e.g., web browser) can request. The metadata is useful for various tasks, including directing a web user to use a mobile app instead of the website or indicating the different ways that the site can be secured. The well-known locations are used by web servers to share metadata with user agents; sometimes these are files and sometimes these are requests for information from the web server software itself. The way to declare the different metadata requests that can be provided is standardized by the IETF so that other developers know how to find and use this information. Use The path well-known URI begins with the characters /.well-known/, and whose scheme is "HTTP", "HTTPS", or another scheme that has explicitly been specified to use well-known URIs. As an example, if an application hosts the service "example", the corresponding well-known URIs on https://www.example.com/ would start with https://www.example.com/.well-known/example. Information shared by a web site as a well-known service is expected to meet a specific standard. S
https://en.wikipedia.org/wiki/Hailey%E2%80%93Hailey%20disease
Hailey–Hailey disease (HHD), or familial benign chronic pemphigus or familial benign pemphigus, was originally described by the Hailey brothers (Hugh Edward and William Howard) in 1939. It is a genetic disorder that causes blisters to form on the skin. Signs and symptoms HHD is characterized by outbreaks of rashes and blisters on the skin. Affected areas of skin undergo repeated blistering and inflammation, and may be painful to the touch. Areas where the skin folds, as well as the armpits, groin, neck, buttocks and under the breasts are most commonly affected. In addition to blistering, other symptoms which accompany HHD include acantholysis, erythema and hyperkeratosis. Causes The cause of the disease is a haploinsufficiency of the enzyme ATP2C1; the ATP2C1 gene is located on chromosome 3, which encodes the protein hSPCA1. A mutation on one copy of the gene causes only half of this necessary protein to be made and the cells of the skin do not adhere together properly due to malformation of intercellular desmosomes, causing acantholysis, blisters and rashes. There is no known cure. Diagnosis Classification While the term pemphigus typically refers to "a rare group of blistering autoimmune diseases" affecting "the skin and mucous membranes", Hailey–Hailey disease is not an autoimmune disorder and there are no autoantibodies. According to Pemphigus Pemphigoid Foundation (IPPF), "familial benign chronic pemphigus, or Hailey-Hailey disease, is a different condition from Pemphigus". Differential diagnosis The differential diagnosis includes intertrigo, candidiasis, frictional or contact dermatitis, and inverse psoriasis. A biopsy and/or family history can confirm. The lack of oral lesions and intercellular antibodies distinguishes familial benign pemphigus from other forms of pemphigus. Treatment Topical steroid preparations often help outbreaks; use of the weakest corticosteroid that is effective is recommended to help prevent thinning of the skin. Drugs such
https://en.wikipedia.org/wiki/Natural%20%28music%29
In music theory, a natural (♮) is an accidental which cancels previous key signatures or accidentals and represents the unaltered pitch of a note. Examples It can be used as key signature or accidental. An example of an A note with an accidental in place is shown below. A note is natural when it is neither lowered nor raised by other key signatures or accidentals. Natural notes are the notes A, B, C, D, E, F, and G represented by the white keys on the keyboard of a piano or organ. On a modern concert harp, the middle position of the seven pedals that alter the tuning of the strings gives the natural pitch for each string. The scale of A minor or C major is sometimes regarded as the central, natural or basic minor scale or major scale because all of its notes are natural notes, whereas every other major scale in the circle of fifths has at least one of other accidental signs in it. The notes F, C, E, B, and most notes inflected by double-flats and double-sharps correspond in pitch with natural notes; however, they are not regarded as natural notes but rather as enharmonic equivalents of them and are just as much chromatically inflected notes as most sharped and flatted notes that are represented by black notes on a keyboard. The natural sign is derived from a square b used to denote B in medieval music (in contrast with the round b denoting B, which became the flat symbol). The Unicode character MUSIC NATURAL SIGN '♮' (U+266E) should display as a natural sign. Its HTML entity is . Notation In musical notation, a natural sign () is a sign used to cancel a flat or sharp from either a preceding note or the key signature. (Ex. A flat Major -> F Major) But, naturals are assumed (by default) in key signatures and mentioned only in key signature changes. However, in some cases, as in John Foulds' A World Requiem, the naturals are sometimes omitted in the position of the canceled flats or sharps in the previous key signature. It seems to be aimed at reducing writi
https://en.wikipedia.org/wiki/Social%20value%20orientations
In social psychology, social value orientation (SVO) is a person's preference about how to allocate resources (e.g. money) between the self and another person. SVO corresponds to how much weight a person attaches to the welfare of others in relation to the own. Since people are assumed to vary in the weight they attach to other peoples' outcomes in relation to their own, SVO is an individual difference variable. The general concept underlying SVO has become widely studied in a variety of different scientific disciplines, such as economics, sociology, and biology under a multitude of different names (e.g. social preferences, other-regarding preferences, welfare tradeoff ratios, social motives, etc.). Historical background The SVO construct has its history in the study of interdependent decision making, i.e. strategic interactions between two or more people. The advent of Game theory in the 1940s provided a formal language for describing and analyzing situations of interdependence based on utility theory. As a simplifying assumption for analyzing strategic interactions, it was generally presumed that people only consider their own outcomes when making decisions in interdependent situations, rather than taking into account the interaction partners' outcomes as well. However, the study of human behavior in social dilemma situations, such as the prisoner's dilemma, revealed that some people do in fact appear to have concerns for others. In the Prisoner's dilemma, participants are asked to take the role of two criminals. In this situation, they are to pretend that they are a pair of criminals being interrogated by detectives in separate rooms. Both participants are being offered a deal and have two options. That is, the participant may remain silent or confess and implicate his or her partner. However, if both participants choose to remain silent, they will be set free. If both participants confess they will receive a moderate sentence. Conversely, if one participant re
https://en.wikipedia.org/wiki/Line%20spectral%20pairs
Line spectral pairs (LSP) or line spectral frequencies (LSF) are used to represent linear prediction coefficients (LPC) for transmission over a channel. LSPs have several properties (e.g. smaller sensitivity to quantization noise) that make them superior to direct quantization of LPCs. For this reason, LSPs are very useful in speech coding. LSP representation was developed by Fumitada Itakura, at Nippon Telegraph and Telephone (NTT) in 1975. From 1975 to 1981, he studied problems in speech analysis and synthesis based on the LSP method. In 1980, his team developed an LSP-based speech synthesizer chip. LSP is an important technology for speech synthesis and coding, and in the 1990s was adopted by almost all international speech coding standards as an essential component, contributing to the enhancement of digital speech communication over mobile channels and the internet worldwide. LSPs are used in the code-excited linear prediction (CELP) algorithm, developed by Bishnu S. Atal and Manfred R. Schroeder in 1985. Mathematical foundation The LP polynomial can be expressed as , where: By construction, P is a palindromic polynomial and Q an antipalindromic polynomial; physically P(z) corresponds to the vocal tract with the glottis closed and Q(z) with the glottis open. It can be shown that: The roots of P and Q lie on the unit circle in the complex plane. The roots of P alternate with those of Q as we travel around the circle. As the coefficients of P and Q are real, the roots occur in conjugate pairs The Line Spectral Pair representation of the LP polynomial consists simply of the location of the roots of P and Q (i.e. such that ). As they occur in pairs, only half of the actual roots (conventionally between 0 and ) need be transmitted. The total number of coefficients for both P and Q is therefore equal to p, the number of original LP coefficients (not counting ). A common algorithm for finding these is to evaluate the polynomial at a sequence of closely
https://en.wikipedia.org/wiki/VIATRA
VIATRA is an open-source model transformation framework based on the Eclipse Modeling Framework (EMF) and hosted by the Eclipse Foundation. VIATRA supports the development of model transformations with specific focus on event-driven, reactive transformations, i.e., rule-based scenarios where transformations occur as reactions to certain external changes in the model. Building upon an incremental query support for locating patterns and changes in the model, VIATRA offers a language (the VIATRA Query Language, VQL) to define transformations and a reactive transformation engine to execute certain transformations upon changes in the underlying model. Application domains VIATRA, as an open-source framework offering, serves as a central integration point and enabler engine in various applications, both in an industrial and in an academic context. Earlier versions of the framework have been intensively used for providing tool support for developing and verifying critical embedded systems in numerous European research projects such as DECOS, MOGENTES, INDEXYS and SecureChange. As a major industrial application of VIATRA, it is utilized as the underlying model querying and transformation engine of the IncQuery Suite. Thus, VIATRA is a key technical component in several industrial collaborations around model-based systems engineering (MBSE), fostering innovative systems engineering practices in domains like aerospace, manufacturing, industrial automation and automotive. Furthermore, via the applications of the IncQuery Suite, VIATRA serves as the foundation for model-based endeavors of ongoing, large-scale European industrial digitalization endeavors, such as the Arrowhead Tools and the Embrace projects. VIATRA is well integrated with Eclipse Modeling tools. However, VIATRA works outside the Eclipse environment as well, as demonstrated by the IncA project using the JetBrains MPS platform. Functionality VIATRA provides the following main services: An increme
https://en.wikipedia.org/wiki/Bedding%20%28horticulture%29
Many types of flowering plants are available to plant in flower gardens or flower beds. The floral industry calls these plants, bedding plants. These fast-growing plants in seasonal flower beds create colourful displays, during spring, summer, fall or winter, depending on the climate. Plants used for bedding are generally annuals, but biennials, tender perennials, and succulents are used. Flowering bedding plants are also grown in containers and pots positioned on patios, terraces, decks and other areas around houses. Large containers of bedding plants are used in public displays along city streets, plazas and hanging from city light posts. Types of gardens with bedding plants Formal, large gardens of bedding plants, as seen in parks and municipal displays, where whole flower beds are replanted two or three times a year, is a costly and labor-intensive process. Towns and cities are encouraged to produce impressive displays by campaigns such as "Britain in bloom" or "America in Bloom". Home gardeners usually have bedding plants in containers or in beds in the front yard or back yard. Spring, fall, winter gardens - temperatures are moderate to cool Plants used for spring bedding are often biennials (sown one year to flower the next), or hardy, but short-lived, perennials. Spring-flowering bulbs such as tulips are often used, typically with forget-me-nots, wallflowers, winter pansies and polyanthus. Hardy annuals sown directly into the ground early in the season (poppy, stock, sunflower, clarkia, godetia, eschscholzia, nigella, dianthus) or transplanted after purchase at a local garden centre. Hardy biennial plants, or perennials treated as biennials, sown in one year to flower the next, and discarded after flowering (antirrhinum, polyanthus, wallflower, daisy, foxglove, some dianthus, some poppies, campanula, delphinium, aubrieta, aquilegia, cornflower, pansies). Planted in autumn to give a display until early spring, the plants used for winter bedding are mainly
https://en.wikipedia.org/wiki/Picture%20language
In formal language theory, a picture language is a set of pictures, where a picture is a 2D array of characters over some alphabet. For example, the language defines the language of rectangles composed of the character . This language contains pictures such as: The study of picture languages was initially motivated by the problems of pattern recognition and image processing, but two-dimensional patterns also appear in the study of cellular automata and other parallel computing models. Some formal systems have been created to define picture languages, such as array grammars and tiling systems.
https://en.wikipedia.org/wiki/Phase%20shift%20torque%20measurement
Phase shift torque measurement is carried out by a torquemeter that uses a shaft connected between rotating machines such as a turbine and compressor or jet engine under test and a dynamometer. A gear at each end of the shaft is surrounded by a coil. The gear produces a sine wave in the coil's eddy current magnetic field. Under no load the waves are in line with each other, as a load is applied to the shaft the waves shift out of phase with each other. Using Young's modulus calculations for the stiffness of the material the load or torque through the shaft can be measured highly accurately at speeds up to 150,000 rpm and torque up to 400,000 Nm. Torque Measurement Dynamometers Mechanical engineering
https://en.wikipedia.org/wiki/Gemini%20%28protocol%29
Gemini is an application-layer internet communication protocol for accessing remote documents, similar to the Hypertext Transfer Protocol (HTTP) and Gopher. It comes with a special document format, commonly referred to as "gemtext", which allows linking to other documents. Started by a pseudonymous person known as Solderpunk, the protocol is being finalized collaboratively and , has not been submitted to the IETF organization for standardization. History The Gemini project was started in June 2019 by Solderpunk. Additional work has been done by an informal community of users. According to Solderpunk's FAQ, Gemini is not intended to replace Gopher or HTTP, but to co-exist with them. Much of the development happened on the Gemini mailing list until the list disappeared at the end of 2021 due to a hardware issue. The creation of the Usenet newsgroup comp.infosystems.gemini in October 2021 was the first new newsgroup in the Big Eight hierarchy in eight years. Design The Gemini specification defines both the Gemini protocol and a native file format for that protocol, analogous to HTML for HTTP, known as "gemtext". The design is inspired by Gopher, but with modernisation such as mandatory use of Transport Layer Security (TLS) for connections and a hypertext format as native content type. The design is deliberately not easily extensible, in order to meet a project goal of simplicity. Protocol Gemini is designed within the framework of the Internet protocol suite and like HTTP/S, Gemini functions as a request–response protocol in the client–server computing model. A Gemini server should listen on TCP port 1965. A Gemini browser, for example, may be the client and an application running on a computer hosting a Gemini site may be the server. The client sends a Gemini request message to the server, and the server sends back a response message. Gemini uses a separate connection to the same server for every resource request. Gemini mandates the use of TLS with privacy-
https://en.wikipedia.org/wiki/DermIS
The DermIS (Dermatology Internet Service or Dermatology Information System) is a web site providing images and information on diagnosis in dermatology. It is a project of the Department of Clinical Social Medicine of the University of Heidelberg and the Department of Dermatology of the University of Erlangen, and provides information in seven languages: Turkish, Japanese, Portuguese, Spanish, French, German and English. It includes the Dermatology Online Atlas (DOIA), a database of images of conditions. See also List of cutaneous conditions
https://en.wikipedia.org/wiki/Artificial%20intelligence
Artificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of humans or animals. It is also the field of study in computer science that develops and studies intelligent machines. "AI" may also refer to the machines themselves. AI technology is widely used throughout industry, government and science. Some high-profile applications are: advanced web search engines (e.g., Google Search), recommendation systems (used by YouTube, Amazon, and Netflix), understanding human speech (such as Siri and Alexa), self-driving cars (e.g., Waymo), generative or creative tools (ChatGPT and AI art), and competing at the highest level in strategic games (such as chess and Go). Artificial intelligence was founded as an academic discipline in 1956. The field went through multiple cycles of optimism followed by disappointment and loss of funding, but after 2012, when deep learning surpassed all previous AI techniques, there was a vast increase in funding and interest. The various sub-fields of AI research are centered around particular goals and the use of particular tools. The traditional goals of AI research include reasoning, knowledge representation, planning, learning, natural language processing, perception, and support for robotics. General intelligence (the ability to solve an arbitrary problem) is among the field's long-term goals. To solve these problems, AI researchers have adapted and integrated a wide range of problem-solving techniques, including search and mathematical optimization, formal logic, artificial neural networks, and methods based on statistics, operations research, and economics. AI also draws upon psychology, linguistics, philosophy, neuroscience and many other fields. Goals The general problem of simulating (or creating) intelligence has been broken down into sub-problems. These consist of particular traits or capabilities that researchers expect an intelligent system to display. The traits described below hav
https://en.wikipedia.org/wiki/Multipolar%20exchange%20interaction
Magnetic materials with strong spin-orbit interaction, such as: LaFeAsO, PrFe4P12, YbRu2Ge2, UO2, NpO2, Ce1−xLaxB6, URu2Si2 and many other compounds, are found to have magnetic ordering constituted by high rank multipoles, e.g. quadruple, octople, etc. Due to the strong spin-orbit coupling, multipoles are automatically introduced to the systems when the total angular momentum quantum number J is larger than 1/2. If those multipoles are coupled by some exchange mechanisms, those multipoles could tend to have some ordering as conventional spin 1/2 Heisenberg problem. Except the multipolar ordering, many hidden order phenomena are believed closely related to the multipolar interactions Tensor operator expansion Basic concepts Consider a quantum mechanical system with Hilbert space spanned by , where is the total angular momentum and is its projection on the quantization axis. Then any quantum operators can be represented using the basis set as a matrix with dimension . Therefore, one can define matrices to completely expand any quantum operator in this Hilbert space. Taking J=1/2 as an example, a quantum operator A can be expanded as Obviously, the matrices: form a basis set in the operator space. Any quantum operator defined in this Hilbert can be expended by operators. In the following, let's call these matrices as a super basis to distinguish the eigen basis of quantum states. More specifically the above super basis can be called a transition super basis because it describes the transition between states and . In fact, this is not the only super basis that does the trick. We can also use Pauli matrices and the identity matrix to form a super basis Since the rotation properties of follow the same rules as the rank 1 tensor of cubic harmonics and the identity matrix follows the same rules as the rank 0 tensor , the basis set can be called cubic super basis. Another commonly used super basis is spherical harmonic super basis which is built by r
https://en.wikipedia.org/wiki/Ren%C3%A9%20Chaloult
René Chaloult (January 26, 1901 – December 20, 1978) was a nationalist politician in Quebec, Canada. Background He was born on January 26, 1901, in Quebec City. Political career Chaloult first won a seat to the Legislative Assembly of Quebec as a Union Nationale candidate in the 1936 election in the district of Kamouraska. In 1937, he and colleagues Oscar Drouin, Joseph-Ernest Grégoire, Philippe Hamel and Adolphe Marcoux left the Union Nationale. Chaloult joined the Liberals and won re-election in the 1939 election as the Member for the district of Lotbinière. During World War II, Chaloult opposed conscription. He won re-election as an Independent in Québec-Comté electoral district in the 1944 and 1948 elections, but was defeated in the 1952 election and in the district of Jonquière-Kénogami in the 1956 election. Chaloult retired to live at his summer home in Kamouraska. Each year on July 1, he would fly the Quebec flag outside his summer home at half-staff to show his nationalist inclinations. Death He died on December 20, 1978. Legacy For many years, Chaloult urged Quebec to adopt a distinctive design for its flag. On November 19, 1946, Chaloult entered a motion to provide Quebec with a unique flag. Two years later, the motion was to be voted on January 21, 1948. However, the opportunistic government of Maurice Duplessis instead issued a decree creating the current Quebec flag. Footnotes 1901 births 1978 deaths French Quebecers Politicians from Quebec City Quebec Liberal Party MNAs Union Nationale (Quebec) MNAs Université Laval alumni Flag designers
https://en.wikipedia.org/wiki/Oprozomib
Oprozomib (codenamed ONX 0912 and PR-047) is an orally active second-generation proteasome inhibitor developed by Proteolix, which was acquired by Onyx Pharmaceuticals, an Amgen subsidiary, in 2009. It selectively inhibits chymotrypsin-like activity of both the constitutive proteasome (PSMB5) and immunoproteasome (LMP7). It is being investigated for the treatment of hematologic malignancies, specifically, multiple myeloma, with Phase 1b studies ongoing (as of February 16, 2016). Being an epoxyketone derivative, oprozomib is structurally related to carfilzomib and has the added benefit of being orally bioavailable. Like carfilzomib, it is active against bortezomib-resistant multiple myeloma cells. Oprozomib was granted orphan drug status for the treatment of Waldenström's macroglobulinaemia and multiple myeloma in 2014. See also Ixazomib (trade name Ninlaro) — an orally available boronic acid-derived proteasome inhibitor approved for the treatment of multiple myeloma
https://en.wikipedia.org/wiki/H2AFX
H2A histone family member X (usually abbreviated as H2AX) is a type of histone protein from the H2A family encoded by the H2AFX gene. An important phosphorylated form is γH2AX (S139), which forms when double-strand breaks appear. In humans and other eukaryotes, the DNA is wrapped around histone octamers, consisting of core histones H2A, H2B, H3 and H4, to form chromatin. H2AX contributes to nucleosome-formation, chromatin-remodeling and DNA repair, and is also used in vitro as an assay for double-strand breaks in dsDNA. Formation of γH2AX H2AX becomes phosphorylated on serine 139, then called γH2AX, as a reaction on DNA double-strand breaks (DSB). The kinases of the PI3-family (Ataxia telangiectasia mutated, ATR and DNA-PKcs) are responsible for this phosphorylation, especially ATM. The modification can happen accidentally during replication fork collapse or in the response to ionizing radiation but also during controlled physiological processes such as V(D)J recombination. γH2AX is a sensitive target for looking at DSBs in cells. The presence of γH2AX by itself, however, is not the evidence of the DSBs. The role of the phosphorylated form of the histone in DNA repair is under discussion but it is known that because of the modification the DNA becomes less condensed, potentially allowing space for the recruitment of proteins necessary during repair of DSBs. Mutagenesis experiments have shown that the modification is necessary for the proper formation of ionizing radiation induced foci in response to double strand breaks, but is not required for the recruitment of proteins to the site of DSBs. Function DNA damage response The histone variant H2AX constitutes about 2-25% of the H2A histones in mammalian chromatin. When a double-strand break occurs in DNA, a sequence of events occurs in which H2AX is altered. Very early after a double-strand break, a specific protein that interacts with and affects the architecture of chromatin is phosphorylated and then relea
https://en.wikipedia.org/wiki/Moeritherium
Moeritherium ("the beast from Lake Moeris") is an extinct genus of primitive proboscideans. These prehistoric mammals are related to the elephant and, more distantly, sea cows and hyraxes. They lived during the Eocene epoch. Description Moeritherium was a rotund semi-aquatic mammal with short, stubby legs that lived about 37–35 million years ago. Its body shape and lifestyle demonstrate convergent evolution with pigs, tapirs, and the pygmy hippopotamus. Moeritherium was smaller than most or all later proboscideans, standing only high at the shoulder and weighing . The shape of the skull suggests that, while Moeritherium did not have an elephant-like trunk, it may have had a broad flexible upper lip like a tapir's for grasping aquatic vegetation. The second incisor teeth formed small tusks, although these would have looked more like the teeth of a hippo than a modern elephant. Ecology Isotopic analysis suggests that Moertherium was amphibious/semiaquatic and probably consumed freshwater plants. Discovery In 1901, Charles William Andrews described Moeritherium lyonsi from fossil remains found in the Qasr el Sagha Formation in the Al Fayyum in Egypt. Andrews described Moeritherium gracile from fossil remains of a smaller specimen found in the same area in 1902 in a fluvio-marine formation, that is a river estuary wetlands to brackish lagoon paleoenvironment. In 1904, the first Moeritherium trigodon fossils were discovered by Charles Andrews in the deposits of an oasis in Al Fayyum. It is also found in other sites around North and West Africa. In 1911, Max Schlosser of Munich divided Moeritherium lyonsi into two species: Moeritherium lyonsi, a large form from the Qasr-el-Sagha formation, and a new large species M. andrewsi from a fluvio-marine formation. In 2006, Moeritherium chehbeurameuri has been described from fossil remains found in the early late Eocene locality of Bir El Ater, Algeria. Classification Moeritherium is not thought to be directly ancestral to
https://en.wikipedia.org/wiki/8192
8192 is the natural number following 8191 and preceding 8193. 8192 is a power of two: (2 to the 13th power). Because it is two times a sixth power (8192 = 2 × 46), it is also a Bhaskara twin. That is, 8192 has the property that twice its square is a cube and twice its cube is a square. In computing 8192 (213) is the maximum number of fragments for IPv4 datagram.
https://en.wikipedia.org/wiki/Xenia%20%28plants%29
Xenia (also known as the Xenia effect) in plants is the effect of pollen on seeds and fruit of the fertilized plant. The effect is separate from the contribution of the pollen towards the next generation. The term was coined in 1881 by the botanist Wilhelm Olbers Focke to refer to effects on maternal tissues, including the seed coat and pericarp, but at that time endosperm was also thought to be a maternal tissue, and the term became closely associated with endosperm effects. The term metaxenia was later coined and is still sometimes used to describe the effects on purely maternal tissues. Endosperm effects in the seed One of the most familiar examples of xenia is the different colours that can be produced in maize (Zea mays) by assortment of alleles via individual pollen grains. Such maize cobs are cultivated for decorative purposes. The endosperm tissue, which makes up most of the bulk of a maize seed, is not produced by the mother plant, but is the product of fertilization, and genetic factors carried by the pollen affect its colour. For example, a yellow-seeded race may have its yellow colour determined by a recessive allele. If it receives pollen from a purple-seeded race that has one copy of a dominant allele for purple colour and one copy of the recessive allele for yellow seed, the resulting cob will have some yellow and some purple seeds. Qualities affected in the endosperm of sorghum may include starchiness, sweetness, waxiness, or other aspects. Fruit-growth effects The vigour of the seeds forming inside a fruit can affect the growth of the fruit itself. For example, in two plant species whose fruit ripen asynchronously (Vaccinium corymbosum and Amelanchier arborea) the fruit with more seeds ripened faster. Xenia and genetically engineered crops Because there is concern about pollen from genetically modified (GM) crops, male-sterile forms are being considered, particularly of maize. Male-fertile non-GM plants must then be grown with the GM crop to
https://en.wikipedia.org/wiki/Hyman%20Bass
Hyman Bass (; born October 5, 1932) is an American mathematician, known for work in algebra and in mathematics education. From 1959 to 1998 he was Professor in the Mathematics Department at Columbia University. He is currently the Samuel Eilenberg Distinguished University Professor of Mathematics and Professor of Mathematics Education at the University of Michigan. Life Born to a Jewish family in Houston, Texas, he earned his B.A. in 1955 from Princeton University and his Ph.D. in 1959 from the University of Chicago. His thesis, titled Global dimensions of rings, was written under the supervision of Irving Kaplansky. He has held visiting appointments at the Institute for Advanced Study in Princeton, New Jersey, Institut des Hautes Études Scientifiques and École Normale Supérieure (Paris), Tata Institute of Fundamental Research (Bombay), University of Cambridge, University of California, Berkeley, University of Rome, IMPA (Rio), National Autonomous University of Mexico, Mittag-Leffler Institute (Stockholm), and the University of Utah. He was president of the American Mathematical Society. Bass formerly chaired the Mathematical Sciences Education Board (1992–2000) at the National Academy of Sciences, and the Committee on Education of the American Mathematical Society. He was the President of ICMI from 1999 to 2006. Since 1996 he has been collaborating with Deborah Ball and her research group at the University of Michigan on the mathematical knowledge and resources entailed in the teaching of mathematics at the elementary level. He has worked to build bridges between diverse professional communities and stakeholders involved in mathematics education. Work His research interests have been in algebraic K-theory, commutative algebra and algebraic geometry, algebraic groups, geometric methods in group theory, and ζ functions on finite simple graphs. Awards and recognitions Bass was elected as a member of the National Academy of Sciences in 1982. In 1983, he was elec
https://en.wikipedia.org/wiki/Electronic%20Yellow%20Pages
Electronic Yellow Pages are online versions of traditional printed business directories produced by telephone companies around the world. Typical functionalities of online yellow pages include the alphabetical listings of businesses and search functionality of the business database by name, business or location. Since Electronic Yellow Pages are not limited by space considerations, they often contain far more comprehensive business information such as vicinity maps, company profiles, product information, and more. An advantage of Electronic Yellow Pages is that they can be updated in real time; therefore, listed businesses are not constrained by once-a-year publishing of the printed version which leads to greater accuracy of the listings since contact information may change at any time. Before the popularity of the internet, business telephone numbers in the United Kingdom could be searched by accessing a remote computer terminal by modem. The initial prototype of this was superseded in 1990 with a commercial service. This service allowed searches via Name, Business classification and locality for business listings and a free text field was provided to allow "unstructured text" searching of Adverts. This dialup service was available via Prestel and "BT Gold" services. The service Electronic Yellow Pages was superseded in the mid-1990s by the internet service www.yell.com. A similar system called Phonebase for published residential phone numbers was discontinued in the 1990s, being superseded by a web-based search interface. History The first true online Yellow Pages, was a creation based on the independent YP publisher in Seattle, Washington called Banana Pages. This was the first print directory which was registered with both YPPA (the Yellow Pages Publishers Association), and the ADP (Association of Directory Publishers) to place their listings online. The Yellow Pages product was the brain child of the co-owner brothers of the company, Peter and John Richards.
https://en.wikipedia.org/wiki/Thiolava
"Candidatus Thiolava", represented by its sole species "Candidatus Thiolava veneris" (meaning Venus's hair), is a genus of bacteria discovered growing in stringlike mats after an eruption of the submarine volcano Tagoro near the Canary Islands. The International Institute of Species Exploration named Thiolava veneris one of its 2018 Top 10 New Species. The mats host a wide variety of other sea life. Physical characteristics Thiolava veneris was found growing in laterally extensive mats in an area recently obliterated by underwater volcanism. The bacteria were discovered growing at about 130 m water depth, near the summit of the submarine volcano Tagoro. The mats of white, hair-like filaments formed by this bacterium cover an area of approximately 2,000 m2 around the newly formed volcanic cone. Each bacteria is 3-6 μm, and form white trichomes, or chains consisting of three helical strands surrounded by a protective sheath. The sheaths are 36 to 90 μm wide and up to 3 cm long.Thiolava veneris was initially separated from a hydrothermal vent in the Aegean Sea that was barely 10 meters below the surface.1,2-dichloroethane. Thiolava veneris may grow at temperatures ranging from 15 to 37°C, with optimal growth happening at around 30°C. One of the organisms discovered in the study, which discovered evidence of phytosterol breakdown by the microbial communities, is Thiolava veneris.Thiolava veneris can move through the surrounding water and sediment because it is motile and has many polar flagella. Gram-negative bacteria such as Thiolava veneris have an envelope made up of an outer membrane, a layer of peptidoglycan, and an inner membrane called the cytoplasmic membrane. The microorganisms that live on carbonate crusts close to hydrothermal vents off the coast of Milos, Greece. Metabolism Thiolava veneris belongs to the Epsilonproteobacteria, a large bacterial phylogenetic group that also includes other sulfur-oxidizing hydrothermal vent bacteria.Unusually, T. veneris ca
https://en.wikipedia.org/wiki/List%20of%20amateur%20radio%20modes
The following is a list of the modes of radio communication used in the amateur radio hobby. Modes of communication Amateurs use a variety of voice, text, image, and data communications modes over radio. Generally new modes can be tested in the amateur radio service, although national regulations may require disclosure of a new mode to permit radio licensing authorities to monitor the transmissions. Encryption, for example, is not generally permitted in the Amateur Radio service except for the special purpose of satellite vehicle control uplinks. The following is a partial list of the modes of communication used, where the mode includes both modulation types and operating protocols. Morse code Morse code is called the original digital mode. Radio telegraphy, designed for machine-to-machine communication is the direct on / off keying of a continuous wave carrier by Morse code symbols, often called amplitude-shift keying or ASK, may be considered to be an amplitude modulated mode of communications, and is rightfully considered the first digital data mode. Although more than 140 years old, bandwidth-efficient Morse code, originally developed by Samuel Morse and Alfred Vail in the 1840s, uses techniques that were not more fully understood until much later under the modern terms of source coding or data compression. Alfred Vail intuitively understood efficient code design: The bandwidth-efficiency of Morse code arises because its encodings are variable length, and Vail assigned the shortest encodings to the most-used symbols, and the longest encodings to the least-used symbols. It was not until one hundred years later that Shannon's modern information theory (1948) described Vail's coding technique for Morse code, giving it a firm footing in a mathematically based theory. Shannon's information theory resulted in similarly efficient data encoding technologies which use bandwidth like Morse code, such as the modern Huffman, Arithmetic, and Lempel-Ziv codes. Although c
https://en.wikipedia.org/wiki/Log%20management
Log management (LM) comprises an approach to dealing with large volumes of computer-generated log messages (also known as audit records, audit trails, event-logs, etc.). Log management generally covers: Log collection Centralized log aggregation Long-term log storage and retention Log rotation Log analysis (in real-time and in bulk after storage) Log search and reporting. Overview The primary drivers for log management implementations are concerns about security, system and network operations (such as system or network administration) and regulatory compliance. Logs are generated by nearly every computing device, and can often be directed to different locations both on a local file system or remote system. Effectively analyzing large volumes of diverse logs can pose many challenges, such as: Volume: log data can reach hundreds of gigabytes of data per day for a large organization. Simply collecting, centralizing and storing data at this volume can be challenging. Normalization: logs are produced in multiple formats. The process of normalization is designed to provide a common output for analysis from diverse sources. Velocity: The speed at which logs are produced from devices can make collection and aggregation difficult Veracity: Log events may not be accurate. This is especially problematic for systems that perform detection, such as intrusion detection systems. Users and potential users of log management may purchase complete commercial tools or build their own log-management and intelligence tools, assembling the functionality from various open-source components, or acquire (sub-)systems from commercial vendors. Log management is a complicated process and organizations often make mistakes while approaching it. Logging can produce technical information usable for the maintenance of applications or websites. It can serve: to define whether a reported bug is actually a bug to help analyze, reproduce and solve bugs to help test new features i
https://en.wikipedia.org/wiki/International%20Grape%20Genome%20Program
The International Grape Genomics Program (IGGP) is a collaborative genome project dedicated to determining the genome sequence of the grapevine Vitis vinifera. It is a multinational project involving research centers in Australia, Canada, Chile, France, Germany, Italy, South Africa, Spain, and the United States. The project was established on the premise that whereas the Vitis family provides the world's most economically important fruit, its biology is still poorly understood. Many centuries of viticulture have provided many well-informed wine-producing centres throughout the world, yet exactly how a grapevine plant responds and interacts with the physical environment and deals with abiotic stresses, pests and diseases is currently unknown. Agricultural technology surrounding Vitis has been traditionally based upon specific genotypes, which in the main have relied on "vegetative multiplication" and control of growing conditions to improve quality and yield. While advances in quality have certainly been achieved, it has involved increased costs and is in danger of incurring unsustainable environmental overheads. The argument is that the relatively unknown biology of Vitis is capable of delivering desired viticultural improvements without the associated ongoing costs, and establishing its genome sequence will examine the role individual genes play in viticulture, improving grape characteristics and quality in a predictable way. Initial discoveries As of March 2007, the project has mapped over half of the grapevine genome. In the course of their research, the Cooperative Research Centre for Viticulture (CRCV), based at the CSIRO Plant Industry Horticulture Unit in Adelaide, Australia (one of the IGGP collaborating centres) discovered that white grapes only exist today as a result of a rare genetic mutation which took place thousands of years ago. White grapes are believed to have arisen due to the extremely rare and independent mutation of two similar and adjace
https://en.wikipedia.org/wiki/Infant%20sleep
Infant sleep is an act of sleeping by an infant or a newborn. It differs significantly from sleep during adulthood. Unlike in adults, sleep early in infancy initially does not follow a circadian rhythm. Infant sleep also appears to have two main modes - active, associated with movement, and quiet, associated with stillness - exhibiting distinct neurological firing patterns. Sleep duration is also shorter. As the infant ages, sleep begins to follow a Circadian rhythm and sleep duration increases. Infants nap frequently. Infants are also particularly vulnerable during sleep; they are prone to suffocation and SIDS. As a result, "safe" sleep techniques have been the subject of several public health campaigns. Infant sleep practices vary widely between cultures and over history; historically infants would sleep on the ground with their parents. In many modern cultures, infants sleep in a variety of types of infant beds or share a bed with parents. Infant sleep disturbance is common, and even normal infant sleep patterns can cause considerable disruption to parents' sleep. As a result, there are a wide variety of interventions and products intended to improve infant sleep; however, their efficacy is variable. Infant sleep training is one common intervention for poor infant sleep. Poor infant sleep is linked to maternal depression. Normal infant sleep In the first week of life, infants will sleep during both the day and night and will wake to feed. Sleep cycle duration is usually short, from 2–4 hours. Over the first two weeks, infants average 16–18 hours of sleep daily. Circadian rhythm has not yet been established and the infants sleeps during the night and day equally. In the first month of life, 95% of infants will wake during the night. At around 2 months, a day-night pattern begins to gradually develop. At around 3 months, sleep cycle may increase to 3–6 hours, and the majority of infants will still wake in the night to feed. By 4 months, the average infant sleeps
https://en.wikipedia.org/wiki/Polychromatic%20symmetry
Polychromatic symmetry is a colour symmetry which interchanges three or more colours in a symmetrical pattern. It is a natural extension of dichromatic symmetry. The coloured symmetry groups are derived by adding to the position coordinates (x and y in two dimensions, x, y and z in three dimensions) an extra coordinate, k, which takes three or more possible values (colours). An example of an application of polychromatic symmetry is crystals of substances containing molecules or ions in triplet states, that is with an electronic spin of magnitude 1, should sometimes have structures in which the spins of these groups have projections of + 1, 0 and -1 onto local magnetic fields. If these three cases are present with equal frequency in an orderly array, then the magnetic space group of such a crystal should be three-coloured. Example The group has three different rotation centres of order three (120°), but no reflections or glide reflections. There are two distinct ways of colouring the p3 pattern with three colours: p3[3]1 and p3[3]2 where the figure in square brackets indicates the number of colours, and the subscript distinguishes between multiple cases of coloured patterns. Taking a single motif in the pattern p3[3]1 it has a symmetry operation 3', consisting of a rotation by 120° and a cyclical permutation of the three colours white, green and red as shown in the animation. This pattern p3[3]1 has the same colour symmetry as M. C. Escher's Hexagonal tessellation with animals: study of regular division of the plane with reptiles (1939). Escher reused the design in his 1943 lithograph Reptiles. Group theory Initial research by Wittke and Garrido (1959) and by Niggli and Wondratschek (1960) identified the relation between the colour groups of an object and the subgroups of the object's geometric symmetry group. In 1961 van der Waerden and Burckhardt built on the earlier work by showing that colour groups can be defined as follows: in a colour group of a pat
https://en.wikipedia.org/wiki/Cytogenetics
Cytogenetics is essentially a branch of genetics, but is also a part of cell biology/cytology (a subdivision of human anatomy), that is concerned with how the chromosomes relate to cell behaviour, particularly to their behaviour during mitosis and meiosis. Techniques used include karyotyping, analysis of G-banded chromosomes, other cytogenetic banding techniques, as well as molecular cytogenetics such as fluorescence in situ hybridization (FISH) and comparative genomic hybridization (CGH). History Beginnings Chromosomes were first observed in plant cells by Carl Nägeli in 1842. Their behavior in animal (salamander) cells was described by Walther Flemming, the discoverer of mitosis, in 1882. The name was coined by another German anatomist, von Waldeyer in 1888. The next stage took place after the development of genetics in the early 20th century, when it was appreciated that the set of chromosomes (the karyotype) was the carrier of the genes. Levitsky seems to have been the first to define the karyotype as the phenotypic appearance of the somatic chromosomes, in contrast to their genic contents. Investigation into the human karyotype took many years to settle the most basic question: how many chromosomes does a normal diploid human cell contain? In 1912, Hans von Winiwarter reported 47 chromosomes in spermatogonia and 48 in oogonia, concluding an XX/XO sex determination mechanism. Painter in 1922 was not certain whether the diploid number of humans was 46 or 48, at first favoring 46. He revised his opinion later from 46 to 48, and he correctly insisted on humans having an XX/XY system of sex-determination. Considering their techniques, these results were quite remarkable. In science books, the number of human chromosomes remained at 48 for over thirty years. New techniques were needed to correct this error. Joe Hin Tjio working in Albert Levan's lab was responsible for finding the approach: Using cells in culture Pre-treating cells in a hypotonic solution, whi
https://en.wikipedia.org/wiki/Partition%20%28politics%29
In politics, a partition is a change of political borders cutting through at least one territory considered a homeland by some community. History Brendan O'Leary distinguishes partition from secession, which take place within existing recognized political units. For Dubnov and Robson, partition is the physical division of territory along ethno-religious lines into separate nation-states. They locate partition in the context of post-World War I peacebuilding and the "new conversations surrounding ethnicity, nationhood, and citizenship" that emerged out of it. The post-war agreements, such as the League of Nations mandate system, promoted "a new political language of ethnic separatism as a central aspect of national self-determination, while protecting and disguising continuities and even expansions of French and, especially, British imperial powers. While Ranabir Samaddar identifies the Dissolution of Austria-Hungary as an example of partition, resulting from competing national ambitions, he agrees partition gained prominence following World War I, particularly with the division of the Ottoman Empire. By this point, he argues ethnicity had become the primary justification of border proposals. After World War II, Dubnov and Robson argue partition transformed from "an imperial tactic into an organizing principle" of world diplomacy". Scholarship has closely linked partition to violence. Tracing the precedent for the Partition of Ireland in population resettlements across former Ottoman Empire territories and the making of national 'majorities' and 'minorities', Dubnov and Robson emphasise how partitions after Ireland contained proposals to transfer "inconvenient populations in addition to forcible territorial division into separate states", which they note had violent consequences for local actors who were devolved the task of "carving out physically separate political entities on the ground and making them ethnically homogenous". T.G. Fraser notes how Britain p
https://en.wikipedia.org/wiki/Spinlock
In software engineering, a spinlock is a lock that causes a thread trying to acquire it to simply wait in a loop ("spin") while repeatedly checking whether the lock is available. Since the thread remains active but is not performing a useful task, the use of such a lock is a kind of busy waiting. Once acquired, spinlocks will usually be held until they are explicitly released, although in some implementations they may be automatically released if the thread being waited on (the one that holds the lock) blocks or "goes to sleep". Because they avoid overhead from operating system process rescheduling or context switching, spinlocks are efficient if threads are likely to be blocked for only short periods. For this reason, operating-system kernels often use spinlocks. However, spinlocks become wasteful if held for longer durations, as they may prevent other threads from running and require rescheduling. The longer a thread holds a lock, the greater the risk that the thread will be interrupted by the OS scheduler while holding the lock. If this happens, other threads will be left "spinning" (repeatedly trying to acquire the lock), while the thread holding the lock is not making progress towards releasing it. The result is an indefinite postponement until the thread holding the lock can finish and release it. This is especially true on a single-processor system, where each waiting thread of the same priority is likely to waste its quantum (allocated time where a thread can run) spinning until the thread that holds the lock is finally finished. Implementing spinlocks correctly is challenging because programmers must take into account the possibility of simultaneous access to the lock, which could cause race conditions. Generally, such an implementation is possible only with special assembly language instructions, such as atomic (i.e. un-interruptible) test-and-set operations and cannot be easily implemented in programming languages not supporting truly atomic operations
https://en.wikipedia.org/wiki/DiProDB
DiProDB is a database designed to collect and analyse thermodynamic, structural and other dinucleotide properties. See also Protein database (disambiguation)
https://en.wikipedia.org/wiki/Index%20of%20information%20theory%20articles
This is a list of information theory topics. A Mathematical Theory of Communication algorithmic information theory arithmetic coding channel capacity Communication Theory of Secrecy Systems conditional entropy conditional quantum entropy confusion and diffusion cross-entropy data compression entropic uncertainty (Hirchman uncertainty) entropy encoding entropy (information theory) Fisher information Hick's law Huffman coding information bottleneck method information theoretic security information theory joint entropy Kullback–Leibler divergence lossless compression negentropy noisy-channel coding theorem (Shannon's theorem) principle of maximum entropy quantum information science range encoding redundancy (information theory) Rényi entropy self-information Shannon–Hartley theorem Information theory Information theory topics
https://en.wikipedia.org/wiki/Sommerfeld%20effect
In mechanics, Sommerfeld effect is a phenomenon arising from feedback in the energy exchange between vibrating systems: for example, when for the rocking table, under given conditions, energy transmitted to the motor resulted not in higher revolutions but in stronger vibrations of the table. It is named after Arnold Sommerfeld. In 1902, A. Sommerfeld analyzed the vibrations caused by a motor driving an unbalanced weight and wrote that "This experiment corresponds roughly to the case in which a factory owner has a machine set on a poor foundation running at 30 horsepower. He achieves an effective level of just 1/3, however, because only 10 horsepower are doing useful work, while 20 horsepower are transferred to the foundational masonry". First mathematical descriptions of Sommerfeld effect were suggested by I. Blekhman and V. Konenko. Hidden attractors in Sommerfeld effect In the theory of hidden oscillations, Sommerfeld effect is explained by the multistability and presence in the phase space of dynamical model without stationary states of two coexisting hidden attractors, one of which attracts trajectories from vicinity of zero initial data (which correspond to the typical start up of the motor), and the other attractor corresponds to the desired mode of operation with a higher frequency of rotation. Depending on the model under consideration, coexisting hidden attractors in the model may be either periodic or chaotic; such dynamical models with Sommerfeld effect are the earliest known mechanical example of a system without equilibria and with hidden attractors. For example, the Sommerfeld effect with hidden attractors can be observed in dynamic models of drilling rigs, where the electric motor may excite torsional vibrations of the drill.
https://en.wikipedia.org/wiki/RTLinux
RTLinux is a hard realtime real-time operating system (RTOS) microkernel that runs the entire Linux operating system as a fully preemptive process. The hard real-time property makes it possible to control robots, data acquisition systems, manufacturing plants, and other time-sensitive instruments and machines from RTLinux applications. The design was patented. Despite the similar name, it is not related to the Real-Time Linux project of the Linux Foundation. RTLinux was developed by Victor Yodaiken, Michael Barabanov, Cort Dougan and others at the New Mexico Institute of Mining and Technology and then as a commercial product at FSMLabs. Wind River Systems acquired FSMLabs embedded technology in February 2007 and made a version available as Wind River Real-Time Core for Wind River Linux. As of August 2011, Wind River has discontinued the Wind River Real-Time Core product line, effectively ending commercial support for the RTLinux product. Background The key RTLinux design objective was to add hard real-time capabilities to a commodity operating system to facilitate the development of complex control programs with both capabilities. For example, one might want to develop a real-time motor controller that used a commodity database and exported a web operator interface. Instead of attempting to build a single operating system that could support real-time and non-real-time capabilities, RTLinux was designed to share a computing device between a real-time and non-real-time operating system so that (1) the real-time operating system could never be blocked from execution by the non-real-time operating system and (2) components running in the two different environments could easily share data. As the name implies RTLinux was originally designed to use Linux as the non-real-time system but it eventually evolved so that the RTCore real-time kernel could run with either Linux or Berkeley Software Distribution (BSD) Unix. Multi-Environment Real-Time (MERT) was the first exam
https://en.wikipedia.org/wiki/Cohn%27s%20theorem
In mathematics, Cohn's theorem states that a nth-degree self-inversive polynomial has as many roots in the open unit disk as the reciprocal polynomial of its derivative. Cohn's theorem is useful for studying the distribution of the roots of self-inversive and self-reciprocal polynomials in the complex plane. An nth-degree polynomial, is called self-inversive if there exists a fixed complex number ( ) of modulus 1 so that, where is the reciprocal polynomial associated with and the bar means complex conjugation. Self-inversive polynomials have many interesting properties. For instance, its roots are all symmetric with respect to the unit circle and a polynomial whose roots are all on the unit circle is necessarily self-inversive. The coefficients of self-inversive polynomials satisfy the relations. In the case where a self-inversive polynomial becomes a complex-reciprocal polynomial (also known as a self-conjugate polynomial). If its coefficients are real then it becomes a real self-reciprocal polynomial. The formal derivative of is a (n − 1)th-degree polynomial given by Therefore, Cohn's theorem states that both and the polynomial have the same number of roots in See also Geometrical properties of polynomial roots
https://en.wikipedia.org/wiki/Synthetic%20ion%20channels
Synthetic ion channels are de novo chemical compounds that insert into lipid bilayers, form pores, and allow ions to flow from one side to the other. They are man-made analogues of natural ion channels, and are thus also known as artificial ion channels. Compared to biological channels, they usually allow fluxes of similar magnitude but are minuscule in size (less than 5k Dalton vs. > 100k Dalton), diverse in molecular architecture, and may rely on diverse supramolecular interactions to pre-form the active, conducting structures. Synthetic channels, like natural channels, are usually characterized by a combination of single-molecule (e.g., voltage-clamp of planar bilayers) and ensemble techniques (flux in vesicles). The study of synthetic ion channels can potentially lead to new single-molecule sensing technologies as well as new therapeutics. History While semi-synthetic ion channels, often based on modified peptidic channels like gramicidin, had been prepared since the 1970s, the first attempt to prepare a synthetic ion channel was made in 1982 using a substituted β-cyclodextrin. Inspired by gramicidin, this molecule was designed to be a barrel-shaped entity spanning a single leaflet of a bilayer membrane, becoming "active" only when two molecules in opposite leaflets come together in an end-to-end fashion. While the compound does induce ion-fluxes in vesicles, the data does not unambiguously show channel formation (as opposed to other transport mechanisms; see Mechanism). Na+ transport by such channels was first reported by two groups of investigators in 1989–1990. With the adoption of voltage clamp technique to synthetic channel research in the early 1990s, researchers were able to observe quantized electrical activities from synthetic molecules, often considered the signature evidence for ion channels. This led to a sustained increase in research activity over the next two decades. In 2009, over 25 peer-reviewed papers were published on the topic
https://en.wikipedia.org/wiki/John%20Armstrong%20%28model%20railroader%29
John H. Armstrong (November 18, 1920 – July 28, 2004) was a mechanical engineer, inventor, editor, prolific author, and model railroader best known for layout design and operations. He was married for 44 years to Ellen Palmer. They had four children. Early life He was born and raised in Canandaigua, New York, and began designing his Canandaigua Southern Railroad model layout when he was 14 years old. After earning a mechanical engineering degree from Purdue University, he settled in Silver Spring, Maryland in the late 1940s. He was employed at the Naval Ordnance Laboratory of the United States Navy in White Oak, Maryland and contributed to the design of weapons systems for nuclear submarines. Following his retirement from the Navy, he was a contributing editor for Railway Age magazine for ten years. Model railroad construction In evenings and on weekends he began building his Canandaigua Southern Railroad O scale layout in the basement of the modest Armstrong family home, carefully cutting the cross-ties from balsa wood, setting them on rail-beds made from scale-sized gravel, and then laying out each length of track and carefully nailing it into place with tiny railroad spikes to scale that were hammered into the cross-ties one at a time. Armstrong was extensively published in the U.S. railroading press, publishing 13 books including "Railroad: What It Is, What It Does" (1978) and articles for Model Railroader and trade magazine Railway Age. He has over 300 references in the Trains Magazine Index. Influence on the hobby Armstrong wrote many books and articles on the subject of railroading and model railroading. His personal model railroad, the Canandaigua Southern, was the subject of many newspaper and magazine articles by other writers. In the late 1940s, Armstrong submitted a track plan to a contest sponsored by the magazine Model Railroader. His plan was so successful that it led to an invitation to contribute an article to the magazine on the Canandaig
https://en.wikipedia.org/wiki/Creatures%202
Creatures 2 is the second game in the Creatures artificial life game series made by Creature Labs, and the sequel to the 1996 game Creatures. It features three species: the cute, dependent Norns, the cantankerous Grendels and the industrious Ettins. The game tries to simulate life, and includes a complex two-dimensional ecology of plants, animals and insects, which provide the environment for the three main species to live and develop in. The player interacts with the world using a hand-shaped cursor, and tries to encourage the creatures' development by manipulating various objects around the world, guiding the creatures using the cursor and encouraging the creatures to speak. Many new gameplay features included in Creatures 2 not present in the original game include a new physics model and a global weather system, along with brand new applets and a world twice the size of the Creatures 1 world. The executable file for the game was in fact an interpreter for its scripting language, thus allowing users to make total conversions or derivative works from the game. Gameplay Like the other games in the series, Creatures 2 is mostly open-ended, with no predetermined goals, allowing the player to raise Norns at their own pace. In each new world, the player begins in the incubator cavern area. The hatchery, one of the game's in-built applets, allows the player to add Norn eggs to the world, which can be hatched via the incubator. Norns may also be downloaded from the internet and imported in to use in the game. Once Norns reach adolescence at around an hour old, they are ready to breed, and female Norns will begin an oestrogen cycle. When Norns mate, there is a long kissing sound with a pop at the end – known as a kisspop. Like real life, not all mating results in a pregnancy. The Norn reproductive process can be monitored via the Breeder's Kit. At the beginning, the player is only able to navigate a small area of Albia. As their Norns explore the world, however, more
https://en.wikipedia.org/wiki/Resin%20%28software%29
Resin is a web server and Java application server from Caucho Technology. In addition to Resin (GPL), Resin Pro is available for enterprise and production environments with a license. Resin supports the Java EE standard as well as a mod_php/PHP like engine called Quercus. While Resin (GPL) is free for use in production, Resin Pro includes optimizations such as: built-in caching public/private/or hybrid clustering advanced administration health system HTTP session replication distributed cache replication auto-recovery & diagnostic reports Although a Java-based server, key pieces of Resin's core networking are written in highly optimized C. Caucho states Java is the layer that allows Resin to be "full featured" while C provides the speed. Resin, which was released in 1999, predates Apache Tomcat, and is one of the most mature application servers and web servers. Product features Resin Pro has been engineered to include: Dynamic Clustering- Locking was replaced with non-locking atomic operations, cleared contention bottlenecks, improved the async/epoll performance, and reduced thread overhead to handle 100,000 requests per second. Cloud Support- Elastic cluster members can be added or removed using a single command. Cluster topology, load balancing, caching, messaging and management automatically adapt to dynamic servers. Compiled PHP on the JVM- Improves performance, scalability and security of PHP applications by allowing PHP code to directly call Java Objects. Security though Open SSL integration- A comprehensive security framework for application authentication, authorization and transport level SSL based security. Smart Software Load balancer- Application load is shared among resources automatically to balance them. Proxy cache- Faster application performance is possible with Java caching by saving the results of long calculations and reducing database load and application response time. Scalability Elastic Clustering / Cloud support 3rd gene
https://en.wikipedia.org/wiki/Bohr%20Festival
The Bohr Festival () was a series of seven lectures given by Niels Bohr 12 to 22 June 1922 at the Institute of Theoretical Physics in Göttingen. These were the Wolfskehl Lectures, funded by the Wolfskehl Foundation. Taking place in the fortnight leading up to the Göttingen International Handel Festival, it became known as the Bohr Festival. In 1991, Friedrich Hund suggested that James Franck responsible for the comparison. In the lectures Bohr outlined the current development of the Bohr-Sommerfeld theory, remarking "how incomplete and uncertain everything still is".
https://en.wikipedia.org/wiki/Spatial%20distribution
A spatial distribution in statistics is the arrangement of a phenomenon across the Earth's surface and a graphical display of such an arrangement is an important tool in geographical and environmental statistics. A graphical display of a spatial distribution may summarize raw data directly or may reflect the outcome of a more sophisticated data analysis. Many different aspects of a phenomenon can be shown in a single graphical display by using a suitable choice of different colours to represent differences. One example of such a display could be observations made to describe the geographic patterns of features, both physical and human across the earth. The information included could be where units of something are, how many units of the thing there are per units of area, and how sparsely or densely packed they are from each other. Patterns of spatial distribution Usually, for a phenomenon that changes in space, there is a pattern that determines the location of the subject of the phenomenon and its intensity or size, in X and Y coordinates. The scientific challenge is trying to identify the variables that affect this pattern. The issue can be demonstrated with several simple examples: The spatial distribution of the human population The spatial distribution of the population and development are closely related to each other, especially in the context of sustainability. The challenges related to the spatial spread of a population include: rapid urbanization and population concentration, rural population, urban management and poverty housing, displaced persons and refugees. Migration is a basic element in the spatial distribution of a population, and it may remain a key driver in the coming decades, especially as an element of urbanization in developing countries. The spatial distribution of economic activity in the world In a pair of studies from Brown University by urban economist J. Vernon Henderson, with co-authors Adam Storeygard and David Weil, the spati
https://en.wikipedia.org/wiki/Polycentric%20networks
In public policy a polycentric network is a group of distinct local, regional, or national entities that work co-operatively towards a common goal. Proponents claim that such networks can better adapt to changing issues collectively than individually, thus providing network participants better results from relevant efforts. Urban contexts Robert Kloosterman and Bart Lambregts define polycentric urban regions as collections of historically distinct jurisdictions that are administratively and politically independent. These jurisdictions are in close proximity and well connected through infrastructure. The literature on polycentric urban regions is limited and unconsolidated, so diverse concepts exist. Evert Meijers claimed that polycentric network are especially prominent in Europe. Rural polycentric networks are nearly non-existent. Urban polycentric networks draw heavily on economic network theories. According to Meijers, “individual cities in these collections of distinct but proximally-located cities relate to each other in a synergetic way, making the whole network of cities more than the sum of its parts”. Implementation Polycentric networks have different spatial characteristics, reflecting a micro, meso, or macro-level of connections in a given region. These different scales allow flexible and convertible networks for spatial planning in complex regions and systems. Micro-level: intra-urban or intra-regional aspects within a certain city region. The emphasis at this level is “urban functional and economic complementarities” which make “cooperation and improved links” major engines of regional economic performance and “promote integrated spatial development strategies for city clusters”. Meso-level: inter-metropolitan issues within a delimited area. The emphasis at this level is very similar to the micro-level, with added specialization. Macro-level: inter-metropolitan issues on a continental or global scale. At the macro level polycentricism is con
https://en.wikipedia.org/wiki/WUHF
WUHF (channel 31) is a television station in Rochester, New York, United States, affiliated with the Fox network. It is owned by Sinclair Broadcast Group, which provides certain services to dual ABC/CW affiliate WHAM-TV (channel 13) under a local marketing agreement (LMA) with Deerfield Media. Both stations share studios on West Henrietta Road (NY 15) in Henrietta (with a Rochester mailing address), while WUHF's transmitter is located on Pinnacle Hill on the border between Rochester and Brighton. History WUHF began operations on January 27, 1980, as a general entertainment independent station running cartoons, sitcoms (classic and recent), movies, drama series, and religious programs. It was, at the time, the only independent station in the Rochester market. The station was owned by Malrite and the General Manager was Jerry Carr who was the former The Weather Outside personality. Apparently, by sheer coincidence, the station re-used a call sign which was previously used by a different and unrelated station which operated on the same channel 31, albeit in New York City. The latter station had only used the WUHF calls for its first year of experimental operation (1961–62); it is now Ion Television owned-and-operated station WPXN-TV. In 1983, former underground cartoonist Brian Bram produced and hosted All Night Live, a program aired live from midnight to 7 a.m. on Fridays and Saturdays. Bram's show was a showcase for regional bands including Personal Effects, Cousin Al and the Relatives, and The Degrads. On October 9, 1986, WUHF became a charter affiliate of Fox for Rochester and was branded as "Fox 31". Most of the religious shows were gone by then. However, WUHF was initially still programmed as an independent station since Fox would only air one program, The Late Show Starring Joan Rivers until April 1987, and even then, would not present an entire week's worth of programming until 1993. In 1989, Act III Broadcasting bought the station from Malrite Communication
https://en.wikipedia.org/wiki/G-strain
In EPR spectroscopy, g-strain refers to broadening of g-values owing to small sample inhomogeneity owing to slight variations in the orientation of the paramagnetic centers. The phenomenon is indicated by broadening of the g-values that depends on the frequency of the spectrometer, such as X- or Q-band. If the line width were determined only by hyperfine coupling (which are field-independent), then the line widths would also be field independent, but they often are not. In iron-sulfur proteins, some other metalloproteins, as well as some solids, g-strain can be substantial.
https://en.wikipedia.org/wiki/Fibre%20Channel%20electrical%20interface
The Fibre Channel electrical interface is one of two related Fibre Channel standards that can be used to physically interconnect computer devices. The other standard is a Fibre Channel optical interface, which is not covered in this article. Fibre Channel signal characteristics Fibre channel electrical signals are sent over a duplex differential interface. This usually consists of twisted-pair cables with a nominal impedance of 75 ohms (single-ended) or 150 ohms (differential). This is a genuine differential signalling system so no ground reference is carried through the cable, except for the shield. Signalling is AC-coupled, with the series capacitors located at the transmitter end of the link. The definition of the Fibre Channel signalling voltage is complex. Eye-diagrams are defined for both the transmitter and receiver. There are many eye-diagram parameters which must all be met to be compliant with the standard. In simple terms, the transmitter circuit must output a signal with a minimum of 600 mV peak-to-peak differential, maximum 2000 mV peak-to-peak differential. A good signal looks rather like a sine-wave with a fundamental frequency of half the data rate, so 1 GHz for a typical system running at 2 gigabits per second. The Bit-Error Rate (BER) objective for Fibre Channel systems is 1 in 1012 (1 bit in 1,000,000,000,000 bits). At 2 Gbit/s this equates to seven errors per hour. Therefore, this is a common event and the receiver circuitry must contain error-handling logic. In order to achieve such a low error-rate, jitter "budgets" are defined for the transmitter and cables. Fibre Channel connector pinouts There are various Fibre Channel connectors in use in the computer industry. Details of their pinouts are distributed between different official documents. The following sections describe the most common Fibre Channel pinouts with some comments about the purpose of their electrical signals. The most familiar Fibre Channel connectors are
https://en.wikipedia.org/wiki/Zilch%20%28company%29
Zilch is a London-based BNPL payment platform. Co-founded in 2018 by Philip Belamant and Sean O’Connor, it was publicly launched in September 2020. As of 2023, Zilch has over 3 million users. In April 2021 Zilch announced a valuation of over $500 million. Shortly after in July 2021, the company announced a funding deal partnership with Goldman Sachs and DMG Ventures. In October 2021 Zilch reported an overall valuation of $2 billion, making it the fastest company in Europe to reach double unicorn status. In April 2022 Zilch and Experian announced a partnership to begin reciprocal reporting of credit information in BNPL. Zilch extended its funding in June 2022, bringing the total amount to $193 million. Zilch partnered with StepChange in February 2023, working to implement StepChange Direct's service directly into its platform.
https://en.wikipedia.org/wiki/Spread-spectrum%20time-domain%20reflectometry
Spread-spectrum time-domain reflectometry (SSTDR) is a measurement technique to identify faults, usually in electrical wires, by observing reflected spread spectrum signals. This type of time-domain reflectometry can be used in various high-noise and live environments. Additionally, SSTDR systems have the additional benefit of being able to precisely locate the position of the fault. Specifically, SSTDR is accurate to within a few centimeters for wires carrying 400 Hz aircraft signals as well as MIL-STD-1553 data bus signals. AN SSTDR system can be run on a live wire because the spread spectrum signals can be isolated from the system noise and activity. At the most basic level, the system works by sending spread spectrum signals down a wireline and waiting for those signals to be reflected back to the SSTDR system. The reflected signal is then correlated with a copy of the sent signal. Mathematical algorithms are applied to both the shape and timing of the signals to locate either the short or the end of an open circuit. Detecting intermittent faults in live wires Spread-spectrum time domain reflectometry is used in detecting intermittent faults in live wires. From buildings and homes to aircraft and naval ships, this technology can discover irregular shorts on live wire running 400 Hz, 115 V. For accurate location of a wiring system's fault the SSTDR associates the PN code with the signal on the line then stores the exact location of the correlation before the arc dissipates. Present SSTDR can collect a complete data set in under 5 ms. SSTDR technology allows for analysis of a network of wires. One SSTDR sensor can measure up to 4 junctions in a branched wire system. See also Spread spectrum Time-domain reflectometry
https://en.wikipedia.org/wiki/Combustion%20models%20for%20CFD
Combustion models for CFD refers to combustion models for computational fluid dynamics. Combustion is defined as a chemical reaction in which a hydrocarbon fuel reacts with an oxidant to form products, accompanied with the release of energy in the form of heat. Being the integral part of various engineering applications like: internal combustion engines, aircraft engines, rocket engines, furnaces, and power station combustors, combustion manifests itself as a wide domain during the design, analysis and performance characteristics stages of the above-mentioned applications. With the added complexity of chemical kinetics and achieving reacting flow mixture environment, proper modeling physics has to be incorporated during computational fluid dynamic (CFD) simulations of combustion. Hence the following discussion presents a general outline of the various adequate models incorporated with the Computational fluid dynamic code to model the process of combustion. Overview Computational fluid dynamics modeling of combustion calls upon the proper selection and implementation of a model suitable to faithfully represent the complex physical and chemical phenomenon associated with any combustion process. The model should be competent enough to deliver information related to the species concentration, their volumetric generation or destruction rate and changes in the parameters of the system like enthalpy, temperature and mixture density. The model should be capable of solving the general transport equations for fluid flow and heat transfer as well as the additional equations of combustion chemistry and chemical kinetics incorporated into that as per the simulating environment desired Critical considerations in combustion phenomenon The major consideration during any general combustion process includes the mixing time scale and the reacting time scale elapsed for the process. The flame type and the type of mixing of flow streams of the constituents also have to be taken into a
https://en.wikipedia.org/wiki/Import%20ratio
Import ratio, in economics and government finance, is the ratio of total imports of a country to that country’s total foreign exchange (FX) reserves. The ratio can be inverted and is referred to as the reserves to imports ratio. This ratio divides a country's average foreign exchange reserve by a country's average monthly level of imports. Relation to sovereign risk Credit restructuring is made more likely by a higher amount of imports relative to FX reserves. A less developed country will pay for imports with its foreign exchange reserves. The more it imports, the faster these reserves are used up. Since satisfying a country's needs is considered more important than repaying foreign creditors the more a country imports relative to its foreign exchange reserves the greater the probability of debt rescheduling.
https://en.wikipedia.org/wiki/WTTO
WTTO (channel 21) is a television station licensed to Homewood, Alabama, United States, serving the Birmingham area as an affiliate of The CW. It is owned by Sinclair Broadcast Group alongside MyNetworkTV affiliate WABM (channel 68) and ABC affiliate WBMA-LD (channel 58). The stations share studios at the Riverchase office park on Concourse Parkway in Hoover (with a Birmingham mailing address), while WTTO's transmitter is located atop Red Mountain, near the Goldencrest neighborhood of southwestern Birmingham. In Tuscaloosa, west Alabama, and the western portions of the Birmingham area, WTTO's CW channel and two subchannels of WBMA-LD are rebroadcast on WDBB (channel 17), which is licensed to Bessemer. It is owned by Cunningham Broadcasting and managed by Sinclair under a local marketing agreement (LMA); however, Sinclair effectively owns WDBB, as the majority of Cunningham's stock is owned by the family of deceased group founder Julian Smith. WTTO had a tortuous history prior to starting operations. It took nearly two decades for the station to be approved and built. Once on air, the station was a successful independent for the Birmingham area. It served as the Fox affiliate for the market from 1990 to 1996, when an affiliation shuffle resulted in the loss of the affiliation. History Early history of UHF channel 21 in central Alabama The UHF channel 21 allocation in Central Alabama was originally allocated to Gadsden. The first television station in the region to occupy the allocation was WTVS, which operated during the 1950s as an affiliate of the DuMont Television Network, and was one of the earliest UHF television stations in the United States. However, it was never able to gain a viewership foothold against the region's other stations; its owners ceased the operations of WTVS in 1957, as it had suffered from severely limited viewership due to the lack of television sets in Central Alabama that were capable of receiving stations on the UHF band (electronics
https://en.wikipedia.org/wiki/NP-completeness
In computational complexity theory, a problem is NP-complete when: It is a decision problem, meaning that for any input to the problem, the output is either "yes" or "no". When the answer is "yes", this can be demonstrated through the existence of a short (polynomial length) solution. The correctness of each solution can be verified quickly (namely, in polynomial time) and a brute-force search algorithm can find a solution by trying all possible solutions. The problem can be used to simulate every other problem for which we can verify quickly that a solution is correct. In this sense, NP-complete problems are the hardest of the problems to which solutions can be verified quickly. If we could find solutions of some NP-complete problem quickly, we could quickly find the solutions of every other problem to which a given solution can be easily verified. The name "NP-complete" is short for "nondeterministic polynomial-time complete". In this name, "nondeterministic" refers to nondeterministic Turing machines, a way of mathematically formalizing the idea of a brute-force search algorithm. Polynomial time refers to an amount of time that is considered "quick" for a deterministic algorithm to check a single solution, or for a nondeterministic Turing machine to perform the whole search. "Complete" refers to the property of being able to simulate everything in the same complexity class. More precisely, each input to the problem should be associated with a set of solutions of polynomial length, whose validity can be tested quickly (in polynomial time), such that the output for any input is "yes" if the solution set is non-empty and "no" if it is empty. The complexity class of problems of this form is called NP, an abbreviation for "nondeterministic polynomial time". A problem is said to be NP-hard if everything in NP can be transformed in polynomial time into it even though it may not be in NP. Conversely, a problem is NP-complete if it is both in NP and NP-hard. The NP
https://en.wikipedia.org/wiki/Partial%20discharge
In electrical engineering, partial discharge (PD) is a localized dielectric breakdown (DB) (which does not completely bridge the space between the two conductors) of a small portion of a solid or fluid electrical insulation (EI) system under high voltage (HV) stress. While a corona discharge (CD) is usually revealed by a relatively steady glow or brush discharge (BD) in air, partial discharges within solid insulation system are not visible. PD can occur in a gaseous, liquid, or solid insulating medium. It often starts within gas voids, such as voids in solid epoxy insulation or bubbles in transformer oil. Protracted partial discharge can erode solid insulation and eventually lead to breakdown of insulation. Discharge mechanism PD usually begins within voids, cracks, or inclusions within a solid dielectric, at conductor-dielectric interfaces within solid or liquid dielectrics, or in bubbles within liquid dielectrics. Since PDs are limited to only a portion of the insulation, the discharges only partially bridge the distance between electrodes. PD can also occur along the boundary between different insulating materials. Partial discharges within an insulating material are usually initiated within gas-filled voids within the dielectric. Because the dielectric constant of the void is considerably less than the surrounding dielectric, the electric field across the void is significantly higher than that across an equivalent distance of dielectric. If the voltage stress across the void is increased above the corona inception voltage (CIV) for the gas within the void, PD activity will start within the void. PD can also occur along the surface of solid insulating materials if the surface tangential electric field is high enough to cause a breakdown along the insulator surface. This phenomenon commonly manifests itself on overhead line insulators, particularly on contaminated insulators during days of high humidity. Overhead lines use air as their insulation medium.
https://en.wikipedia.org/wiki/Huntington%27s%20disease
Huntington's disease (HD), also known as Huntington's chorea, is an incurable neurodegenerative disease that is mostly inherited. The earliest symptoms are often subtle problems with mood or mental/psychiatric abilities. A general lack of coordination and an unsteady gait often follow. It is also a basal ganglia disease causing a hyperkinetic movement disorder known as chorea. As the disease advances, uncoordinated, involuntary body movements of chorea become more apparent. Physical abilities gradually worsen until coordinated movement becomes difficult and the person is unable to talk. Mental abilities generally decline into dementia, depression, apathy, and impulsivity at times. The specific symptoms vary somewhat between people. Symptoms usually begin between 30 and 50 years of age, and can start at any age but are usually seen around the age of 40. The disease may develop earlier in each successive generation. About eight percent of cases start before the age of 20 years, and are known as juvenile HD, which typically present with the slow movement symptoms of Parkinson's disease rather than those of chorea. HD is typically inherited from an affected parent, who carries a mutation in the huntingtin gene (HTT). However, up to 10% of cases are due to a new mutation. The huntingtin gene provides the genetic information for huntingtin protein (Htt). Expansion of CAG repeats of cytosine-adenine-guanine (known as a trinucleotide repeat expansion) in the gene coding for the huntingtin protein results in an abnormal mutant protein (mHtt), which gradually damages brain cells through a number of possible mechanisms. Diagnosis is by genetic testing, which can be carried out at any time, regardless of whether or not symptoms are present. This fact raises several ethical debates: the age at which an individual is considered mature enough to choose testing; whether parents have the right to have their children tested; and managing confidentiality and disclosure of test resul
https://en.wikipedia.org/wiki/Comparison%20of%20programming%20languages%20%28string%20functions%29
String functions are used in computer programming languages to manipulate a string or query information about a string (some do both). Most programming languages that have a string datatype will have some string functions although there may be other low-level ways within each language to handle strings directly. In object-oriented languages, string functions are often implemented as properties and methods of string objects. In functional and list-based languages a string is represented as a list (of character codes), therefore all list-manipulation procedures could be considered string functions. However such languages may implement a subset of explicit string-specific functions as well. For function that manipulate strings, modern object-oriented languages, like C# and Java have immutable strings and return a copy (in newly allocated dynamic memory), while others, like C manipulate the original string unless the programmer copies data to a new string. See for example Concatenation below. The most basic example of a string function is the length(string) function. This function returns the length of a string literal. e.g. length("hello world") would return 11. Other languages may have string functions with similar or exactly the same syntax or parameters or outcomes. For example, in many languages the length function is usually represented as len(string). The below list of common functions aims to help limit this confusion. Common string functions (multi language reference) String functions common to many languages are listed below, including the different names used. The below list of common functions aims to help programmers find the equivalent function in a language. Note, string concatenation and regular expressions are handled in separate pages. Statements in guillemets (« … ») are optional. CharAt { Example in Pascal } var MyStr: string = 'Hello, World'; MyChar: Char; begin MyChar := MyStr[2]; // 'e' # Example in ALGOL 68 # "Hello, W
https://en.wikipedia.org/wiki/Material%20flow%20management
Material flow management (MFM) is a method of efficiently managing materials. The triad of environmental, social and economical orientation makes MFM a tool of high importance in the field of sustainable development and circular economy. Seen historically, material flow management is a tool that can be understood as an implementation-orientated advancement of the methodology of material flow analysis (MFA). MFM was established as a policy tool after the UN Earth Summit conference in Rio de Janeiro 1992. The German Bundestag outlined the targets and specific goals of MFM in a special report by an Enquete Commission. See also Material flow accounting
https://en.wikipedia.org/wiki/Mark%20D.%20McDonnell
Mark Damian McDonnell (born 28 February 1975) is an electronic engineer and mathematician, notable for his work on stochastic resonance and more specifically suprathreshold stochastic resonance. Education McDonnell graduated from the Salesian College, Adelaide. He received a BSc in Mathematical & Computer Sciences (1997), a BE (Hons) in Electrical & Electronic Engineering (1998), and a BSc (Hons) in Applied Mathematics (2001) all from The University of Adelaide, Australia. He received his PhD in Electrical & Electronic Engineering (2006), under Derek Abbott and Charles E. M. Pearce, also from the University of Adelaide, for a thesis entitled Theoretical Aspects of Stochastic Signal Quantisation and Suprathreshold Stochastic Resonance. During the course of his PhD, he was also a visiting scholar at the University of Warwick, UK, under Nigel G. Stocks. Career McDonnell worked as a research assistant in electromagnetic propagation, ice-penetrating radar, and as a computer systems engineer, at the University of Adelaide. His main research interests are in the field of nonlinear signal processing, with applications in computational neuroscience, complex systems, and lossy compression, reliable communication, and coding of noisy signals. Honors In 2002, McDonnell was awarded a D. R. Stranks Fellowship, and in 2003, he was awarded a Santa Fe Institute Complex Systems Fellowship, as well as the AFUW Doreen MacCarthy Bursary. In 2004 he was the recipient of an Australian Academy of Science Young Researcher's Award. He was awarded the Postgraduate Alumni University Medal for his PhD thesis. In 2007, he won a Fresh Science award, the Gertrude Rohan Prize, and an Australian Postdoctoral Fellowship that he took up at the University of South Australia. Books by McDonnell Mark D. McDonnell, Nigel G. Stocks, Charles E. M. Pearce, and Derek Abbott, Stochastic Resonance, Cambridge University Press, 2008, . See also Stochastic resonance Suprathreshold stochastic resonance St
https://en.wikipedia.org/wiki/Data%20virtualization
Data virtualization is an approach to data management that allows an application to retrieve and manipulate data without requiring technical details about the data, such as how it is formatted at source, or where it is physically located, and can provide a single customer view (or single view of any other entity) of the overall data. Unlike the traditional extract, transform, load ("ETL") process, the data remains in place, and real-time access is given to the source system for the data. This reduces the risk of data errors, of the workload moving data around that may never be used, and it does not attempt to impose a single data model on the data (an example of heterogeneous data is a federated database system). The technology also supports the writing of transaction data updates back to the source systems. To resolve differences in source and consumer formats and semantics, various abstraction and transformation techniques are used. This concept and software is a subset of data integration and is commonly used within business intelligence, service-oriented architecture data services, cloud computing, enterprise search, and master data management. Applications, benefits and drawbacks The defining feature of data virtualization is that the data used remains in its original locations and real-time access is established to allow analytics across multiple sources. This aids in resolving some technical difficulties such as compatibility problems when combining data from various platforms, lowering the risk of error caused by faulty data, and guaranteeing that the newest data is used. Furthermore, avoiding the creation of a new database containing personal information can make it easier to comply with privacy regulations. As a result, data virtualization creates new possibilities for data use. However, with data virtualization, the connection to all necessary data sources must be operational as there is no local copy of the data, which is one of the main drawbacks
https://en.wikipedia.org/wiki/Double%20harmonic%20scale
The double harmonic major scale is a musical scale with a flattened second and sixth degree. This is also known as Mayamalavagowla, Bhairav Raga, Byzantine scale, Arabic (Hijaz Kar), and Gypsy major. It can be likened to a gypsy scale because of the diminished step between the 1st and 2nd degrees. Arabic scale may also refer to any Arabic mode, the simplest of which, however, to Westerners, resembles the double harmonic major scale. Details The sequence of steps comprising the double harmonic scale is : half, augmented second, half, whole, half, augmented second, half Or, in relation to the tonic note minor second, major third, perfect fourth and fifth, minor sixth, major seventh, octave However, this scale is commonly represented with the first and last half step each being represented with quarter tones: The non-quarter tone form is identical, in terms of notes, to the North Indian Thaat named Bhairav and the South Indian (Carnatic) Melakarta named Mayamalavagowla. The double harmonic scale is arrived at by either: lowering both the second and sixth of the Ionian mode by a semitone. lowering the second note and raising the third note of the harmonic minor scale by one semitone. raising the seventh of the Phrygian dominant scale (a mode of the harmonic minor scale) by a semitone. The Phrygian dominant in turn is produced by raising the third of the diatonic Phrygian mode (a mode of the major scale) by a semitone. raising the third of the neapolitan minor scale by a semitone. lowering the second note of the harmonic major scale by a semitone. combining the lower half of the Phrygian dominant scale with the upper half of harmonic minor. It is referred to as the "double harmonic" scale because it contains two harmonic tetrads featuring augmented seconds. By contrast, both the harmonic major and harmonic minor scales contain only one augmented second, located between their sixth and seventh degrees. The scale contains a built-in tritone substitution, a domina