source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Zero%20field%20NMR
Zero- to ultralow-field (ZULF) NMR is the acquisition of nuclear magnetic resonance (NMR) spectra of chemicals with magnetically active nuclei (spins 1/2 and greater) in an environment carefully screened from magnetic fields (including from the Earth's field). ZULF NMR experiments typically involve the use of passive or active shielding to attenuate Earth’s magnetic field. This is in contrast to the majority of NMR experiments which are performed in high magnetic fields provided by superconducting magnets. In ZULF experiments the dominant interactions are nuclear spin-spin couplings, and the coupling between spins and the external magnetic field is a perturbation to this. There are a number of advantages to operating in this regime: magnetic-susceptibility-induced line broadening is attenuated which reduces inhomogeneous broadening of the spectral lines for samples in heterogeneous environments. Another advantage is that the low frequency signals readily pass through conductive materials such as metals due to the increased skin depth; this is not the case for high-field NMR for which the sample containers are usually made of glass, quartz or ceramic. High-field NMR employs inductive detectors to pick up the radiofrequency signals, but this would be inefficient in ZULF NMR experiments since the signal frequencies are typically much lower (on the order of hertz to kilohertz). The development of highly sensitive magnetic sensors in the early 2000s including SQUIDs, magnetoresistive sensors, and SERF atomic magnetometers made it possible to detect NMR signals directly in the ZULF regime. Previous ZULF NMR experiments relied on indirect detection where the sample had to be shuttled from the shielded ZULF environment into a high magnetic field for detection with a conventional inductive pick-up coil. One successful implementation was using atomic magnetometers at zero magnetic field working with rubidium vapor cells to detect zero-field NMR. Without a large magnetic fie
https://en.wikipedia.org/wiki/I%27ve%20Got%20My%20Love%20to%20Keep%20Me%20Warm
"I've Got My Love to Keep Me Warm" is a popular song copyrighted in 1937 by its composer, Irving Berlin, and first recorded by (i) Ray Noble (January 5, 1937), Howard Barrie, vocalist; (ii) Red Norvo (January 8, 1937), Mildred Bailey, vocalist; (iii) and Billie Holiday with her orchestra (January 12, 1937). The song – sung by Dick Powell and Alice Faye – debuted on film February 12, 1937, in the musical, On the Avenue. Background The Noble, Norvo, and film renditions were successful that year, as well as the other 1937 recordings that included Billie Holiday and Glen Gray (vocal by Kenny Sargent). Les Brown's instrumental version, arranged by Skip Martin and recorded in 1946 as Columbia #38324, became a million-seller and Billboard top ten song in 1949. Brown said that he got a call from Columbia Records after he performed the song telling him to record it, only to respond that he had recorded it three years earlier. That same year, vocal group The Mills Brothers also had a chart hit with their version on Decca #24550. Other recordings Although not strictly a Christmas song as the lyrics make no mention of the holiday, it has been recorded for many artists' Christmas albums and is a standard part of the holiday song repertoire in the U.S. Artists such as Rosemary Clooney, Doris Day, Dean Martin, Bette Midler, Rod Stewart, Dinah Washington and Idina Menzel (in a duet with Billy Porter) are among those who have covered it. Ella Fitzgerald recorded this for her 1958 Verve release Ella Fitzgerald Sings the Irving Berlin Songbook. During the Big Band era, the song was also recorded by several leading "sweet jazz bands" including Shep Fields and his Rippling Rhythm Orchestra in 1937. Bibliography Annotations Notes
https://en.wikipedia.org/wiki/Kin%20recognition
Kin recognition, also called kin detection, is an organism's ability to distinguish between close genetic kin and non-kin. In evolutionary biology and psychology, such an ability is presumed to have evolved for inbreeding avoidance, though animals do not typically avoid inbreeding. An additional adaptive function sometimes posited for kin recognition is a role in kin selection. There is debate over this, since in strict theoretical terms kin recognition is not necessary for kin selection or the cooperation associated with it. Rather, social behaviour can emerge by kin selection in the demographic conditions of 'viscous populations' with organisms interacting in their natal context, without active kin discrimination, since social participants by default typically share recent common origin. Since kin selection theory emerged, much research has been produced investigating the possible role of kin recognition mechanisms in mediating altruism. Taken as a whole, this research suggests that active powers of recognition play a negligible role in mediating social cooperation relative to less elaborate cue-based and context-based mechanisms, such as familiarity, imprinting and phenotype matching. Because cue-based 'recognition' predominates in social mammals, outcomes are non-deterministic in relation to actual genetic kinship, instead outcomes simply reliably correlate with genetic kinship in an organism's typical conditions. A well-known human example of an inbreeding avoidance mechanism is the Westermarck effect, in which unrelated individuals who happen to spend their childhood in the same household find each other sexually unattractive. Similarly, due to the cue-based mechanisms that mediate social bonding and cooperation, unrelated individuals who grow up together in this way are also likely to demonstrate strong social and emotional ties, and enduring altruism. Theoretical background The English evolutionary biologist W. D. Hamilton's theory of inclusive fitness, a
https://en.wikipedia.org/wiki/Pearl%20barley
Pearl barley, or pearled barley, is barley that has been processed to remove its fibrous outer hull and polished to remove some or all of the bran layer. It is the most common form of barley for human consumption because it cooks faster and is less chewy than other, less-processed forms of the grain such as "hulled barley" (or "barley groats", also known as "pot barley" and "Scotch barley"). Fine barley flour is prepared from milled pearl barley. Pearl barley is similar to wheat in its caloric, protein, vitamin and mineral content, though some varieties are higher in lysine. It is used mainly in soups, stews, and potages. It is the primary ingredient of the Italian dish orzotto and one of the main ingredients of the Jewish dish cholent and the Polish soup krupnik.
https://en.wikipedia.org/wiki/Ulnar%20carpal%20collateral%20ligament
The ulnar collateral ligament (internal lateral ligament, ulnar carpal collateral ligament or ulnar collateral ligament of the wrist joint) is a rounded cord, attached above to the end of the styloid process of the ulna, and dividing below into two fasciculi, one of which is attached to the medial side of the triquetral bone, the other to the pisiform and flexor retinaculum.
https://en.wikipedia.org/wiki/Montia%20fontana
Montia fontana, commonly known as blinks or water blinks, water chickweed or annual water miner's lettuce, is a herbaceous annual plant of the genus Montia. It is a common plant that can be found in wet environments around the globe, from the tropics to the Arctic. It is quite variable in morphology, taking a variety of forms. It is sometimes aquatic. Montia fontana is divided into four subspecies, subsp. fontana, subsp. amporitana, subsp. chondrosperma and subsp. variabilis In some countries like Spain it is consumed as salad and it is highly demanded in markets and restaurants specialized in foraged foods like wild mushrooms.
https://en.wikipedia.org/wiki/D37D
The D37D Minuteman III flight computer was initially supplied with the LGM-30G missile, as part of the NS-20 navigation system. The NS-20 D37D flight computer is a miniaturized general purpose (serial transmission) digital computer. The new NS-50 missile guidance computer (MGC) is built around a 16-bit high-speed microprocessor chip set. They are both designed to solve real-time positional error problems under the adverse conditions encountered in airborne weapon systems. They accept and process data and generate steering signals with sufficient accuracy and speed to meet the requirements of the inertial guidance and flight control systems of the Minuteman ICBMs. Computer operation is controlled by an internally stored program which is loaded from a magnetic tape cartridge at the launch facility (LF). Both the D37D computer and the MGC are designed and programmed to control the Minuteman III missile throughout the powered portion of flight. After thrust termination they also control the PBV for the reentry vehicle (RV) deployment phase. In addition, they control the alignment of the inertial platform and test/monitor the guidance & control (G&C) system and other components to determine continued readiness while missiles are in alert status. The D37D computer began to be replaced by the MGC in 2000 as part of the Guidance Replacement Program (GRP), with fielding planned through 2008. The MGC incorporates the amplifier assembly functions. When a launch is commanded, a complete retesting of the G&C system is made prior to entering the flight program. During flight, the computer uses missile attitude, change of attitude rate, and velocity signal inputs to solve a series of guidance, steering, and control equations. It also generates missile steering commands and controls staging and thrust termination. Finally, the computer determines whether or not to provide pre-arm signals to the warhead. The pre-arm decision is based on flight safety checks made during powered fli
https://en.wikipedia.org/wiki/4-PPBP
4-PPBP is a neuroprotective cyclic amine which binds to sigma receptors. 4-PPBP decreases neuronal nitric oxide synthase (nNOS) activity and ischemia-evoked nitric oxide (NO) production. 4-PPBP provides neuroprotection; this involves the prevention of ischemia-induced intracellular Ca2+ dysregulation. 4-PPBP protects neurons using a mechanism that activates the transcription factor cyclic adenosine monophosphate response element-binding protein (CREB). Neuroprotection that is associated with 4-PPBP increases Bcl-2 expression; Bcl-2 expression is regulated by CREB. See also 3-PPP
https://en.wikipedia.org/wiki/Sigma-2%20receptor
The sigma-2 receptor (σ2R) is a sigma receptor subtype that has attracted attention due to its involvement in diseases such as cancer and neurological diseases. It is currently under investigation for its potential diagnostic and therapeutic uses. Although the sigma-2 receptor was identified as a separate pharmacological entity from the sigma-1 receptor in 1990, the gene that codes for the receptor was identified as TMEM97 only in 2017. TMEM97 was shown to regulate the cholesterol transporter NPC1 and to be involved in cholesterol homeostasis. The sigma-2 receptor is a four-pass transmembrane protein located in the endoplasmic reticulum. It has been found to play a role in both hormone signaling and calcium signaling, in neuronal signaling, in cell proliferation and death, and in binding of antipsychotics. Classification The sigma-2 receptor is located in the lipid raft. The sigma-2 receptor is found in several areas of the brain, including high densities in the cerebellum, motor cortex, hippocampus, and substantia nigra. It is also highly expressed in the lungs, liver, and kidneys. Function The sigma-2 receptor takes part in a number of normal-function roles, including cell proliferation, non-neuronal, and neuronal signaling. Much of sigma-2 receptor function relies on signaling cascades. The receptor's interaction with EGFR and PGRMC1 proteins allow for sigma-2 receptors to play diverse roles within cell through Ras, PLC, and PI3K signaling. Non-neuronal signaling Binding of a number of hormones and steroids, including testosterone, progesterone, and cholesterol, has been found to occur with sigma-2 receptors, though in some cases with lower affinity than to the sigma-1 receptor. Signaling caused by this binding is thought to occur via a calcium secondary messenger and calcium-dependent phosphorylation, and in association with sphingolipids following endoplasmic reticulum release of calcium. Known effects include decrease of expression of effectors in th
https://en.wikipedia.org/wiki/PB-28
PB-28 is an agonist of the sigma-2 receptor. It is derived from cyclohexylpiperazine.
https://en.wikipedia.org/wiki/Shib%20Narayan%20Das
Shib Narayan Das (born 16 December 1946) is the first craftsman of the first national flag of Bangladesh. Designing the national flag of Bangladesh As desired by the then NUCLEUS (Shadhin Bangla Biplobi Parishad) by a match stick and yellow colored paint, he marked the map of Bangladesh over the red circle of the proposed flag in the room numbered 108 of the then Iqbal Hall of Dhaka University (DU) from deep mid-night of June 6 up to the dawn of June 7 in 1970. A. S. M. Abdur Rab, Shahjahan Siraj, Kazi Aref Ahmed, Manirul Islam, Swapan Kumar Choudhury, Quamrul Alam Khan (Khasru), Hasanul Haq Inu, Yousuf Salahuddin Ahmed and some others were also involved in it and present at that time . On behalf of the NUCLEUS, the same flag was unfurled by Abdur Rab along with AFM Mahbubul Haq, Das and other student league leaders from the roof top of the western porch of the Arts Faculty of DU, on March 2, 1971, shortly before the outbreak of the Bangladesh Liberation War. After the independence of Bangladesh in 1971, the map of Bangladesh was removed from the flag as it had difficulty for rendering the map correctly on both sides of the flag. Artist Quamrul Hassan was asked to report on the design, color, shape and explanation. On 17 January 1972, the new design was made the official national flag of Bangladesh.
https://en.wikipedia.org/wiki/Inferior%20ligament%20of%20epididymis
The inferior ligament of the epididymis is a strand of fibrous tissue which is covered by a reflection of the tunica vaginalis and connects the lower aspect of the epididymis with the testis. Sexual anatomy Ligaments
https://en.wikipedia.org/wiki/Superior%20ligament%20of%20epididymis
The superior ligament of the epididymis is a strand of fibrous tissue which is covered by a reflection of the tunica vaginalis and connects the upper aspect of the epididymis with the testis. Sexual anatomy Ligaments
https://en.wikipedia.org/wiki/Obelisk%20posture
The obelisk posture is a handstand-like position that some dragonflies and damselflies assume to prevent overheating on sunny days. The abdomen is raised until its tip points at the sun, minimizing the surface area exposed to solar radiation. When the sun is close to directly overhead, the vertical alignment of the insect's body suggests an obelisk. Function and occurrence Dragonflies may also raise their abdomens for other reasons. For instance, male blue dashers (Pachydiplax longipennis) assume an obelisk-like posture while guarding their territories or during conflicts with other males, displaying the blue pruinescence on their abdomens to best advantage. However, both females and males will raise their abdomens at high temperature and lower them again if shaded. This behavior can be demonstrated in the laboratory by heating captive blue dashers with a 250 watt lamp, and has been shown to be effective in stopping or slowing the rise in their body temperature. The obelisk posture has been observed in about 30 species in the demoiselle, clubtail, and kimmer families. All are "perchers"—sit-and-wait predators that fly up from a perch to take prey and perch again to eat it. Since they spend most of their time stationary, perchers have the most opportunity to thermoregulate by adjusting their position. Other forms of postural thermoregulation Some species, including the dragonhunter (Hagenius brevistylus), reduce exposure to the sun by perching with the abdomen pointed downward, rather than upward. The tropical skimmer dragonfly Diastatops intensa, whose wings are mostly black, points its wings rather than its abdomen at the sun, apparently to reduce the heat they absorb. While flying, some saddlebags gliders (genus Tramea) lower their abdomens into the shade provided by dark patches at the bases of their hindwings. The same behavior has been observed in Pseudothemis zonata, which has a similar hindwing patch. Dragonflies also use postural thermoregul
https://en.wikipedia.org/wiki/Genome-wide%20association%20study
In genomics, a genome-wide association study (GWA study, or GWAS), is an observational study of a genome-wide set of genetic variants in different individuals to see if any variant is associated with a trait. GWA studies typically focus on associations between single-nucleotide polymorphisms (SNPs) and traits like major human diseases, but can equally be applied to any other genetic variants and any other organisms. When applied to human data, GWA studies compare the DNA of participants having varying phenotypes for a particular trait or disease. These participants may be people with a disease (cases) and similar people without the disease (controls), or they may be people with different phenotypes for a particular trait, for example blood pressure. This approach is known as phenotype-first, in which the participants are classified first by their clinical manifestation(s), as opposed to genotype-first. Each person gives a sample of DNA, from which millions of genetic variants are read using SNP arrays. If there is significant statistical evidence that one type of the variant (one allele) is more frequent in people with the disease, the variant is said to be associated with the disease. The associated SNPs are then considered to mark a region of the human genome that may influence the risk of disease. GWA studies investigate the entire genome, in contrast to methods that specifically test a small number of pre-specified genetic regions. Hence, GWAS is a non-candidate-driven approach, in contrast to gene-specific candidate-driven studies. GWA studies identify SNPs and other variants in DNA associated with a disease, but they cannot on their own specify which genes are causal. The first successful GWAS published in 2002 studied myocardial infarction. This study design was then implemented in the landmark GWA 2005 study investigating patients with age-related macular degeneration, and found two SNPs with significantly altered allele frequency compared to healthy cont
https://en.wikipedia.org/wiki/Polymerase%20cycling%20assembly
Polymerase cycling assembly (or PCA, also known as Assembly PCR) is a method for the assembly of large DNA oligonucleotides from shorter fragments. The process uses the same technology as PCR, but takes advantage of DNA hybridization and annealing as well as DNA polymerase to amplify a complete sequence of DNA in a precise order based on the single stranded oligonucleotides used in the process. It thus allows for the production of synthetic genes and even entire synthetic genomes. PCA principles Much like how primers are designed such that there is a forward primer and a reverse primer capable of allowing DNA polymerase to fill the entire template sequence, PCA uses the same technology but with multiple oligonucleotides. While in PCR the customary size of oligonucleotides used is 18 base pairs, in PCA lengths of up to 50 are used to ensure uniqueness and correct hybridization. Each oligonucleotide is designed to be either part of the top or bottom strand of the target sequence. As well as the basic requirement of having to be able to tile the entire target sequence, these oligonucleotides must also have the usual properties of similar melting temperatures, hairpin free, and not too GC rich to avoid the same complications as PCR. During the polymerase cycles, the oligonucleotides anneal to complementary fragments and then are filled in by polymerase. Each cycle thus increases the length of various fragments randomly depending on which oligonucleotides find each other. It is critical that there is complementarity between all the fragments in some way or a final complete sequence will not be produced as polymerase requires a template to follow. After this initial construction phase, additional primers encompassing both ends are added to perform a regular PCR reaction, amplifying the target sequence away from all the shorter incomplete fragments. A gel purification can then be used to identify and isolate the complete sequence. Typical reaction A typical reacti
https://en.wikipedia.org/wiki/Supersymmetry%20algebras%20in%201%20%2B%201%20dimensions
A two dimensional Minkowski space, i.e. a flat space with one time and one spatial dimension, has a two-dimensional Poincaré group IO(1,1) as its symmetry group. The respective Lie algebra is called the Poincaré algebra. It is possible to extend this algebra to a supersymmetry algebra, which is a -graded Lie superalgebra. The most common ways to do this are discussed below. algebra Let the Lie algebra of IO(1,1) be generated by the following generators: is the generator of the time translation, is the generator of the space translation, is the generator of Lorentz boosts. For the commutators between these generators, see Poincaré algebra. The supersymmetry algebra over this space is a supersymmetric extension of this Lie algebra with the four additional generators (supercharges) , which are odd elements of the Lie superalgebra. Under Lorentz transformations the generators and transform as left-handed Weyl spinors, while and transform as right-handed Weyl spinors. The algebra is given by the Poincaré algebra plus where all remaining commutators vanish, and and are complex central charges. The supercharges are related via . , , and are Hermitian. Subalgebras of the algebra The and subalgebras The subalgebra is obtained from the algebra by removing the generators and . Thus its anti-commutation relations are given by plus the commutation relations above that do not involve or . Both generators are left-handed Weyl spinors. Similarly, the subalgebra is obtained by removing and and fulfills Both supercharge generators are right-handed. The subalgebra The subalgebra is generated by two generators and given by for two real numbers and . By definition, both supercharges are real, i.e. . They transform as Majorana-Weyl spinors under Lorentz transformations. Their anti-commutation relations are given by where is a real central charge. The and subalgebras These algebras can be obtained from the subalgebra by removing resp. fr
https://en.wikipedia.org/wiki/Stefan%27s%20formula
In thermodynamics, Stefan's formula says that the specific surface energy at a given interface is determined by the respective enthalpy difference . where σ is the specific surface energy, NA is the Avogadro constant, is a steric dimensionless coefficient, and Vm is the molar volume.
https://en.wikipedia.org/wiki/Center%20%28category%20theory%29
In category theory, a branch of mathematics, the center (or Drinfeld center, after Soviet-American mathematician Vladimir Drinfeld) is a variant of the notion of the center of a monoid, group, or ring to a category. Definition The center of a monoidal category , denoted , is the category whose objects are pairs (A,u) consisting of an object A of and an isomorphism which is natural in satisfying and (this is actually a consequence of the first axiom). An arrow from (A,u) to (B,v) in consists of an arrow in such that . This definition of the center appears in . Equivalently, the center may be defined as i.e., the endofunctors of C which are compatible with the left and right action of C on itself given by the tensor product. Braiding The category becomes a braided monoidal category with the tensor product on objects defined as where , and the obvious braiding. Higher categorical version The categorical center is particularly useful in the context of higher categories. This is illustrated by the following example: the center of the (abelian) category of R-modules, for a commutative ring R, is again. The center of a monoidal ∞-category C can be defined, analogously to the above, as . Now, in contrast to the above, the center of the derived category of R-modules (regarded as an ∞-category) is given by the derived category of modules over the cochain complex encoding the Hochschild cohomology, a complex whose degree 0 term is R (as in the abelian situation above), but includes higher terms such as (derived Hom). The notion of a center in this generality is developed by . Extending the above-mentioned braiding on the center of an ordinary monoidal category, the center of a monoidal ∞-category becomes an -monoidal category. More generally, the center of a -monoidal category is an algebra object in -monoidal categories and therefore, by Dunn additivity, an -monoidal category. Examples has shown that the Drinfeld center of the category of sheaves
https://en.wikipedia.org/wiki/Desmond%20Paul%20Henry
Desmond Paul Henry (1921–2004) was a Manchester University Lecturer and Reader in Philosophy (1949–82). He was one of the first British artists to experiment with machine-generated visual effects at the time of the emerging global computer art movement of the 1960s (The Cambridge Encyclopaedia 1990 p. 289; Levy 2006 pp. 178–180). During this period, Henry constructed a succession of three electro-mechanical drawing machines from modified bombsight analogue computers which were employed in World War II bombers to calculate the accurate release of bombs onto their targets (O'Hanrahan 2005). Henry's machine-generated effects resemble complex versions of the abstract, curvilinear graphics which accompany Microsoft's Windows Media Player. Henry's machine-generated effects may therefore also be said to represent early examples of computer graphics: "the making of line drawings with the aid of computers and drawing machines" (Franke 1971, p. 41). During the 1970s Henry focused on developing his Cameraless Photography experiments. He went on to make a fourth and a fifth drawing machine in 1984 and 2002 respectively. These later machines however, were based on a mechanical pendulum design and not bombsight computers (O'Hanrahan 2005). Artistic career It was thanks to artist L. S. Lowry, working in collaboration with the then director of Salford Art Gallery, A. Frape, that Henry's artistic career was launched in 1961 when he beat a thousand contestants to win a local art competition at Salford Art Gallery, entitled "London Opportunity." The picture that won Henry this prize was one based on his own photo-chemical technique, and not a machine drawing. The prize for winning this competition was a one-man exhibition show in London at the Reid Gallery. Lowry knew how crucial such a London show could be in bringing an artist to public attention. As one of the competition judges, Lowry visited Henry's home in Burford Drive, Manchester, to view his range of artistic work. (O'Hanra
https://en.wikipedia.org/wiki/Remote%20diagnostics
Remote diagnostics is the act of diagnosing a given symptom, issue or problem from a distance. Instead of the subject being co-located with the person or system done diagnostics, with remote diagnostics the subjects can be separated by physical distance (e.g., Earth-Moon). Important information is exchanged either through wire or wireless. When limiting to systems, a general accepted definition is: "To improve reliability of vital or capital-intensive installations and reduce the maintenance costs by avoiding unplanned maintenance, by monitoring the condition of the system remotely." Process elements for remote diagnostics Remotely monitor selected vital system parameters Analysis of data to detect trends Comparison with known or expected behavior data After detected performance degradation, predict the failure moment by extrapolation Order parts and/or plan maintenance, to be executed when really necessary, but in time to prevent a failure or stop Typical uses Medical use (see Remote guidance) Formula One racecars Space (Apollo project and others) Telephone systems like a PABX Connected Cars (https://en.wikipedia.org/wiki/Connected_car) Reasons for use Limit local personnel to a minimum (Gemini, Apollo capsules: too tight to fit all technicians) Limit workload of local personnel Limit risks (exposure to dangerous environments) Central expertise (locally solve small problems, remotely/centralized solve complex problems by experts) Efficiency: reduce travel time to get expert and system or subject together Remote diagnostics and maintenance Remote diagnostics and maintenance refers to both diagnoses of the fault or faults and taking corrective (maintenance) actions, like changing settings to improve performance or prevent problems like breakdown, wear and tear. RDM can replace manpower at location by experts on a central location, in order to save manpower or prevent hazardous situations (space for instance). Increasing globalisation and more an
https://en.wikipedia.org/wiki/Stearyl%20palmityl%20tartrate
Stearyl palmityl tartrate is a derivative of tartaric acid used as an emulsifier. It is produced by esterification of tartaric acid with commercial grade stearyl alcohol, which generally consists of a mixture of the fatty alcohols stearyl and palmityl alcohol. Stearyl palmityl tartrate consists mainly of diesters, with minor amounts of monoester and of unchanged starting materials. Use Stearyl palmityl tartrate is used as emulsifier under the E number E 483. The Food and Agriculture Organization of the United Nations sets limits of use at 4 g/kg for bakery wares and 5 g/kg for dessert products. Law Use of stearyl palmityl tartrate is prohibited in Australia.
https://en.wikipedia.org/wiki/Siegel%27s%20lemma
In mathematics, specifically in transcendental number theory and Diophantine approximation, Siegel's lemma refers to bounds on the solutions of linear equations obtained by the construction of auxiliary functions. The existence of these polynomials was proven by Axel Thue; Thue's proof used Dirichlet's box principle. Carl Ludwig Siegel published his lemma in 1929. It is a pure existence theorem for a system of linear equations. Siegel's lemma has been refined in recent years to produce sharper bounds on the estimates given by the lemma. Statement Suppose we are given a system of M linear equations in N unknowns such that N > M, say where the coefficients are rational integers, not all 0, and bounded by B. The system then has a solution with the Xs all rational integers, not all 0, and bounded by gave the following sharper bound for the X'''s: where D is the greatest common divisor of the M × M minors of the matrix A, and AT is its transpose. Their proof involved replacing the pigeonhole principle by techniques from the geometry of numbers. See also Diophantine approximation
https://en.wikipedia.org/wiki/Calcium%20inosinate
Calcium inosinate is a calcium salt of the nucleoside inosine. Under the E number E633, it is a food additive used as a flavor enhancer. Calcium compounds Flavor enhancers Food additives Nucleotides Purines E-number additives
https://en.wikipedia.org/wiki/Journal%20of%20Physics%3A%20Condensed%20Matter
Journal of Physics: Condensed Matter is a weekly peer-reviewed scientific journal established in 1989 and published by IOP Publishing. The journal covers all areas of condensed matter physics including soft matter and nanostructures. The editor-in-chief is Gianfranco Pacchioni (University of Milano-Bicocca). The journal was formed by the merger of Journal of Physics C: Solid State Physics and Journal of Physics F: Metal Physics in 1989. Abstracting and indexing This journal is abstracted and indexed in: According to the Journal Citation Reports, the journal has a 2022 impact factor of 2.7.
https://en.wikipedia.org/wiki/Project%20Vamp
Project Vamp was a U.S. Navy project for the U.S. Hydrographic Office, which consisted of a special coastal survey along the Virginia and Massachusetts shores during 1954 and 1955. Oceanography
https://en.wikipedia.org/wiki/Mechanical%20explanations%20of%20gravitation
Mechanical explanations of gravitation (or kinetic theories of gravitation) are attempts to explain the action of gravity by aid of basic mechanical processes, such as pressure forces caused by pushes, without the use of any action at a distance. These theories were developed from the 16th until the 19th century in connection with the aether. However, such models are no longer regarded as viable theories within the mainstream scientific community and general relativity is now the standard model to describe gravitation without the use of actions at a distance. Modern "quantum gravity" hypotheses also attempt to describe gravity by more fundamental processes such as particle fields, but they are not based on classical mechanics. Screening This theory is probably the best-known mechanical explanation, and was developed for the first time by Nicolas Fatio de Duillier in 1690, and re-invented, among others, by Georges-Louis Le Sage (1748), Lord Kelvin (1872), and Hendrik Lorentz (1900), and criticized by James Clerk Maxwell (1875), and Henri Poincaré (1908). The theory posits that the force of gravity is the result of tiny particles or waves moving at high speed in all directions, throughout the universe. The intensity of the flux of particles is assumed to be the same in all directions, so an isolated object A is struck equally from all sides, resulting in only an inward-directed pressure but no net directional force. With a second object B present, however, a fraction of the particles that would otherwise have struck A from the direction of B is intercepted, so B works as a shield, so-to-speak—that is, from the direction of B, A will be struck by fewer particles than from the opposite direction. Likewise, B will be struck by fewer particles from the direction of A than from the opposite direction. One can say that A and B are "shadowing" each other, and the two bodies are pushed toward each other by the resulting imbalance of forces. This shadow obeys the inverse
https://en.wikipedia.org/wiki/The%20Facts%20of%20Life%20%28Darlington%20book%29
The Facts of Life is a book published in 1953 by C. D. Darlington of the subject of race, heredity and evolution. Darlington was a major contributor to the field of genetics around the time of the modern synthesis.
https://en.wikipedia.org/wiki/Distributed%20economy
Distributed economies (DE) is a term that was coined by Allan Johansson et al. in 2005. Definition There is no official definition for DE, but it could be described as a regional approach to promote innovation by small and medium-sized enterprises, as well as sustainable development. The concept is illustrated in the figure below, that shows centralised, decentralised and distributed economies respectively. Features The relations in DE are much more complex than those in a centralised economy. This feature makes the whole economy more stable – leaf nodes no longer rely on just one central node. It also resembles ecological networks, making it a good practical example of industrial ecology. A big advantage of DE is that it enables entities within the network to work much more with regional/local natural resources, finances, human capital, knowledge, technology, and so on. It also makes the entities more flexible to respond to the local market needs and thus generating a bigger innovation drive. By doing this, they become a better reflection of their social environment and in that way they can improve quality of life. The whole concept of DE is not at all a new invention – this is how most pre-industrial economies were organised. However, information technology has opened new doors for the concept: information can be shared much more easily and small-scale production facilities (rapid prototyping) are becoming cheaper. The DE concept works well with the development of fab labs. Not all industries are fit for DE; for example, many chemical processes only become economically feasible & efficient on a large scale. On the other hand, bio-energy and consumer products are interesting candidates. See also Decentralized planning (economics) Distributism Long tail Open-design movement Slow design
https://en.wikipedia.org/wiki/IBM%20IntelliStation
The IntelliStation is a family of workstations developed by IBM and first released in March 1997 as the follow-on to the PC Series 360 and 365. Certain IntelliStation M Pro Series were near hardware identical to low end IBM Netfinity 1000 Series network servers (with variants in included video cards and SCSI options). In February 2002, POWER processor-based workstations, previously sold directly under the eServer pSeries brand, were also placed under the IntelliStation umbrella. The last IntelliStation models were discontinued in January 2009, ending the product line. IntelliStation Pro Intel or AMD processor based workstations, discontinued in March 2008. IntelliStation A Pro Type 6224 (March 2004 to July 2005) Dual AMD Opteron Models 244, 246, 248, 250 and 256 (no dual-core support) Up to 16 GB PC3200 memory Ultra320 SCSI or SATA150 HDD 10/100/1000 Mbit Ethernet Graphic adapter options: Nvidia Quadro NVS 280 Nvidia Quadro FX 1100 Nvidia Quadro FX 3000 Nvidia Quadro FX 4000 Type 6217 (April 2005 to April 2007) Dual AMD Opteron Models 250, 252, 254, 256 or dual-core Model 275, 280 or 285 Up to 16 GB PC3200 memory Ultra320 SCSI or SATA150 HDD 10/100/1000 Mbit Ethernet Graphic adapter options: Nvidia Quadro NVS 280 Nvidia Quadro NVS 285 Nvidia Quadro FX 1400 Nvidia Quadro FX 1500 Nvidia Quadro FX 3400 Nvidia Quadro FX 3500 Nvidia Quadro FX 4500 Nvidia Quadro FX 4500 X2 3DLabs Wildcat Realizm 800 IntelliStation E Pro Type 6893 (June 1998 to June 1999) Intel Pentium II at 350, 400 or 450 MHz (100 MHz FSB) Up to 384 MB PC100 memory Ultra Wide SCSI or ATA HDD 10/100 Mbit Ethernet Graphic adapter options: Matrox Millennium II Matrox Millennium G200 3DLabs Permedia 2A Type 6893 (March 1999 to June 2000) Intel Pentium III at 450, 500, 550 or 600 MHz (100 MHz FSB) Up to 768 MB PC100 memory Ultra Wide SCSI or ATA HDD 10/100 Mbit Ethernet Graphic adapter options: Matrox Millennium G200 Matrox Millennium G400 IBM/Diamond Fire GL1 grap
https://en.wikipedia.org/wiki/Pleurotus%20dryinus
Pleurotus dryinus, commonly known as the veiled oyster mushroom, is a species of fungus in the family Pleurotaceae. It grows on dead wood and is also a weak pathogen; infecting especially broad-leaved trees. Naming The species name is a Latinised version of the Greek word "dryinos" (δρύῐνος), meaning "related to oak", which refers to one of its main hosts. The original definition of this fungus as Agaricus dryinus was made by Persoon in 1800. In 1871 in his "Führer in die Pilzkunde" ("Guide to mycology"), Paul Kummer introduced Pleurotus as a genus and defined three similar ringed species: Pleurotus corticatus, Pleurotus Albertini and P. dryinus. They were distinguished because only P. corticatus has intertwined ("anastomosing") gills on the stem and P. Albertini is bigger and grows on conifer wood rather than oak. However, nowadays all three are considered to be forms of the same species. The name dryinus takes precedence because it is the oldest. Also, in 1874 Fries defined a species Pleurotus tephrotrichus, having a deeper grey colour, which again has been incorporated into P. dryinus but may be distinguished as the variety P. dryinus var. tephrotrichus. The English name "Veiled Oyster Mushroom" has been given to this species. Description The following sections use the given references throughout. General The cap, growing to about 13 cm, is pale, beige or (in variety tephrotrichus) greyish; later it can turn yellowish. Veil remnants may adhere to the edge. At first it is velvety (tomentose) and the tomentum can develop into grey-brown scales; in old specimens the surface becomes bare and may crack. The whitish or pale brownish lateral stem may be very short or up to about 8 cm long, generally with a membranous ring. The gills are decurrent well down the stipe and may anastomose (criss-cross) at the lower extreme. They are white or cream. The smell is described as "pleasant" or "slightly polypore-like" or "complex, a bit fruity or sourish". The odour
https://en.wikipedia.org/wiki/List%20of%20Belarusian%20flags
The following is a list of flags of Belarus. State flag Presidential flag Presidential institutions Military flags Governmental flags Subdivision flags Political flags Minority flags Historical flags Former national flag proposals See also Flag of Belarus Coat of arms of Belarus
https://en.wikipedia.org/wiki/Signature-tagged%20mutagenesis
Signature-tagged mutagenesis (STM) is a genetic technique used to study gene function. Recent advances in genome sequencing have allowed us to catalogue a large variety of organisms' genomes, but the function of the genes they contain is still largely unknown. Using STM, the function of the product of a particular gene can be inferred by disabling it and observing the effect on the organism. The original and most common use of STM is to discover which genes in a pathogen are involved in virulence in its host, to aid the development of new medical therapies/drugs. Basic premise The gene in question is inactivated by insertional mutation; a transposon is used which inserts itself into the gene sequence. When that gene is transcribed and translated into a protein, the insertion of the transposon affects the protein structure and (in theory) prevents it from functioning. In STM, mutants are created by random transposon insertion and each transposon contains a different 'tag' sequence that uniquely identifies it. If an insertional mutant bacterium exhibits a phenotype of interest, such as susceptibility to an antibiotic it was previously resistant to, its genome can be sequenced and searched (using a computer) for any of the tags used in the experiment. When a tag is located, the gene that it disrupts is also thus located (it will reside somewhere between a start and stop codon which mark the boundaries of the gene). STM can be used to discover which genes are critical to a pathogen's virulence by injecting a 'pool' of different random mutants into an animal model (e.g. a mouse infection model) and observing which of the mutants survive and proliferate in the host. Those mutant pathogens that don't survive in the host must have an inactivated gene, required for virulence. Hence, this is an example of a negative selection method.
https://en.wikipedia.org/wiki/Multi-user%20MIMO
Multi-user MIMO (MU-MIMO) is a set of multiple-input and multiple-output (MIMO) technologies for multipath wireless communication, in which multiple users or terminals, each radioing over one or more antennas, communicate with one another. In contrast, single-user MIMO (SU-MIMO) involves a single multi-antenna-equipped user or terminal communicating with precisely one other similarly equipped node. Analogous to how OFDMA adds multiple-access capability to OFDM in the cellular-communications realm, MU-MIMO adds multiple-user capability to MIMO in the wireless realm. SDMA, massive MIMO, coordinated multipoint (CoMP), and ad hoc MIMO are all related to MU-MIMO; each of those technologies often leverage spatial degrees of freedom to separate users. Technology MU-MIMO leverages multiple users as spatially distributed transmission resources, at the cost of somewhat more expensive signal processing. In comparison, conventional single-user MIMO (SU-MIMO) involves solely local-device multiple-antenna dimensions. MU-MIMO algorithms enhance MIMO systems where connections among users count greater than one. MU-MIMO may be generalized into two categories: MIMO broadcast channels (MIMO BC) and MIMO multiple-access channels (MIMO MAC) for downlink and uplink situations, respectively. Again in comparison, SU-MIMO may be represented as a point-to-point, pairwise MIMO. To remove ambiguity of the words receiver and transmitter, we can adopt the terms access point (AP) or base station, and user. An AP is the transmitter and a user the receiver for downlink connections, and vice versa for uplink connections. Homogeneous networks are freed from this distinction since they tend to be bi-directional. MIMO broadcast (MIMO BC) MIMO BC represents a MIMO downlink case where a single sender transmits to multiple receivers within the wireless network. Examples of advanced-transmit processing for MIMO BC are interference-aware precoding and SDMA-based downlink user scheduling. For advanced-
https://en.wikipedia.org/wiki/Signal-to-interference%20ratio
The signal-to-interference ratio (SIR or S/I), also known as the carrier-to-interference ratio (CIR or C/I), is the quotient between the average received modulated carrier power S or C and the average received co-channel interference power I, i.e. crosstalk, from other transmitters than the useful signal. The CIR resembles the carrier-to-noise ratio (CNR or C/N), which is the signal-to-noise ratio (SNR or S/N) of a modulated signal before demodulation. A distinction is that interfering radio transmitters contributing to I may be controlled by radio resource management, while N involves noise power from other sources, typically additive white Gaussian noise (AWGN). Carrier-to-noise-and-interference ratio (CNIR) The CIR ratio is studied in interference limited systems, i.e. where I dominates over N, typically in cellular radio systems and broadcasting systems where frequency channels are reused in view to achieve high level of area coverage. The C/N is studied in noise limited systems. If both situations can occur, the carrier-to-noise-and-interference ratio (CNIR or C/(N+I)) may be studied. See also Carrier-to-noise ratio (CNR or C/N) Carrier-to-receiver noise density C/N0 Co-channel interference (CCI) Crosstalk Signal-to-noise ratio (SNR or S/N'') SINAD (ratio of signal-plus-noise-plus-distortion to noise-plus-distortion)
https://en.wikipedia.org/wiki/Auxiliary%20function
In mathematics, auxiliary functions are an important construction in transcendental number theory. They are functions that appear in most proofs in this area of mathematics and that have specific, desirable properties, such as taking the value zero for many arguments, or having a zero of high order at some point. Definition Auxiliary functions are not a rigorously defined kind of function, rather they are functions which are either explicitly constructed or at least shown to exist and which provide a contradiction to some assumed hypothesis, or otherwise prove the result in question. Creating a function during the course of a proof in order to prove the result is not a technique exclusive to transcendence theory, but the term "auxiliary function" usually refers to the functions created in this area. Explicit functions Liouville's transcendence criterion Because of the naming convention mentioned above, auxiliary functions can be dated back to their source simply by looking at the earliest results in transcendence theory. One of these first results was Liouville's proof that transcendental numbers exist when he showed that the so called Liouville numbers were transcendental. He did this by discovering a transcendence criterion which these numbers satisfied. To derive this criterion he started with a general algebraic number α and found some property that this number would necessarily satisfy. The auxiliary function he used in the course of proving this criterion was simply the minimal polynomial of α, which is the irreducible polynomial f with integer coefficients such that f(α) = 0. This function can be used to estimate how well the algebraic number α can be estimated by rational numbers p/q. Specifically if α has degree d at least two then he showed that and also, using the mean value theorem, that there is some constant depending on α, say c(α), such that Combining these results gives a property that the algebraic number must satisfy; therefore any n
https://en.wikipedia.org/wiki/GeckOS
GeckOS is a multitasking operating system for MOS 6502, and compatible processors such as the MOS 6510. The GeckOS operating system is one of the few successful attempts to implement a Unix-like operating system on the 6502 architecture. Overview The system offers some Unix-like functionality including pre-emptive multitasking, multithreading, semaphores, signals, binary relocation, TCP/IP networking via SLIP, and a 6502 standard library. GeckOS includes native support for the Commodore PET (32 KB and 96 KB models), Commodore 64 and the CS/A65 homebrew system. Due to the platform independent nature of the kernel code, GeckOS is advertised as an extremely easy OS to port to alternative 6502 platforms. Binary compatibility with the LUnix operating system can be attained when the lib6502 shared library is used. Due to the small fixed-location stack of the 6502, and because an external MMU is rarely provided, multitasking is somewhat limited. The OS supports a maximum of four tasks when a shared stack space is used. This can be increased to sixteen tasks when stack snapshotting is enabled, although this is done at the expense of some system speed. A webserver is integrated into the SLIP daemon. Unix on 6502 architecture While early versions of Unix ran on for example early model PDP-11 computers that were comparable to Commodore 64 in terms of memory and processor performance there were architecture differences in terms of lack of a kernel mode, only 3 8-bit registers versus eight 16-bit general registers, and a fixed stack. These architectural limitations make implementing a Unix-like operating system on the 6502 challenging. The possible non-exhaustive list of other viable Unix-like implementations on 6502 are LUnix, Asterix (Chris Baird) and ACE (Chris "Polar" Baird). GeckOS arguably is more complete in some respects, with ACE being stronger in terms of standard Unix utilities but weaker in the operating system area.
https://en.wikipedia.org/wiki/Wellcome%20Genome%20Campus
The Wellcome Genome Campus is a scientific research campus built in the grounds of Hinxton Hall, Hinxton in Cambridgeshire, England. Campus The Campus is home to some institutes and organisations in genomics and computational biology. The Campus is part of the Wellcome Trust, a global charitable foundation that exists to improve health, and houses the Wellcome Sanger Institute, the European Bioinformatics Institute (EBI), the bioinformatics outstation of the European Molecular Biology Laboratory (EMBL), and a number of biotech companies whose UK offices are located in the BioData Innovation Centre acting as an incubator for businesses of all sizes. Today, the Campus is a globally significant hub for scientific, business, educational and cultural activities for genomic and biodata sciences. Over the next 15 years, there are significant plans to grow the Campus, extending its facilities to expand its community of scientific talent and business leaders with aligned interests. Campus Facilities As the leading hub of genomic science in Europe, the campus provides a range of facilities and services to support academic research, and businesses that operate in the genomics and biodata market. Its facilities include: 350+ seminars, lectures and training courses a year (free and paid for) to help educate staff and keep their skills current. A state-of-the-art conference centre that attracts thousands of visitors every year. Connecting Science is a group of experts who help the wider public, schools and colleges to learn about and explore genomic science and its impact on research, health and society. A number of cafes and restaurants across the site provide spaces or eating out, informal meetings, relaxing and working. A 15 acre Wetlands Nature Reserve to enjoy and relax in all year around. Hinxton Hall- a beautiful historical setting for conference hosts, VIP events and meetings. A modern gymnasium and an all-weather tennis court. A library with an extensive
https://en.wikipedia.org/wiki/Reciprocal%20Fibonacci%20constant
The reciprocal Fibonacci constant, or ψ, is defined as the sum of the reciprocals of the Fibonacci numbers: The ratio of successive terms in this sum tends to the reciprocal of the golden ratio. Since this is less than 1, the ratio test shows that the sum converges. The value of ψ is known to be approximately . Gosper describes an algorithm for fast numerical approximation of its value. The reciprocal Fibonacci series itself provides O(k) digits of accuracy for k terms of expansion, while Gosper's accelerated series provides O(k 2) digits. ψ is known to be irrational; this property was conjectured by Paul Erdős, Ronald Graham, and Leonard Carlitz, and proved in 1989 by Richard André-Jeannin. The continued fraction representation of the constant is: . See also List of sums of reciprocals
https://en.wikipedia.org/wiki/PEST%20sequence
A PEST sequence is a peptide sequence that is rich in proline (P), glutamic acid (E), serine (S) and threonine (T). It is associated with proteins that have a short intracellular half-life, so might act as a signal peptide for protein degradation. This may be mediated via the proteasome or calpain.
https://en.wikipedia.org/wiki/Listeriolysin%20O
Listeriolysin O (LLO) is a hemolysin produced by the bacterium Listeria monocytogenes, the pathogen responsible for causing listeriosis. The toxin may be considered a virulence factor, since it is crucial for the virulence of L. monocytogenes. Biochemistry Listeriolysin O is a non-enzymatic, cytolytic, thiol-activated, cholesterol-dependent cytolysin; hence, it is activated by reducing agents and inhibited by oxidizing agents. However, LLO differs from other thiol-activated toxins, since its cytolytic activity is maximized at a pH of 5.5. By maximizing activity at a pH of 5.5, LLO is selectively activated within the acidic phagosomes (average pH ~ 5.9) of cells that have phagocytosed L. monocytogenes. After LLO lyses the phagosome, the bacterium escapes into the cytosol, where it can grow intracellularly. Upon release from the phagosome, the toxin has little activity in the more basic cytosol. Furthermore, LLO permits L. monocytogenes to escape from phagosomes into the cytosol without damaging the plasma membrane of the infected cell. This allows the bacteria to live intracellularly, where they are protected from extracellular immune system factors such as the complement system and antibodies. LLO also causes dephosphorylation of histone H3 and deacetylation of histone H4 during the early phases of infection, prior to entry of L. monocytogenes into the host cell. The pore-forming activity is not involved in causing the histone modifications. The alterations of the histones cause the down regulation of genes encoding proteins involved in the inflammatory response. Thus, LLO may be important in subverting the host immune response to L. monocytogenes. A PEST-like sequence is present in LLO and is considered essential for virulence, since mutants lacking the sequence lysed the host cell. However, contrary to PEST's supposed role in protein degradation, evidence suggests that the PEST-like sequence may regulate LLO production in the cytosol rather than increase
https://en.wikipedia.org/wiki/Book%20of%20Optics
The Book of Optics (; or Perspectiva; ) is a seven-volume treatise on optics and other fields of study composed by the medieval Arab scholar Ibn al-Haytham, known in the West as Alhazen or Alhacen (965–c. 1040 AD). The Book of Optics presented experimentally founded arguments against the widely held extramission theory of vision (as held by Euclid in his Optica), and proposed the modern intromission theory, the now accepted model that vision takes place by light entering the eye. The book is also noted for its early use of the scientific method, its description of the camera obscura, and its formulation of Alhazen's problem. The book extensively affected the development of optics, physics and mathematics in Europe between the 13th and 17th centuries. Vision theory Before the Book of Optics was written, two theories of vision existed. The extramission or emission theory was forwarded by the mathematicians Euclid and Ptolemy, who asserted that certain forms of radiation are emitted from the eyes onto the object which is being seen. When these rays reached the object they allowed the viewer to perceive its color, shape and size. An early version of the intromission theory, held by the followers of Aristotle and Galen, argued that sight was caused by agents, which were transmitted to the eyes from either the object or from its surroundings. Al-Haytham offered many reasons against the extramission theory, pointing to the fact that eyes can be damaged by looking directly at bright lights, such as the sun. He wrote that the low probability that the eye can fill the entirety of space as soon as the eyelids are opened as an observer looks up into the night sky. Using the intromission theory as a foundation, he formed his own theory that an object emits rays of light from every point on its surface which then travel in all directions, thereby allowing some light into a viewer's eyes. According to this theory, the object being viewed is considered to be a compilation of an
https://en.wikipedia.org/wiki/Menger%20curvature
In mathematics, the Menger curvature of a triple of points in n-dimensional Euclidean space Rn is the reciprocal of the radius of the circle that passes through the three points. It is named after the Austrian-American mathematician Karl Menger. Definition Let x, y and z be three points in Rn; for simplicity, assume for the moment that all three points are distinct and do not lie on a single straight line. Let Π ⊆ Rn be the Euclidean plane spanned by x, y and z and let C ⊆ Π be the unique Euclidean circle in Π that passes through x, y and z (the circumcircle of x, y and z). Let R be the radius of C. Then the Menger curvature c(x, y, z) of x, y and z is defined by If the three points are collinear, R can be informally considered to be +∞, and it makes rigorous sense to define c(x, y, z) = 0. If any of the points x, y and z are coincident, again define c(x, y, z) = 0. Using the well-known formula relating the side lengths of a triangle to its area, it follows that where A denotes the area of the triangle spanned by x, y and z. Another way of computing Menger curvature is the identity where is the angle made at the y-corner of the triangle spanned by x,y,z. Menger curvature may also be defined on a general metric space. If X is a metric space and x,y, and z are distinct points, let f be an isometry from into . Define the Menger curvature of these points to be Note that f need not be defined on all of X, just on {x,y,z}, and the value cX (x,y,z) is independent of the choice of f. Integral Curvature Rectifiability Menger curvature can be used to give quantitative conditions for when sets in may be rectifiable. For a Borel measure on a Euclidean space define A Borel set is rectifiable if , where denotes one-dimensional Hausdorff measure restricted to the set . The basic intuition behind the result is that Menger curvature measures how straight a given triple of points are (the smaller is, the closer x,y, and z are to being collinear), and this int
https://en.wikipedia.org/wiki/LIGO%20Scientific%20Collaboration
The LIGO Scientific Collaboration (LSC) is a scientific collaboration of international physics institutes and research groups dedicated to the search for gravitational waves. History The LSC was established in 1997, under the leadership of Barry Barish. Its mission is to ensure equal scientific opportunity for individual participants and institutions by organizing research, publications, and all other scientific activities, and it includes scientists from both LIGO Laboratory and collaborating institutions. Barish appointed Rainer Weiss as the first spokesperson. LSC members have access to the US-based Advanced LIGO detectors in Hanford, Washington and in Livingston, Louisiana, as well as the GEO 600 detector in Sarstedt, Germany. Under an agreement with the European Gravitational Observatory (EGO), LSC members also have access to data from the Virgo detector in Pisa, Italy. While the LSC and the Virgo Collaboration are separate organizations, they cooperate closely and are referred to collectively as "LVC". The KAGRA observatory's collaboration has joined the LIGO-Virgo collective, and the LIGO-Virgo-KAGRA collective is called "LVK". The current LSC Spokesperson is Patrick Brady of University of Wisconsin-Milwaukee. The Executive Director of the LIGO Laboratory is David Reitze from the University of Florida. On 11 February 2016, the LIGO and Virgo collaborations announced that they succeeded in making the first direct gravitational wave observation on 14 September 2015. In 2016, Barish received the Enrico Fermi Prize "for his fundamental contributions to the formation of the LIGO and LIGO-Virgo scientific collaborations and for his role in addressing challenging technological and scientific aspects whose solution led to the first detection of gravitational waves". Collaboration members Membership of LIGO Scientific Collaboration as of November 2015 is detailed in the table below. Notes
https://en.wikipedia.org/wiki/Load%20regulation
Load regulation is the capability to maintain a constant voltage (or current) level on the output channel of a power supply despite changes in the supply's load (such as a change in resistance value connected across the supply output). Definitions Load regulation of a constant-voltage source is defined by the equation: Where: is the voltage at maximum load. The maximum load is the one that draws the greatest current, i.e. the lowest specified load resistance (never short circuit); is the voltage at minimum load. The minimum load is the one that draws the least current, i.e. the highest specified load resistance (possibly open circuit for some types of linear supplies, usually limited by pass transistor minimum bias levels); is the voltage at the typical specified load. For a constant-current supply, the above equation uses currents instead of voltages, and the maximum and minimum load values are when the largest and smallest specified voltage across the load are produced. For switching power supplies, the primary source of regulation error is switching ripple, rather than control loop precision. In such cases, load regulation is defined without normalizing to voltage at nominal load and has the unit of volts, not a percentage. Measurement A simple way to manually measure load regulation is to connect three parallel load resistors to the power supply where two of the resistors, R2 and R3, are connected through switches while the other resistor, R1 is connected directly. The values of the resistors are selected such that R1 gives the highest load resistance, R1||R2 gives the nominal load resistance and either R1||R2||R3 or R2||R3 gives the lowest load resistance. A voltmeter is then connected in parallel to the resistors and the measured values of voltage for each load state can be used to calculate the load regulation as given in the equation above. Programmable loads are typically used to automate the measurement of load regulation. See also Line
https://en.wikipedia.org/wiki/Line%20regulation
Line regulation is the ability of a power supply to maintain a constant output voltage despite changes to the input voltage, with the output current drawn from the power supply remaining constant. where ΔVi is the change in input voltage while ΔVo is the corresponding change in output voltage. It is desirable for a power supply to maintain a stable output regardless of changes in the input voltage. The line regulation is important when the input voltage source is unstable or unregulated and this would result in significant variations in the output voltage. The line regulation for an unregulated power supply is usually very high for a majority of operations, but this can be improved by using a voltage regulator. A low line regulation is always preferred. In practice, a well regulated power supply should have a line regulation of at most 0.1%. In the regulator device datasheets the line regulation is expressed as percent change in output with respect to change in input per volt of the output. Mathematically it is expressed as: The unit here is %/V. For example, In the ABLIC Inc. S1206-series regulator device the typical line regulation is expressed as 0.05%/V which means that the change in the output with respect to change in the input of the regulator device is 0.05%, when the output of the device is set at 1V. Moreover, the line regulation of the device expressed in the datasheet is temperature dependent. Usually the datasheets mention line regulation at 25 °C. See also Load regulation Linear regulator Notes Electrical power control
https://en.wikipedia.org/wiki/Grotesque%20%28architecture%29
In architecture, a grotesque () or chimera () is a fantastic or mythical figure carved from stone and fixed to the walls or roof of a building. A chimera is a kind of grotesque in which the figure is a combination of animals (including humans). Grotesque are often called gargoyles, although the term gargoyle refers to figures carved specifically to drain water away from the sides of buildings. In the Middle Ages, the term babewyn was used to refer to both gargoyles and chimerae. This word is derived from the Italian word , which means "baboon". Grotesques often depict whimsical, mythical creatures in dramatic or humorous ways. They have historically been a key element of architecture in many periods including the Renaissance and Medieval periods and have stylistically developed in conjunction with these times. Although grotesques typically depict a wide range of subjects, they are often hybrids of different mythical, human, and animalistic features. Many scholars describe grotesques as being used to ward off evil and as reminders of the separation of the earth and the divine. Grotesques are predominantly carved into buildings of religious significance, in particular churches and cathedrals. Despite their presence in religious spaces, their anthropomorphic designs are largely not directly religious and instead are often more whimsical without religious connotations. They commonly exist on high ledges and rooftops and are frequently positioned out of view from common areas. Prominent examples of preserved grotesques exist on buildings such as the Florence Cathedral and Notre-Dame de Paris. Historically, grotesques have also had significant design influence from sculptural trends and often their architects were originally sculptors or artists. This meant that the widespread emergence of grotesques also often converged with popular art styles that existed at the time, especially the combined rise of the gothic style and the addition of grotesques in architecture. Key
https://en.wikipedia.org/wiki/Case%20series
A case series (also known as a clinical series) is a type of medical research study that tracks subjects with a known exposure, such as patients who have received a similar treatment, or examines their medical records for exposure and outcome. Case series may be consecutive or non-consecutive, depending on whether all cases presenting to the reporting authors over a period were included, or only a selection. When information on more than three patients is included, the case series is considered to be a systematic investigation designed to contribute to generalizable knowledge (i.e., research), and therefore submission is required to an institutional review board (IRB). Case series usually contain demographic information about the patient(s), for example, age, gender, ethnic origin. etc. Case series have a descriptive study design; unlike studies that employ an analytic design (e.g. cohort studies, case-control studies or randomized controlled trials), case series do not, in themselves, involve hypothesis testing to look for evidence of cause and effect (though case-only analyses are sometimes performed in genetic epidemiology to investigate the association between an exposure and a genotype). Case series are especially vulnerable to selection bias; for example, studies that report on a series of patients with a certain illness and/or a suspected linked exposure draw their patients from a particular population (such as a hospital or clinic) which may not appropriately represent the wider population. Internal validity of case series studies is usually very low, due to the lack of a comparator group exposed to the same array of intervening variables. For example, the effects seen may be wholly or partly due to intervening effects such as the placebo effect, Hawthorne effect, Rosenthal effect, time effects, practice effects or the natural history effect. Calculating the difference in effects between two treatment groups assumed to be exposed to a very similar array of
https://en.wikipedia.org/wiki/Atchara
Atchara (also spelled achara or atsara) is a pickle made from grated unripe papaya originating from the Philippines. This dish is often served as a side dish for fried or grilled foods such as pork barbecue. History The name atchara originated from the Indian achar, which was transmitted to the Philippines via the acar of the Indonesia, Malaysia, and Brunei. Preparation The primary ingredient is grated unripe papaya. Carrot slices, julienned ginger, bell pepper, onion and garlic make up the other vegetables. Raisins or pineapple chunks may be added, and chilis, freshly ground black pepper, red pepper flakes, or whole peppercorns complete the mixture. These are then mixed in a solution of vinegar, sugar/syrup, and salt preserves. The mixture is placed in airtight jars where it will keep without refrigeration, however once opened it is preferably kept chilled to maintain its flavour. Variants Atcharang maasim (sour pickles) - is prepared in the same way as normal Atchara except that no sugar is added. Atcharang labóng (pickled bamboo shoots) - are prepared in the same way as Atchara, but use bamboo shoots instead of papaya. Atcharang dampalit (pickled sea purslane) - made from Sesuvium portulacastrum, called dampalit in Tagalog. Atcharang ubod (pickled palm hearts) - made from palm hearts, called ubod in Tagalog. Atcharang sayote (pickled chayote) - made from chayote, bell pepper, carrots, and ginger. See also Philippine condiments
https://en.wikipedia.org/wiki/ACVRL1
Serine/threonine-protein kinase receptor R3 is an enzyme that in humans is encoded by the ACVRL1 gene. ACVRL1 is a receptor in the TGF beta signaling pathway. It is also known as activin receptor-like kinase 1, or ALK1. Function This gene encodes a type I cell-surface receptor for the TGF-beta superfamily of ligands. It shares with other type I receptors a high degree of similarity in serine-threonine kinase subdomains, a glycine- and serine-rich region (called the GS domain) preceding the kinase domain, and a short C-terminal tail. The encoded protein, sometimes termed ALK1, shares similar domain structures with other closely related ALK or activin receptor-like kinase proteins that form a subfamily of receptor serine/threonine kinases. Mutations in this gene are associated with hereditary hemorrhagic telangiectasia (HHT) type 2, also known as Rendu-Osler-Weber syndrome 2. Pathology Germline mutations of ACVRL1 are associated with: hereditary hemorrhagic telangiectasia type 2 (Rendu-Osler-Weber syndrome 2) Pulmonary arteriovenous malformations Somatic mosaicism in ACVRL1 are associated with severe pulmonary arterial hypertension. ACVRL1 directly interacts with low-density lipoprotein (LDL), which implies that it might initiate the early phases of atherosclerosis. Abnormal activity of ACVRL1 has been found to be closely associated with idiopathic pulmonary arterial hypertension. As a drug target Dalantercept is an experimental ALK1 inhibitor. Closely/family related kinases (Not to be confused with anaplastic lymphoma kinase (ALK) ) ALK4 is ACVR1B, ALK7 is ACVR1C, and ALK5 is [part of] the TGF-β type I receptor. See also TGF beta signaling pathway, see summary table for ALK*
https://en.wikipedia.org/wiki/Kolobok
Kolobok (Cyrillic: колобо́к) is the main character of an East Slavic national fairy-tale with the same name, represented as a small yellow spherical bread-like being. The story is often called "Little Round Bun" and sometimes "The Runaway Bun." The fairy tale occurs widely in Slavic regions in a number of variations. A similar fairy tale - with a pancake rolling off - has also been recorded in German and Nordic regions. The plot is similar to that of "The Gingerbread Man" tale in English tradition. The Aarne-Thompson index classifies such stories in a common type: 2025. Etymology The origin of word kolobok is not clear, and has several proposed versions: connected with Proto-Slavic: *klǫbъ ("Something twisted, has a round form, similar to a ball", "club"); has appearance in ("a piece of bread"); from Proto-Slavic: *kolo ("circle", "wheel"), that is, "that which is round and rolling"; from ("kind of wheat bread, pie"); In 2011, Ukraine claimed the origin of kolobok came from the Ukrainian word kolo meaning round, while Russia claimed its origin came from the Russian word kolob, meaning round dough, with risen dough called kolebyatka. There are also many other versions. Plot Kolobok, translated from Russian as "round bun" or referencing the Eastern European Bread, kalach, or an old Russian round palt (based on the Swedish food item of the same name), suddenly comes to life and escapes from the home of an old man and old woman (often referred to as a grandpa and grandma). In the Russian fairy tale, the old man and old woman are very poor, and Kolobok is made with the very last food they scraped together to eat. Chased by the old man and old woman, the fairy tale's plot describes Kolobok's repetitive meetings with various animals (often a rabbit, wolf, and bear) who intend to eat it, but Kolobok cunningly escapes by distracting each animal with song, then running away before it can be caught. With each animal Kolobok sings a repeated song. The words of
https://en.wikipedia.org/wiki/Wallace%20H.%20Coulter%20Department%20of%20Biomedical%20Engineering
The Wallace H. Coulter Department of Biomedical Engineering is a department in the Emory University School of Medicine, Georgia Institute of Technology's College of Engineering, and Peking University College of Engineering dedicated to the study of and research in biomedical engineering, and is named after the pioneering engineer and Georgia Tech alumnus Wallace H. Coulter. The graduate program has consistently ranked 2nd in USNWR rankings, while the undergraduate program ranks 1st in USNWR rankings. History Establishment Georgia Tech Provost and Vice President Michael E. Thomas and the Emory Dean of Medicine Thomas J. Lawley established an Advisory Committee of Georgia Tech and Emory faculty to address new opportunities in biomedical engineering. The Committee met initially on June 2, 1997 and was charged to develop a set of recommendations for an innovative and unique Department of Biomedical Engineering that is joint with Georgia Tech and Emory and that will enable both institutions to maximize research and educational opportunities in fields of intersecting biomedical interest. The Committee was required to report to Drs. Thomas and Lawley no later than August 15, 1997. Naming Recognized as one of the most influential inventors of the twentieth century, Wallace Coulter studied electronics as a student at Georgia Tech in the early 1930s. Recognition The National Academy of Engineering awarded three professors in this department, Wendy C. Newstetter, Joseph M. Le Doux, and Paul Benkeser, with the 2019 Bernard M. Gordon Prize for Innovation in Engineering and Technology Education. They were recognized for "fusing problem-driven engineering education with learning science principles to create a pioneering program that develops leaders in biomedical engineering.”
https://en.wikipedia.org/wiki/Projection%20panel
A Projection panel (also called overhead display or LCD panel) is a device that, although no longer in production, was used as a data projector is today. It works with an overhead projector. The panel consists of a translucent LCD, and a fan to keep it cool. The projection panel sits on the bed of the overhead projector, and acts like a piece of transparency. The panels have a VGA input, and sometimes Composite (RCA) and S-Video input. Later models have remotes, with functions such as 'freeze' which lets you freeze the image, useful for when you want to leave something on the screen whilst you do other things. Earlier models only had 640x480 resolution, while newer ones had up to SVGA resolution. Proxima, one maker of the panels, included a magic wand and sensor, which worked with the sensor detecting where you put the wand, to create and interactive effect, the equivalent of today's smart boards. Although they are not produced anymore, used panels can be purchased for a fraction of the price of a data projector. The panels are quite dim, as they do not let a great deal of light through, so brightness can be a problem, even with a powerful overhead projector.
https://en.wikipedia.org/wiki/John%20ellipsoid
In mathematics, the John ellipsoid or Löwner-John ellipsoid E(K) associated to a convex body K in n-dimensional Euclidean space Rn can refer to the n-dimensional ellipsoid of maximal volume contained within K or the ellipsoid of minimal volume that contains K. Often, the minimal volume ellipsoid is called the Löwner ellipsoid, and the maximal volume ellipsoid is called the John ellipsoid (although John worked with the minimal volume ellipsoid in its original paper). One can also refer to the minimal volume circumscribed ellipsoid as the outer Löwner-John ellipsoid, and the maximum volume inscribed ellipsoid as the inner Löwner-John ellipsoid. Properties The John ellipsoid is named after the German-American mathematician Fritz John, who proved in 1948 that each convex body in Rn is circumscribed by a unique ellipsoid of minimal volume and that the dilation of this ellipsoid by factor 1/n is contained inside the convex body. The inner Löwner-John ellipsoid E(K) of a convex body K ⊂ Rn is a closed unit ball B in Rn if and only if B ⊆ K and there exists an integer m ≥ n and, for i = 1, ..., m, real numbers ci > 0 and unit vectors ui ∈ Sn−1 ∩ ∂K such that and, for all x ∈ Rn Applications The computation of Löwner-John ellipsoids (and in more general, the computation of minimal-volume polynomial level sets enclosing a set) has found many applications in control and robotics. In particular, computing Löwner-John ellipsoids has applications in obstacle collision detection for robotic systems, where the distance between a robot and its surrounding environment is estimated using a best ellipsoid fit. Löwner-John ellipsoids has also been used to approximate the optimal policy in portfolio optimization problems with transaction costs. See also Steiner inellipse, the special case of the inner Löwner-John ellipsoid for a triangle. Fat object, related to radius of largest contained ball.
https://en.wikipedia.org/wiki/Entropy%20power%20inequality
In information theory, the entropy power inequality (EPI) is a result that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication". Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary. Statement of the inequality For a random vector X : Ω → Rn with probability density function f : Rn → R, the differential entropy of X, denoted h(X), is defined to be and the entropy power of X, denoted N(X), is defined to be In particular, N(X) = |K| 1/n when X is normal distributed with covariance matrix K. Let X and Y be independent random variables with probability density functions in the Lp space Lp(Rn) for some p > 1. Then Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices. Alternative form of the inequality The entropy power inequality can be rewritten in an equivalent form that does not explicitly depend on the definition of entropy power (see Costa and Cover reference below). Let X and Y be independent random variables, as above. Then, let X' and Y' be independently distributed random variables with gaussian distributions, such that and Then, See also Information entropy Information theory Limiting density of discrete points Self-information Kullback–Leibler divergence Entropy estimation
https://en.wikipedia.org/wiki/List%20of%20flags%20of%20Montenegro
This is a list of flags used in Montenegro. For more information about the national flag, visit the article Flag of Montenegro. National flags Standards Military Municipal flags Ethnic groups flags Historical flags National flags Royal flags Civil ensigns Military flags Political flags See also Montenegro Flag of Montenegro Coat of arms of Montenegro Armorial of Montenegro
https://en.wikipedia.org/wiki/Clock%20angle%20problem
Clock angle problems are a type of mathematical problem which involve finding the angle between the hands of an analog clock. Math problem Clock angle problems relate two different measurements: angles and time. The angle is typically measured in degrees from the mark of number 12 clockwise. The time is usually based on a 12-hour clock. A method to solve such problems is to consider the rate of change of the angle in degrees per minute. The hour hand of a normal 12-hour analogue clock turns 360° in 12 hours (720 minutes) or 0.5° per minute. The minute hand rotates through 360° in 60 minutes or 6° per minute. Equation for the angle of the hour hand where: is the angle in degrees of the hand measured clockwise from the 12 is the hour. is the minutes past the hour. is the number of minutes since 12 o'clock. Equation for the angle of the minute hand where: is the angle in degrees of the hand measured clockwise from the 12 o'clock position. is the minute. Example The time is 5:24. The angle in degrees of the hour hand is: The angle in degrees of the minute hand is: Equation for the angle between the hands The angle between the hands can be found using the following formula: where is the hour is the minute If the angle is greater than 180 degrees then subtract it from 360 degrees. Example 1 The time is 2:20. Example 2 The time is 10:16. When are the hour and minute hands of a clock superimposed? The hour and minute hands are superimposed only when their angle is the same. is an integer in the range 0–11. This gives times of: 0:00, 1:05., 2:10., 3:16., 4:21., 5:27.. 6:32., 7:38., 8:43., 9:49., 10:54., and 12:00. (0. minutes are exactly 27. seconds.) See also Clock position
https://en.wikipedia.org/wiki/Tree%20hollow
A tree hollow or tree hole is a semi-enclosed cavity which has naturally formed in the trunk or branch of a tree. They are found mainly in old trees, whether living or not. Hollows form in many species of trees, and are a prominent feature of natural forests and woodlands, and act as a resource or habitat for a number of vertebrate and invertebrate animals. Hollows may form as the result of physiological stress from natural forces causing the excavating and exposure of the heartwood. Forces may include wind, fire, heat, lightning, rain, attack from insects (such as ants or beetles), bacteria, or fungi. Also, trees may self-prune, dropping lower branches as they reach maturity, exposing the area where the branch was attached. Many animals further develop the hollows using instruments such as their beak, teeth or claws. The size of hollows may depend on the age of the tree. For example, eucalypts develop hollows at all ages, but only from when the trees are 120 years old do they form hollows suitable for vertebrates, and it may take 220 years for hollows suitable for larger species to form. Hollows in fallen timber are also very important for animals such as echidnas, numbats, chuditch and many reptiles. In streams, hollow logs may be important to aquatic animals for shelter and egg attachment. Hollows are an important habitat for many wildlife species, especially where the use of hollows is obligate, as this means no other resource would be a feasible substitute. Animals may use hollows as diurnal or nocturnal shelter sites, as well as for rearing young, feeding, thermoregulation, and to facilitate ranging behaviour and dispersal. While use may also be opportunistic, rather than obligate, it may be difficult to determine the nature of a species' relationship to hollows—it may vary across a species' range, or depend on climatic conditions. Animals will select a hollow based on factors including entrance size and shape, depth, and degree of insulation. Such factor
https://en.wikipedia.org/wiki/Recursive%20partitioning
Recursive partitioning is a statistical method for multivariable analysis. Recursive partitioning creates a decision tree that strives to correctly classify members of the population by splitting it into sub-populations based on several dichotomous independent variables. The process is termed recursive because each sub-population may in turn be split an indefinite number of times until the splitting process terminates after a particular stopping criterion is reached. Recursive partitioning methods have been developed since the 1980s. Well known methods of recursive partitioning include Ross Quinlan's ID3 algorithm and its successors, C4.5 and C5.0 and Classification and Regression Trees (CART). Ensemble learning methods such as Random Forests help to overcome a common criticism of these methods – their vulnerability to overfitting of the data – by employing different algorithms and combining their output in some way. This article focuses on recursive partitioning for medical diagnostic tests, but the technique has far wider applications. See decision tree. As compared to regression analysis, which creates a formula that health care providers can use to calculate the probability that a patient has a disease, recursive partition creates a rule such as 'If a patient has finding x, y, or z they probably have disease q'. A variation is 'Cox linear recursive partitioning'. Advantages and disadvantages Compared to other multivariable methods, recursive partitioning has advantages and disadvantages. Advantages are: Generates clinically more intuitive models that do not require the user to perform calculations. Allows varying prioritizing of misclassifications in order to create a decision rule that has more sensitivity or specificity. May be more accurate. Disadvantages are: Does not work well for continuous variables May overfit data. Examples Examples are available of using recursive partitioning in research of diagnostic tests. Goldman used recursive par
https://en.wikipedia.org/wiki/Mediaroom
Mediaroom is a collection of software for operators to deliver IPTV (IPTV) subscription services, including content-protected, live, digital video recorder, video on demand, multiscreen, and applications. These services can be delivered via a range of devices inside and outside customers' homes, including wired and Wi-Fi set top boxes, PCs, tablets, smartphones and other connected devices – over both the operator's managed IP networks as well as "over the top" (OTT) or unmanaged networks. According to a marketing firm, Mediaroom was the market leader in IPTV for 2014. History Microsoft TV platform Microsoft announced an UltimateTV service from DirecTV in October 2000, based on technology acquired from WebTV Networks (later renamed MSN TV). The software was called the Microsoft TV platform (which included the Foundation Edition); it had integrated digital video recorder (DVR) and Internet access capabilities. It was released on October 26, 2000. The software to decode and view digital video programming was derived from WebTV (later called MSN TV). UltimateTV had support for picture-in-picture and could record up to 35 hours of video content. The Internet capabilities were provided by Microsoft TV platform software, which was used for the TV guide. The TV guide could display programming schedule for 14 days, and recording could be scheduled for any of the shows. It could also be used to access E-mail. However, Microsoft lost distribution when DirecTV accepted an acquisition bid by Echostar, who had their own DVR. By 2003, it was taken off the market, even though it is still supported by DirecTV and the acquisition by Echostar failed. The UltimateTV developers in Mountain View, California were eliminated by early 2002. By June 2002, Moshe Lichtman replaced Jon DeVaan as leader of the division as more reductions were announced. Foundation Edition The Microsoft TV Foundation Edition platform integrated video-on-demand (VOD), DVR and HDTV programming with live telev
https://en.wikipedia.org/wiki/Seletar%20Aerospace%20Park
Seletar Aerospace Park is an industrial park in Singapore catering to the aerospace industries. Located in Seletar, the plan to develop 140 hectares of land adjacent to Seletar Airport will further strengthen Singapore's position as an aviation hub. The development of the new aerospace park is geared towards delivering additional space for industry expansion, and complement existing aerospace activities at Changi North and Loyang. Seletar Aerospace Park will host an integrated cluster of activities such as aerospace maintenance, repair and overhaul (MRO); design and manufacturing of aerospace systems and components; business and general aviation activities, and an aviation campus for the training of pilots, aviation professionals and technical personnel. History In May 2006, the Singapore Government together with the Economic Development Board (EDB) and JTC Corporation unveiled the plan of a new aerospace park. Decision was made when Singapore's aerospace industry has seen soaring growth potential and also a strong demand for aviation-related services. JTC Corporation was asked to carry out the master-planning and infrastructure improvements for Seletar Aerospace Park, in consultation with other government agencies. The development of the new aerospace hub is expected to take care of the industry's land needs for at least 10 years. A master plan for Seletar Aerospace Park was announced by JTC Corporation on 26 June 2007. The Seletar Airport was upgraded to support the park, including lengthening the airport's runway and upgrading of avionics systems to allow access for bigger aircraft. Aerospace design and manufacturing companies and training schools were given additional space with new roads and infrastructure. The development of the park would cost more than S$60 million and done in phases. The park is expected to create 10,000 jobs, predominantly skilled and technical positions and double the output of aerospace sector, from 2006's recordof S$6.3 billion. E
https://en.wikipedia.org/wiki/Azure%20DevOps%20Server
Azure DevOps Server (formerly Team Foundation Server (TFS) and Visual Studio Team System (VSTS)) is a Microsoft product that provides version control (either with Team Foundation Version Control (TFVC) or Git), reporting, requirements management, project management (for both agile software development and waterfall teams), automated builds, testing and release management capabilities. It covers the entire application lifecycle and enables DevOps capabilities. Azure DevOps can be used as a back-end to numerous integrated development environments (IDEs) but is tailored for Microsoft Visual Studio and Eclipse on all platforms. On-premises vs. online Azure DevOps is available in two different forms: on-premises ("Server") and online ("Services"). The latter form is called Azure DevOps Services (formerly Visual Studio Online before it was renamed to Visual Studio Team Services in 2015). The cloud service is backed by the Microsoft Azure cloud platform. It uses the same code as the on-premises version of Azure DevOps, with minor modifications, and implements the most recent features. A user signs in using a Microsoft account to set up an environment, creating projects and adding team members. New features developed in short development cycles are added to the cloud version first. These features migrate to the on-premises version as updates, at approximately three-month intervals. Architecture Server architecture Azure DevOps is built on multi-tier, scalable architecture. The primary structure consists of an application tier responsible for processing logic and maintaining the web application portal (referred to as Team Web Access or TWA). Azure DevOps is built using Windows Communication Foundation web services. These may be consumed by any client, although the client object model is recommended. The data tier and application tier can exist on the same machine. To support scalability, the application tier can be load balanced and the data tier can be clustered. If us
https://en.wikipedia.org/wiki/Thambuli
Thambuli is a type of raita or buttermilk eaten in the Indian state of Karnataka. Thambuli, being a curd based cuisine, is consumed with rice. Tambuli is derived from Kannada word . (ತಂಪು+ಹುಳಿ ---->ತಂಬುಳಿ). Thampu meaning cool/cold. So thambuli is a cooling food. It is made mostly from many greens and carrot, beetroot like vegetables as their main ingredients. It is prepared by grinding the vegetable with the spices and then mixing it with yogurt. All ingredients are used raw, (as they are) without any cooking. Thambuli/Tambli/Tambuli is a form of . There are many varieties of Thumbuli: Menthe Thumbuli, shunti (ginger) thaumbuli, and various other herbal thambulis. The herbal thambuli is prepared with leaves like (all of them grown in all over Karnataka). Many different seasonal vegetables/herbs can be used in the preparation of thambulis, such as doddapatre leaves (ajwain leaves/karibevu leaves), coriander leaves, poppy seeds, curry leaves and so on. Various recipes for the same exist, with slight variations in the ingredients. Thambuli/Tambli is generally prepared mild and not spicy. Fundamentally, thambuli/tambli has a few simple whole spices, roasted and ground with seasonal vegetables or herbs (some with coconut) added to buttermilk/curds. Tambuli is another authentic Karnataka recipe.
https://en.wikipedia.org/wiki/Fenna%E2%80%93Matthews%E2%80%93Olson%20complex
The Fenna–Matthews–Olson (FMO) complex is a water-soluble complex and was the first pigment-protein complex (PPC) to be structure analyzed by x-ray spectroscopy. It appears in green sulfur bacteria and mediates the excitation energy transfer from light-harvesting chlorosomes to the membrane-embedded bacterial reaction center (bRC). Its structure is trimeric (C3-symmetry). Each of the three monomers contains eight bacteriochlorophyll a (BChl a) molecules. They are bound to the protein scaffold via chelation of their central magnesium atom either to amino acids of the protein (mostly histidine) or water-bridged oxygen atoms (only one BChl a of each monomer). Since the structure is available, calculating structure-based optical spectra is possible for comparison with experimental optical spectra. In the simplest case only the excitonic coupling of the BChls is taken into account. More realistic theories consider pigment-protein coupling. An important property is the local transition energy (site energy) of the BChls, different for each, due to their individual local protein environment. The site energies of the BChls determine the direction of the energy flow. Some structural information on the FMO-RC super complex is available, which was obtained by electron microscopy and linear dichroism spectra measured on FMO trimers and FMO-RC complexes. From these measurements, two orientations of the FMO complex relative to the RC are possible. The orientation with BChl 3 and 4 close to the RC and BChl 1 and 6 (following Fenna and Matthews' original numbering) oriented towards the chlorosomes is useful for efficient energy transfer. Test object The complex is the simplest PPC appearing in nature and therefore a suitable test object for the development of methods that can be transferred to more complex systems like photosystem I. Engel and co-workers observed that the FMO complex exhibits remarkably long quantum coherence, but after about a decade of debate, it was shown tha
https://en.wikipedia.org/wiki/Department%20of%20Petroleum%20Engineering%20and%20Applied%20Geophysics%2C%20NTNU
In 2017 the department was merged with the Department of Geology and Mineral Resources Engineering, forming the new Department of Geoscience and Petroleum. The Norwegian University of Science and Technology (NTNU) is the key university of science and technology in Norway. The Department of Petroleum Engineering and Applied Geophysics (IPT) was established in 1973, shortly after the start of production (Ekofisk field) from the Norwegian continental shelf. The department came to include Petroleum Engineering as well as Geophysics, which is seen as a major strength of the petroleum education at NTNU. The department has elected chairman and vice chairman, and 4 informal groups of professors; geophysics, drilling, production and reservoir engineering. The stated primary purpose of maintaining the informal groups is to take care of the teaching in their respective disciplines. Each group is responsible for offering a sufficient number of courses, semester projects and thesis projects at MSc and PhD levels in their discipline, and to make annual revisions of these in accordance with the needs of society and industry. The total number of professors, associate professors, assistant professors and adjunct professors is 32. The administrative staff is led by a department administrator, and consists of a total of 6 secretaries. The technical support staff reports to the department head, and consists of 8 engineers and technicians. Until 2000, the department was part of the Applied Earth Sciences faculty, together with the Geology-department. After that, the department is part of the Faculty of Engineering Science and Technology (one of a total of 10 departments). Brief historical statistics of the department: Established in 1973 More than 2000 graduated M.Sc.´s More than 150 graduated Ph.D.´s Around 120 M.Sc.´s graduate every year Around 10 Ph.D.´s graduate every year Currently around 120 full-time teachers, researchers and staff Around 450 students enrolled at B.Sc.
https://en.wikipedia.org/wiki/Keldysh%20formalism
In non-equilibrium physics, the Keldysh formalism is a general framework for describing the quantum mechanical evolution of a system in a non-equilibrium state or systems subject to time varying external fields (electrical field, magnetic field etc.). Historically, it was foreshadowed by the work of Julian Schwinger and proposed almost simultaneously by Leonid Keldysh and, separately, Leo Kadanoff and Gordon Baym. It was further developed by later contributors such as O. V. Konstantinov and V. I. Perel. Extensions to driven-dissipative open quantum systems is given not only for bosonic systems, but also for fermionic systems. The Keldysh formalism provides a systematic way to study non-equilibrium systems, usually based on the two-point functions corresponding to excitations in the system. The main mathematical object in the Keldysh formalism is the non-equilibrium Green's function (NEGF), which is a two-point function of particle fields. In this way, it resembles the Matsubara formalism, which is based on equilibrium Green functions in imaginary-time and treats only equilibrium systems. Time evolution of a quantum system Consider a general quantum mechanical system. This system has the Hamiltonian . Let the initial state of the system be the pure state . If we now add a time-dependent perturbation to this Hamiltonian, say , the full Hamiltonian is and hence the system will evolve in time under the full Hamiltonian. In this section, we will see how time evolution actually works in quantum mechanics. Consider a Hermitian operator . In the Heisenberg picture of quantum mechanics, this operator is time-dependent and the state is not. The expectation value of the operator is given by where, due to time evolution of operators in the Heisenberg picture, . The time-evolution unitary operator is the time-ordered exponential of an integral, (Note that if the Hamiltonian at one time commutes with the Hamiltonian at different times, then this can be simplified to
https://en.wikipedia.org/wiki/Mallet%20%28habit%29
A mallet is a small-tree form of Eucalyptus found in Western Australia. Unlike the mallee, it is single-stemmed and lacks a lignotuber. Species of this form have a relatively long, slender trunk, steeply-angled branches, often a conspicuously dense terminal crown, and sometimes form thickets. Mallet species include: Brown mallet (Eucalyptus astringens) Blue mallet, blue-leaved mallet, Gardner's mallet (Eucalyptus gardneri) Green mallet (Eucalyptus clivicola) Salt River mallet, Sargent's mallet (Eucalyptus sargentii) Silver mallet (Eucalyptus falcata or Eucalyptus ornata) Steedman's mallet (Eucalyptus steedmanii) Swamp mallet (Eucalyptus spathulata) White mallet (Eucalyptus falcata or Eucalyptus spathulata) Fuchsia gum (Eucalyptus dolichorhyncha) See also Mallee Marlock Gimlet
https://en.wikipedia.org/wiki/Tyrosine%20kinase%202
Non-receptor tyrosine-protein kinase TYK2 is an enzyme that in humans is encoded by the TYK2 gene. TYK2 was the first member of the JAK family that was described (the other members are JAK1, JAK2, and JAK3). It has been implicated in IFN-α, IL-6, IL-10 and IL-12 signaling. Function This gene encodes a member of the tyrosine kinase and, to be more specific, the Janus kinases (JAKs) protein families. This protein associates with the cytoplasmic domain of type I and type II cytokine receptors and promulgate cytokine signals by phosphorylating receptor subunits. It is also component of both the type I and type III interferon signaling pathways. As such, it may play a role in anti-viral immunity. Cytokines play pivotal roles in immunity and inflammation by regulating the survival, proliferation, differentiation, and function of immune cells, as well as cells from other organ systems. Hence, targeting cytokines and their receptors is an effective means of treating such disorders. Type I and II cytokine receptors associate with Janus family kinases (JAKs) to affect intracellular signaling. Cytokines including interleukins, interferons and hemopoietins activate the Janus kinases, which associate with their cognate receptors. The mammalian JAK family has four members: JAK1, JAK2, JAK3 and tyrosine kinase 2 (TYK2). The connection between Jaks and cytokine signaling was first revealed when a screen for genes involved in interferon type I (IFN-1) signaling identified TYK2 as an essential element, which is activated by an array of cytokine receptors. TYK2 has broader and profound functions in humans than previously appreciated on the basis of analysis of murine models, which indicate that TYK2 functions primarily in IL-12 and type I-IFN signaling. TYK2 deficiency has more dramatic effects in human cells than in mouse cells. However, in addition to IFN-α and -β and IL-12 signaling, TYK2 has major effects on the transduction of IL-23, IL-10, and IL-6 signals. Since, IL-6 si
https://en.wikipedia.org/wiki/Bonnet%20theorem
In the mathematical field of differential geometry, the fundamental theorem of surface theory deals with the problem of prescribing the geometric data of a submanifold of Euclidean space. Originally proved by Pierre Ossian Bonnet in 1867, it has since been extended to higher dimensions and non-Euclidean contexts. Bonnet's theorem Any surface in three-dimensional Euclidean space has a first and second fundamental form, which automatically are interrelated by the Gauss–Codazzi equations. Bonnet's theorem asserts a local converse to this result. Given an open region in , let and be symmetric 2-tensors on , with additionally required to be positive-definite. If these are smooth and satisfy the Gauss–Codazzi equations, then Bonnet's theorem says that is covered by open sets which can be smoothly embedded into with first fundamental form and second fundamental form (relative to one of the two choices of unit normal vector field) . Furthermore, each of these embeddings is uniquely determined up to a rigid motion of . Bonnet's theorem is a corollary of the Frobenius theorem, upon viewing the Gauss–Codazzi equations as a system of first-order partial differential equations for the two coordinate derivatives of the position vector of an embedding, together with the normal vector. General formulations Bonnet's theorem can be naturally formulated for hypersurfaces in a Euclidean space of any dimension, and the result remains true in this context. Furthermore, the theorem can be extended from Bonnet's local formulation to a global formulation, allowing to be any connected and simply-connected smooth manifold, with the result asserting the existence and uniqueness (up to a rigid motion) of a smooth immersion of as a hypersurface of Euclidean space with first fundamental form and second fundamental form . The idea of the proof is to use the existence theory from the local formulation to construct the immersion along arbitrary curves emanating from a single point. Sim
https://en.wikipedia.org/wiki/Andreas%20Blass
Andreas Raphael Blass (born October 27, 1947) is a mathematician, currently a professor at the University of Michigan. He works in mathematical logic, particularly set theory, and theoretical computer science. Blass graduated from the University of Detroit, where he was a Putnam Fellow in 1965, in 1966 with a B.S. in physics. He received his Ph.D. in 1970 from Harvard University, with a thesis on Orderings of Ultrafilters written under the supervision of Frank Wattenberg. Since 1970 he has been employed by the University of Michigan, first as a T.H. Hildebrandt Research Instructor (1970–72), then assistant professor (1972–76), associate professor (1976–84) and since 1984 he has been a full professor there. In 2014, he became a Fellow of the American Mathematical Society. Selected publications and results In 1984 Blass proved that the existence of a basis for every vector space is equivalent to the axiom of choice. He made important contributions in the development of the set theory of the reals and forcing. Blass was the first to point out connections between game semantics and linear logic. He has authored more than 200 research articles in mathematical logic and theoretical computer science, including:
https://en.wikipedia.org/wiki/L%C3%A9o-Pariseau%20Prize
The Léo-Pariseau Prize is a Québécois prize which is awarded annually to a distinguished individual working in the field of biological or health sciences. The prize is awarded by the Association francophone pour le savoir (Acfas), and is named after Léo Pariseau, the first president of Acfas. The award was inaugurated in 1944 and was the first Acfas prize. Prior to 1980 the prize was awarded to researchers in a large variety of disciplines, before being restricted to biological and health sciences. There are now ten annual prizes for researchers in different disciplines. Winners Source: Acfas – Prix de la Recherche Scientifique de l'Acfas – Prix Léo-Pariseau 1944 - Marie-Victorin Kirouac, botany, Université de Montréal 1945 - Paul-Antoine Giguère, chemistry, Université Laval 1946 - Marius Barbeau, ethnology, Université Laval 1947 - Jacques Rousseau, botany and ethnology, Université de Montréal 1948 - Léo Marion, chemistry, University of Ottawa 1949 - Jean Bruchési, history and political science, Université de Montréal 1950 - Louis-Charles Simard, pathology, Université de Montréal 1951 - Cyrias Ouellet, chemistry, Université Laval 1952 - Louis-Paul Dugal, physiology, Université de Montréal 1953 - Guy Frégault, history, Université de Montréal 1954 - Pierre Demers. physics, Université de Montréal 1955 - René Pomerleau, mycology, Université de Montréal 1956 - Marcel Rioux, anthropology, Université de Montréal 1957 - No prize awarded. 1958 - Roger Gaudry, chemistry, Université de Montréal 1959 - Lionel Daviault, entomology 1960 - Marcel Trudel, history, Université Laval 1961 - Raymond Lemieux, chemistry, University of Alberta 1962 - Charles-Philippe Leblond, histology, McGill University 1963 - Lionel Groulx, history, Université de Montréal 1964 - Larkin Kerwin, physics, Université Laval 1965 - Pierre Dansereau, ecology, Université du Québec à Montréal 1966 - Noël Mailloux, psychology, Université de Montréal 1967 - Albéric Boivin, physics, Université Laval 1968 - Lé
https://en.wikipedia.org/wiki/Knowledge%20space
In mathematical psychology and education theory, a knowledge space is a combinatorial structure used to formulate mathematical models describing the progression of a human learner. Knowledge spaces were introduced in 1985 by Jean-Paul Doignon and Jean-Claude Falmagne, and remain in extensive use in the education theory. Modern applications include two computerized tutoring systems, ALEKS and the defunct RATH. Formally, a knowledge space assumes that a domain of knowledge is a collection of concepts or skills, each of which must be eventually mastered. Not all concepts are interchangeable; some require other concepts as prerequisites. Conversely, competency at one skill may ease the acquisition of another through similarity. A knowledge space marks out which collections of skills are feasible: they can be learned without mastering any other skills. Under reasonable assumptions, the collection of feasible competencies forms the mathematical structure known as an antimatroid. Researchers and educators usually explore the structure of a discipline's knowledge space as a latent class model. Motivation Knowledge Space Theory attempts to address shortcomings of standardized testing when used in educational psychometry. Common tests, such as the SAT and ACT, compress a student's knowledge into a very small range of ordinal ranks, in the process effacing the conceptual dependencies between questions. Consequently, the tests cannot distinguish between true understanding and guesses, nor can they identify a student's particular weaknesses, only the general proportion of skills mastered. The goal of knowledge space theory is to provide a language by which exams can communicate What the student can do and What the student is ready to learn. Model structure Knowledge Space Theory-based models presume that an educational subject can be modeled as a finite set of concepts, skills, or topics. Each feasible state of knowledge about is then a subset of ; the set of
https://en.wikipedia.org/wiki/Biodiversity%20Outcomes%20Framework
Canada's Biodiversity Outcomes Framework was approved by Ministers responsible for Environment, Forests, Parks, Fisheries and Aquaculture, and Wildlife in October 2006. It has been developed further to the Canadian Biodiversity Strategy, an implementation measure required under Article 6 of the United Nations Convention on Biological Diversity. Criticism of the Framework The Framework has been developed from the Canadian Biodiversity Strategy, which has been criticized as having a tendency to focus on species and to assign less importance to other scales of biodiversity from the genetic to the ecosystem level. See also Criticisms of the biodiversity paradigm
https://en.wikipedia.org/wiki/American%20Society%20of%20Cytopathology
The American Society of Cytopathology (ASC), founded in 1951, is a national professional society of physicians, cytotechnologists and scientists who are dedicated to cytopathology, which involves the cytologic method of diagnostic pathology. They have more than 3000 members including representatives for other countries. The ASC provides a forum where physicians and cytotechnologists can interact with one another. Although this society is involved with guidelines for cytotechnologists and admits certain qualified cytotechnologists, it should not be confused with the American Society for Cytotechnology (ASCT). The official journal of the ASC is the Journal of the American Society of Cytopathology (JASC) published bimonthly.
https://en.wikipedia.org/wiki/Biodiversity%20Convention%20Office
Canada's Biodiversity Convention Office (BCO) serves as National Focal Point for the United Nations Convention on Biological Diversity and the Canadian Biodiversity Strategy. BCO also provides a leadership role in the Biodiversity Conservation Working Group of the Commission for Environmental Cooperation and in the Conservation of Arctic Flora and Fauna (CAFF) working group of the Arctic Council. The BCO was established by Environment Canada in September 1991 to coordinate Canadian involvement in the negotiations of the Convention. Following Canada's ratification of the Convention in December 1992, attention shifted to development of a Canadian response. Under the guidance of the BCO, a Federal-Provincial-Territorial Working Group was charged by the Canadian Council of Ministers of the Environment with developing the Canadian Biodiversity Strategy. In 1996, all jurisdictions signed a statement of commitment to use the Strategy as a guide to implementing the Convention in Canada. In 2005, Ministers instructed the Federal-Provincial-Territorial Working Group to develop a corresponding outcomes-based framework for guiding and monitoring implementation of the Canadian Biodiversity Strategy. This Biodiversity Outcomes Framework was approved by Ministers responsible for Environment, Forests, Parks, Fisheries and Aquaculture, and Wildlife in October 2006. The BCO plays a policy coordinating, catalysing and facilitating role in national efforts to define Canada's response to the Convention and National Strategy. It operates through a network of contacts within and outside government. At the federal level, an Interdepartmental Committee on Biodiversity provides advice and guidance on domestic and international policy issues. The Federal/Provincial/Territorial Biodiversity Working Group focuses on national biodiversity issues. BCO also works with indigenous groups to enable their participation in meeting the objectives of the Convention and the Canadian Biodiversity Stra
https://en.wikipedia.org/wiki/Mathieu%20wavelet
The Mathieu equation is a linear second-order differential equation with periodic coefficients. The French mathematician, E. Léonard Mathieu, first introduced this family of differential equations, nowadays termed Mathieu equations, in his “Memoir on vibrations of an elliptic membrane” in 1868. "Mathieu functions are applicable to a wide variety of physical phenomena, e.g., diffraction, amplitude distortion, inverted pendulum, stability of a floating body, radio frequency quadrupole, and vibration in a medium with modulated density" Elliptic-cylinder wavelets This is a wide family of wavelet system that provides a multiresolution analysis. The magnitude of the detail and smoothing filters corresponds to first-kind Mathieu functions with odd characteristic exponent. The number of notches of these filters can be easily designed by choosing the characteristic exponent. Elliptic-cylinder wavelets derived by this method possess potential application in the fields of optics and electromagnetism due to its symmetry. Mathieu differential equations Mathieu's equation is related to the wave equation for the elliptic cylinder. In 1868, the French mathematician Émile Léonard Mathieu introduced a family of differential equations nowadays termed Mathieu equations. Given , the Mathieu equation is given by The Mathieu equation is a linear second-order differential equation with periodic coefficients. For q = 0, it reduces to the well-known harmonic oscillator, a being the square of the frequency. The solution of the Mathieu equation is the elliptic-cylinder harmonic, known as Mathieu functions. They have long been applied on a broad scope of wave-guide problems involving elliptical geometry, including: analysis for weak guiding for step index elliptical core optical fibres power transport of elliptical wave guides evaluating radiated waves of elliptical horn antennas elliptical annular microstrip antennas with arbitrary eccentricity ) scattering by a coated
https://en.wikipedia.org/wiki/Netgear%20MP101
The Netgear MP101 was the first of a series of digital media receivers by Netgear. Family history The Netgear MP101's family also includes other devices such as the MP115, the EVA700 and, the EVA8000. Appearance The Netgear MP101 is a small brushed silver unit that can provide a link between a PC-based MP3 collection and a conventional hi-fi. Concept The MP101 requires a UPnP AV media server to provide access to digital media, while some other units (and the later EVA8000) can read from a Windows share directly (or a NAS device). Netgear does not manufacture the devices itself, but they are produced instead by a third-party company and then marketed as a Netgear product. Implementation The MP101 is based on the ARM9 Marvell Libertas 88W8510H system-on-a-chip and has 8 MB of DRAM. Netgear licensed the ARM MP3 decoder software for use with the device. The MP101 runs the open-source eCos real-time operating system. Netgear made the source code available online. External links MP101 review (December 20, 2004) ARM press release: NetGear Builds Innovative Wireless MP3 Player Around ARM Powered SOC From Marvell (March 16, 2004) MP101 Internet audio players
https://en.wikipedia.org/wiki/PrintableString
A PrintableString is a restricted character string type in the ASN.1 notation. It is used to describe data that consists only of a specific printable subset of the ASCII character set. According to the ASN.1 Specification of basic notation, the character set of PrintableString can be expressed as: The PrintableString definition does not include the at sign (@) or ampersand (&). This sometimes causes problems for naive implementers who attempt to put an SMTP email address into an X.509 digital certificate Distinguished Name. The PrintableString definition does not include asterisk (*) which means it must not be used to represent a wildcard in an X.509 digital certificate Distinguished Name. See also The X.690 encoding standard for ASN.1 IA5String
https://en.wikipedia.org/wiki/Collision%20problem
The r-to-1 collision problem is an important theoretical problem in complexity theory, quantum computing, and computational mathematics. The collision problem most often refers to the 2-to-1 version: given even and a function , we are promised that f is either 1-to-1 or 2-to-1. We are only allowed to make queries about the value of for any . The problem then asks how many such queries we need to make to determine with certainty whether f is 1-to-1 or 2-to-1. Classical solutions Deterministic Solving the 2-to-1 version deterministically requires queries, and in general distinguishing r-to-1 functions from 1-to-1 functions requires queries. This is a straightforward application of the pigeonhole principle: if a function is r-to-1, then after queries we are guaranteed to have found a collision. If a function is 1-to-1, then no collision exists. Thus, queries suffice. If we are unlucky, then the first queries could return distinct answers, so queries is also necessary. Randomized If we allow randomness, the problem is easier. By the birthday paradox, if we choose (distinct) queries at random, then with high probability we find a collision in any fixed 2-to-1 function after queries. Quantum solution The BHT algorithm, which uses Grover's algorithm, solves this problem optimally by only making queries to f.
https://en.wikipedia.org/wiki/Norton%20AntiBot
Norton AntiBot, developed by Symantec, monitored applications for damaging behavior. The application was designed to prevent computers from being hijacked and controlled by hackers. According to Symantec, over 6 million computers have been hijacked, and the majority of users are unaware of their computers being hacked. AntiBot was designed to be used in conjunction with other antivirus software. Unlike traditional antivirus products, AntiBot does not use signatures; there is a delay between when a vendor discovers a virus and distributes the signature. During the delay, computers can be affected. Instead, AntiBot attempts to identify a virus through its actions; viruses are malicious by nature. However, AntiBot was not intended to replace an antivirus product. The program uses technology licensed from Sana Security. The product has been discontinued after AVG acquired Sana Security in January 2009, developing a standalone program similar to AntiBot called AVG Identity protection, which was also discontinued and integrated in AVG Internet Security 2011. Product updates and technical support were available from Symantec for one year after a customer's last purchase or renewal. History Ed Kim, director of product management at Symantec, highlighted the rise of botnets. A botnet is a collection of compromised computers, known as bots, which hackers usually control for malicious purposes. Two main uses of botnets include identity theft and e-mail spam. Kim cited a 29 percent increase of bots from the first half of 2006 to the second half. In all, there were six million active bots by the end of 2006. On 7 June 2007, Symantec released a beta version of Norton AntiBot. AntiBot was designed to supplement a user's existing antivirus software. Unlike traditional antivirus software, AntiBot does not use signatures to identify malware. Instead, it monitors running applications for damaging or malicious behavior, licensing technology from Sana Security. AntiBot can also sup
https://en.wikipedia.org/wiki/Halbert%20L.%20Dunn%20Award
The Halbert L. Dunn Award is the most prestigious award presented by the National Association for Public Health Statistics and Information Systems (NAPHSIS). The award has been presented since 1981 providing national recognition of outstanding and lasting contributions to the field of vital and health statistics at the national, state, or local level. The award was established in honor of the late Halbert L. Dunn, M.D., Director of the National Office of Vital Statistics from 1936 to 1960. Dr. Dunn was highly instrumental in encouraging the states to establish state vital statistics associations and played a major role in developing NAPHSIS. The award is presented at the Hal Dunn Awards Luncheon during the association’s annual meeting. The winners of the Halbert L. Dunn Award have been: Source: NAPHSIS 1981 Deane Huxtable 1982 Loren Chancellor 1983 Vito Logrillo 1984 Carl Erhardt 1985 Irvin Franzen 1986 W. D. "Don" Carroll 1987 Margaret Shackelford 1988 John Brockert, State Registrar, Utah 1989 Margaret Watts 1990 John Patterson 1991 Patricia Potrzebowski, State Registrar, Pennsylvania 1992 Rose Trasatti, National Association for Public Health Statistics and Information Systems (NAPHSIS) 1993 Garland Land, State Registrar, Missouri 1994 George Van Amburg 1995 Jack Smith 1996 no award 1997 Ray Nashold 1998 Iwao Moriyama 1999 no award 2000 George Gay 2001 Dorothy Harshbarger, State Registrar, Alabama 2002 Lorne Phillips, State Registrar, Kansas 2003 Mary Anne Freedman, Director of the Division of Vital Statistics, NCHS 2004 no award 2005 Joe Carney 2006 Dan Friedman 2007 Harry Rosenberg, National Center for Health Statistics 2008 Alvin T. Onaka, Registrar, Hawaii 2009 Marshall Evans, National Center for Health Statistics 2010 Steven Schwartz, Registrar, New York City 2011 Charles Rothwell, Director, National Center for Health Statistics 2012 no award 2013 Stephanie Ventura, Director of Reproductive Statistics Branch, National
https://en.wikipedia.org/wiki/National%20Society%20of%20Hispanic%20Physicists
The National Society of Hispanic Physicists (NSHP) was established in 1996 with the goal of promoting the participation and advancement of Hispanic-Americans in physics and celebrating the contributions of Hispanic-American physicists to the study and teaching of physics. Brief history A grant from the Sloan Foundation was awarded to the Pan-American Association for Physics to establish the National Society of Hispanic Physicists (NSHP). The Founding Meeting of the Society was held in Austin at the University of Texas in April, 1996 and the first annual meeting was held in Houston, Texas in October 1997. The US-Mexico Workshop on Teaching Introductory Physics, the first major project undertaken by the Society, was held later that year in Monterrey, Mexico. The project was a bilingual joint venture between the NSHP and the Instituto Tecnológico y de Estudios Superiores de Monterrey (ITESM) to explore the goals of the introductory physics sequence and recent pedagogical developments to meet those goals. The Hispanic Physicist, the official newsletter of the NSHP, has been in publication since 1997. The NSHP meets jointly with other societies organizing sessions, hosting social functions, promoting discussions of diversity and inclusion issues in the physics community, and recognizing achievements of Hispanic-American physics students and faculty. The National Society of Hispanic Physicists has met annually with the Society for the Advancement of Chicanos and Native Americans in Science (SACNAS) since 1997 and twice with the American Association of Physics Teachers (AAPT) (in Austin, TX in 2003 and Albuquerque, NM in 2005). In addition, the NSHP has met at sectional meetings of the American Physical Society (APS) and the American Astronomical Society (AAS). More recently the Society has met annually with the National Society of Black Physicists. The organization was incorporated under the umbrella of the Southeastern Universities Research Association (SURA) in Aug
https://en.wikipedia.org/wiki/Signal%20subspace
In signal processing, signal subspace methods are empirical linear methods for dimensionality reduction and noise reduction. These approaches have attracted significant interest and investigation recently in the context of speech enhancement, speech modeling, and speech classification research. The signal subspace is also used in radio direction finding using the MUSIC (algorithm). Essentially the methods represent the application of a principal components analysis (PCA) approach to ensembles of observed time-series obtained by sampling, for example sampling an audio signal. Such samples can be viewed as vectors in a high-dimensional vector space over the real numbers. PCA is used to identify a set of orthogonal basis vectors (basis signals) which capture as much as possible of the energy in the ensemble of observed samples. The vector space spanned by the basis vectors identified by the analysis is then the signal subspace. The underlying assumption is that information in speech signals is almost completely contained in a small linear subspace of the overall space of possible sample vectors, whereas additive noise is typically distributed through the larger space isotropically (for example when it is white noise). By projecting a sample on a signal subspace, that is, keeping only the component of the sample that is in the signal subspace defined by linear combinations of the first few most energized basis vectors, and throwing away the rest of the sample, which is in the remainder of the space orthogonal to this subspace, a certain amount of noise filtering is then obtained. Signal subspace noise-reduction can be compared to Wiener filter methods. There are two main differences: The basis signals used in Wiener filtering are usually harmonic sine waves, into which a signal can be decomposed by Fourier transform. In contrast, the basis signals used to construct the signal subspace are identified empirically, and may for example be chirps, or particular character
https://en.wikipedia.org/wiki/History%20of%20molecular%20evolution
The history of molecular evolution starts in the early 20th century with "comparative biochemistry", but the field of molecular evolution came into its own in the 1960s and 1970s, following the rise of molecular biology. The advent of protein sequencing allowed molecular biologists to create phylogenies based on sequence comparison, and to use the differences between homologous sequences as a molecular clock to estimate the time since the last common ancestor. In the late 1960s, the neutral theory of molecular evolution provided a theoretical basis for the molecular clock, though both the clock and the neutral theory were controversial, since most evolutionary biologists held strongly to panselectionism, with natural selection as the only important cause of evolutionary change. After the 1970s, nucleic acid sequencing allowed molecular evolution to reach beyond proteins to highly conserved ribosomal RNA sequences, the foundation of a reconceptualization of the early history of life. Early history Before the rise of molecular biology in the 1950s and 1960s, a small number of biologists had explored the possibilities of using biochemical differences between species to study evolution. Alfred Sturtevant predicted the existence of chromosomal inversions in 1921 and with Dobzhansky constructed one of the first molecular phylogenies on 17 Drosophila Pseudo-obscura strains from the accumulation of chromosomal inversions observed from the hybridization of polyten chromosomes. Ernest Baldwin worked extensively on comparative biochemistry beginning in the 1930s, and Marcel Florkin pioneered techniques for constructing phylogenies based on molecular and biochemical characters in the 1940s. However, it was not until the 1950s that biologists developed techniques for producing biochemical data for the quantitative study of molecular evolution. The first molecular systematics research was based on immunological assays and protein "fingerprinting" methods. Alan Boyden—building
https://en.wikipedia.org/wiki/Netherlands%20Entomological%20Society
The Netherlands Entomological Society (, abbreviated NEV) was founded in 1845 for the purpose of improving and promoting entomology in the Netherlands. The society has more than 600 members. External links Official website: Nederlandse Entomologische Vereniging Entomological societies Scientific organisations based in the Netherlands 1845 establishments in the Netherlands Scientific organizations established in 1845
https://en.wikipedia.org/wiki/Panavision%20cameras
Panavision has been a manufacturer of cameras for the motion picture industry since the 1950s, beginning with anamorphic widescreen lenses. The lightweight Panaflex is credited with revolutionizing filmmaking. Other influential cameras include the Millennium XL and the digital video Genesis. Panavision Silent Reflex Panavision Silent Reflex (1967) Panavision Super PSR (R-200 or Super R-200) Panaflex Panaflex (1972) Panaflex-X (1974) Panaflex Lightweight (1975) The Panaflex Lightweight is a sync-sound 35 mm motion picture camera, stripped of all components not essential for work with "floating camera" systems such as the Steadicam. Contemporary cameras such as the Panavision Gold II can weigh as much as depending on configuration. The Panaflex Lightweight II (1993) is crystal-controlled in one-frame increments between four and 36 frames per second, and has a fixed focal-plane shutter. 200°, 180°, 172.8° or 144° shutters can be installed by Panavision prior to rental per the customer's order. This camera is still available through Panavision. Panaflex Gold (1976) Panaflex Gold II (1987) The Panaflex Gold II is a sync-sound 35 mm motion picture camera. It is capable of crystal sync at 24 and 25 or 29.97 frame/s, and the non-sync speed is variable from 4–34 fps (frames per second) according to Panavision; the Gold II can safely run up to 40 fps, crystal controlled with a special board which can be fitted on request. It has a focal-plane shutter which can be adjusted from 50 to 200° while the camera is running, either by an external control unit or by manually turning a knob. Improvements over the Panavision Gold include a brighter viewfinder. While the movement remains essentially the same as the original Panaflex movement introduced in 1972, the Gold II's dual registration pins are "full-fitting" according to Panavision, implying a more precise grip on the film during exposure and thus greater sharpness. This camera is still available through Panavision.
https://en.wikipedia.org/wiki/Crying%20in%20the%20Rain
"Crying in the Rain" is a song composed by Carole King with lyrics by Howard Greenfield, originally recorded by American duo the Everly Brothers. The single peaked at number six on the US Billboard Hot 100 in 1962. The song was the only collaboration between songwriters Greenfield and King, both of whom worked for Aldon Music at the time of the song's composition. On a whim, two Aldon songwriting partnerships decided to switch partners for a day – Gerry Goffin (who normally worked with King) partnered with Greenfield's frequent writing partner, Jack Keller, leaving King and Greenfield to pair up for the day. Despite the commercial success of their collaboration, King and Greenfield never wrote another song together. Track listing Charts Tammy Wynette version In 1981, "Crying in the Rain" was notably covered by American country artist Tammy Wynette. It became a major hit after being released as a single that year. Wynette's version was produced by Chips Moman at the Moman Recording Studio in Las Vegas, Nevada. The recording session also included nine additional tracks that would appear on Wynette's 1981 studio album. The song was released as a single in July 1981. It reached number 18 on the Billboard Hot Country Singles chart that same year. "Crying in the Rain" became Wynette's third single to reach the country songs top 20 in the 1980s decade. The song was issued on Wynette's twenty-second studio album, You Brought Me Back (1981). Additionally, "Crying in the Rain" peaked at number 11 on the RPM Country Tracks chart in Canada around the same time. It was her highest-charting solo song on the RPM survey since 1979. Track listing 7-inch single A. "Crying in the Rain" – 3:12 B. "Bring My Baby Back to Me" – 3:25 Charts A-ha version In 1989, Norwegian band A-ha covered the song. It was the first single taken from their fourth studio album, East of the Sun, West of the Moon (1990). Following its success, A-ha became closer to the Everly Brothers, who had orig
https://en.wikipedia.org/wiki/Haar-like%20feature
Haar-like features are digital image features used in object recognition. They owe their name to their intuitive similarity with Haar wavelets and were used in the first real-time face detector. Historically, working with only image intensities (i.e., the RGB pixel values at each and every pixel of image) made the task of feature calculation computationally expensive. A publication by Papageorgiou et al. discussed working with an alternate feature set based on Haar wavelets instead of the usual image intensities. Paul Viola and Michael Jones adapted the idea of using Haar wavelets and developed the so-called Haar-like features. A Haar-like feature considers adjacent rectangular regions at a specific location in a detection window, sums up the pixel intensities in each region and calculates the difference between these sums. This difference is then used to categorize subsections of an image. For example, with a human face, it is a common observation that among all faces the region of the eyes is darker than the region of the cheeks. Therefore, a common Haar feature for face detection is a set of two adjacent rectangles that lie above the eye and the cheek region. The position of these rectangles is defined relative to a detection window that acts like a bounding box to the target object (the face in this case). In the detection phase of the Viola–Jones object detection framework, a window of the target size is moved over the input image, and for each subsection of the image the Haar-like feature is calculated. This difference is then compared to a learned threshold that separates non-objects from objects. Because such a Haar-like feature is only a weak learner or classifier (its detection quality is slightly better than random guessing) a large number of Haar-like features are necessary to describe an object with sufficient accuracy. In the Viola–Jones object detection framework, the Haar-like features are therefore organized in something called a classifier cascad
https://en.wikipedia.org/wiki/Cottrell%20equation
In electrochemistry, the Cottrell equation describes the change in electric current with respect to time in a controlled potential experiment, such as chronoamperometry. Specifically it describes the current response when the potential is a step function in time. It was derived by Frederick Gardner Cottrell in 1903. For a simple redox event, such as the ferrocene/ferrocenium couple, the current measured depends on the rate at which the analyte diffuses to the electrode. That is, the current is said to be "diffusion controlled". The Cottrell equation describes the case for an electrode that is planar but can also be derived for spherical, cylindrical, and rectangular geometries by using the corresponding Laplace operator and boundary conditions in conjunction with Fick's second law of diffusion. where, = current, in units of A = number of electrons (to reduce/oxidize one molecule of analyte , for example) = Faraday constant, 96485 C/mol = area of the (planar) electrode in cm2 = initial concentration of the reducible analyte in mol/cm3; = diffusion coefficient for species in cm2/s = time in s. Deviations from linearity in the plot of vs. sometimes indicate that the redox event is associated with other processes, such as association of a ligand, dissociation of a ligand, or a change in geometry. In practice, the Cottrell equation simplifies to where is the collection of constants for a given system (, , ). See also Voltammetry Electroanalytical methods Limiting current Anson equation
https://en.wikipedia.org/wiki/Cyclic%20executive
A cyclic executive is an alternative to a real-time operating system. It is a form of cooperative multitasking, in which there is only one task. The sole task is typically realized as an infinite loop in main(), e.g. in C. The basic scheme is to cycle through a repeating sequence of activities, at a set frequency (AKA time-triggered cyclic executive). For example, consider the example of an embedded system designed to monitor a temperature sensor and update an LCD display. The LCD may need to be written twenty times a second (i.e., every 50 ms). If the temperature sensor must be read every 100 ms for other reasons, we might construct a loop of the following appearance: int main(void) { while (1) { // This loop is designed to take 100 ms, meaning // all steps add up to 100 ms. // Since this is demo code and we don't know how long // tempRead or lcdWrite take to execute, we assume // they take zero time. // As a result, the delays are responsible for the task scheduling / timing. // Read temp once per cycle (every 100 ms) currTemp = tempRead(); // Write to LCD twice per cycle (every 50 ms) lcdWrite(currTemp); delay(50); lcdWrite(currTemp); delay(50); // Now 100 ms (delay(50) + delay(50) + tempRead + lcdWrite + lcdWrite) // has passed so we repeat the cycle. } } The outer 100 ms cycle is called the major cycle. In this case, there is also an inner minor cycle of 50 ms. In this first example the outer versus inner cycles aren't obvious. We can use a counting mechanism to clarify the major and minor cycles. int main(void) { unsigned int i = 0; while (1) { // This loop is designed to take 50 ms. // Since this is demo code and we don't know how long // tempRead or lcdWrite take to execute, we assume // they take zero time. // Since we only want tempRead to execute every 100ms, we use // an if statement to check whether
https://en.wikipedia.org/wiki/Geosynchronous%20satellite
A geosynchronous satellite is a satellite in geosynchronous orbit, with an orbital period the same as the Earth's rotation period. Such a satellite returns to the same position in the sky after each sidereal day, and over the course of a day traces out a path in the sky that is typically some form of analemma. A special case of geosynchronous satellite is the geostationary satellite, which has a geostationary orbit – a circular geosynchronous orbit directly above the Earth's equator. Another type of geosynchronous orbit used by satellites is the Tundra elliptical orbit. Geostationary satellites have the unique property of remaining permanently fixed in exactly the same position in the sky as viewed from any fixed location on Earth, meaning that ground-based antennas do not need to track them but can remain fixed in one direction. Such satellites are often used for communication purposes; a geosynchronous network is a communication network based on communication with or through geosynchronous satellites. Definition The term geosynchronous refers to the satellite's orbital period which enables it to be matched, with the rotation of the Earth ("geo-"). Along with this orbital period requirement, to be geostationary as well, the satellite must be placed in an orbit that puts it in the vicinity over the equator. These two requirements make the satellite appear in an unchanging area of visibility when viewed from the Earth's surface, enabling continuous operation from one point on the ground. The special case of a geostationary orbit is the most common type of orbit for communications satellites. If a geosynchronous satellite's orbit is not exactly aligned with the Earth's equator, the orbit is known as an inclined orbit. It will appear (when viewed by someone on the ground) to oscillate daily around a fixed point. As the angle between the orbit and the equator decreases, the magnitude of this oscillation becomes smaller; when the orbit lies entirely over the equator
https://en.wikipedia.org/wiki/Octacube%20%28sculpture%29
The Octacube is a large, stainless steel sculpture displayed in the mathematics department of Pennsylvania State University in State College, PA. The sculpture represents a mathematical object called the 24-cell or "octacube". Because a real 24-cell is four-dimensional, the artwork is actually a projection into the three-dimensional world. Octacube has very high intrinsic symmetry, which matches features in chemistry (molecular symmetry) and physics (quantum field theory). The sculpture was designed by , a mathematics professor at Pennsylvania State University. The university's machine shop spent over a year completing the intricate metal-work. Octacube was funded by an alumna in memory of her husband, Kermit Anderson, who died in the September 11 attacks. Artwork The Octacube's metal skeleton measures about in all three dimensions. It is a complex arrangement of unpainted, tri-cornered flanges. The base is a high granite block, with some engraving. The artwork was designed by Adrian Ocneanu, a Penn State mathematics professor. He supplied the specifications for the sculpture's 96 triangular pieces of stainless steel and for their assembly. Fabrication was done by Penn State's machine shop, led by Jerry Anderson. The work took over a year, involving bending and welding as well as cutting. Discussing the construction, Ocneanu said: It's very hard to make 12 steel sheets meet perfectly—and conformally—at each of the 23 vertices, with no trace of welding left. The people who built it are really world-class experts and perfectionists—artists in steel. Because of the reflective metal at different angles, the appearance is pleasantly strange. In some cases, the mirror-like surfaces create an illusion of transparency by showing reflections from unexpected sides of the structure. The sculpture's mathematician creator commented: When I saw the actual sculpture, I had quite a shock. I never imagined the play of light on the surfaces. There are subtle optical effects t
https://en.wikipedia.org/wiki/Ciaccio%27s%20glands
Ciaccio's glands or Wolfring's glands are small tubular accessory lacrimal glands (glandulae lacrimales accessoriae) found in the lacrimal caruncle of the eyelid. These accessory lacrimal glands are located in the upper border of the tarsus, approximately in the middle between the extremities of the tarsal glands. Sometimes they are situated slightly above the tarsus. There are usually 2 to 5 of these glands in the upper eyelid, and their function is to produce tears which are secreted onto the surface of the conjunctiva. They are named after Italian anatomist Giuseppe Vincenzo Ciaccio (1824–1901), who described these glands in 1874. They are sometimes called "Wolfring's glands" after Polish ophthalmologist Emilj von Wolfring (1832-1906), who described them during the same time period as did Ciaccio. Another type of accessory lacrimal gland are "Krause's glands", which are smaller, more numerous than "Ciaccio's glands" and are found along the superior and inferior fornices of the conjunctival sac.
https://en.wikipedia.org/wiki/Net%20interest%20spread
Net interest spread refers to the difference in borrowing and lending rates of financial institutions (such as banks) in nominal terms. It is considered analogous to the gross margin of non-financial companies. Net interest spread is expressed as interest yield on earning assets (any asset, such as a loan, that generates interest income) minus interest rates paid on borrowed funds. Net interest spread is similar to net interest margin; net interest spread expresses the nominal average difference between borrowing and lending rates, without compensating for the fact that the amount of earning assets and borrowed funds may be different. Example For example, a bank has average loans to customers of $100, and earns gross interest income of $6. The interest yield is 6/100 = 6%. A bank takes deposits from customers and pays 1% to those customers. The bank lends its customers money at 6%. The bank's net interest spread is 5%.
https://en.wikipedia.org/wiki/American%20Society%20for%20Clinical%20Pathology
The American Society for Clinical Pathology (ASCP) is a professional association based in Chicago, Illinois encompassing 130,000 pathologists and laboratory professionals. Founded in 1922, the ASCP provides programs in education, certification and advocacy on behalf of patients, pathologists and lab professionals. In addition, the ASCP publishes numerous textbooks, newsletters and other manuals, and publishes two industry journals: American Journal of Clinical Pathology and LabMedicine. The ASCP also promotes National Medical Laboratory Professionals Week (NMLPW) as a time of recognition for medical laboratory personnel and a chance to celebrate their professionalism and be recognized for their efforts. National Lab Week is held annually during the last full week of April.
https://en.wikipedia.org/wiki/George%20Henry%20Burgess
George Henry Burgess (8 June 1831 – 22 April 1905) was an English American painter, wood engraver and lithographer. In London, he received training in lithography. With two other brothers preceding them, in 1850 Burgess traveled to California in the company of his brother Charles. Once there, the Burgess brothers set up a jewelry and watch repair business in Sonora. Unsuccessful at mining, George spent time sketching the gold fields and mining activity. In 1856, he made the first of three trips to Hawaii, where he painted the royal family and made preparations for lithographic views of Honolulu. In San Francisco, his primary source of income was painting portraits, but he often revisited the Gold Rush theme. Burgess' most well-known work is the massive San Francisco in July, 1849, now located at the Oakland Museum of California. Life and work George Henry Burgess was born in London on 8 June 1831, One of four sons of a prominent London physician. Burgess studied "artistic lithography" at the Somerset School House School of Design, and then apprenticed at a commercial lithography firm in London. George's eldest brother, Edward, had journeyed to San Francisco in 1847, eventually setting up a trading business between California and Hawaii. Another brother, Hubert, a professional artist, in 1850 set out for California from New York on news of the California Gold Rush. In the same year, George and his brother Charles, a portrait painter and photographer, traveled over the Great Plains in search of their fortune. After initial gold prospecting attempts, Hubert, Charles, and George Burgess opened a jewelry shop in Sonora. Operating out of a tent, they repaired watches and made gold chains for the miners. Due to local disturbances, the brothers disbanded their operations. Hubert and George continued to hunt together, supplying meat for restaurants and boardinghouses. They also engaged in sketching trips around the Mother Lode area. George's pictures are noteworthy for th
https://en.wikipedia.org/wiki/American%20Vacuum%20Society
AVS: Science and Technology of Materials, Interfaces, and Processing (formerly American Vacuum Society) is a not-for-profit learned society founded in 1953 focused on disciplines related to materials, interfaces, and processing. AVS has approximately 4500 members worldwide from academia, governmental laboratories and industry. AVS is a member society of the American Institute of Physics. AVS publishes through the American Institute of Physics the journals Journal of Vacuum Science and Technology (JVST A and B) and Biointerphases, which are devoted to peer-reviewed articles, and Surface Science Spectra (SSS), which publishes peer-reviewed articles with reference spectra of technological and scientific interest. In 2019 American Institute of Physics and AVS launched jointly a new journal AVS Quantum Science. Organization AVS is composed of 10 technical divisions, two technical groups, 16 regional chapters, two international chapters and one international affiliate: Advanced Surface Engineering Division Applied Surface Science Division Biomaterial Interfaces Division Electronic Materials/Photonics Division Magnetic Interfaces and Nanostructures Division Nanometer-Scale Science and Technology Division Plasma Science and Technology Division Surface Science Division Thin Film Division (TF) Vacuum Technology Division AVS Technical Groups Division Manufacturing Science & Technology Technical Group (MSTG) MEMS and NEMS Technical Group Conferences The AVS International Symposium and Exhibition is AVS's flagship conference. The symposium addresses cutting-edge issues associated with materials, processing, and interfaces in the research and manufacturing communities. AVS also sponsors a variety of topical conferences, including the International Conference on Atomic Layer Deposition and the North American Conference on Molecular Beam Epitaxy. AVS professional awards Dorothy Hoffman Award Medard W. Welch Award John A. Thornton Memorial Award & Lecture AVS graduate s
https://en.wikipedia.org/wiki/Grid-tie%20inverter
A grid-tie inverter converts direct current (DC) into an alternating current (AC) suitable for injecting into an electrical power grid, normally 120 V RMS at 60 Hz or 240 V RMS at 50 Hz. Grid-tie inverters are used between local electrical power generators: solar panel, wind turbine, hydro-electric, and the grid. To inject electrical power efficiently and safely into the grid, grid-tie inverters must accurately match the voltage, frequency and phase of the grid sine wave AC waveform. Payment for injected power Electricity companies, in some countries, pay for electrical power that is injected into the electricity utility grid. Payment is arranged in several ways. With net metering the electricity company pays for the net power injected into the grid, as recorded by a meter in the customer's premises. For example, a customer may consume 400 kilowatt-hours over a month and may return 500 kilowatt-hours to the grid in the same month. In this case the electricity company would pay for the 100 kilowatt hours balance of power fed back into the grid. In the US, net metering policies vary by jurisdiction. Feed-in tariff, based on a contract with a distribution company or other power authority, is where the customer is paid for electrical power injected into the grid. In the United States, grid-interactive power systems are specified in the National Electric Code (NEC), which also mandates requirements for grid-interactive inverters. Operation Grid-tie inverters convert DC electrical power into AC power suitable for injecting into the electric utility company grid. The grid tie inverter (GTI) must match the phase of the grid and maintain the output voltage slightly higher than the grid voltage at any instant. A high-quality modern grid-tie inverter has a fixed unity power factor, which means its output voltage and current are perfectly lined up, and its phase angle is within 1° of the AC power grid. The inverter has an internal computer that senses the current AC grid
https://en.wikipedia.org/wiki/Krause%27s%20glands
Krause's glands or Krause glands are small, mucous accessory lacrimal glands that are found underneath the eyelid where the upper and lower conjunctivae meet. Their ducts unite into a rather long sinus which open into the fornix conjunctiva. There are approximately forty Krause glands in the region of the upper eyelid, and around 6 to 8 in the region of the lower lid. The function of these glands are to produce tears which are secreted onto the surface of the conjunctiva. There are rare instances of tumors associated with Krause's glands. They usually occur as retention cysts in cicatricial conditions of the conjunctiva. Krause's glands are named after German anatomist Karl Friedrich Theodor Krause (1797–1868). See also Ciaccio's glands
https://en.wikipedia.org/wiki/Simon%27s%20problem
In computational complexity theory and quantum computing, Simon's problem is a computational problem that is proven to be solved exponentially faster on a quantum computer than on a classical (that is, traditional) computer. The quantum algorithm solving Simon's problem, usually called Simon's algorithm, served as the inspiration for Shor's algorithm. Both problems are special cases of the abelian hidden subgroup problem, which is now known to have efficient quantum algorithms. The problem is set in the model of decision tree complexity or query complexity and was conceived by Daniel R. Simon in 1994. Simon exhibited a quantum algorithm that solves Simon's problem exponentially faster with exponentially fewer queries than the best probabilistic (or deterministic) classical algorithm. In particular, Simon's algorithm uses a linear number of queries and any classical probabilistic algorithm must use an exponential number of queries. This problem yields an oracle separation between the complexity classes BPP (bounded-error classical query complexity) and BQP (bounded-error quantum query complexity). This is the same separation that the Bernstein–Vazirani algorithm achieves, and different from the separation provided by the Deutsch–Jozsa algorithm, which separates P and EQP. Unlike the Bernstein–Vazirani algorithm, Simon's algorithm's separation is exponential. Because this problem assumes the existence of a highly-structured "black box" oracle to achieve its speedup, this problem has little practical value. However, without such an oracle, exponential speedups cannot easily be proven, since this would prove that P is different from PSPACE. Problem description Given a function (implemented by a black box or oracle) with the promise that, for some unknown , for all , if and only if , where denotes bitwise XOR. The goal is to identify by making as few queries to as possible. Note that if and only if Furthermore, for some and in , is unique (not equa