source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/RADAR%20%28audio%20recorder%29
RADAR (Random Access Digital Audio Recorder) is a product line of professional digital multitrack recorders capable of recording and playing back twenty-four tracks of audio. History The idea for RADAR was born out of Creation Studios, a high end music recording studio, built in North Vancouver by Barry and Jane Anne Henderson. Barry was also Music Products Division Manager at Anatek Microcircuits, a hybrid manufacturer in North Vancouver, BC. Barry and his team developed the Anatek line of MIDI and audio products including the now famous line of small MIDI signal powered MIDI processing accessories called "Pocket Products". Pocket products were used world-wide by some of the most famous musicians in the world such as Stevie Wonder. The most famous of the Pocket Products, Pocket Merge, sold close to 10,000 units in 1989, the product launch year. In 1991, Barry used Creation Studios as a vehicle to fuel the development of RADAR, the world's first 24 track digital audio hard disk recorder. In 1993 - Creation showed RADAR at the October 1993 AES show in New York. Barry initially marketed RADAR under the Otari brand, and later under his own brand, iZ (iZ Technology Corporation). RADAR is now used world-wide by bands such as U2, and Neil Young, as well in large commercial facilities such as Disney, CBC, IMAX, and NBC.The first model, the Creation RADAR I, was released in 1993. RADAR I was manufactured in North Vancouver BC by Creation Technologies(iZ Technology Corporation), but marketed and distributed in 1994 by Otari under an exclusive license. Creation/iZ also produced the RADAR II. iZ Technology terminated Otari's RADAR distribution agreement in 2000. In October 2000, iZ Technology released RADAR 24 under the iZ brand, which was based on BeOS 5. In 2005, iZ released RADAR V. In 2012, iZ released RADAR 6 at the October 2012 AES show in San Francisco. In 2015, iZ released RADAR studio, the world's first DAW/Dedicated Recorder in a single box. In 2017, Barry sold t
https://en.wikipedia.org/wiki/Polar%20wind
The polar wind or plasma fountain is a permanent outflow of plasma from the polar regions of Earth's magnetosphere, caused by the interaction between the solar wind and the Earth's atmosphere. The solar wind ionizes gas molecules in the upper atmosphere to such high energy that some of them reach escape velocity and pour into space. A considerable percentage of these ions remain bound inside Earth's magnetic field, where they form part of the radiation belts. The term was coined in 1968 in a pair of articles by Banks and Holzer and by Ian Axford. Since the process by which the ionospheric plasma flows away from the Earth along magnetic field lines is similar to the flow of solar plasma away from the sun's corona (the solar wind), Axford suggested the term "polar wind." The idea for the polar wind originated with the desire to solve the paradox of the terrestrial helium budget. This paradox consists of the fact that helium in the Earth's atmosphere seems to be produced (via radioactive decay of uranium and thorium) faster than it is lost by escaping from the upper atmosphere. The realization that some helium could be ionized, and therefore escape the earth along open magnetic field lines near the magnetic poles (the 'polar wind'), is one possible solution to the paradox. Further research came from the Retarding Ion Mass Spectrometer instrument on the Dynamics Explorer spacecraft, in the 1980s. Recently, the SCIFER sounding rocket was launched into the plasma heating region of the fountain.
https://en.wikipedia.org/wiki/Uniqueness%20quantification
In mathematics and logic, the term "uniqueness" refers to the property of being the one and only object satisfying a certain condition. This sort of quantification is known as uniqueness quantification or unique existential quantification, and is often denoted with the symbols "∃!" or "∃=1". For example, the formal statement may be read as "there is exactly one natural number such that ". Proving uniqueness The most common technique to prove the unique existence of a certain object is to first prove the existence of the entity with the desired condition, and then to prove that any two such entities (say, and ) must be equal to each other (i.e. ). For example, to show that the equation has exactly one solution, one would first start by establishing that at least one solution exists, namely 3; the proof of this part is simply the verification that the equation below holds: To establish the uniqueness of the solution, one would then proceed by assuming that there are two solutions, namely and , satisfying . That is, Then since equality is a transitive relation, Subtracting 2 from both sides then yields which completes the proof that 3 is the unique solution of . In general, both existence (there exists at least one object) and uniqueness (there exists at most one object) must be proven, in order to conclude that there exists exactly one object satisfying a said condition. An alternative way to prove uniqueness is to prove that there exists an object satisfying the condition, and then to prove that every object satisfying the condition must be equal to . Reduction to ordinary existential and universal quantification Uniqueness quantification can be expressed in terms of the existential and universal quantifiers of predicate logic, by defining the formula to mean which is logically equivalent to An equivalent definition that separates the notions of existence and uniqueness into two clauses, at the expense of brevity, is Another equivalent defin
https://en.wikipedia.org/wiki/Hygiene%20hypothesis
In medicine, the hygiene hypothesis states that early childhood exposure to particular microorganisms (such as the gut flora and helminth parasites) protects against allergies by strengthening the immune system. In particular, a lack of such exposure is thought to lead to poor immune tolerance. The time period for exposure begins before birth and ends at school age. While early versions of the hypothesis referred to microorganism exposure in general, later versions apply to a specific set of microbes that have co-evolved with humans. The updates have been given various names, including the microbiome depletion hypothesis, the microflora hypothesis, and the "old friends" hypothesis. There is a significant amount of evidence supporting the idea that lack of exposure to these microbes is linked to allergies or other conditions, although it is still rejected by many scientists. The term "hygiene hypothesis" has been described as a misnomer because people incorrectly interpret it as referring to their own cleanliness. Having worse personal hygiene, such as not washing hands before eating, only increases the risk of infection without affecting the risk of allergies or immune disorders. Hygiene is essential for protecting vulnerable populations such as the elderly from infections, preventing the spread of antibiotic resistance, and combating emerging infectious diseases such as Ebola or COVID-19. The hygiene hypothesis does not suggest that having more infections during childhood would be an overall benefit. Overview The idea of a link between parasite infection and immune disorders was first suggested in 1968 before the advent of large scale dna sequencing techniques. The original formulation of the hygiene hypothesis dates from 1989, when David Strachan proposed that lower incidence of infection in early childhood could be an explanation for the rise in allergic diseases such as asthma and hay fever during the 20th century. The hygiene hypothesis has also been exp
https://en.wikipedia.org/wiki/Nordstr%C3%B6m%27s%20theory%20of%20gravitation
In theoretical physics, Nordström's theory of gravitation was a predecessor of general relativity. Strictly speaking, there were actually two distinct theories proposed by the Finnish theoretical physicist Gunnar Nordström, in 1912 and 1913 respectively. The first was quickly dismissed, but the second became the first known example of a metric theory of gravitation, in which the effects of gravitation are treated entirely in terms of the geometry of a curved spacetime. Neither of Nordström's theories are in agreement with observation and experiment. Nonetheless, the first remains of interest insofar as it led to the second. The second remains of interest both as an important milestone on the road to the current theory of gravitation, general relativity, and as a simple example of a self-consistent relativistic theory of gravitation. As an example, this theory is particularly useful in the context of pedagogical discussions of how to derive and test the predictions of a metric theory of gravitation. Development of the theories Nordström's theories arose at a time when several leading physicists, including Nordström in Helsinki, Max Abraham in Milan, Gustav Mie in Greifswald, Germany, and Albert Einstein in Prague, were all trying to create competing relativistic theories of gravitation. All of these researchers began by trying to suitably modify the existing theory, the field theory version of Newton's theory of gravitation. In this theory, the field equation is the Poisson equation , where is the gravitational potential and is the density of matter, augmented by an equation of motion for a test particle in an ambient gravitational field, which we can derive from Newton's force law and which states that the acceleration of the test particle is given by the gradient of the potential This theory is not relativistic because the equation of motion refers to coordinate time rather than proper time, and because, should the matter in some isolated object sudden
https://en.wikipedia.org/wiki/Distress%20%28medicine%29
In medicine, distress is an aversive state in which a person is unable to completely adapt to difficult situations and their resulting effects and shows maladaptive behaviors. It can be evident in the presence of various phenomena, such as inappropriate social interaction (e.g., aggression, passivity, or withdrawal). Distress is the opposite of eustress, a positive emotion that motivates people. Risk factors Stress can be created by influences such as work, school, peers or co-workers, family and death. Other influences vary by age. People under constant distress are more likely to become sick, mentally or physically. There is a clear response association between psychological distress and major causes of mortality across the full range of distress. Higher education has been linked to a reduction in psychological distress in both men and women, and these effects persist throughout the aging process, not just immediately after receiving education. However, this link does lessen with age. The major mechanism by which higher education plays a role on reducing stress in men is more so related to labor-market resources rather than social resources as in women. In the clinic, distress is a patient reported outcome that has a huge impact on patient's quality of life. To assess patient distress, a Hospital Anxiety and Depression Scale (HADS) questionnaire is most commonly used. The score from the HADS questionnaire guides a clinician to recommend lifestyle modifications or further assessment for mental disorders like depression. Management People often find ways of dealing with distress, in both negative and positive ways. Examples of positive ways are listening to music, calming exercises, coloring, sports and similar healthy distractions. Negative ways can include but are not limited to use of drugs including alcohol, and expression of anger, which are likely to lead to complicated social interactions, thus causing increased distress. See also Intentional inflic
https://en.wikipedia.org/wiki/BioCryst%20Pharmaceuticals
BioCryst Pharmaceuticals, Inc. is an American pharmaceutical company headquartered in Durham, North Carolina. The company is a late stage biotech company that focuses on oral drugs for rare and serious diseases. BioCryst's antiviral drug peramivir (Rapivab) was approved by FDA in December 2014. It has also been approved in Japan, Korea, and China. History The company was founded in 1986 by Charles E. Bugg and John A. Montgomery. In March 1994, BioCryst became a public company when it completed an initial public offering by listing its shares on the NASDAQ stock exchange. In 2008, the company was named one of the fastest growing companies by Deloitte & Touche in its 2008 list of Technology Fast 500. In October 2010, BioCryst announced its headquarters would move to Durham, North Carolina, where the company has had an office since 2006. In January 2018, BioCryst signed a definitive merger agreement with Idera Pharmaceuticals, with plans for the combined company to change its name and move to Pennsylvania. However, BioCryst shareholders voted down the merger in July. Pipeline BioCryst’s core development programs include: Berotralstat (BCX7353), an oral inhibitor of plasma kallikrein for prevention and treatment of hereditary angioedema (HAE). FDA has approved Expanded Access Program for berotralstat for eligible patients in the United States. Berotralstat has a PDUFA date for FDA approval on December 3, 2020. On December 3, 2020, the FDA approved berotralstat as the first oral hereditary angioedema prophylaxis. BCX9930, an oral Factor D inhibitor for the treatment of complement-mediated diseases. FDA has granted Fast Track designation for BCX9930, for the treatment of paroxysmal nocturnal hemoglobinuria (PNH). BCX9250, an oral ALK-2 inhibitor for treatment of fibrodysplasia ossificans progressiva (FOP). Galidesivir (BCX4430) is a broad-spectrum antiviral in advanced development for the treatment of viruses including SARS-CoV-2 (the cause of COVID-19), Ebola,
https://en.wikipedia.org/wiki/Seasonal%20affective%20disorder
Seasonal affective disorder (SAD) is a mood disorder subset in which people who typically have normal mental health throughout most of the year exhibit depressive symptoms at the same time each year. It is commonly, but not always, associated with the reductions or increases in total daily sunlight hours that occur during the summer or winter. Common symptoms include sleeping too much, having little to no energy, and overeating. The condition in the summer can include heightened anxiety. In the DSM-IV and DSM-5, its status as a standalone condition was changed: It is no longer classified as a unique mood disorder, but is now a specifier (called "with seasonal pattern") for recurrent major depressive disorder that occurs at a specific time of the year and fully remits otherwise. Although experts were initially skeptical, this condition is now recognized as a common disorder. The validity of SAD was called into question, however, by a 2016 analysis by the Center for Disease Control in which no links were detected between depression and seasonality or sunlight exposure. In the United States, the percentage of the population affected by SAD ranges from 1.4% of the population in Florida to 9.9% in Alaska. SAD was formally described and named in 1984, by Norman E. Rosenthal and colleagues at the National Institute of Mental Health. History SAD was first systematically reported and named in the early 1980s, by Norman E. Rosenthal, M.D., and his associates at the National Institute of Mental Health (NIMH). Rosenthal was initially motivated by his desire to discover the cause of his own experience of depression during the dark days of the northern US winter, called polar night. He theorized that the reduction in available natural light during winter was the cause. Rosenthal and his colleagues then documented the phenomenon of SAD in a placebo-controlled study utilizing light therapy. A paper based on this research was published in 1984. Although Rosenthal's ideas were in
https://en.wikipedia.org/wiki/Van%20H.%20Vu
Van H. Vu () is a Vietnamese mathematician, Percey F. Smith Professor of Mathematics at Yale University. Education and career Vu was born in Hanoi (Vietnam) in 1970. He went to special math classes for gifted children at Chu Van An and Hanoi-Amsterdam high schools. In 1987, he went to Hungary for his undergraduate studies, and in 1994, obtained his M.Sc in mathematics at the Faculty of Sciences of the Eötvös University, Budapest. His thesis supervisor was Tamás Szőnyi. He received his Ph.D. at Yale University in 1998 under the direction of László Lovász. He worked as a postdoc at IAS and Microsoft Research (1998-2001). He joined the University of California, San Diego as an assistant professor in 2001 and was promoted to full professor in 2005. In Fall 2005, he moved to Rutgers University and stayed there until he joined Yale in Fall 2011. Vu was a member at IAS on three occasions (1998, 2005, 2007), the last time, in 2007, as the leader of the special program Arithmetic Combinatorics. Contributions In his PhD thesis, Vu, together with Kim, developed a theory for concentration of measure of polynomials (and non-Lipschitz functions in general). Later, as an application, he established a refinement of Waring's problem. In 2003, Vu and Szemeredi solved the Erdos-Folkman problem, answering the following question: How dense a set of positive integers should be so every sufficiently large integer can be represented as a subsum? In 2006, with Tao and Vu published their book "Additive Combinatorics.” Together, they developed the Inverse Littlewood-Offord theory for anti-concentration. In 2007, with Johansson and Kahn, Vu solved the Shamir conjecture in random graph theory. Among others, they established the sharp threshold for the existence of a perfect matching in a random hypergraph. In 2010, Terence Tao and Vu solved the circular law conjecture in random matrix theory, which established the non-Hermitian version of Wigner semi-circle law. In 2011, they
https://en.wikipedia.org/wiki/Population%20dynamics
Population dynamics is the type of mathematics used to model and study the size and age composition of populations as dynamical systems. History Population dynamics has traditionally been the dominant branch of mathematical biology, which has a history of more than 220 years, although over the last century the scope of mathematical biology has greatly expanded. The beginning of population dynamics is widely regarded as the work of Malthus, formulated as the Malthusian growth model. According to Malthus, assuming that the conditions (the environment) remain constant (ceteris paribus), a population will grow (or decline) exponentially. This principle provided the basis for the subsequent predictive theories, such as the demographic studies such as the work of Benjamin Gompertz and Pierre François Verhulst in the early 19th century, who refined and adjusted the Malthusian demographic model. A more general model formulation was proposed by F. J. Richards in 1959, further expanded by Simon Hopkins, in which the models of Gompertz, Verhulst and also Ludwig von Bertalanffy are covered as special cases of the general formulation. The Lotka–Volterra predator-prey equations are another famous example, as well as the alternative Arditi–Ginzburg equations. Logistic function Simplified population models usually start with four key variables (four demographic processes) including death, birth, immigration, and emigration. Mathematical models used to calculate changes in population demographics and evolution hold the assumption of no external influence. Models can be more mathematically complex where "...several competing hypotheses are simultaneously confronted with the data." For example, in a closed system where immigration and emigration does not take place, the rate of change in the number of individuals in a population can be described as: where is the total number of individuals in the specific experimental population being studied, is the number of births and D is
https://en.wikipedia.org/wiki/Pathogen
In biology, a pathogen (, "suffering", "passion" and , "producer of"), in the oldest and broadest sense, is any organism or agent that can produce disease. A pathogen may also be referred to as an infectious agent, or simply a germ. The term pathogen came into use in the 1880s. Typically, the term pathogen is used to describe an infectious microorganism or agent, such as a virus, bacterium, protozoan, prion, viroid, or fungus. Small animals, such as helminths and insects, can also cause or transmit disease. However, these animals are usually referred to as parasites rather than pathogens. The scientific study of microscopic organisms, including microscopic pathogenic organisms, is called microbiology, while parasitology refers to the scientific study of parasites and the organisms that host them. There are several pathways through which pathogens can invade a host. The principal pathways have different episodic time frames, but soil has the longest or most persistent potential for harboring a pathogen. Diseases in humans that are caused by infectious agents are known as pathogenic diseases. Not all diseases are caused by pathogens, such as black lung from exposure to the pollutant coal dust, genetic disorders like sickle cell disease, and autoimmune diseases like lupus. Pathogenicity Pathogenicity is the potential disease-causing capacity of pathogens, involving a combination of infectivity (pathogen's ability to infect hosts) and virulence (severity of host disease). Koch's postulates are used to establish causal relationships between microbial pathogens and diseases. Whereas meningitis can be caused by a variety of bacterial, viral, fungal, and parasitic pathogens, cholera is only caused by some strains of Vibrio cholerae. Additionally, some pathogens may only cause disease in hosts with an immunodeficiency. These opportunistic infections often involve hospital-acquired infections among patients already combating another condition. Infectivity involves path
https://en.wikipedia.org/wiki/Manchester%20Mark%201
The Manchester Mark 1 was one of the earliest stored-program computers, developed at the Victoria University of Manchester, England from the Manchester Baby (operational in June 1948). Work began in August 1948, and the first version was operational by April 1949; a program written to search for Mersenne primes ran error-free for nine hours on the night of 16/17 June 1949. The machine's successful operation was widely reported in the British press, which used the phrase "electronic brain" in describing it to their readers. That description provoked a reaction from the head of the University of Manchester's Department of Neurosurgery, the start of a long-running debate as to whether an electronic computer could ever be truly creative. The Mark 1 was to provide a computing resource within the university, to allow researchers to gain experience in the practical use of computers, but it very quickly also became a prototype on which the design of Ferranti's commercial version could be based. Development ceased at the end of 1949, and the machine was scrapped towards the end of 1950, replaced in February 1951 by a Ferranti Mark 1, the world's first commercially available general-purpose electronic computer. The computer is especially historically significant because of its pioneering inclusion of index registers, an innovation which made it easier for a program to read sequentially through an array of words in memory. Thirty-four patents resulted from the machine's development, and many of the ideas behind its design were incorporated in subsequent commercial products such as the and 702 as well as the Ferranti Mark 1. The chief designers, Frederic C. Williams and Tom Kilburn, concluded from their experiences with the Mark 1 that computers would be used more in scientific roles than in pure mathematics. In 1951, they started development work on Meg, the Mark 1's successor, which would include a floating point unit. It was also called the Manchester Automatic Digital
https://en.wikipedia.org/wiki/Pseudoelasticity
Pseudoelasticity, sometimes called superelasticity, is an elastic (reversible) response to an applied stress, caused by a phase transformation between the austenitic and martensitic phases of a crystal. It is exhibited in shape-memory alloys. Overview Pseudoelasticity is from the reversible motion of domain boundaries during the phase transformation, rather than just bond stretching or the introduction of defects in the crystal lattice (thus it is not true superelasticity but rather pseudoelasticity). Even if the domain boundaries do become pinned, they may be reversed through heating. Thus, a pseudoelastic material may return to its previous shape (hence, shape memory) after the removal of even relatively high applied strains. One special case of pseudoelasticity is called the Bain Correspondence. This involves the austenite/martensite phase transformation between a face-centered crystal lattice (FCC) and a body-centered tetragonal crystal structure (BCT). Superelastic alloys belong to the larger family of shape-memory alloys. When mechanically loaded, a superelastic alloy deforms reversibly to very high strains (up to 10%) by the creation of a stress-induced phase. When the load is removed, the new phase becomes unstable and the material regains its original shape. Unlike shape-memory alloys, no change in temperature is needed for the alloy to recover its initial shape. Superelastic devices take advantage of their large, reversible deformation and include antennas, eyeglass frames, and biomedical stents. Nickel titanium (Nitinol) is an example of an alloy exhibiting superelasticity. Size effects Recently, there have been interests of discovering materials exhibiting superelasticity in nanoscale for MEMS (Microelectromechanical systems) application. The ability to control the martensitic phase transformation has already been reported. But the behavior of superelasticity has been observed to have size effects in nanoscale. Qualitatively speaking, supere
https://en.wikipedia.org/wiki/BID%20610
BID 610 or Alvis was a British cipher machine used by both British and Canadian governments. It was the first fully transistorised full-duplex online cipher machine used by the British Army. It was introduced in the 1960s.
https://en.wikipedia.org/wiki/Artificial%20enzyme
An artificial enzyme is a synthetic organic molecule or ion that recreates one or more functions of an enzyme. It seeks to deliver catalysis at rates and selectivity observed in naturally occurring enzymes. History Enzyme catalysis of chemical reactions occur with high selectivity and rate. The substrate is activated in a small part of the enzyme's macromolecule called the active site. There, the binding of a substrate close to functional groups in the enzyme causes catalysis by so-called proximity effects. It is possible to create similar catalysts from small molecules by combining substrate-binding with catalytic functional groups. Classically, artificial enzymes bind substrates using receptors such as cyclodextrin, crown ethers, and calixarene. Artificial enzymes based on amino acids or peptides have expanded the field of artificial enzymes or enzyme mimics. For instance, scaffolded histidine residues mimic certain metalloproteins and enzymes such as hemocyanin, tyrosinase, and catechol oxidase). Artificial enzymes have been designed from scratch via a computational strategy using Rosetta. A December 2014 publication reported active enzymes made from molecules that do not occur in nature. In 2016, a book chapter entitled "Artificial Enzymes: The Next Wave" was published. Nanozymes Nanozymes are nanomaterials with enzyme-like characteristics. They have been explored for applications such as biosensing, bioimaging, tumor diagnosis and therapy, and anti-biofouling. 1990s In 1996 and 1997, Dugan et al. discovered superoxide dismutase (SOD)-mimicking activities of fullerene derivatives. 2000s The term "nanozyme" was coined in 2004 by Flavio Manea, Florence Bodar Houillon, Lucia Pasquato, and Paolo Scrimin. A 2005 review article attributed this term to "analogy with the activity of catalytic polymers (synzymes)", based on the "outstanding catalytic efficiency of some of the functional nanoparticles synthesized". In 2006, nanoceria (CeO2 nanoparticles) was repo
https://en.wikipedia.org/wiki/Triclopyr
Triclopyr (3,5,6-trichloro-2-pyridinyloxyacetic acid) is an organic compound in the pyridine group that is used as a systemic foliar herbicide and fungicide. Uses Triclopyr is used to control broadleaf weeds while leaving grasses and conifers unaffected or to control rust diseases on crops. Triclopyr is effective on woody plants and is used for brush control in the right-of-way and defoliation of wooded areas. In the USA, it is sold under the trade names Garlon, Remedy, and many others, and in the UK as SBK Brushwood Killer. It is also used for broadleaf weeds, particularly creeping charlie (Glechoma hederacea). It is sold under the trade names Turflon, Weed-B-Gon (purple label), and Brush-B-Gon ("Poison Ivy Killer") for these purposes. It is a major ingredient in Confront, which was withdrawn from most uses due to concerns about compost contamination from the other major ingredient, clopyralid. Environmental effects Triclopyr breaks down in soil with a half-life between 30 and 90 days. It degrades rapidly in water, and remains active in decaying vegetation for about 3 months. The compound is slightly toxic to ducks (LD50 = 1698 mg/kg) and quail (LD50 = 3000 mg/kg). It has been found nontoxic to bees and very slightly toxic to fish (rainbow trout LC50 (96 hr) = 117 ppm). Garlon's fact sheet for their triclopyr ester product indicates that triclopyr is highly toxic to fish, aquatic plants, and aquatic invertebrates, and should never be used in waterways, wetlands, or other sensitive habitats. This is only for the triclopyr ester product, not for the triclopyr amine product.
https://en.wikipedia.org/wiki/NIROSETI
The NIROSETI (Near-InfraRed Optical Search for Extraterrestrial Intelligence) is an astronomical program to search for artificial signals in the optical (visible) and near infrared (NIR) wavebands of the electromagnetic spectrum. It is the first dedicated near-infrared SETI experiment. The instrument was created by a collaboration of scientists from the University of California, San Diego, Berkeley SETI Research Center at the University of California, Berkeley, University of Toronto, and the SETI Institute. It uses the Anna Nickel 1-m telescope at the Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA. The instrument was commissioned (saw its first light) on 15 March 2015 and has been operated for more than 150 nights, and is still operational today. Overview The NIROSETI project is based on the assumption that hypothetical communicative extraterrestrials may send out pulsed laser signals in the optical, as well as infrared spectrum. Near-infrared offers a possible way for signal transmission since there is a decrease in both interstellar extinction and Galactic background compared to optical wavelengths. The near-infrared bands remain largely unexplored because instruments capable of capturing short pulses of infrared light have only recently become available. The NIROSETI instrument makes use of the 1-meter optical Nickel telescope located at the Lick Observatory in California to search for near-infrared (laser) transmissions from extraterrestrial communication or technosignatures. This project was funded by the Bill and Susan Bloomfield Foundation and is based upon a predecessor called Lick Optical SETI instrument, conducted between 2001 and 2006. Professor Shelley Wright leads the team that built and operates the NIROSETI program. The NIROSETI instrument employs a new generation of near-infrared (900 to 1700 nm) detectors, cooled at -25 °C, that have a high speed response (>1 GHz) and gain comparable to photomultipl
https://en.wikipedia.org/wiki/Trimethylamine
Trimethylamine (TMA) is an organic compound with the formula N(CH3)3. It is a trimethylated derivative of ammonia. TMA is widely used in industry: it is used in the synthesis of choline, tetramethylammonium hydroxide, plant growth regulators or herbicides, strongly basic anion exchange resins, dye leveling agents, and a number of basic dyes. At higher concentrations it has an ammonia-like odor, and can cause necrosis of mucous membranes on contact. At lower concentrations, it has a "fishy" odor, the odor associated with rotting fish. Properties TMA is a colorless, hygroscopic, and flammable tertiary amine. It is a gas at room temperature but is usually sold as a 40% solution in water. It is also sold in pressurized gas cylinders. TMA is a nitrogenous base and can be readily protonated to give the trimethylammonium cation. Trimethylammonium chloride is a hygroscopic colorless solid prepared from hydrochloric acid. Reactivity Trimethylamine is a good nucleophile, and this reaction is the basis of most of its applications. Trimethylamine is a Lewis base that forms adducts with a variety of Lewis acids. Production Trimethylamine is prepared by the reaction of ammonia and methanol employing a catalyst: 3 CH3OH + NH3 → (CH3)3N + 3 H2O This reaction coproduces the other methylamines, dimethylamine (CH3)2NH and methylamine CH3NH2. Trimethylamine has also been prepared by a reaction of ammonium chloride and paraformaldehyde: 9 (CH2=O)n + 2n NH4Cl → 2n (CH3)3N•HCl + 3n H2O + 3n CO2↑ Applications Trimethylamine is used in the synthesis of choline, tetramethylammonium hydroxide, plant growth regulators, herbicides, strongly basic anion exchange resins, dye leveling agents and a number of basic dyes. Gas sensors to test for fish freshness detect trimethylamine. Importance in the history of psychoanalysis The first dream of his own which Sigmund Freud tried to analyse in detail, when he was developing his theories about the interpretation of dreams, involved a patient
https://en.wikipedia.org/wiki/Mass%20distribution
In physics and mechanics, mass distribution is the spatial distribution of mass within a solid body. In principle, it is relevant also for gases or liquids, but on Earth their mass distribution is almost homogeneous. Astronomy In astronomy mass distribution has decisive influence on the development e.g. of nebulae, stars and planets. The mass distribution of a solid defines its center of gravity and influences its dynamical behaviour - e.g. the oscillations and eventual rotation. Mathematical modelling A mass distribution can be modeled as a measure. This allows point masses, line masses, surface masses, as well as masses given by a volume density function. Alternatively the latter can be generalized to a distribution. For example, a point mass is represented by a delta function defined in 3-dimensional space. A surface mass on a surface given by the equation may be represented by a density distribution , where is the mass per unit area. The mathematical modelling can be done by potential theory, by numerical methods (e.g. a great number of mass points), or by theoretical equilibrium figures. Geology In geology the aspects of rock density are involved. Rotating solids Rotating solids are affected considerably by the mass distribution, either if they are homogeneous or inhomogeneous - see Torque, moment of inertia, wobble, imbalance and stability. See also Bouguer plate Gravity Mass function Mass concentration (astronomy) External links Mass distribution of the Earth Mechanics Celestial mechanics Geophysics Mass
https://en.wikipedia.org/wiki/DECUS
The Digital Equipment Computer Users' Society (DECUS) was an independent computer user group related to Digital Equipment Corporation (DEC). The Connect User Group Community, formed from the consolidation in May, 2008 of DECUS, Encompass, HP-Interex, and ITUG is the Hewlett-Packard’s largest user community, representing more than 50,000 participants. History DECUS was the Digital Equipment Computer Users' Society, a users' group for Digital Equipment Corporation (DEC) computers. Members included companies and organizations who purchased DEC equipment; many members were application programmers who wrote code for DEC machines or system programmers who managed DEC systems. DECUS was founded in March 1961 by Edward Fredkin. DECUS was legally a part of Digital Equipment Corporation and subsidized by the company; however, it was run by unpaid volunteers. Digital staff members were not eligible to join DECUS, yet were allowed and encouraged to participate in DECUS activities. Digital, in turn, relied on DECUS as an important channel of communication with its customers. DECUS Software Library DECUS had a software library which accepted orders from anyone, distributing programs submitted to it by people willing to share. It was organized by processor and operating system, using information submitted by program submitters, who signed releases allowing this and asserting their right to do so. The DECUS library published catalogs of these offerings yearly, though because it had the catalog mastered by an outside firm, it did not have easy ways to retrieve the content of early catalogs (prior to circa 1980) in machine readable format. Later material was maintained in house and was more easily edited. The charges for copying were somewhat high, reflecting the fact the copies were made by hand on DECUS equipment. Activities There were two DECUS US symposia per year, at which members and DEC employees gave presentations, and could visit an exhibit hall containing many new comp
https://en.wikipedia.org/wiki/Proceptive%20phase
In biology and sexology, the proceptive phase is the initial period in a relationship when organisms are "courting" each other, prior to the acceptive phase when copulation occurs. Behaviors that occur during the proceptive phase depend very much on the species, but may include visual displays, movements, sounds and odors. The term proceptivity was introduced into general sexological use by Frank A. Beach in 1976 and refers to behavior enacted by a female to initiate, maintain, or escalate a sexual interaction. There are large species differences in proceptive behavior. The term has also been used to describe women's roles in human courtship, with a meaning very close to Beach's. A near synonym is proception. The term proceptive phase refers to pre-consummatory, that is, pre-ejaculatory, behavior and focuses attention on the active role played by the female organism in creating, maintaining, and escalating the sexual interaction. See also Mating
https://en.wikipedia.org/wiki/Society%20of%20Broadcast%20Engineers
The Society of Broadcast Engineers (SBE) is a professional organization for engineers in broadcast radio and television. The SBE also offers certification in various radio frequency and video and audio technology areas for its members. Background The organization was founded in 1964. The society elected its first female president, Andrea Cummis, in 2021. Certifications Operator Level Certifications Certified Radio Operator (CRO) Certified Television Operator (CTO) Broadcast Networking Certifications Certified Broadcast Networking Technologist (CBNT) Certified Broadcast Networking Engineer (CBNE) Engineer Level Certifications Certified Broadcast Technologist (CBT) Certified Audio Engineer (CEA) Certified Video Engineer (CEV) Certified Broadcast Radio Engineer (CBRE) Certified Broadcast Television Engineer (CBTE) Certified Senior Broadcast Radio Engineer (CSRE) Certified Senior Broadcast Television Engineer (CSTE) Certified Professional Broadcast Engineer (CPBE) Specialist Certifications Certified 8-VSB Specialist (8-VSB) Certified AM Directional Specialist (AMD) Certified Digital Radio Broadcast Specialist (DRB) Previous Certifications These certifications are still in use but are no longer issued. Certified Senior Broadcast Engineer (CSBE) Certified Radio and Television Broadcast Engineer (CBRTE) Certified Senior Radio and Television Broadcast Engineer (CSRTE) See also List of post-nominal letters Broadcast engineering
https://en.wikipedia.org/wiki/Maternal%20impression
The conception of a maternal impression rests on the belief that a powerful mental (or sometimes physical) influence working on the mother's mind may produce an impression, either general or definite, on the child she is carrying. The child might be said to be "marked" as a result. Medicine Maternal impression, according to a long-discredited medical theory, was a phenomenon that explained the existence of birth defects and congenital disorders. The theory stated that an emotional stimulus experienced by a pregnant woman could influence the development of the fetus. For example, it was sometimes supposed that the mother of the Elephant Man was frightened by an elephant during her pregnancy, thus "imprinting" the memory of the elephant onto the gestating fetus. Mental problems, such as schizophrenia and depression, were believed to be a manifestation of similar disordered feelings in the mother. For instance, a pregnant woman who experienced great sadness might imprint depressive tendencies onto the fetus in her womb. The theory of maternal impression was largely abandoned by the 20th century, with the development of modern genetic theory. Folklore In folklore, maternal imprinting, or Versehen (a German noun meaning "inadvertence" or as a verb "to provide") as it is usually called, is the belief that a sudden fear of some object or animal in a pregnant woman can cause her child to bear the mark of it. Some of the more vivid examples are given in Vance Randolph's Ozark Superstitions:Children are also said to be marked by some sudden fright or unpleasant experience of the mother, and I have myself seen a pop-eyed, big-mouthed idiot whose condition is ascribed to the fact that his mother stepped on a toad several months before his birth. In another case, a large red mark on a baby's cheek was caused by the mother seeing a man shot down at her side, when the discharge of the gun threw some of the blood and brains into her face.Other explanations claimed that birthma
https://en.wikipedia.org/wiki/KCNN4
Potassium intermediate/small conductance calcium-activated channel, subfamily N, member 4, also known as KCNN4, is a human gene encoding the KCa3.1 protein. Function The KCa3.1 protein is part of a potentially heterotetrameric voltage-independent potassium channel that is activated by intracellular calcium. Activation is followed by membrane hyperpolarization, which promotes calcium influx. The encoded protein may be part of the predominant calcium-activated potassium channel in T-lymphocytes. This gene is similar to other KCNN family potassium channel genes, but it differs enough to possibly be considered as part of a new subfamily. History The channel activity was first described in 1958 by György Gárdos in human erythrocytes. The channel is also named Gardos channel because of its discoverer. See also SK channel Voltage-gated potassium channel Senicapoc
https://en.wikipedia.org/wiki/Singing%20sand
Singing sand, also called whistling sand, barking sand or singing dune, is sand that produces sound. The sound emission may be caused by wind passing over dunes or by walking on the sand. Certain conditions have to come together to create singing sand: The sand grains have to be round and between 0.1 and 0.5 mm in diameter. The sand has to contain silica. The sand needs to be at a certain humidity. The most common frequency emitted seems to be close to 450 Hz. There are various theories about the singing sand mechanism. It has been proposed that the sound frequency is controlled by the shear rate. Others have suggested that the frequency of vibration is related to the thickness of the dry surface layer of sand. The sound waves bounce back and forth between the surface of the dune and the surface of the moist layer, creating a resonance that increases the sound's volume. The noise may be generated by friction between the grains or by the compression of air between them. Other sounds that can be emitted by sand have been described as "roaring" or "booming". In dunes Singing sand dunes, an example of the phenomenon of singing sand, produce a sound described as roaring, booming, squeaking, or the "Song of Dunes". This is a natural sound phenomenon of up to 105 decibels, lasting as long as several minutes, that occurs in about 35 desert locations around the world. The sound is similar to a loud low-pitch rumble. It emanates from crescent-shaped dunes, or barchans. The sound emission accompanies a slumping or avalanching movement of sand, usually triggered by wind passing over the dune or by someone walking near the crest. Examples of singing sand dunes include California's Kelso Dunes and Eureka Dunes; AuTrain Beach in Northern Michigan; sugar sand beaches and Warren Dunes in southwestern Michigan; Sand Mountain in Nevada; the Booming Dunes in the Namib Desert, Africa; Porth Oer (also known as Whistling Sands) near Aberdaron in Wales; Indiana Dunes in Indiana;
https://en.wikipedia.org/wiki/Dirichlet%20kernel
In mathematical analysis, the Dirichlet kernel, named after the German mathematician Peter Gustav Lejeune Dirichlet, is the collection of periodic functions defined as where is any nonnegative integer. The kernel functions are periodic with period . The importance of the Dirichlet kernel comes from its relation to Fourier series. The convolution of with any function of period 2 is the nth-degree Fourier series approximation to , i.e., we have where is the th Fourier coefficient of . This implies that in order to study convergence of Fourier series it is enough to study properties of the Dirichlet kernel. L1 norm of the kernel function Of particular importance is the fact that the L1 norm of Dn on diverges to infinity as . One can estimate that By using a Riemann-sum argument to estimate the contribution in the largest neighbourhood of zero in which is positive, and Jensen's inequality for the remaining part, it is also possible to show that: This lack of uniform integrability is behind many divergence phenomena for the Fourier series. For example, together with the uniform boundedness principle, it can be used to show that the Fourier series of a continuous function may fail to converge pointwise, in rather dramatic fashion. See convergence of Fourier series for further details. A precise proof of the first result that is given by where we have used the Taylor series identity that and where are the first-order harmonic numbers. Relation to the periodic delta function The Dirichlet kernel is a periodic function which becomes the Dirac comb, i.e. the periodic delta function, in the limit with the angular frequency . This can be inferred from the autoconjugation property of the Dirichlet kernel under forward and inverse Fourier transform: and goes to the Dirac comb of period as , which remains invariant under Fourier transform: . Thus must also have converged to as . In a different vein, consider ∆(x) as the identity element for convolutio
https://en.wikipedia.org/wiki/Compton%20telescope
A Compton telescope (also known as Compton camera or Compton imager) is a gamma-ray detector which utilizes Compton scattering to determine the origin of the observed gamma rays. Compton cameras are usually applied to detect gamma rays in the energy range where Compton scattering is the dominating interaction process, from a few hundred keV to several MeV. They are applied in fields such as astrophysics, nuclear medicine, and nuclear threat detection. In astrophysics, the most famous Compton telescopes was COMPTEL aboard the Compton Gamma Ray Observatory, which pioneered the observation of the gamma-ray sky in the energy range between 0.75 and 30 MeV. A potential successor is NCT - the Nuclear Compton Telescope. Telescopes Astrophysics
https://en.wikipedia.org/wiki/Evacuation%20simulation
Evacuation simulation is a method to determine evacuation times for areas, buildings, or vessels. It is based on the simulation of crowd dynamics and pedestrian motion. The distinction between buildings, ships, and vessels on the one hand and settlements and areas on the other hand is important for the simulation of evacuation processes. In the case of the evacuation of a whole district, the transport phase (see emergency evacuation) is usually covered by queueing models (see below). Simulations are not primarily methods for optimization. To optimize the geometry of a building or the procedure with respect to evacuation time, a target function has to be specified and minimized. Accordingly, one or several variables must be identified which are subject to variation. Classification of models Modelling approaches in the field of evacuation simulation: Cellular automaton: discrete, microscopic models, where the pedestrian is represented by a cell state. There exist models for ship evacuation processes, bi-directional pedestrian flows, general models with bionics aspects Agent-based models: microscopic models, where the pedestrian is represented by an agent. The agents can have human attributes besides the coordinates. Their behavior can integrate stochastic nature. There exist general models with spatial aspects of pedestrian steps Social Force Model: continuous, microscopic model, based on equations from physics Queuing models: macroscopic models which are based on the graphical representation of the geometry. The movement of the persons is represented as a flow on this graph. Particle swarm optimization models: microscopic model, based on a fitness function which minimizes some properties of the evacuation (distance between pedestrians, distance between pedestrians and exits) Fluid-dynamic models: continuous, macroscopic models, where large crowds are modeled with coupled, nonlinear, partial differential equations Simulation of evacuations Buildings (t
https://en.wikipedia.org/wiki/Women%27s%20Museum%20Istanbul
Women's Museum Istanbul (), located in Istanbul, Turkey, is an online museum devoted to the role played by women in the city. Launched online in 2012, it is Turkey's first-ever and the world's third city museum dedicated to women. The permanent exhibition of the Women's Museum Istanbul presents the biographies of women who chose a different lifestyle than that which was expected in their times. Short texts accompanying the installations in the permanent exhibition illustrate the social, cultural, economic and political dynamics of each life. The Women's Museum Istanbul is currently in search of a building that would be suitable for the concept it envisages. Gallery
https://en.wikipedia.org/wiki/Parchive
Parchive (a portmanteau of parity archive, and formally known as Parity Volume Set Specification) is an erasure code system that produces par files for checksum verification of data integrity, with the capability to perform data recovery operations that can repair or regenerate corrupted or missing data. Parchive was originally written to solve the problem of reliable file sharing on Usenet, but it can be used for protecting any kind of data from data corruption, disc rot, bit rot, and accidental or malicious damage. Despite the name, Parchive uses more advanced techniques (specifically error correction codes) than simplistic parity methods of error detection. As of 2014, PAR1 is obsolete, PAR2 is mature for widespread use, and PAR3 is a discontinued experimental version developed by MultiPar author Yutaka Sawada. The original SourceForge Parchive project has been inactive since April 30, 2015. A new PAR3 specification has been worked on since April 28, 2019 by PAR2 specification author Michael Nahas. An alpha version of the PAR3 specification has been published on January 29, 2022 while the program itself is being developed. History Parchive was intended to increase the reliability of transferring files via Usenet newsgroups. Usenet was originally designed for informal conversations, and the underlying protocol, NNTP was not designed to transmit arbitrary binary data. Another limitation, which was acceptable for conversations but not for files, was that messages were normally fairly short in length and limited to 7-bit ASCII text. Various techniques were devised to send files over Usenet, such as uuencoding and Base64. Later Usenet software allowed 8 bit Extended ASCII, which permitted new techniques like yEnc. Large files were broken up to reduce the effect of a corrupted download, but the unreliable nature of Usenet remained. With the introduction of Parchive, parity files could be created that were then uploaded along with the original data files. If an
https://en.wikipedia.org/wiki/TeleSign
Telesign is a company based in Marina Del Rey California, United States providing digital identity and programmable communications APIs to prevent fraud and enable omnichannel engagement. Telesign’s risk-scoring API analyzes phone number attributes acquired during the authentication process to assist companies with onboarding new customers securely and maintaining account integrity. Joe Burton is the current CEO of the company. History Founded in 2005 as a Communications Platform as a Service company by Darren Berkovitz, Ryan Disraeli and Stacy Stubblefield, the company pioneered Two-factor authentication services, a secure supplementary method of preventing illicit access to online accounts.   Co-founders Disraeli, Stubblefield and Berkowitz were named as USC alumni entrepreneurs of the year. TeleSign was an early project of the technology incubator Curious Minds. As of 2014, the company had raised $49 million in funds. In April 2017, TeleSign announced that it is being acquired by BICS, a Belgium-based company for $230 million. In December 2021, TeleSign announced it intends to go public via a business combination with North Atlantic Acquisition Corporation (NASDAQ: NAAC).
https://en.wikipedia.org/wiki/Opponent%20process
The opponent process is a color theory that states that the human visual system interprets information about color by processing signals from photoreceptor cells in an antagonistic manner. The opponent-process theory suggests that there are three opponent channels, each comprising an opposing color pair: red versus green, blue versus yellow, and black versus white (luminance). The theory was first proposed in 1892 by the German physiologist Ewald Hering. Color theory Complementary colors When staring at a bright color for a while (e.g. red), then looking away at a white field, an afterimage is perceived, such that the original color will evoke its complementary color (green, in the case of red input). When complementary colors are combined or mixed, they "cancel each other out" and become neutral (white or gray). That is, complementary colors are never perceived as a mixture; there is no "greenish red" or "yellowish blue", despite claims to the contrary. The strongest color contrast a color can have is its complementary color. Complementary colors may also be called "opposite colors" and are understandably the basis of the colors used in the opponent process theory. Unique hues The colors that define the extremes for each opponent channel are called unique hues, as opposed to composite (mixed) hues. Ewald Hering first defined the unique hues as red, green, blue, and yellow, and based them on the concept that these colors could not be simultaneously perceived. For example, a color cannot appear both red and green. These definitions have been experimentally refined and are represented today by average hue angles of 353° (carmine-red), 128° (cobalt green), 228° (cobalt blue), 58° (yellow). Unique hues can differ between individuals and are often used in psychophysical research to measure variations in color perception due to color-vision deficiencies or color adaptation. While there is considerable inter-subject variability when defining unique hues experimentall
https://en.wikipedia.org/wiki/Thermo-mechanical%20fatigue
Thermo-mechanical fatigue (short TMF) is the overlay of a cyclical mechanical loading, that leads to fatigue of a material, with a cyclical thermal loading. Thermo-mechanical fatigue is an important point that needs to be considered, when constructing turbine engines or gas turbines. Failure mechanisms There are three mechanisms acting in thermo-mechanical fatigue Creep is the flow of material at high temperatures Fatigue is crack growth and propagation due to repeated loading Oxidation is a change in the chemical composition of the material due to environmental factors. The oxidized material is more brittle and prone to crack creation. Each factor has more or less of an effect depending on the parameters of loading. In phase (IP) thermo-mechanical loading (when the temperature and load increase at the same time) is dominated by creep. The combination of high temperature and high stress is the ideal condition for creep. The heated material flows more easily in tension, but cools and stiffens under compression. Out of phase (OP) thermo-mechanical loading is dominated by the effects of oxidation and fatigue. Oxidation weakens the surface of the material, creating flaws and seeds for crack propagation. As the crack propagates, the newly exposed crack surface then oxidizes, weakening the material further and enabling the crack to extend. A third case occurs in OP TMF loading when the stress difference is much greater than the temperature difference. Fatigue alone is the driving cause of failure in this case, causing the material to fail before oxidation can have much of an effect. TMF still is not fully understood. There are many different models to attempt to predict the behavior and life of materials undergoing TMF loading. The two models presented below take different approaches. Models There are many different models that have been developed in an attempt to understand and explain TMF. This page will address the two broadest approaches, constitutive and phenomen
https://en.wikipedia.org/wiki/Now%20%26%20Zen
Now & Zen, Inc. is an American company, founded by Steve McIntosh in January 1995, that is based in Boulder, Colorado. The Zen Alarm Clock was introduced in early 1996. McIntosh stepped down as CEO in 2012. Patents Now & Zen holds two patents covering both the design and utility aspects of its chiming alarm clocks: U.S. Patent No. Des. 390,121, issued February 3, 1998, and U.S. Patent No. US 6,819,635 B2, issued November 16, 2004. Products All Now & Zen products were invented and designed by founder and former CEO Steve McIntosh. The company produced a number of household products in the beginning, but now only produces acoustic alarm clocks and timers. The Zen Alarm Clock, uses a series of progressive acoustic chimes to wake people gradually. In 2001 the firm introduced a portable, digital version of its chiming alarm clock. In 2005, the firm introduced an alarm clock and timer featuring a six-inch brass bowl-gong, called The Zen Timepiece. Reviews The Zen Alarm Clock was reviewed by The New York Times, The Los Angeles Times, the Good Morning America television show, The Washington Post, and Good Housekeeping Magazine. Criticism Because of The Zen Alarm Clock's New Age positioning, some reviewers have ridiculed it. In his review of the product in The New York Times, reviewer William L. Hamilton wrote: "It is like a monk losing his temper — om to OM! Now! Tranquil, tenacious — the Dalai Lama as drill sergeant". Similarly, Dads Magazine referred to the aesthetics of the triangular shaped version of the clock as a "hippie carpenter contraption," but nevertheless praised the way it woke users gently and gradually. Moreover, despite the moderate success of The Zen Alarm Clock, the company has also had some failures, such as The Affirmation Station, introduced in 1998, which was designed to wake users with their personal affirmations. However, the product failed to gain consumer acceptance and was discontinued after three years on the market.
https://en.wikipedia.org/wiki/Saturated%20model
In mathematical logic, and particularly in its subfield model theory, a saturated model M is one that realizes as many complete types as may be "reasonably expected" given its size. For example, an ultrapower model of the hyperreals is -saturated, meaning that every descending nested sequence of internal sets has a nonempty intersection. Definition Let κ be a finite or infinite cardinal number and M a model in some first-order language. Then M is called κ-saturated if for all subsets A ⊆ M of cardinality less than κ, the model M realizes all complete types over A. The model M is called saturated if it is |M|-saturated where |M| denotes the cardinality of M. That is, it realizes all complete types over sets of parameters of size less than |M|. According to some authors, a model M is called countably saturated if it is -saturated; that is, it realizes all complete types over countable sets of parameters. According to others, it is countably saturated if it is countable and saturated. Motivation The seemingly more intuitive notion—that all complete types of the language are realized—turns out to be too weak (and is appropriately named weak saturation, which is the same as 1-saturation). The difference lies in the fact that many structures contain elements that are not definable (for example, any transcendental element of R is, by definition of the word, not definable in the language of fields). However, they still form a part of the structure, so we need types to describe relationships with them. Thus we allow sets of parameters from the structure in our definition of types. This argument allows us to discuss specific features of the model that we may otherwise miss—for example, a bound on a specific increasing sequence cn can be expressed as realizing the type which uses countably many parameters. If the sequence is not definable, this fact about the structure cannot be described using the base language, so a weakly saturated structure may not bound t
https://en.wikipedia.org/wiki/CAdES%20%28computing%29
CAdES (CMS Advanced Electronic Signatures) is a set of extensions to Cryptographic Message Syntax (CMS) signed data making it suitable for advanced electronic signatures. Description CMS is a general framework for electronic signatures for various kinds of transactions like purchase requisition, contracts or invoices. CAdES specifies precise profiles of CMS signed data making it compliant with the European eIDAS regulation (Regulation on electronic identification and trust services for electronic transactions in the internal market). The eIDAS regulation enhances and repeals the Electronic Signatures Directive 1999/93/EC. EIDAS is legally binding in all EU member states since July 2014. An electronic signature that has been created in compliance with eIDAS has the same legal value as a handwritten signature. An electronic signature, technically implemented based on CAdES has the status of an advanced electronic signature. This means that it is uniquely linked to the signatory; it is capable of identifying the signatory; only the signatory has control of the data used for the signature creation; it can be identified if data attached to the signature has been changed after signing. A resulting property of CAdES is that electronically signed documents can remain valid for long periods, even if the signer or verifying party later attempts to deny the validity of the signature. A CAdES-based electronic signature is accepted in a court proceeding as evidence; as advanced electronic signatures are legally binding. But it gets higher probative value when enhanced to a qualified electronic signature. To receive that legal standing, it needs to be doted with a digital certificate, encrypted by a security signature creation device ("qualified electronic signature"). The authorship of a statement with a qualified electronic signature cannot be challenged - the statement is non-repudiable. The document ETSI TS 101 733 Electronic Signature and Infrastructure (ESI) – CMS
https://en.wikipedia.org/wiki/Zoological%20Survey%20of%20India
The Zoological Survey of India (ZSI), founded on 1 July 1916 by the Ministry of Environment, Forest and Climate Change of the Government of India as a premier Indian organisation in zoological research and studies to promote the survey, exploration and research of the fauna in the country. History The annals of Zoological Survey of India (ZSI) reflect an eventful beginning for the Survey even before its formal birth and growth. The history of ZSI begins from the days of the Asiatic Society of Bengal founded by Sir William Jones on 15 January 1784. The Asiatic Society of Bengal was the mother institution not only to the Indian Museum (1875) but also to the institutions like the Zoological Survey of India and the Geological Survey of India. ZSI's establishment was in fact a fulfillment of the dream of Sir William Jones, the founder of the Asiatic Society of Bengal, whose vision encompassed the entire range of human knowledge. The Asiatic Society had started collecting zoological and geological specimens since 1796 and set up a museum in 1814. Nathaniel Wallich, the first Superintendent of the "Museum of the Asiatic Society", was in charge of the increasing collections of Geological and Zoological specimens; he had augmented the animal collections to the Zoological Galleries of the Museum. The genesis of the ZSI was in 1875 with the opening of the Indian Museum. The new museum on its inception comprised only three sections: the Zoological, the Archaeological and the Geological. The zoological collections of the Asiatic Society of Bengal were formally handed over to the board of trustees of the Indian Museum in 1875. Zoological Section of the Museum during the period from 1875 to 1916 steadily expanded, growing to the greatest collection of natural history in Asia. By the care and activity of the Curators of the Asiatic society of Bengal and the Superintendents of the Indian Museum, viz., John McClelland, Edward Blyth, John Anderson, James Wood-Mason, Alfred William
https://en.wikipedia.org/wiki/List%20of%20PTP%20implementations
Precision Time Protocol (PTP) is a widely adopted protocol for delivery of precise time over a computer network. A complete PTP system includes PTP functionality in network equipment and hosts. PTP may be implemented in hardware, software or a combination of both. PTP is implemented in end systems and in PTP-aware networking hardware. PTP implementations may have the ability to serve as a source of time for the network, a grandmaster, or operate as a slave and receive time and synchronize to the grandmaster. Some implementations are able to operate as either master or slave. Routers and switches Artel 1G Quarra Switch Artel 10Gig Quarra Switch ACRA CONTROL (now Curtiss-Wright) airborne switches Alcatel-Lucent 7210 Service Access Switch Alcatel-Lucent 7705 Service Aggregation Router Alcatel-Lucent 7750 Service Router Allen-Bradley Stratix 5400, 5410, 5700, 8000 Managed Switches Arista 7050X/X2/X3 Series Switches Arista 7060X/X2 Series Switches Arista 7150 Series Switches Arista 7280E/R/R2 Series Switches Arista 7500E/R/R2 Series Switches Aruba 2930M Series Switches (with WC.16.04 software release) Aruba CX 6300 M series Aruba CX 8360 BitStream Hyperion 300/402/500 Series Switches Brocade 6910 Ethernet Access Switch Dell EMC PowerSwitch S4100-ON Series Switches Dell EMC PowerSwitch S5200-ON Series Switches Cisco 7600 Router Cisco ASR 903 Router Cisco ASR 9000 Router Cisco Catalyst 9300 Switch Cisco CGS 2520 Switch Cisco Industrial Ethernet 3000 Series Switches Cisco Industrial Ethernet 5000 Series Switches Cisco Nexus 3000 Series Switches Cisco Nexus 5000 Series Switches Cisco Nexus 7000 router Cisco Nexus 9000 router Connect Tech Inc. Xtreme/10G Managed Ethernet Switch/Router Crystal Instruments IEEE 1588 Spider-HUB Industrial Ethernet Switch Ericsson Router 6000 series Extreme Networks E4G-200 router Extreme Networks E4G-400 router Fibrolan Falcon-R Class xHaul Switche/Grandmaster Fibrolan Falcon-M Class Switches/Routers Fibrolan Falc
https://en.wikipedia.org/wiki/ERIKA%20Enterprise
ERIKA Enterprise is a real-time operating system (RTOS) kernel for embedded systems, which is OSEK/VDX certified. It is free and open source software released under a GNU General Public License (GPL). The RTOS also includes RT-Druid, an integrated development environment (IDE) based on Eclipse. ERIKA Enterprise implements various conformance classes, including the standard OSEK/VDX conformance classes BCC1, BCC2, ECC1, ECC2, CCCA, and CCCB. Also, ERIKA provides other custom conformance classes named FP (fixed priority), EDF (earliest deadline first scheduling), and FRSH (an implementation of resource reservation protocols). Due to the collaboration with the Tool & Methodologies team of Magneti Marelli Powertrain & Electronics, the automotive kernel (BCC1, BCC2, ECC1, ECC2, multicore, memory protection, and kernel fixed priority with Diab 5.5.1 compiler) is MISRA C 2004 compliant using FlexeLint 9.00h under the configuration suggested by Magneti Marelli. In August 2012 ERIKA Enterprise officially received the OSEK/VDX certification; see below. History ERIKA Enterprise began in the year 2000 with the aim to support multicore devices for the automotive markets. The main milestones are: 2000: support for STMicroelectronics ST10 2001: support for ARM7 2002: support for Janus, a prototype dual ARM7 system for the automotive market 2004: support for Hitachi H8 2005: support for Altera Nios II, with support for partitioning on multicore designs; availability of the RT-Druid code generator 2006: support for Microchip dsPIC 2007: support for Atmel AVR Micaz 2009: announced ERIKA website on TuxFamily 2010: support for TriCore, Freescale S12XS, Freescale PowerPC 5000 PPC MPC5674F Mamba, Microchip PIC24, Microchip PIC32, Lattice MICO32, eSi-RISC 2011: support for Texas Instruments MSP430, Renesas R2xx, Freescale S12G, Freescale PowerPC 5000 PPC MPC5668G Fado 2012: support for ARM Cortex-M, Atmel AVR (Arduino), TI Stellaris Cortex M4, Freescale PowerPC 5000 PPC MP
https://en.wikipedia.org/wiki/Pfaffian%20constraint
In dynamics, a Pfaffian constraint is a way to describe a dynamical system in the form: where is the number of equations in a system of constraints. Holonomic systems can always be written in Pfaffian constraint form. Derivation Given a holonomic system described by a set of holonomic constraint equations where are the n generalized coordinates that describe the system, and where is the number of equations in a system of constraints, we can differentiate by the chain rule for each equation: By a simple substitution of nomenclature we arrive at: Examples Pendulum Consider a pendulum. Because of how the motion of the weight is constrained by the arm, the velocity vector of the weight must be perpendicular at all times to the position vector . Because these vectors are always orthogonal, their dot product must be zero. Both position and velocity of the mass can be defined in terms of an - coordinate system: Simplifying the dot product yields: We multiply both sides by . This results in the Pfaffian form of the constraint equation: This Pfaffian form is useful, as we may integrate it to solve for the holonomic constraint equation of the system, if one exists. In this case, the integration is rather trivial: Where C is the constant of integration. And conventionally, we may write: The term is squared simply because it must be a positive number; being a physical system, dimensions must all be real numbers. Indeed, is the length of the pendulum arm. Robotics In robot motion planning, a Pfaffian constraint is a set of k linearly independent constraints linear in velocity, i.e., of the form One source of Pfaffian constraints is rolling without slipping in wheeled robots.
https://en.wikipedia.org/wiki/Partial%20fractions%20in%20complex%20analysis
In complex analysis, a partial fraction expansion is a way of writing a meromorphic function as an infinite sum of rational functions and polynomials. When is a rational function, this reduces to the usual method of partial fractions. Motivation By using polynomial long division and the partial fraction technique from algebra, any rational function can be written as a sum of terms of the form , where and are complex, is an integer, and is a polynomial. Just as polynomial factorization can be generalized to the Weierstrass factorization theorem, there is an analogy to partial fraction expansions for certain meromorphic functions. A proper rational function (one for which the degree of the denominator is greater than the degree of the numerator) has a partial fraction expansion with no polynomial terms. Similarly, a meromorphic function for which goes to 0 as goes to infinity at least as quickly as has an expansion with no polynomial terms. Calculation Let be a function meromorphic in the finite complex plane with poles at and let be a sequence of simple closed curves such that: The origin lies inside each curve No curve passes through a pole of lies inside for all , where gives the distance from the curve to the origin one more condition of compatibility with the poles , described at the end of this section Suppose also that there exists an integer such that Writing for the principal part of the Laurent expansion of about the point , we have if . If , then where the coefficients are given by should be set to 0, because even if itself does not have a pole at 0, the residues of at must still be included in the sum. Note that in the case of , we can use the Laurent expansion of about the origin to get so that the polynomial terms contributed are exactly the regular part of the Laurent series up to . For the other poles where , can be pulled out of the residue calculations: To avoid issues with convergence, the poles sh
https://en.wikipedia.org/wiki/Nigericin
Nigericin is an antibiotic derived from Streptomyces hygroscopicus. Its isolation was described in the 1950s, and in 1968 the structure could be elucidated by X-ray crystallography. The structure and properties of nigericin are similar to the antibiotic monensin. Commercially it is obtained as a byproduct, or contaminant, at the fermentation of Geldanamycin. It is also called Polyetherin A, Azalomycin M, Helixin C, Antibiotic K178, Antibiotic X-464. Nigericin acts as an H+, K+, Pb2+ ionophore. Most commonly it is an antiporter of H+ and K+. In the past nigericin was used as an antibiotic active against gram positive bacteria. It inhibits the Golgi functions in Eukaryotic cells. Its ability to induce K+ efflux also makes it a potent activator of the NLRP3 inflammasome
https://en.wikipedia.org/wiki/Exponential%20map%20%28discrete%20dynamical%20systems%29
In the theory of dynamical systems, the exponential map can be used as the evolution function of the discrete nonlinear dynamical system. Family The family of exponential functions is called the exponential family. Forms There are many forms of these maps, many of which are equivalent under a coordinate transformation. For example two of the most common ones are: The second one can be mapped to the first using the fact that , so is the same under the transformation . The only difference is that, due to multi-valued properties of exponentiation, there may be a few select cases that can only be found in one version. Similar arguments can be made for many other formulas.
https://en.wikipedia.org/wiki/Selman%20A.%20Waksman%20Award%20in%20Microbiology
The Selman A. Waksman Award in Microbiology is awarded by the U.S. National Academy of Sciences "in recognition of excellence in the field of microbiology." Named after Selman Waksman, it was first awarded in 1968. A $5000 prize is included in the honor. List of Selman A. Waksman Award in Microbiology winners Source: National Academy of Sciences Pascale Cossart (2021) For her pioneering contributions to the field of cellular microbiology and her fundamental work uncovering novel mechanisms that govern the interplay between the pathogenic intracellular bacterium Listeria and its mammalian host, as well as her many contributions to supporting microbiology worldwide. Sharon R. Long (2019) For pioneering research defining the molecular mechanisms underlying the important nitrogen-fixing symbiosis between Rhizobium and legumes, research that has had major implications for microbe-host interactions in general. Bernard Roizman (2017) For his many seminal contributions to understanding the mechanisms by which herpes viruses replicate and cause disease. Susan Gottesman (2015) For transforming our understanding of post-transcriptional regulation in bacteria through mechanisms of controlled proteolysis and small RNAs. Jeffrey Gordon (2013) For his pioneering interdisciplinary studies on the human microbiome and for defining the genomic and metabolic foundations of its contributions to health and disease. Carol A. Gross (2011) For her pioneering studies on mechanisms of gene transcription and its control, and for defining the roles of sigma factors during homeostasis and under stress. Jonathan Beckwith (2009) For fundamental contributions to gene regulation, protein targeting and secretion, and disulfide biochemistry, and also for the development of gene fusions as an experimental tool. Richard M. Losick (2007) For discovering alternative bacterial sigma factors and his fundamental contributions to understanding the mechanism of bacterial sporulation. Lucy Sh
https://en.wikipedia.org/wiki/Splice%20%28system%20call%29
is a Linux-specific system call that moves data between a file descriptor and a pipe without a round trip to user space. The related system call moves or copies data between a pipe and user space. Ideally, splice and vmsplice work by remapping pages and do not actually copy any data, which may improve I/O performance. As linear addresses do not necessarily correspond to contiguous physical addresses, this may not be possible in all cases and on all hardware combinations. Workings With , one can move data from one file descriptor to another without incurring any copies from user space into kernel space, which is usually required to enforce system security and also to keep a simple interface for processes to read and write to files. works by using the pipe buffer. A pipe buffer is an in-kernel memory buffer that is opaque to the user space process. A user process can splice the contents of a source file into this pipe buffer, then splice the pipe buffer into the destination file, all without moving any data through userspace. Linus Torvalds described in a 2006 email, which was included in a KernelTrap article. Origins The Linux splice implementation borrows some ideas from an original proposal by Larry McVoy in 1998. The splice system calls first appeared in Linux kernel version 2.6.17 and were written by Jens Axboe. Prototype ssize_t splice(int fd_in, loff_t *off_in, int fd_out, loff_t *off_out, size_t len, unsigned int flags); Some constants that are of interest are: /* Splice flags (not laid down in stone yet). */ #ifndef SPLICE_F_MOVE #define SPLICE_F_MOVE 0x01 #endif #ifndef SPLICE_F_NONBLOCK #define SPLICE_F_NONBLOCK 0x02 #endif #ifndef SPLICE_F_MORE #define SPLICE_F_MORE 0x04 #endif #ifndef SPLICE_F_GIFT #define SPLICE_F_GIFT 0x08 #endif Example This is an example of splice in action: /* Transfer from disk to a log. */ int log_blocks (struct log_handle * handle, int fd, loff_t offset, size_t size) { int
https://en.wikipedia.org/wiki/Paul%20K.%20Guillow%2C%20Inc.
Paul K. Guillow, Inc., commonly known as Guillow's, is an American manufacturer of balsa wood model aircraft kits. The company was founded by Paul K. Guillow in 1926 in Wakefield, Massachusetts, and was originally called NuCraft Toys. Founder Born in 1893, Paul Guillow was a naval aviator during World War I, and returned from Europe with an interest in aviation related toys. He later went on to graduate from Worcester Polytechnic Institute. Guillow died in 1951. Company history Among the company's earliest products were a card game called The Lindy Flying Game, which was introduced in 1927, and a board game called Crash: The New Airplane Game which was introduced in 1928. Soon after Charles Lindbergh's famous solo transatlantic flight in 1927, a craze for all things aeronautical swept over America. Guillow's capitalized on that fad by introducing a line of balsa wood model kits. The first line of Guillow's balsa non-flying shelf model kits consisted of twelve different World War I biplane fighters with six-inch wingspans that retailed for 10-cents each. Each kit contained a 3-view plan, balsa wood cement, two bottles of colored aircraft dope, a strip of bamboo for wing and landing gear strutsthis was considered relatively good value for such toys at that time. In 1933, demand for the kits were high enough as to enable Guillow's to move out of the family barn where it had started, and into its present-day location in Wakefield. In the 1940s, the company also supplemented the production of model airplanes with the publication of several books on the construction of flying model planes . During World War II, the supply of balsa wood was diverted to the war effort for the manufacture of rafts and life jackets. Guillow's was forced to use alternative materials like cardboard or pine wood to manufacture the model kits. In the meantime the company also diversified into building target drone aircraft as training aids for gunners. After the war, to meet changing cus
https://en.wikipedia.org/wiki/Ministry%20of%20Agrarian%20Policy%20and%20Food%20%28Ukraine%29
The Ministry of Agrarian Policy and Food () is the central executive authority of Ukraine in charge of country's agro-development. It is one of the oldest government agencies of Ukraine. On 29 August 2019 the ministry's function were taken over by the Ministry of Economic Development, Trade and Agriculture. On 17 December 2020 the ministry was resurrected. Ministry activity is coordinated by the Cabinet of Ukraine. Ministry is the main authority in the system of central government responsible for national agricultural policy supervising and implementation including policy on agriculture and food security, public policy in the fields of fishery and fishery protection, use and reproduction of aquatic resources, regulation of fishery and maritime security, veterinary medicine, species protection, land related questions, mapping and surveying, forestry and hunting, surveillance (monitoring) in agriculture. History The precursor of the Ministry of Agrarian Policy and Food of Ukraine was the General Secretariat of Land Affairs, established on June 15, 1917 under the Government (the First Vynnychenko government) of the Central Council of Ukraine. List of names of the precursor of the ministry Ministry of Agriculture of the Ukrainian SSR Ministry of Fruits and Vegetables of the Ukrainian SSR Ministry of Rural Construction of the Ukrainian SSR Ministry of Meat and Milk Industry of the Ukrainian SSR Ministry of Food Industry of the Ukrainian SSR Ministry of Preparations of the Ukrainian SSR State Committee of the Ukrainian SSR on production and technical provision of agriculture Main Administration in gardening, viticulture and wine industry of the Ukrainian SSR State Agro-Industrial Committee of the Ukrainian SSR: 1986 - 1992 (part of the State Agro-Industrial Committee of the USSR) Kolhosps (along with the Council of kolhosps of the Ukrainian SSR) Ministry of Bread products of the Ukrainian SSR Ministry of Amelioration and Water Management of the Ukrainian
https://en.wikipedia.org/wiki/Caenorhabditis%20elegans%20Cer13%20virus
Caenorhabditis elegans Cer13 virus is a species of virus in the genus Semotivirus and the family Belpaoviridae. It exists as retrotransposons in the Caenorhabditis elegans genome.
https://en.wikipedia.org/wiki/Whiplash%20%28decorative%20art%29
The whiplash or whiplash line is a motif of decorative art and design that was particularly popular in Art Nouveau. It is an asymmetrical, sinuous line, often in an ornamental S curve, usually inspired by natural forms such as plants and flowers, which suggests dynamism and movement. It took its name from a woven fabric panel called "Coup de Fouet" ("Whiplash") by the German artist Hermann Obrist (1895) which depicted the stems and roots of the cyclamen flower. The panel was later reproduced by the textile workshop of the Darmstadt Artists Colony. Curling whiplash lines were modelled after natural and vegetal forms, particularly the cyclamen, iris, orchid, thistle, mistletoe, holly, water lily and from the stylized lines of the swan, peacock, dragonfly and butterfly. In architecture, furniture and other decorative arts, the decoration was entirely integrated with the structure. The whiplash lines were frequently interlaced and combined with twists and scrolls to inspire a poetic and romantic association. Femininity and romanticism were represented by the lines of long curling hair intertwined with flowers. Designers such as Henry van de Velde used the whiplash line to create a sense of tension and dynamism. He wrote: "A line is a force like other elementary forces. Several lines put together but opposed act like the presence of multiple forces. Noted designers who used the whiplash line included Aubrey Beardsley, Hector Guimard, Alphonse Mucha and Victor Horta. In the Art Nouveau period, the whiplash line appeared frequently in furniture design, railings and other ornamental iron work, floor tiles, posters and jewelry. It became so common that critics of Art Nouveau ridiculed it as "the noodle style". Origins The twisting and curving lines of the whiplash form have a long history. They are similar to the arabesque design, used particularly in Islamic art, such as the ceramic tiles of the mosque of Samarkand in Central Asia, They featured prominently in the l
https://en.wikipedia.org/wiki/Renninger%20negative-result%20experiment
In quantum mechanics, the Renninger negative-result experiment is a thought experiment that illustrates some of the difficulties of understanding the nature of wave function collapse and measurement in quantum mechanics. The statement is that a particle need not be detected in order for a quantum measurement to occur, and that the lack of a particle detection can also constitute a measurement. The thought experiment was first posed in 1953 by Mauritius Renninger. The non-detection of a particle in one arm of an interferometer implies that the particle must be in the other arm. It can be understood to be a refinement of the paradox presented in the Mott problem. The Mott problem The Mott problem concerns the paradox of reconciling the spherical wave function describing the emission of an alpha ray by a radioactive nucleus, with the linear tracks seen in a cloud chamber. Formulated in 1929 by Sir Nevill Francis Mott and Werner Heisenberg, it was resolved by a calculation done by Mott that showed that the correct quantum mechanical system must include the wave functions for the atoms in the cloud chamber as well as that for the alpha ray. The calculation showed that the resulting probability is non-zero only on straight lines raying out from the decayed atom; that is, once the measurement is performed, the wave-function becomes non-vanishing only near the classical trajectory of a particle. Renninger's negative-result experiment In Renninger's 1960 formulation, the cloud chamber is replaced by a pair of hemispherical particle detectors, completely surrounding a radioactive atom at the center that is about to decay by emitting an alpha ray. For the purposes of the thought experiment, the detectors are assumed to be 100% efficient, so that the emitted alpha ray is always detected. By consideration of the normal process of quantum measurement, it is clear that if one detector registers the decay, then the other will not: a single particle cannot be detected by bot
https://en.wikipedia.org/wiki/Proof%20%28play%29
Proof is a 2000 play by the American playwright David Auburn. Proof was developed at George Street Playhouse in New Brunswick, New Jersey, during the 1999 Next Stage Series of new plays. The play premiered Off-Broadway in May 2000 and transferred to Broadway in October 2000. The play won the 2001 Pulitzer Prize for Drama and the Tony Award for Best Play. Plot The play focuses on Catherine, the daughter of Robert, a recently deceased mathematical genius in his fifties and professor at the University of Chicago, and her struggle with mathematical genius and mental illness. Catherine had cared for her father through a lengthy mental illness. Upon Robert's death, his ex-graduate student Hal discovers a paradigm-shifting proof about prime numbers in Robert's office. The title refers both to that proof and to the play's central question: Can Catherine prove the proof's authorship? Along with demonstrating the proof's authenticity, Catherine also finds herself in a relationship with Hal. Throughout, the play explores Catherine's fear of following in her father's footsteps, both mathematically and mentally and her desperate attempts to stay in control. Act I The play opens with Catherine sitting in the backyard of her large, old house. Robert, her father, reveals a bottle of champagne to help celebrate her 25th birthday. Catherine complains that she hasn't done any worthwhile work in the field of mathematics, at least not to the same level as her father, a well-known math genius. He reassures her that she can still do good work as long as she stops sleeping until noon and wasting time reading magazines. Catherine confesses she is worried about inheriting Robert's inclination towards mental instability. He begins to comfort her but then alludes to a "bad sign" when he points out that he did, in fact, die a week ago. Robert disappears as Catherine dozes off. She awakens when Hal, one of Robert's students, exits the house. He has been studying the hundreds of notebooks Robe
https://en.wikipedia.org/wiki/Timeline%20of%20geometry
The following is a timeline of key developments of geometry: Before 1000 BC ca. 2000 BC – Scotland, carved stone balls exhibit a variety of symmetries including all of the symmetries of Platonic solids. 1800 BC – Moscow Mathematical Papyrus, findings volume of a frustum 1800 BC – Plimpton 322 contains the oldest reference to the Pythagorean triplets. 1650 BC – Rhind Mathematical Papyrus, copy of a lost scroll from around 1850 BC, the scribe Ahmes presents one of the first known approximate values of π at 3.16, the first attempt at squaring the circle, earliest known use of a sort of cotangent, and knowledge of solving first order linear equations 1st millennium BC 800 BC – Baudhayana, author of the Baudhayana Sulba Sutra, a Vedic Sanskrit geometric text, contains quadratic equations, and calculates the square root of 2 correct to five decimal places ca. 600 BC – the other Vedic "Sulba Sutras" ("rule of chords" in Sanskrit) use Pythagorean triples, contain of a number of geometrical proofs, and approximate π at 3.16 5th century BC – Hippocrates of Chios utilizes lunes in an attempt to square the circle 5th century BC – Apastamba, author of the Apastamba Sulba Sutra, another Vedic Sanskrit geometric text, makes an attempt at squaring the circle and also calculates the square root of 2 correct to five decimal places 530 BC – Pythagoras studies propositional geometry and vibrating lyre strings; his group also discover the irrationality of the square root of two, 370 BC – Eudoxus states the method of exhaustion for area determination 300 BC – Euclid in his Elements studies geometry as an axiomatic system, proves the infinitude of prime numbers and presents the Euclidean algorithm; he states the law of reflection in Catoptrics, and he proves the fundamental theorem of arithmetic 260 BC – Archimedes proved that the value of π lies between 3 + 1/7 (approx. 3.1429) and 3 + 10/71 (approx. 3.1408), that the area of a circle was equal to π multiplied by the square
https://en.wikipedia.org/wiki/Seeks
Seeks is a free and open-source project licensed under the GNU Affero General Public License version 3 (AGPL-3.0-or-later). It exists to create an alternative to the current market-leading search engines, driven by user concerns rather than corporate interests. The original manifesto was created by Emmanuel Benazera and Sylvio Drouin and published in October 2006. The project was under active development until April 2014, with both stable releases of the engine and revisions of the source code available for public use. In September 2011, Seeks won an innovation award at the Open World Forum Innovation Awards. The Seeks source code has not been updated since April 28, 2014 and no Seeks nodes have been usable since February 6, 2016. User control Seeks aims to give the control of the ranking of results to the users, as search algorithms are often less accurate than humans. It relies on a distributed collaborative filter to let users personalize and share their preferred results on a search. Also, because of the openness of the source code, users can verify and modify the collaborative filter to fit its needs. Forms Currently Seeks can be used in three main forms: Public meta search engine – These are various individuals or entities that have created publicly accessible instances of the Seeks source code. This is the easiest way to begin using Seeks, as it operates in a similar manner to any other search engine. Web proxy – Based on the popular Privoxy open source code, this allows setting up Seeks to operate as a web proxy which intercepts network requests for search queries and returns Seeks-based results. Web application – This allows setting up an instance of the web search interface on a local system, and more customizing than available when using a public node. Features Results are automatically re-ranked based on user behavior. Results are automatically re-ranked based on behavior by similar users. The software can be run in a distributed, peer-to-pee
https://en.wikipedia.org/wiki/Large%20sieve
The large sieve is a method (or family of methods and related ideas) in analytic number theory. It is a type of sieve where up to half of all residue classes of numbers are removed, as opposed to small sieves such as the Selberg sieve wherein only a few residue classes are removed. The method has been further heightened by the larger sieve which removes arbitrarily many residue classes. Name Its name comes from its original application: given a set such that the elements of S are forbidden to lie in a set Ap ⊂ Z/p Z modulo every prime p, how large can S be? Here Ap is thought of as being large, i.e., at least as large as a constant times p; if this is not the case, we speak of a small sieve. History The early history of the large sieve traces back to work of Yu. B. Linnik, in 1941, working on the problem of the least quadratic non-residue. Subsequently Alfréd Rényi worked on it, using probability methods. It was only two decades later, after quite a number of contributions by others, that the large sieve was formulated in a way that was more definitive. This happened in the early 1960s, in independent work of Klaus Roth and Enrico Bombieri. It is also around that time that the connection with the duality principle became better understood. In the mid-1960s, the Bombieri–Vinogradov theorem was proved as a major application of large sieves using estimations of mean values of Dirichlet characters. In the late 1960s and early 1970s, many of the key ingredients and estimates were simplified by Patrick X. Gallagher. Development Large-sieve methods have been developed enough that they are applicable to small-sieve situations as well. Something is commonly seen as related to the large sieve not necessarily in terms of whether it is related to the kind of situation outlined above, but, rather, if it involves one of the two methods of proof traditionally used to yield a large-sieve result: Approximate Plancherel inequality If a set S is ill-distributed modulo p (by vir
https://en.wikipedia.org/wiki/Sinclair%20QDOS
QDOS is the multitasking operating system found on the Sinclair QL personal computer and its clones. It was designed by Tony Tebby whilst working at Sinclair Research, as an in-house alternative to 68K/OS, which was later cancelled by Sinclair, but released by original authors GST Computer Systems. Its name is not regarded as an acronym and sometimes written as Qdos in official literature (see also the identically pronounced word kudos). QDOS was implemented in Motorola 68000 assembly language, and on the QL, resided in 48 KB of ROM, consisting of either three 16 KB EPROM chips or one 32 KB and one 16 KB ROM chip. These ROMs also held the SuperBASIC interpreter, an advanced variant of BASIC programming language with structured programming additions. This also acted as the QDOS command-line interpreter. Facilities provided by QDOS included management of processes (or "jobs" in QDOS terminology), memory allocation, and an extensible "redirectable I/O system", providing a generic framework for filesystems and device drivers. Very basic screen window functionality was also provided. This, and several other features, were never fully implemented in the released versions of QDOS, but were improved in later extensions to the operating system produced by Tebby's own company, QJUMP. Rewritten, enhanced versions of QDOS were also developed, including Laurence Reeves' Minerva and Tebby's SMS2 and SMSQ/E. The last is the most modern variant and is still being improved. Versions QDOS versions were identified by numerical version numbers. However, the QL firmware ROMs as a whole (including SuperBASIC) were given two- or three-letter alphabetic identifiers (returned by the SuperBASIC function VER$). The following version of QDOS were released (dates are estimated first customer shipments): 0.08: the last pre-production version. 1.00: corresponded to the FB version QL ROMs, released in April 1984. 1.01: corresponded to the PM version ROMs. This was faster and had improved
https://en.wikipedia.org/wiki/Neo-Darwinism
Neo-Darwinism is generally used to describe any integration of Charles Darwin's theory of evolution by natural selection with Gregor Mendel's theory of genetics. It mostly refers to evolutionary theory from either 1895 (for the combinations of Darwin's and August Weismann's theories of evolution) or 1942 ("modern synthesis"), but it can mean any new Darwinian- and Mendelian-based theory, such as the current evolutionary theory. Original use Darwin's theory of evolution by natural selection, as published in 1859, provided a selection mechanism for evolution, but not a trait transfer mechanism. Lamarckism was still a very popular candidate for this. August Weismann and Alfred Russel Wallace rejected the Lamarckian idea of inheritance of acquired characteristics that Darwin had accepted and later expanded upon in his writings on heredity. The basis for the complete rejection of Lamarckism was Weismann's germ plasm theory. Weismann realised that the cells that produce the germ plasm, or gametes (such as sperm and eggs in animals), separate from the somatic cells that go on to make other body tissues at an early stage in development. Since he could see no obvious means of communication between the two, he asserted that the inheritance of acquired characteristics was therefore impossible; a conclusion now known as the Weismann barrier. It is, however, usually George Romanes who is credited with the first use of the word in a scientific context. Romanes used the term to describe the combination of natural selection and Weismann's germ plasm theory that evolution occurs solely through natural selection, and not by the inheritance of acquired characteristics resulting from use or disuse, thus using the word to mean "Darwinism without Lamarckism." Following the development, from about 1918 to 1947, of the modern synthesis of evolutionary biology, the term neo-Darwinian started to be used to refer to that contemporary evolutionary theory. Current meaning Biologists, howev
https://en.wikipedia.org/wiki/John%20Caius
John Caius (born John Kays ; 6 October 1510 – 29 July 1573), also known as Johannes Caius and Ioannes Caius, was an English physician, and second founder of Gonville and Caius College, Cambridge. Biography Early years Caius was born in Norwich and was educated at Norwich School. In 1529, he was admitted as a student at Gonville Hall, Cambridge, founded by Edmund Gonville in 1348, where he seems to have mainly studied divinity. After graduating in 1533, he visited Italy, where he studied under Montanus and Vesalius at Padua. In 1541 he took his degree as a physician at the University of Padua. In 1543 he visited several parts of Italy, Germany and France and then returned to England. Upon his return from Italy he Latinised his surname which was somewhat fashionable at the time. Career Caius was a physician in London in 1547, and was admitted as a fellow of the College of Physicians, of which he was for many years president. In 1551 he was attending in Shrewsbury when a notable outbreak of sweating sickness occurred in the town; the following year, after his return to London, he published A Boke or Counseill Against the Disease Commonly Called the Sweate, or Sweatyng Sicknesse (1552), which became the main source of knowledge of this disease, now understood to be influenza. In 1557 Caius, at that time physician to Queen Mary, enlarged the foundation of his old college, changed the name from "Gonville Hall" to "Gonville and Caius College", and endowed it with several considerable estates, adding an entire new court at the expense of £1,834 (). He accepted the mastership of the college 24 January 1559 on the death of Thomas Bacon, and held it until about a month before his own death. He was physician to Edward VI, Queen Mary and Queen Elizabeth. From this position he was dismissed in 1568 on account of his adherence to the Roman Catholic faith. He was incongruously accused both of atheism, and of keeping secretly a collection of ornaments and vestments for Roma
https://en.wikipedia.org/wiki/Grothendieck%20category
In mathematics, a Grothendieck category is a certain kind of abelian category, introduced in Alexander Grothendieck's Tôhoku paper of 1957 in order to develop the machinery of homological algebra for modules and for sheaves in a unified manner. The theory of these categories was further developed in Pierre Gabriel's seminal thesis in 1962. To every algebraic variety one can associate a Grothendieck category , consisting of the quasi-coherent sheaves on . This category encodes all the relevant geometric information about , and can be recovered from (the Gabriel–Rosenberg reconstruction theorem). This example gives rise to one approach to noncommutative algebraic geometry: the study of "non-commutative varieties" is then nothing but the study of (certain) Grothendieck categories. Definition By definition, a Grothendieck category is an AB5 category with a generator. Spelled out, this means that is an abelian category; every (possibly infinite) family of objects in has a coproduct (also known as direct sum) in ; direct limits of short exact sequences are exact; this means that if a direct system of short exact sequences in is given, then the induced sequence of direct limits is a short exact sequence as well. (Direct limits are always right-exact; the important point here is that we require them to be left-exact as well.) possesses a generator, i.e. there is an object in such that is a faithful functor from to the category of sets. (In our situation, this is equivalent to saying that every object of admits an epimorphism , where denotes a direct sum of copies of , one for each element of the (possibly infinite) set .) The name "Grothendieck category" neither appeared in Grothendieck's Tôhoku paper nor in Gabriel's thesis; it came into use in the second half of the 1960s in the work of several authors, including Jan-Erik Roos, Bo Stenström, Ulrich Oberst, and Bodo Pareigis. (Some authors use a different definition in that they don't require the exis
https://en.wikipedia.org/wiki/Crystal%20oscillator%20frequencies
Crystal oscillators can be manufactured for oscillation over a wide range of frequencies, from a few kilohertz up to several hundred megahertz. Many applications call for a crystal oscillator frequency conveniently related to some other desired frequency, so hundreds of standard crystal frequencies are made in large quantities and stocked by electronics distributors. Using frequency dividers, frequency multipliers and phase locked loop circuits, it is practical to derive a wide range of frequencies from one reference frequency. The UART column shows the highest common baud rate (under 1,000,000), assuming a clock pre-divider of 16 is resolved to an exact integer baud rate. Though some UART variations have fractional dividers, those concepts are ignored to simplify this table.
https://en.wikipedia.org/wiki/EMBnet
The European Molecular Biology network (EMBnet) is an international scientific network and interest group that aims to enhance bioinformatics services by bringing together bioinformatics expertises and capacities. On 2011 EMBnet has 37 nodes spread over 32 countries. The nodes include bioinformatics related university departments, research institutes and national service providers. Operations The main task of most EMBnet nodes is to provide their national scientific community with access to bioinformatics databanks, specialised software and sufficient computing resources and expertise. EMBnet is also working in the fields of bioinformatics training and software development. Examples of software created by EMBnet members are: EMBOSS, wEMBOSS, UTOPIA. EMBnet represents a wide user group and works closely together with the database producers such as EMBL's European Bioinformatics Institute (EBI), the Swiss Institute of Bioinformatics (Swiss-Prot), the Munich Information Center for Protein Sequences (MIPS), in order to provide a uniform coverage of services throughout Europe. EMBnet is registered in the Netherlands as a public foundation (Stichting). Since its creation in 1988, EMBnet has evolved from an informal network of individuals in charge of maintaining biological databases into the only worldwide organization bringing bioinformatics professionals to work together to serve the expanding fields of genetics and molecular biology. Although composed predominantly of academic nodes, EMBnet gains an important added dimension from its industrial members. The success of EMBnet is attracting increasing numbers of organizations outside Europe to join. EMBnet has a tried-and-tested infrastructure to organise training courses, give technical help and help its members effectively interact and respond to the rapidly changing needs of biological research in a way no single institute is able to do. In 2005 the organization created additional types of node to allow more tha
https://en.wikipedia.org/wiki/Life%20Cairn
The Life Cairn is a type of public memorial built around the world for species rendered extinct by human activity. It takes the form of a cairn. The first Life Cairn was raised on Mount Caburn, near Lewes in East Sussex. The inaugural opening ceremony was held on 22 May 2011. A Life Cairn was dedicated to Lonesome George at the Galapagos National Park on 26 July 2012 and an additional Life Cairn is being raised in Stockholm. The Life Cairn memorial programme was started by BBC TV presenter Reverend Peter Owen Jones, known for his TV series Around the World in 80 Faiths and Andreas Kornevall, the Director for the environmental charity, the Earth Restoration Service (Registered Charity No. 1118951).
https://en.wikipedia.org/wiki/Eikonal%20equation
An eikonal equation (from Greek εἰκών, image) is a non-linear first-order partial differential equation that is encountered in problems of wave propagation. The classical eikonal equation in geometric optics is a differential equation of the form where lies in an open subset of , is a positive function, denotes the gradient, and is the Euclidean norm. The function is given and one seeks solutions . In the context of geometric optics, the function is the refractive index of the medium. More generally, an eikonal equation is an equation of the form where is a function of variables. Here the function is given, and is the solution. If , then equation () becomes (). Eikonal equations naturally arise in the WKB method and the study of Maxwell's equations. Eikonal equations provide a link between physical (wave) optics and geometric (ray) optics. One fast computational algorithm to approximate the solution to the eikonal equation is the fast marching method. History The term "eikonal" was first used in the context of geometric optics by Heinrich Bruns. However, the actual equation appears earlier in the seminal work of William Rowan Hamilton on geometric optics. Physical interpretation Continuous shortest-path problems Suppose that is an open set with suitably smooth boundary . The solution to the eikonal equation can be interpreted as the minimal amount of time required to travel from to , where is the speed of travel, and is an exit-time penalty. (Alternatively this can be posed as a minimal cost-to-exit by making the right-side and an exit-cost penalty.) In the special case when , the solution gives the signed distance from . By assuming that exists at all points, it is easy to prove that corresponds to a time-optimal control problem using Bellman's optimality principle and a Taylor expansion. Unfortunately, it is not guaranteed that exists at all points, and more advanced techniques are necessary to prove this. This led to the d
https://en.wikipedia.org/wiki/Social%20login
Social login is a form of single sign-on using existing information from a social networking service such as Facebook, Twitter or Google, to login to a third party website instead of creating a new login account specifically for that website. It is designed to simplify logins for end users as well as provide more reliable demographic information to web developers. How social login works Social login links accounts from one or more social networking services to a website, typically using either a plug-in or a widget. By selecting the desired social networking service, the user simply uses his or her login for that service to sign on to the website. This, in turn, negates the need for the end user to remember login information for multiple electronic commerce and other websites while providing site owners with uniform demographic information as provided by the social networking service. Many sites which offer social login also offer more traditional online registration for those who either desire it or who do not have an account with a compatible social networking service (and therefore would be precluded from creating an account with the website). Application Social login can be implemented strictly as an authentication system using standards such as OpenID or SAML. For consumer websites that offer social functionality to users, social login is often implemented using the OAuth standard. OAuth is a secure authorization protocol which is commonly used in conjunction with authentication to grant 3rd party applications a "session token" allowing them to make API calls to providers on the user's behalf. Sites using the social login in this manner typically offer social features such as commenting, sharing, reactions and gamification. While social login can be extended to corporate websites, the majority of social networks and consumer-based identity providers allow self-asserted identities. For this reason, social login is generally not used for strict, highly secu
https://en.wikipedia.org/wiki/Alliance%20%28taxonomy%29
An alliance is an informal grouping used in biological taxonomy. The term "alliance" is not a taxonomic rank defined in any of the nomenclature codes. It is used for any group of species, genera or tribes to which authors wish to refer, that have at some time provisionally been considered to be closely related. The term is often used for a group that authors are studying in further detail in order to refine the complex taxonomy. For example, a molecular phylogenetics study of the Aerides–Vanda Alliance (Orchidaceae: Epidendroideae) confirmed that the group is monophyletic, and clarified which species belong in each of the 14 genera. In other orchid groups, the various alliances that have been defined do not correspond well to clades. Historically, some 19th century botanical authors used alliance to denote groups that would now be considered orders. This usage is now obsolete, and the ICN (Article 17.2) specifies that such taxa are treated as orders. See also Species aggregate Association (ecology) Bioindicator
https://en.wikipedia.org/wiki/Agile%20contracts
The Agile fixed price is a contractual model agreed upon by suppliers and customers of IT projects that develop software using Agile methods. The model introduces an initial test phase after which budget, due date, and the way of steering the scope within the framework is agreed upon. This differs from traditional fixed-price contracts in that fixed-price contracts usually require a detailed and exact description of the subject matter of the contract in advance. Fixed price contracts aim at minimizing the potential risk caused by unpredictable, later changes. In contrast, Agile fixed price contracts simply require a broad description of the entire project instead of a detailed one. In Agile contracts, the supplier and the customer together define their common assumptions in terms of the business value, implementation risks, expenses (effort) and costs. On the basis of these assumptions, an indicative fixed price scope is agreed upon which is not yet contractually binding. This is followed by the test phase (checkpoint phase), during which the actual implementation begins. At the end of this phase, both parties compare the empirical findings with their initial assumptions. Together, they then decide on the implementation of the entire project and fixate the conditions under which changes are allowed to happen. Further aspects of an Agile contract are risk share (both parties divide the additional expenses for unexpected changes equally amongst themselves) or the option of either party leaving the contract at any stage (exit points). Approaches to Agile Contracts Capped Time and materials Contracts Capped T&M contracts work in the sense of traditional T&M contracts. However, there is an upper limit to how much customers will have to pay. In this way, suppliers benefit with early time-frame changes while customers only have to pay up until the capped cost limit is reached. Target Cost Contracts In target cost contracts, parties involved with the contracts agree
https://en.wikipedia.org/wiki/Diacetylverrucarol
Diacetylverrucarol is a natural trichothecene produced by the fungus Myrothecium verrucaria. Chemically, it is an acetate derivative of verrucarol. Herbicide use Trichothecenes have been found to cause growth retardation, wilting, chlorosis, and necrosis in some plants. It can work in poor conditions, even where there is very little or no moisture. Diacetylverrucarol is undergoing study for its successful effects at reducing or killing the kudzu weed. It is prepared through liquid fermentation of Myrothecium verrucaria mycelium after which it can be used as a pesticide. This causes very little harm to the plants that the kudzu plant displaces. Results appeared within 24 hours and the kudzu plant's roots were diseased after 14 days. In greenhouse experiments the fungus killed 100 percent of the kudzu seeds, and, in outdoor trials, 90-100 percent of older plants. The fungus is undergoing study to reduce the diacetylverrucarol presence to reduce side effects. Because it is a trichothecene, it has significant toxicity. If swallowed by humans, the initial symptoms are general discomfort, dry eyes, and drowsiness. If a larger dose is consumed, symptoms of a hemorrhagic fever will occur, as well as mental impairment. Other animals are susceptible as well, and can experience growth retardation, reproductive disorders, and vomiting if the substance is consumed.
https://en.wikipedia.org/wiki/Dependency%20ratio
The dependency ratio is an age-population ratio of those typically not in the labor force (the dependent part ages 0 to 14 and 65+) and those typically in the labor force (the productive part ages 15 to 64). It is used to measure the pressure on the productive population. Consideration of the dependency ratio is essential for governments, economists, bankers, business, industry, universities and all other major economic segments which can benefit from understanding the impacts of changes in population structure. A low dependency ratio means that there are sufficient people working who can support the dependent population. A lower ratio could allow for better pensions and better health care for citizens. A higher ratio indicates more financial stress on working people and possible political instability. While the strategies of increasing fertility and of allowing immigration especially of younger working age people have been formulas for lowering dependency ratios, future job reductions through automation may impact the effectiveness of those strategies. Formula In published international statistics, the dependent part usually includes those under the age of 15 and over the age of 64. The productive part makes up the population in between, ages 15 – 64. It is normally expressed as a percentage: As the ratio increases there may be an increased burden on the productive part of the population to maintain the upbringing and pensions of the economically dependent. This results in direct impacts on financial expenditures on things like social security, as well as many indirect consequences. The (total) dependency ratio can be decomposed into the child dependency ratio and the aged dependency ratio: Total dependency ratio by regions Projections Below is a table constructed from data provided by the UN Population Division. It shows a historical ratio for the regions shown for the period 1950 - 2010. Columns to the right show projections of the ratio. Each number
https://en.wikipedia.org/wiki/Activity-based%20proteomics
Activity-based proteomics, or activity-based protein profiling (ABPP) is a functional proteomic technology that uses chemical probes that react with mechanistically related classes of enzymes. Description The basic unit of ABPP is the probe, which typically consists of two elements: a reactive group (RG, sometimes called a "warhead") and a tag. Additionally, some probes may contain a binding group which enhances selectivity. The reactive group usually contains a specially designed electrophile that becomes covalently-linked to a nucleophilic residue in the active site of an active enzyme. An enzyme that is inhibited or post-translationally modified will not react with an activity-based probe. The tag may be either a reporter such as a fluorophore or an affinity label such as biotin or an alkyne or azide for use with the Huisgen 1,3-dipolar cycloaddition (also known as click chemistry). Advantages A major advantage of ABPP is the ability to monitor the availability of the enzyme active site directly, rather than being limited to protein or mRNA abundance. With classes of enzymes such as the serine hydrolases and metalloproteases that often interact with endogenous inhibitors or that exist as inactive zymogens, this technique offers a valuable advantage over traditional techniques that rely on abundance rather than activity. Multidimensional protein identification technology In recent years ABPP has been combined with tandem mass spectrometry enabling the identification of hundreds of active enzymes from a single sample. This technique, known as ABPP-MudPIT (multidimensional protein identification technology) is especially useful for profiling inhibitor selectivity as the potency of an inhibitor can be tested against hundreds of targets simultaneously. ABPP were first reported in the 1990s in the study of proteases. See also Mass spectrometry Proteomics Related inhibitors MAFP and DIFP Chemoproteomics
https://en.wikipedia.org/wiki/Communal%20coping
Communal coping is the collective effort of members of a connected network (familial or social) to manage a distressing event (Lyons, Michelson, Sullivan and Coyne, 1998). This definition and the scope of the concept positions communal coping as an offshoot of social support. According to Lyons et al. (1998), the communal coping conceptual framework emerged for two reasons. First, to expand the research that supports the claim that the coping process sometimes requires individual and collective effort (e.g. Fukuyama, 1995). Second, the need for a specific framework for investigating the cooperative characteristic of coping. To support the need for a framework which explores the social aspect of coping as a combined effort, the authors argued that the communal coping conceptual framework emphasizes the connectedness and reliance on personal network for coping. Developments to the communal coping framework include the explanation of the complex nature of the communal coping process (Afifi, Helgeson & Krouse, 2006) and specific personal outcomes (Helgeson, Jakubiak, Vleet, & Zajdel, 2018) following a communal coping process. Background Lyons et al. (1998) introduced the communal coping framework. The first model Lyons et al. (1998) proposed mainly distinguished between communal coping and existing perception of coping as an individualistic or prosocial process. Also, the model provided a lens for examining other aspects of coping such as the benefits, cost and influential factors.  Afifi, Hutchinson, and Krouse (2006) noted some of the achievements of the model is that it accounts for the relational process within coping and shifts the focus of researchers from treating the phenomenon as mainly a psychological process but also a relational or communication. However, despite the contributions of the model to the coping research, some questions still need an answer and a couple of research challenges remained unaddressed. For instance, Afifi et al. (2006) noted some r
https://en.wikipedia.org/wiki/NDPCP
5'-guanosyl-methylene-triphosphate (GDPCP) and 5'-adenosyl-methylene-triphosphate (ADPCP) are analogues of guanosine 5'-triphosphate (GTP) and adenosine 5'-triphosphate (ATP), which store chemical energy from metabolism in the cell. Hydrolysis of nucleoside triphosphates (NTPs) such as ATP and GTP yields energy, inorganic phosphate (Pi or PPi), and either NDP or NMP. GDPCP and ADPCP are not subject to hydrolysis under the same conditions as NTPs; it is this property which makes them useful in experiments in biochemistry and molecular biology. NTPs can be hydrolyzed at the phosphodiester bonds between phosphates, releasing energy and one or more of the three phosphate groups. Additionally, NTPs are inextricable components of some proteins, where their role may be structural and need not involve hydrolysis. In some cases, the presence of an NTP may be required for association of one protein with another, while hydrolysis is necessary for dissociation. GDPCP and ADPCP could be used in such a case, since association can still occur, but hydrolysis-dependent dissociation cannot. GDPCP was used to examine the prokaryotic elongation factor EF-Tu. EF-Tu is required for the elongation phase of protein synthesis (translation). EF-Tu requires GTP in order for the ribosome to bind it, necessary for recruiting an aminoacyl-tRNA. The later dissociation of EF-Tu from the ribosome, however, requires that the GTP first be hydrolyzed to GDP and Pi. GDPCP was used in place of GTP to differentiate between these two steps: the elongation factor could associate, but without hydrolysis, it was effectively stuck.
https://en.wikipedia.org/wiki/Signaling%20Gateway%20%28website%29
Signaling Gateway is a web portal dedicated to signaling pathways powered by the San Diego Supercomputer Center at the University of California, San Diego. It was initiated by a collaboration between the Alliance for Cellular Signaling and Nature. A primary feature is the Molecule Pages database. Molecule Pages Database (online database journal) Signaling Gateway Molecule Pages is a database containing "essential information on more than 8000 mammalian proteins (Mouse and Human) involved in cellular signaling." The content of molecule pages is authored by invited experts and is peer-reviewed. The published pages are citable by digital object identifiers (DOIs). All data in the Molecule Pages are freely available to the public. Data can be exported to PDF, XML, BioPAX/SBPAX and SBML. MIRIAM Registry Details. Some Published Molecule Pages
https://en.wikipedia.org/wiki/Woodin%20cardinal
In set theory, a Woodin cardinal (named for W. Hugh Woodin) is a cardinal number such that for all functions there exists a cardinal with and an elementary embedding from the Von Neumann universe into a transitive inner model with critical point and An equivalent definition is this: is Woodin if and only if is strongly inaccessible and for all there exists a which is --strong. being --strong means that for all ordinals , there exist a which is an elementary embedding with critical point , , and . (See also strong cardinal.) A Woodin cardinal is preceded by a stationary set of measurable cardinals, and thus it is a Mahlo cardinal. However, the first Woodin cardinal is not even weakly compact. Consequences Woodin cardinals are important in descriptive set theory. By a result of Martin and Steel, existence of infinitely many Woodin cardinals implies projective determinacy, which in turn implies that every projective set is Lebesgue measurable, has the Baire property (differs from an open set by a meager set, that is, a set which is a countable union of nowhere dense sets), and the perfect set property (is either countable or contains a perfect subset). The consistency of the existence of Woodin cardinals can be proved using determinacy hypotheses. Working in ZF+AD+DC one can prove that is Woodin in the class of hereditarily ordinal-definable sets. is the first ordinal onto which the continuum cannot be mapped by an ordinal-definable surjection (see Θ (set theory)). Mitchell and Steel showed that assuming a Woodin cardinal exists, there is an inner model containing a Woodin cardinal in which there is a -well-ordering of the reals, ◊ holds, and the generalized continuum hypothesis holds. Shelah proved that if the existence of a Woodin cardinal is consistent then it is consistent that the nonstationary ideal on is -saturated. Woodin also proved the equiconsistency of the existence of infinitely many Woodin cardinals and the existence of a
https://en.wikipedia.org/wiki/Melanie%20Mitchell
Melanie Mitchell is an American scientist. She is the Davis Professor of Complexity at the Santa Fe Institute. Her major work has been in the areas of analogical reasoning, complex systems, genetic algorithms and cellular automata, and her publications in those fields are frequently cited. She received her PhD in 1990 from the University of Michigan under Douglas Hofstadter and John Holland, for which she developed the Copycat cognitive architecture. She is the author of "Analogy-Making as Perception", essentially a book about Copycat. She has also critiqued Stephen Wolfram's A New Kind of Science and showed that genetic algorithms could find better solutions to the majority problem for one-dimensional cellular automata. She is the author of An Introduction to Genetic Algorithms, a widely known introductory book published by MIT Press in 1996. She is also author of Complexity: A Guided Tour (Oxford University Press, 2009), which won the 2010 Phi Beta Kappa Science Book Award, and Artificial Intelligence: A Guide for Thinking Humans (Farrar, Straus, and Giroux). Life Melanie Mitchell was born and raised in Los Angeles, California. She attended Brown University in Providence, Rhode Island, where she studied physics, astronomy and mathematics. Her interest in artificial intelligence was spurred in college when she read Douglas Hofstadter's Gödel, Escher, Bach. After graduating, she worked as a high school math teacher in New York City. Deciding she "needed to be" in artificial intelligence, Mitchell tracked down Douglas Hofstadter, repeatedly asking to become one of his graduate students. After finding Hofstadter's phone number at MIT, a determined Mitchell made several calls, all of which went unanswered. She was ultimately successful in reaching Hofstadter after calling at 11 p.m., and secured an internship working on the development of Copycat. In the fall of 1984, Mitchell followed Hofstadter to the University of Michigan, submitting a "last minute" applica
https://en.wikipedia.org/wiki/Image-forming%20optical%20system
In optics, an image-forming optical system is a system capable of being used for imaging. The diameter of the aperture of the main objective is a common criterion for comparison among optical systems, such as large telescopes. The two traditional optical systems are mirror-systems (catoptrics) and lens-systems (dioptrics). However, in the late twentieth century, optical fiber was introduced as a technology for transmitting images over long distances. Catoptrics and dioptrics have a focal point that concentrates light onto a specific point, while optical fiber the transfer of an image from one plane to another without the need for an optical focus. Isaac Newton is reported to have designed what he called a catadioptrical phantasmagoria, which can be interpreted to mean an elaborate structure of both mirrors and lenses. Catoptrics and optical fiber have no chromatic aberration, while dioptrics need to have this error corrected. Newton believed that such correction was impossible, because he thought the path of the light depended only on its color. In 1757 John Dollond was able to create an achromatised dioptric, which was the forerunner of the lenses used in all popular photographic equipment today. Lower-energy X-Rays are the highest energy electromagnetic radiation that can be formed into an image, using a Wolter telescope. There are three types of Wolter telescopes Near infrared is typically the longest wavelength that are handled optically, such as in some large telescopes.
https://en.wikipedia.org/wiki/Sergio%20Goldvarg
Sergio Goldvarg (born 1957) is an Argentinian architect, better known as a car collector, model car collector and model car maker resident in Miami, Florida. His large scale model car collection, with 7,000 model cars in 2003 and 12,000 model cars in 2009, was recognized by Guinness World Records. He and his wife Mariana also produced their own line of model American cars. In 2014,Sergio Goldvarg was inducted to the Model Car Hall of Fame in Las Vegas, for preserving the Legacy, for honoring pioneers of the model vehicle industry for their efforts to promote and enhance the hobby. From designers to entrepreneurs. www.modelcarhall.com. In 2018,Goldvarg Collection, the scale model car brand created by Sergio Goldvarg, was inducted to the Modelcar Hall of Fame for the New Brand of the Year. www.modelcarhall.com
https://en.wikipedia.org/wiki/P%C3%A1l%20Tur%C3%A1n
Pál Turán (; 18 August 1910 – 26 September 1976) also known as Paul Turán, was a Hungarian mathematician who worked primarily in extremal combinatorics. In 1940, because of his Jewish origins, he was arrested by the Nazis and sent to a labour camp in Transylvania, later being transferred several times to other camps. While imprisoned, Turán came up with some of his best theories, which he was able to publish after the war. Turán had a long collaboration with fellow Hungarian mathematician Paul Erdős, lasting 46 years and resulting in 28 joint papers. Biography Early years Turán was born into a Jewish family in Budapest on 18 August 1910. Pál's outstanding mathematical abilities showed early, already in secondary school he was the best student. At the same period of time, Turán and Pál Erdős were famous answerers in the journal KöMaL. On 1 September 1930, at a mathematical seminar at the University of Budapest, Turan met Erdős. They would collaborate for 46 years and produce 28 scientific papers together. Turán received a teaching degree at the University of Budapest in 1933. In the same year he published two major scientific papers in the journals of the American and London Mathematical Societies. He got the PhD degree under Lipót Fejér in 1935 at Eötvös Loránd University. As a Jew, he fell victim to numerus clausus, and could not get a stable job for several years. He made a living as a tutor, preparing applicants and students for exams. It was not until 1938 that he got a job at a rabbinical training school in Budapest as a teacher's assistant, by which time he had already had 16 major scientific publications and an international reputation as one of Hungary's leading mathematicians. He married Edit (Klein) Kóbor in 1939; they had one son, Róbert. In World War II In September 1940 Turán was interned in labour service. As he recalled later, his five years in labour camps eventually saved his life: they saved him from ending up in a concentration camp
https://en.wikipedia.org/wiki/Wigner%27s%20friend
Wigner's friend is a thought experiment in theoretical quantum physics, first published by the physicist Eugene Wigner in 1961, and further developed by David Deutsch in 1985. The scenario involves an indirect observation of a quantum measurement: An observer observes another observer who performs a quantum measurement on a physical system. The two observers then formulate a statement about the physical system's state after the measurement according to the laws of quantum theory. However, in the "orthodox" Copenhagen interpretation, the resulting statements of the two observers contradict each other. This reflects a seeming incompatibility of two laws in the Copenhagen interpretation: the deterministic and continuous time evolution of the state of a closed system and the nondeterministic, discontinuous collapse of the state of a system upon measurement. Wigner's friend is therefore directly linked to the measurement problem in quantum mechanics with its famous Schrödinger's cat paradox. Generalizations and extensions of Wigner's friend have been proposed. Two such scenarios involving multiple friends have been implemented in a laboratory, using photons to stand in for the friends. Original paradox Wigner introduced the thought experiment in a 1961 article "Remarks on the Mind-Body Question". He begins by noting that most physicists in the then-recent past had been thoroughgoing materialists who would insist that "mind" or "soul" are illusory, and that nature is fundamentally deterministic. He argues that quantum physics has changed this situation: All that quantum mechanics purports to provide are probability connections between subsequent impressions (also called "apperceptions") of the consciousness, and even though the dividing line between the observer, whose consciousness is being affected, and the observed physical object can be shifted towards the one or the other to a considerable degree, it cannot be eliminated. Nature of the wave function Going into
https://en.wikipedia.org/wiki/Potassium%20ascorbate
Potassium ascorbate is a compound with formula KC6H7O6. It is the potassium salt of ascorbic acid (vitamin C) and a mineral ascorbate. As a food additive, it has E number E303, INS number 303. Although it is not a permitted food additive in the UK, USA and the EU, it is approved for use in Australia and New Zealand. According to some studies, it has shown a strong antioxidant activity and antitumoral properties.
https://en.wikipedia.org/wiki/Bayesian%20program%20synthesis
In programming languages and machine learning, Bayesian program synthesis (BPS) is a program synthesis technique where Bayesian probabilistic programs automatically construct new Bayesian probabilistic programs. This approach stands in contrast to routine practice in probabilistic programming where human developers manually write new probabilistic programs. The framework Bayesian program synthesis (BPS) has been described as a framework related to and utilizing probabilistic programming. In BPS, probabilistic programs are generated that are themselves priors over a space of probabilistic programs. This strategy allows automatic synthesis of new programs via probabilistic inference and is achieved by the composition of modular component programs. The modularity in BPS allows inference to work on and test smaller probabilistic programs before being integrated into a larger model. This framework can be contrasted with the family of automated program synthesis fields, which include programming by example and programming by demonstration. The goal in such fields is to find the best program that satisfies some constraint. In traditional program synthesis, for instance, verification of logical constraints reduce the state space of possible programs, allowing more efficient search to find an optimal program. Bayesian program synthesis differs both in that the constraints are probabilistic and the output is itself a distribution over programs that can be further refined. Additionally, Bayesian program synthesis can be contrasted to the work on Bayesian program learning, where probabilistic program components are hand-written, pre-trained on data, and then hand assembled in order to recognize handwritten characters. See also Probabilistic programming language
https://en.wikipedia.org/wiki/Bone%20morphogenetic%20protein%208B
Bone morphogenetic protein 8B is a protein that in humans is encoded by the BMP8B gene. The protein encoded by this gene is a member of the TGF-β superfamily. It has close sequence homology to BMP7 and BMP5 and is believed to play a role in bone and cartilage development. It has been shown to be expressed in the hippocampus of murine embryos. The bone morphogenetic proteins (BMPs) are a family of secreted signaling molecules that can induce ectopic bone growth. Many BMPs are part of the transforming growth factor-beta (TGFB) superfamily. BMPs were originally identified by an ability of demineralized bone extract to induce endochondral osteogenesis in vivo in an extraskeletal site. Based on its expression early in embryogenesis, the BMP encoded by this gene has a proposed role in early development. In addition, the fact that this BMP is closely related to BMP5 and BMP7 has led to speculation of possible bone inductive activity.
https://en.wikipedia.org/wiki/InHour
InHour is a unit of reactivity of a nuclear reactor. It stands for the inverse of an hour. It is equal to the inverse of the period in hours. One InHour is the amount of reactivity needed to increase the reaction from critical to where the power will increase by a factor of e in one hour. The unit is abbreviated ih or inhr, and is usually measured with a reactimeter. See also Per cent mille
https://en.wikipedia.org/wiki/Coding%20conventions
Coding conventions are a set of guidelines for a specific programming language that recommend programming style, practices, and methods for each aspect of a program written in that language. These conventions usually cover file organization, indentation, comments, declarations, statements, white space, naming conventions, programming practices, programming principles, programming rules of thumb, architectural best practices, etc. These are guidelines for software structural quality. Software programmers are highly recommended to follow these guidelines to help improve the readability of their source code and make software maintenance easier. Coding conventions are only applicable to the human maintainers and peer reviewers of a software project. Conventions may be formalized in a documented set of rules that an entire team or company follows, or may be as informal as the habitual coding practices of an individual. Coding conventions are not enforced by compilers. Software maintenance Reducing the cost of software maintenance is the most often cited reason for following coding conventions. In the introductory section on code conventions for the Java programming language, Sun Microsystems offers the following reasoning: Code conventions are important to programmers for a number of reasons: 40%–80% of the lifetime cost of a piece of software goes to maintenance. Hardly any software is maintained for its whole life by the original author. Code conventions improve the readability of the software, allowing engineers to understand new code more quickly and thoroughly. If you ship your source code as a product, you need to make sure it is as well packaged and clean as any other product you create. Quality Software peer review frequently involves reading source code. This type of peer review is primarily a defect detection activity. By definition, only the original author of a piece of code has read the source file before the code is submitted for review. Code that is w
https://en.wikipedia.org/wiki/Raphidophyte
The raphidophytes, formally known as Raphidophycidae or Raphidophyceae (formerly referred to as Chloromonadophyceae and Chloromonadineae), are a small group of eukaryotic algae that includes both marine and freshwater species. All raphidophytes are unicellular, with large cells (50 to 100 μm), but no cell walls. Raphidophytes possess a pair of flagella, organised such that both originate from the same invagination (or gullet). One flagellum points forwards, and is covered in hair-like mastigonemes, while the other points backwards across the cell surface, lying within a ventral groove. Raphidophytes contain numerous ellipsoid chloroplasts, which contain chlorophylls a, c1 and c2. They also make use of accessory pigments including β-carotene and diadinoxanthin. Unlike other heterokontophytes, raphidophytes do not possess the photoreceptive organelle (or eyespot) typical of this group. In terms of ecology, raphidophytes occur as photosynthetic autotrophs across a range of aquatic systems. Freshwater species are more common in acidic waters, such as pools in bogs. Marine species often produce large blooms in summer, particularly in coastal waters. Off the Japanese coast, the resulting red tides often cause disruption to fish farms, although raphidophytes are not usually responsible for toxic blooms. The position of this group varied in former classifications. Some protozoologists treated the chloromonads as an order within the phytoflagellates. Some phycologists classified them with the Xanthophyceae and the Eustigmatophyceae in the division Xanthophyta. Others considered them as related to the Chrysophyceae, Dinophyceae, or Cryptophyceae Currently, raphidophytes are regarded as an independent lineage of algae within the class Raphidomonadea, which also includes the heliozoan group Actinophryida. Taxonomy The classification based on Cavalier-Smith and Scoble 2013 recognizes only one order, Chattonellales. Subclass Raphidophycidae Cavalier-Smith 2013 [Rap
https://en.wikipedia.org/wiki/Linear%20form
In mathematics, a linear form (also known as a linear functional, a one-form, or a covector) is a linear map from a vector space to its field of scalars (often, the real numbers or the complex numbers). If is a vector space over a field , the set of all linear functionals from to is itself a vector space over with addition and scalar multiplication defined pointwise. This space is called the dual space of , or sometimes the algebraic dual space, when a topological dual space is also considered. It is often denoted , or, when the field is understood, ; other notations are also used, such as , or When vectors are represented by column vectors (as is common when a basis is fixed), then linear functionals are represented as row vectors, and their values on specific vectors are given by matrix products (with the row vector on the left). Examples The constant zero function, mapping every vector to zero, is trivially a linear functional. Every other linear functional (such as the ones below) is surjective (that is, its range is all of ). Indexing into a vector: The second element of a three-vector is given by the one-form That is, the second element of is Mean: The mean element of an -vector is given by the one-form That is, Sampling: Sampling with a kernel can be considered a one-form, where the one-form is the kernel shifted to the appropriate location. Net present value of a net cash flow, is given by the one-form where is the discount rate. That is, Linear functionals in Rn Suppose that vectors in the real coordinate space are represented as column vectors For each row vector there is a linear functional defined by and each linear functional can be expressed in this form. This can be interpreted as either the matrix product or the dot product of the row vector and the column vector : Trace of a square matrix The trace of a square matrix is the sum of all elements on its main diagonal. Matrices can be multiplied by scalars and two mat
https://en.wikipedia.org/wiki/Subscriber%20loop%20carrier
A subscriber loop carrier or subscriber line carrier (SLC) provides telephone exchange-like telephone interface functionality. SLC remote terminals are typically located in areas with a high density of telephone subscribers, such as a residential neighborhood, or very rural areas with widely dispersed customers, that are remote from the telephone company's central office (CO). Two or four T1 circuits (depending on the configuration) connect the SLC remote terminal to the central office terminal (COT), in the case of a universal subscriber loop carrier (USLC). An integrated subscriber loop carrier (ISLC) has its T-spans terminating directly in time division switching equipment in the telephone exchange. One system serves up to 96 customers. This configuration is more efficient than the alternative of having separate copper pairs between each service termination point (the subscriber's location) and the central telephone exchange. These systems are generally installed in cabinets that have some form of uninterruptible power supply or other backup battery arrangements, standby generators, and sometimes with additional equipment such as remote DSLAMs. Reliability SLCs have been criticized for reducing the reliability of local loops due to their increased reliance on utility power. Historically, all loop power was provided by the CO and was backed up by battery power and, for longer power outages, stand-by diesel generators housed at the office. However, telephone companies have increasingly been using SLCs, which are notorious for poorly functioning or short-lived battery backup systems, some lasting as little as four hours. Many do not have on-site standby generators, which requires the telephone company to bring out a portable generator before the battery power fails. This may not happen in time if there are obstructions caused by a natural or man-made disaster, causing service outages for anyone served by that unit. Often, the air conditioning units, sump p
https://en.wikipedia.org/wiki/Permutationally%20invariant%20quantum%20state%20tomography
Permutationally invariant quantum state tomography (PI quantum state tomography) is a method for the partial determination of the state of a quantum system consisting of many subsystems. In general, the number of parameters needed to describe the quantum mechanical state of a system consisting of subsystems is increasing exponentially with In the case of a system consisting of qubits, which is perhaps the most relevant case, real parameters are needed to describe the state vector of a pure state, or real parameters are needed to describe the density matrix of a mixed state. Quantum state tomography is a method to determine all these parameters from a series of measurements on many independent and identically prepared systems. Thus, in the case of full quantum state tomography, the number of measurements needed scales exponentially with the number of particles or qubits. For large systems, the determination of the entire quantum state is no longer possible in practice and one is interested in methods that determine only a subset of the parameters necessary to characterize the quantum state that still contains important information about the state. Permutationally invariant quantum tomography is such a method needing fewer measurements than state tomography. PI quantum tomography only measures the permutationally invariant part of the density matrix. For the procedure, it is sufficient to carry out local measurements on the subsystems. If the state is close to being permutationally invariant, which is the case in many practical situations, then is close to the density matrix of the system. Even if the state is not permutationally invariant, can still be used for entanglement detection and computing relevant operator expectations values. Thus, the procedure does not assume the permutationally invariance of the quantum state. The number of independent real parameters of for qubits scales as The number of local measurement settings scales as Thus, per
https://en.wikipedia.org/wiki/Brazilian%20real
The Brazilian real (pl. ; sign: R$; code: BRL) is the official currency of Brazil. It is subdivided into 100 centavos. The Central Bank of Brazil is the central bank and the issuing authority. The real replaced the cruzeiro real in 1994. the real was the twentieth most traded currency. History Currencies in use before the current real include: The Portuguese real from the 16th to 18th centuries, with 1,000 réis called the milréis. The old Brazilian real from 1747 to 1942, with 1,000 réis also called the milréis. The first cruzeiro from 1942 to 1967, at 1 cruzeiro = 1 milréis or 1,000 réis. The cruzeiro novo from 1967 to 1970, at 1 cruzeiro novo = 1,000 first cruzeiros. From 1970 it was simply called the (second) cruzeiro and was used until 1986. The cruzado from 1986 to 1989, at 1 cruzado = 1,000 second cruzeiros. The cruzado novo from 1989 to 1990, at 1 cruzado novo = 1,000 cruzados. From 1990, because of the Plano Collor it was renamed the (third) cruzeiro and was used until 1993. The cruzeiro real (CR$) from 1993 to 1994, at 1 cruzeiro real = 1,000 third cruzeiros. The current real was introduced in 1994 at 1 real = 2,750 cruzeiros reais. The modern real (Portuguese plural reais or English plural reals) was introduced on 1 July 1994, during the presidency of Itamar Franco, when Rubens Ricupero was the Minister of Finance as part of a broader plan to stabilize the Brazilian economy, known as the Plano Real. The new currency replaced the short-lived cruzeiro real (CR$). The reform included the demonetisation of the cruzeiro real and required a massive banknote replacement. At its introduction, the real was defined to be equal to 1 unidade real de valor (URV, "real value unit") a non-circulating currency unit. At the same time the URV was defined to be worth 2,750 cruzeiros reais, which was the average exchange rate of the U.S. dollar to the cruzeiro real on that day. As a consequence, the real was worth exactly one U.S. dollar as it was introduced. Co
https://en.wikipedia.org/wiki/Barry%20Dwolatzky
Barry Dwolatzky (29 April 1952 – 16 May 2023) was a South African software engineer. He was a professor emeritus at the University of the Witwatersrand Joburg Centre for Software Engineering. Dwolatzky was on University of the People's computer science advisory board. He was an anti-apartheid activist and in the late 1980s he joined the African National Congress's armed wing, Umkhonto we Sizwe ("spear of the nation"). Education Dwolatzky completed a Bachelors of Science degree in 1975 and a Ph.D. in 1979, both in electrical engineering at the University of the Witwatersrand. He was a postdoctoral researcher at the University of Manchester Institute of Science and Technology and GEC Marconi. Career In 1989, Dwolatzky joined University of the Witwatersrand as a senior lecturer, becoming a full professor in 2000. He was an emeritus professor at the Joburg Centre for Software Engineering. Dwolatzky was on University of the People's computer science advisory board. Dwolatzky was a fellow of the South African Institute of Electrical Engineers and The Institute of IT Professionals South Africa (IITPSA). Death Dwolatzky died in Johannesburg on 16 May 2023, at the age of 71.
https://en.wikipedia.org/wiki/Chelonid%20alphaherpesvirus%206
Chelonid alphaherpesvirus 6 (ChHV-6) is a species of virus of uncertain generic placement in the subfamily Alphaherpesvirinae, family Herpesviridae, and order Herpesvirales.
https://en.wikipedia.org/wiki/Stahlberg%20Models
Stahlberg was a Finnish company producing promotional plastic model cars mainly of Swedish Saab and Volvo automobiles usually in scales between 1:18 and 1:25. Stahlberg mainly molded cars from the 1960s to about 1992, though its modern counterpart, Emek continues to make truck models. Other Finnish companies producing similar sized plastic models were Aren, Emek Muovi, Ju Ju, Hot Toys, KMS Myynti, Muovo, Nyrhinen Ky, and Plasto. Volvos and Saabs Stahlberg vehicles were pre-assembled promotional models. Available mainly through car dealerships in Europe and the U.S., models were produced from about 1965 to about 1990. Usually Stahlbergs were packaged in a clear plastic bag (poly-bagged), though later models did come in blue and white boxes. One writer reminisces that when at the Volvo dealer with his father as a child, he remembers the models often would be "hanging on the pegboard behind the service desk". Some standard Swedish models produced by Stahlberg were the Volvo 120–122 series (Amazon), 142 series, 164 series, 200 series, 700 series, and 66 / 300 series (DAFs). Saabs produced were Saab 99, 900 series, and 9000 series. Models were produced in 2-door, 4-door, and station wagon versions. Some models from earlier years are now getting rather hard to find, like the 144 sedan from the late 1960s which featured a bit better detail than later models and 'knobbier' tread tires. These early Volvos like the 164 or the SAAB 99 had lovely two piece wheels with unpainted rims and the silver hubs. Later wheels were accurate single cast copies of the actual designs. Details and model variety Older Stahlbergs from the 1970s were cast in a solid color plastic while later ones had a metallic effect along with swirls and streaks in the plastic. The most common scale of models was 1:22 (about 9 inches long), however 1:32 (about 5 inches long), 1:25, 1:18, and a larger 1:12 scale (about 18 inches long) were also produced. Though Volvo and Saab were the most common S
https://en.wikipedia.org/wiki/Patera%20%28architecture%29
In architecture, patera () is an ornamental circular or elliptical bas-relief disc. The patera is usually used to decorate friezes and walls, and to interrupt moldings. Patera is also used in furniture-making. It can be carved, incised, inlaid, or even painted. Overview The patera is found in the ancient Roman architecture and in almost all later western styles of architecture. The patera is used both within the civil and church architecture is usually made of marble or Istrian stone. It has a variable diameter between 20 and 80 cm, while the thickness is around 10 cm. The subject represented in the bas-relief is generally of floral or animal type, but there are also figures symbolizing trades or people. Being mainly a decorative element, the patera may also perform an apotropaic function to keep away evil spirits. Gallery
https://en.wikipedia.org/wiki/Salt%20water%20aspiration%20syndrome
Salt water aspiration syndrome is a rare diving disorder suffered by scuba divers who inhale a mist of seawater, usually from a faulty demand valve, causing irritation of the lungs. It is not the same thing as aspiration of salt water as a bulk liquid, i.e. drowning. It can usually be treated by rest for several hours. If severe, medical assessment is required. First described by Carl Edmonds. Signs and symptoms Symptoms of salt water aspiration syndrome include: Post-dive cough, with or without sputum, usually suppressed during the dive. In serious cases the sputum may be bloodstained, frothy and copious. Over time further symptoms may develop, including: rigors, tremors or shivering; nausea or vomiting; hot or cold sensations; dyspnea; cough; sputum; shortness of breath; headaches; malaise; and generalised aches. Cyanosis Mild fever retrosternal chest pain. Diagnosis The condition follows an exposure to breathing through apparatus that could allow aspiration of small quantities of salt water as an aerosol. An immediate cough with sputum followed by a latent period of about two hours average, respiratory symptoms and signs, reduction in forced expiration volume and vital capacity, possible radiographic changes and generalised symptoms of malaise, rigors, generalised aches and headaches, tachypnea and tachycardia. Differential diagnosis should consider decompression sickness, which can be indicated by the dive profile and breathing gas mixtures, and the presence of other symptoms of decompression sickness. Treatment for DCS is appropriate if any of these indications exist. A rapid beneficial response to breathing 100% oxygen is likely in the salt water aspiration syndrome, response to normobaric oxygen is likely to be slower for DCS, which may respond rapidly to recompression. Pulmonary barotrauma is also possible and should be considered. Serious cases of pulmonary barotrauma with pneumothorax, air emboli and surgical emphysema occurring suddenly after a d
https://en.wikipedia.org/wiki/K%C5%8Dsaku%20Yosida
was a Japanese mathematician who worked in the field of functional analysis. He is known for the Hille-Yosida theorem concerning C0-semigroups. Yosida studied mathematics at the University of Tokyo, and held posts at Osaka and Nagoya Universities. In 1955, Yosida returned to the University of Tokyo. See also Einar Carl Hille Functional analysis
https://en.wikipedia.org/wiki/Adductor%20tubercle%20of%20femur
The adductor tubercle is a tubercle on the lower extremity of the femur. It is formed where the medial lips of the linea aspera end below at the summit of the medial condyle. It is the insertion point of the tendon of the vertical fibers of the adductor magnus muscle.
https://en.wikipedia.org/wiki/Antitermination
Antitermination is the prokaryotic cell's aid to fix premature termination of RNA synthesis during the transcription of RNA. It occurs when the RNA polymerase ignores the termination signal and continues elongating its transcript until a second signal is reached. Antitermination provides a mechanism whereby one or more genes at the end of an operon can be switched either on or off, depending on the polymerase either recognizing or not recognizing the termination signal. Antitermination is used by some phages to regulate progression from one stage of gene expression to the next. The lambda gene N, codes for an antitermination protein (pN) that is necessary to allow RNA polymerase to read through the terminators located at the ends of the immediate early genes. Another antitermination protein, pQ, is required later in phage infection. pN and pQ act on RNA polymerase as it passes specific sites. These sites are located at different relative positions in their respective transcription units. Antitermination may be a regulated event Antitermination was discovered in bacteriophage infections. A common feature in the control of phage infection is that very few of the phage genes can be transcribed by the bacterial host RNA polymerase. Among these genes, however, are regulators whose products allow the next set of phage genes to be expressed. One of these types of regulator is an antitermination protein. In the absence of the antitermination protein, RNA polymerase terminates at the terminator. When the antitermination protein is present, it continues past the terminator. The best characterized example of antitermination is provided by lambda phage, in which the phenomenon was discovered. It is used at two stages of phage expression. The antitermination protein produced at each stage is specific for the particular transcription units that are expressed at that stage. The host RNA polymerase initially transcribes two genes, which are called the immediate early genes (N
https://en.wikipedia.org/wiki/Moesin
Moesin is a protein that in humans is encoded by the MSN gene. Moesin (for membrane-organizing extension spike protein) is a member of the ERM protein family which includes ezrin and radixin. ERM proteins appear to function as cross-linkers between plasma membranes and actin-based cytoskeletons. Moesin is localized to filopodia and other membranous protrusions that are important for cell–cell recognition and signaling and for cell movement. Interactions Moesin has been shown to interact with: CD43 ICAM3 Neutrophil cytosolic factor 1, Neutrophil cytosolic factor 4 VCAM-1 EZR
https://en.wikipedia.org/wiki/University%20of%20Kent
The University of Kent (formerly the University of Kent at Canterbury, abbreviated as UKC) is a semi-collegiate public research university based in Kent, United Kingdom. The university was granted its royal charter on 4 January 1965 and the following year Princess Marina, Duchess of Kent, was formally installed as the first Chancellor. The university has its main campus north of Canterbury situated within of park land, housing over 6,000 students, as well as campuses in Medway and Tonbridge in Kent and European postgraduate centres in Brussels, Athens, Rome and Paris. The university is international, with students from 158 different nationalities and 41% of its academic and research staff being from outside the United Kingdom. It is a member of the Santander Network of European universities encouraging social and economic development. History Origins A university in the city of Canterbury was first considered in 1947, when an anticipated growth in student numbers led several residents to seek the creation of a new university, including Kent. However, the plans never came to fruition. A decade later both population growth and greater demand for university places led to a re-consideration. In 1959 the Education Committee of Kent County Council explored the creation of a new university, formally accepting the proposal unanimously on 24 February 1960. Two months later the Education Committee agreed to seek a site at or near Canterbury, given the historical associations of the city, subject to the support of Canterbury City Council. By 1962 a site was found at Beverley Farm, straddling the then boundary between the City of Canterbury and the administrative county of Kent. The university's original name, chosen in 1962, was the University of Kent at Canterbury, reflecting the fact that the campus straddled the boundary between the county borough of Canterbury and Kent County Council. At the time it was the normal practice for universities to be named after the town
https://en.wikipedia.org/wiki/Temperature-sensitive%20mutant
Temperature-sensitive mutants are variants of genes that allow normal function of the organism at low temperatures, but altered function at higher temperatures. Cold sensitive mutants are variants of genes that allow normal function of the organism at higher temperatures, but altered function at low temperatures. Mechanism Most temperature-sensitive mutations affect proteins, and cause loss of protein function at the non-permissive temperature. The permissive temperature is one at which the protein typically can fold properly, or remain properly folded. At higher temperatures, the protein is unstable and ceases to function properly. These mutations are usually recessive in diploid organisms. Temperature sensitive mutants arrange a reversible mechanism and are able to reduce particular gene products at varying stages of growth and are easily done by changing the temperature of growth. Permissive temperature The permissive temperature is the temperature at which a temperature-sensitive mutant gene product takes on a normal, functional phenotype. When a temperature-sensitive mutant is grown in a permissive condition, the mutated gene product behaves normally (meaning that the phenotype is not observed), even if there is a mutant allele present. This results in the survival of the cell or organism, as if it were a wild type strain. In contrast, the nonpermissive temperature or restrictive temperature is the temperature at which the mutant phenotype is observed. Temperature sensitive mutations are usually missense mutations, which then will harbor the function of a specified necessary gene at the standard, permissive, low temperature. It will alternatively lack the function at a rather high, non-permissive, temperature and display a hypomorphic (partial loss of gene function) and a middle, semi-permissive, temperature. Use in research Temperature-sensitive mutants are useful in biological research. They allow the study of essential processes required for the surviv
https://en.wikipedia.org/wiki/Edward%20Keonjian
Edward Keonjian (14 August 1909 – 6 September 1999) was a prominent engineer, an early leader in the field of low-power electronics, the father of microelectronics. In 1954 Keonjian designed the world's first solar-powered, pocket-sized radio transmitter. In 1959 Keonjian designed the first prototype of integrated circuit. In 1963 he organized the world's first international symposium on low-power electronics. Later on Keonjian collaborated with NASA astronaut Neil Armstrong as chief of failure analysis on the Apollo 11 project. Early life Edward Keonjian was born in 1909 to an Armenian family in Tiflis, in the southern part of Russia, now known as the Georgian Republic. He obtained academic degrees in electrical engineering from Leningrad (now St. Petersburg) Institute of Electrical Engineering in 1932. When Leningrad was besieged in World War II, Edward was teaching at the Institute. Millions were perishing from cold and starvation. When he too collapsed from hunger, he was mistaken for dead and placed in a common grave. A woman passing by saw a hand sticking out of this grave. Noticing a slight movement, she realised someone was still alive and rescued him. To her astonishment it was an old friend. Not long afterward, Edward was evacuated from Leningrad, only to be captured, along with his wife Virginia and young son Edward, Jr., by the Germans and sent to a slave labor camp. His duties at the camp included dismantling aircraft for spare parts. Liberated after World War II, he eventually emigrated to the United States of America with both his wife, (who died in 1969) and son. He arrived in 1947 penniless, knowing not one word of English, and with no friends or relatives. Nonetheless, he rose to become one of the outstanding scientists and inventors in microelectronics. Personal life He was an avid traveler and a member of the Explorers Club, the Circumnavigators Club, and the Archaeological Institute of America. Edward was married to his first wife Virginia,
https://en.wikipedia.org/wiki/Systematic%20Biology
Systematic Biology is a peer-reviewed scientific journal published by Oxford University Press on behalf of the Society of Systematic Biologists. It covers the theory, principles, and methods of systematics as well as phylogeny, evolution, morphology, biogeography, paleontology, genetics, and the classification of all living things. The journal was established in 1952 as Systematic Zoology and obtained its current title in 1992.