source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/ATOX1
ATOX1 is a copper metallochaperone protein that is encoded by the ATOX1 gene in humans. In mammals, ATOX1 plays a key role in copper homeostasis as it delivers copper from the cytosol to transporters ATP7A and ATP7B. Homologous proteins are found in a wide variety of eukaryotes, including Saccharomyces cerevisiae as ATX1, and all contain a conserved metal binding domain. Function ATOX1 is an abbreviation of the full name Antioxidant Protein 1. The nomenclature stems from initial characterization that showed that ATOX1 protected cells from reactive oxygen species. Since then, the primary role of ATOX1 has been established as a copper metallochaperone protein found in the cytoplasm of eukaryotes. A metallochaperone is an important protein that has metal trafficking and sequestration roles. As a metal sequestration protein, ATOX1 is capable of binding free metals in vivo, in order to protect cells from generation of reactive oxygen species and mismetallation of metalloproteins. As a metal trafficking protein, ATOX1 is responsible for shuttling copper from the cytosol to ATPase transporters ATP7A and ATP7B that move copper to the trans-Golgi network or secretory vesicles. In Saccharomyces cerevisiae, Atx1 delivers Cu(I) to a homologous transporter, Ccc2. The delivery of copper to ATPase transporters is vital for the subsequent insertion of copper into ceruloplasmin, a ferroxidase required for iron metabolism, within the golgi apparatus. In addition to the metallochaperone function, recent reports have characterized ATOX1 as a cyclin D1 transcription factor. Structure & metal coordination ATOX1 has a ferrodoxin-like βαββαβ fold and coordinates to Cu(I) via a MXCXXC binding motif located in between the first β-sheet and α-helix. The metal binding motif is largely solvent exposed in Apo-ATOX1 and a conformational change is induced upon coordination to Cu(I). Cu(I) is coordinated in a distorted linear geometry to sulfurs of cystine to form a bond angle of 120°. The ov
https://en.wikipedia.org/wiki/Cinematography%20in%20healthcare
The concept of healthcare knowledge transfer using cinematography recognizes that films with carefully crafted and verified content, using graphics, animations and live-action video, can be one of the most efficient ways of transferring knowledge with clarity and speed, to both lay-people and healthcare professionals. History of medical cinematography Late 19th century The use of cinematography to enhance healthcare practice and delivery dates back to the late 19th century in Western Europe. Étienne-Jules Marey (a French scientist and physiologist), Eugène-Louis Doyen (a French surgeon), Bolesław Matuszewski (a Polish cameraman, in France his first name was written as Boleslas), and Gheorghe Marinescu (a Romanian neurologist) are some of the pioneers of medical cinematography. In 1888, Doyen had his surgeries captured on film. His films were brought into disrepute by the fact that they were copied and shown on fairgrounds. The resulting social prejudice may explain the slow take-off of medical cinematography. For example, in 1910 someone said the following regarding Doyen's videos: ,“These pictures savoured of advertisement, and were never popular, save as a side show among the less scientifically inclined members of the profession.” Doyen's work, however, marked the distinction between the concepts of ‘film for entertainment’ versus ‘film for medicine’. In 1893, Marey used the technique to study human physiology and movement. In 1898 Boleslas Matuszewski recorded medical films in Paris (at the time, the world's neurologic capital). At Salpêtrière and Pitié hospitals he filmed surgeries and cases of people affected by nervous and mental disorders. Gheorghe Marinescu (Romanian neurologist) studied under renowned French professor Jean-Martin Charcot and returned to Bucharest as Chief Physician at Pantelimon Hospital. Between 1898 and 1902 he conducted a cinematographic project, recording and analyzing a series of neurological conditions in patients. He perfec
https://en.wikipedia.org/wiki/Gliwice%20Radio%20Tower
The Gliwice Radio Tower is a transmission tower in the Szobiszowice district of Gliwice, Upper Silesia, Poland. Structure The Gliwice Radio Tower is 111 m tall, with a wooden framework of impregnated siberian larch linked by brass connectors. It was nicknamed "the Silesian Eiffel Tower" by the local population. The tower has four platforms, which are 40.4 m, 55.3 m, 80.0 m and 109.7 m above ground. The top platform measures 2.13 x 2.13 m. A ladder with 365 steps provides access to the top. The tower is the tallest wooden structure in Europe. The tower was originally designed to carry aerials for medium wave broadcasting, but that transmitter is no longer in service because the final stage is missing. Today, the Gliwice Radio Tower carries multiple transceiver antennas for mobile phone services and a low-power FM transmitter broadcasting on 93.4 MHz. History The tower was erected from 1 August 1934 as Sendeturm Gleiwitz (Gleiwitz Radio Tower), when the territory was part of Germany. It was operated by the Reichssender Breslau (former Schlesische Funkstunde broadcasting corporation) of the Reichs-Rundfunk-Gesellschaft radio network. The tower was modeled on the Mühlacker radio transmitter, it replaced a smaller transmitter in Gleiwitz situated nearby on Raudener Straße and went in service on 23 December 1935. On 31 August 1939, the German SS staged a 'Polish' attack on Gleiwitz radio station, which was later used as justification for the invasion of Poland. The transmission facility was not demolished in World War II. From 4 October 1945 until the inauguration of the new transmitter in Ruda Śląska in 1955 the Gliwice transmitter was used for medium-wave transmissions by the Polish state broadcaster Polskie Radio. After 1955, it was used to jam medium-wave stations (such as Radio Free Europe) broadcasting Polish-language programmes from Western Europe. Transmitted programmes Radio See also Gleiwitz incident List of tallest wooden buildings Trivia The shape of
https://en.wikipedia.org/wiki/Infosys%20Prize
The Infosys Prize is an annual award given to scientists, researchers, engineers and social scientists of Indian origin (not necessarily born in India) by the Infosys Science Foundation and ranks among the highest monetary awards in India to recognize research. The prize for each category includes a gold medallion, a citation certificate, and prize money of US$100,000 (or its equivalent in Indian Rupees). The prize purse is tax free in the hands of winners in India. The winners are selected by the jury of their respective categories, headed by the jury chairs. In 2008, the prize was jointly awarded by the Infosys Science Foundation and National Institute of Advanced Studies for mathematics. The following year, three additional categories were added: Life Sciences, Mathematical Sciences, Physical Sciences and Social Sciences. In 2010, Engineering and Computer Science was added as a category. In 2012, a sixth category, Humanities, was added. Laureates in Engineering and Computer Science The Infosys Prize in Engineering and Computer Science has been awarded annually since 2010. Laureates in Humanities The Infosys Prize in Humanities has been awarded annually since 2012. Laureates in Life Sciences The Infosys Prize in Life Sciences has been awarded annually since 2009. Laureates in Mathematical Sciences The Infosys Prize in Mathematical Sciences has been awarded annually since 2008. Laureates in Physical Sciences The Infosys Prize in Physical Sciences has been awarded annually since 2009. Laureates in Social Sciences The Infosys Prize in Social Sciences has been awarded annually since 2009. Trustees N. R. Narayana Murthy S. Gopalakrishnan K. Dinesh S. D. Shibulal T.V. Mohandas Pai Srinath Batni Nandan Nilekani Controversies Lawrence Liang, a professor of law awarded the Infosys Prize, was found guilty by an internal university inquiry committee of sexually harassing a doctoral student on multiple occasions. Following the adverse finding, prominen
https://en.wikipedia.org/wiki/Frontlight
A frontlight is a means of illuminating a display device, usually a liquid crystal display (LCD), which would otherwise be viewed in ambient light. This improves its performance in poor lighting conditions. An LCD presents an image by absorbing some light passing through it. When an electric field is applied across the crystal, it changes the passing light so it will not pass through a polarization filter. This allows LCDs to operate at low power, as no energy needs to be spent generating light. Many battery-operated electronic devices, including most calculators and other devices use unilluminated LCDs. An unilluminated LCD must be lit from the front. To use ambient light, the liquid crystal itself is sandwiched between a polarization filter and a reflective surface. The mirror makes the display opaque so it cannot be illuminated from the back. Most often a light source is placed around the perimeter of the LCD. Frontlights are relatively uncommon. Electroluminescent lights present a reflective surface when turned off. This allows for a backlit display which can also be used with ambient light. Such backlights are popular in digital watches. The monochromatic light from an electroluminescent source does not work well with color displays, however. An incandescent frontlight was therefore a popular accessory for the Nintendo Game Boy Color. Devices using a frontlight GP32 FLU (2002) Game Boy Advance SP (2003) Kindle Paperwhite (2013) See also LCD Backlight Liquid crystal displays
https://en.wikipedia.org/wiki/Six-state%20protocol
The six-state protocol (SSP) is the quantum cryptography protocol that is the version of BB84 that uses a six-state polarization scheme on three orthogonal bases. Origin The six-state protocol first appeared in the article "Optimal Eavesdropping in Quantum Cryptography with Six States" by Dagmar Bruss in 1998, and was further studied in "Incoherent and coherent eavesdropping in the six-state protocol of quantum cryptography" by Pasquinucci and Nicolas Gisin in 1999. Description "The six-state protocol is a discrete-variable protocol for quantum key distribution that permits tolerating a noisier channel than the BB84 protocol." (2011, Abruzzo). SSP produces a higher rate of errors during attempted eavesdropping, thus making it easier to detect errors, as an eavesdropper must choose the right basis from three possible bases (Haitjema, 2016). High dimensional systems have been proven to provide a higher level of security. Implementation Six-state protocol can be implemented without a quantum computer using only optical technologies. SSP's three conjugate bases span is shown on Picture 1. Alice randomly generates a qubit string, encodes them using randomly chosen one of three bases, and sends string of qubits to Bob through the secured quantum channel. The probability of using one of the bases equals 1/3. After receiving the string of qubits, Bob also randomly chooses one of three bases for measuring the state of each qubits. Using classical insecure, but authenticated, channel Alice and Bob communicate and discard measurements where Bob used the different basis for measure the state of the qubit than basis that Alice used for encoding. States of qubits where encoding basis matched measurement basis used to determine the secret key. See also SARG04 E91 – quantum cryptographic communication protocol BB84
https://en.wikipedia.org/wiki/Pickled%20carrot
A pickled carrot is a carrot that has been pickled in a brine, vinegar, or other solution and left to ferment for a period of time, by either immersing the carrots in an acidic solution or through souring by lacto-fermentation. Pickled carrots are often served with Vietnamese cuisine including bánh mì or as a component in an appetizer. Common variations In Mexico, pickled carrots are known as escabeche carrots. This variation is usually pickled in vinegar with onions, peppers, and spices. They maintain a crunchy, carrot texture with flavors of peppers and other seasonings. These spicy, pickled vegetables are traditionally served with a meal. In Vietnamese culture, pickled carrots are served alongside appetizers including Vietnamese egg rolls, or as an ingredient in recipes such as bánh mì and various soups. In Vietnamese-American markets, pickled carrot and daikon are available to buy in bulk, and in Vietnam, the two pickled vegetables are sold by wet market vendors in small plastic bags. They are not canned and should be refrigerated, although they may last for long amounts of time in the refrigerator. Japanese cuisine may use pickled carrots alongside many traditional Japanese meals. Pickled fruits and vegetables (tsukemono) are a quintessential part of the Japanese diet. Pickling in Japan has taken place since before refrigeration, so tsukemono such as pickled carrots was made to last long amounts of time. The different methods used to make tsukemono vary from simple salting or vinegar brining, to other various processes involving cultured molds and fermentation. A common method of pickling carrots is called misozuke. Misozuke pickles are made by covering vegetables in miso so give them the salty, miso flavor. Alternatively, pickled carrots may be prepared in the nukazuke fashion — fermented in a mixture of roasted rice bran, salt, konbu, and other ingredients. Nukazuke pickled carrots are usually served alongside set meals. In India, traditional pickled c
https://en.wikipedia.org/wiki/International%20Affective%20Picture%20System
The International Affective Picture System (IAPS) is a database of pictures designed to provide a standardized set of pictures for studying emotion and attention that has been widely used in psychological research. The IAPS was developed by the National Institute of Mental Health Center for Emotion and Attention at the University of Florida. In 2005, the IAPS comprised 956 color photographs ranging from everyday objects and scenes − such as household furniture and landscapes − to extremely rare or exciting scenes − such as mutilated bodies and erotic nudes. Normative Ratings It is the essential property of the IAPS that the stimulus set is accompanied by a detailed list of average ratings of the emotions elicited by each picture. This shall enable other researchers to select stimuli eliciting a specific range of emotions for their experiments when using the IAPS. The process of establishing such average ratings for a stimulus set is also referred to as standardization by psychologists. The normative rating procedure for the IAPS is based on the assumption that emotional assessments can be accounted for by the three dimensions of valence, arousal and dominance. Thus, participants taking part in the studies that are conducted to standardize the IAPS are asked to rate how pleasant/unpleasant, how calm/excited and how controlled/in-control they felt when looking at each picture. A graphic rating scale, the Self-Assessment Manikin (SAM), is used for this rating procedure. Original norms The official normative ratings for the IAPS pictures were obtained from a sample of 100 college students (50 women, 50 men, presumably predominantly US-American) who each rated 16 sets of 60 pictures. The rating was carried out in groups using paper-and-pencil versions of the SAMs. Pictures were presented for 6 seconds each; 15 seconds were given to rate the picture. /dominance. Average valence, arousal and dominance ratings are available for the overall sample, men, and women. Norm
https://en.wikipedia.org/wiki/Fluid%20power
Fluid power is the use of fluids under pressure to generate, control, and transmit power. Fluid power is conventionally subdivided into hydraulics (using a liquid such as mineral oil or water) and pneumatics (using a gas such as compressed air or other gases). Although steam is also a fluid, steam power is usually classified separately from fluid power (implying hydraulics or pneumatics). Compressed-air and water-pressure systems were once used to transmit power from a central source to industrial users over extended geographic areas; fluid power systems today are usually within a single building or mobile machine. Fluid power systems perform work by a pressurized fluid bearing directly on a piston in a cylinder or in a fluid motor. A fluid cylinder produces a force resulting in linear motion, whereas a fluid motor produces torque resulting in rotary motion. Within a fluid power system, cylinders and motors (also called actuators) do the desired work. Control components such as valves regulate the system. Elements A fluid power system has a pump driven by a prime mover (such as an electric motor or internal combustion engine) that converts mechanical energy into fluid energy, Pressurized fluid is controlled and directed by valves into an actuator device such as a hydraulic cylinder or pneumatic cylinder, to provide linear motion, or a hydraulic motor or pneumatic motor, to provide rotary motion or torque. Rotary motion may be continuous or confined to less than one revolution. Hydraulic pumps Dynamic (non positive displacement) pumps This type is generally used for low-pressure, high volume flow applications. Since they are not capable of withstanding high pressures, there is little use in the fluid power field. Their maximum pressure is limited to 250-300 psi (1.7 - 2.0 MPa). This type of pump is primarily used for transporting fluids from one location to another. Centrifugal and axial flow propeller pumps are the two most common types of dynamic pumps. Po
https://en.wikipedia.org/wiki/Shell%20Processing%20Support
Shell Processing Support (SPS) is a set of formatted files used for exchanging Land 3D seismic data. They can also be used for 2D. History Shell Processing Support data format was initially defined and used by Shell Internationale Petroleum for transferring of seismic and positioning data to the processing centres. In 1993 SEG Technical standards committee on ancillary data formats adopted SPS as the standard format for exchanging Geophysical positioning data. SPS Format SPS format is a set of three files called Receiver file, Source file, Cross reference file and an optional fourth file for comments. In addition to the comment file comments can be entered in each of the three files as part of the header and is indicated by starting with the letter H. Receiver file: Receiver file contains the information about the geophones their type, position (Easting, Northing, Elevation) and their ID. Source file: Source file contains the information about the Seismic source its position and their ID. Cross reference file: Cross reference file (also known as relational file) or in short, X file, is basically a relational file relating the source and the receiver when the shot occurred. It contains the details about the Shot ID, Source and receivers associated with that particular shot ID. Comment file: This file is an optional file. Any other information regarding the seismic acquisition can be provided in this file for the processing center. Versions The first version of SPS adopted by SEG is now referred to as SPS rev0 The recent version of SPS is SPS rev2.1
https://en.wikipedia.org/wiki/Hydride%20vapour-phase%20epitaxy
Hydride vapour-phase epitaxy (HVPE) is an epitaxial growth technique often employed to produce semiconductors such as GaN, GaAs, InP and their related compounds, in which hydrogen chloride is reacted at elevated temperature with the group-III metals to produce gaseous metal chlorides, which then react with ammonia to produce the group-III nitrides. Carrier gasses commonly used include ammonia, hydrogen and various chlorides. HVPE technology can significantly reduce the cost of production compared to the most common method of vapor deposition of organometallic compounds (MOCVD). Cost reduction is achieved by significantly reducing the consumption of NH3, cheaper source materials than in MOCVD, reducing the capital equipment costs, due to the high growth rate. Developed in the 1960s, it was the first epitaxial method used for the fabrication of single GaN crystals. Hydride vapour-phase epitaxy (HVPE) is the only III–V and III–N semiconductor crystal growth process working close to equilibrium. This means that the condensation reactions exhibit fast kinetics: one observes immediate reactivity to an increase of the vapour-phase supersaturation towards condensation. This property is due to the use of chloride vapour precursors GaCl and InCl, of which dechlorination frequency is high enough so that there is no kinetic delay. A wide range of growth rates, from 1 to 100 micrometers per hour, can then be set as a function of the vapour-phase supersaturation. Another HVPE feature is that growth is governed by surface kinetics: adsorption of gaseous precursors, decomposition of ad-species, desorption of decomposition products, surface diffusion towards kink sites. This property is of benefit when it comes to selective growth on patterned substrates for the synthesis of objects and structures exhibiting a 3D morphology. The morphology is only dependent on the intrinsic growth anisotropy of crystals. By setting experimental growth parameters of temperature and composition of
https://en.wikipedia.org/wiki/Great%20Oxidation%20Event
The Great Oxidation Event (GOE) or Great Oxygenation Event, also called the Oxygen Catastrophe, Oxygen Revolution, Oxygen Crisis or Oxygen Holocaust, was a time interval during the Early Earth's Paleoproterozoic Era when the Earth's atmosphere and the shallow ocean first experienced a rise in the concentration of oxygen. This began approximately 2.460–2.426 Ga (billion years) ago, during the Siderian period, and ended approximately 2.060 Ga, during the Rhyacian. Geological, isotopic, and chemical evidence suggests that biologically-produced molecular oxygen (dioxygen or O2) started to accumulate in Earth's atmosphere and changed it from a weakly reducing atmosphere practically devoid of oxygen into an oxidizing one containing abundant free oxygen, with oxygen levels being as high as 10% of their present atmospheric level by the end of the GOE. The sudden injection of highly reactive free oxygen, toxic to the then-mostly anaerobic biosphere, may have caused the extinction of many existing organisms on Earth — then mostly archaeal colonies that used retinal to utilize green-spectrum light energy and power a form of anoxygenic photosynthesis (see Purple Earth hypothesis). Although the event is inferred to have constituted a mass extinction, due in part to the great difficulty in surveying microscopic organisms' abundances, and in part to the extreme age of fossil remains from that time, the Great Oxidation Event is typically not counted among conventional lists of "great extinctions", which are implicitly limited to the Phanerozoic eon. In any case, Isotope geochemistry data from sulfate minerals have been interpreted to indicate a decrease in the size of the biosphere of >80% associated with changes in nutrient supplies at the end of the GOE. The GOE is inferred to have been caused by cyanobacteria who evolved porphyrin-based photosynthesis, which produces dioxygen as a byproduct. The increasing oxygen level eventually depleted the reducing capacity of ferrous compo
https://en.wikipedia.org/wiki/Data%20lineage
Data lineage includes the data origin, what happens to it, and where it moves over time. Data lineage provides visibility and simplifies tracing errors back to the root cause in a data analytics process. It also enables replaying specific portions or inputs of the data flow for step-wise debugging or regenerating lost output. Database systems use such information, called data provenance, to address similar validation and debugging challenges. Data provenance refers to records of the inputs, entities, systems, and processes that influence data of interest, providing a historical record of the data and its origins. The generated evidence supports forensic activities such as data-dependency analysis, error/compromise detection and recovery, auditing, and compliance analysis. "Lineage is a simple type of why provenance." Data lineage can be represented visually to discover the data flow/movement from its source to destination via various changes and hops on its way in the enterprise environment, how the data gets transformed along the way, how the representation and parameters change, and how the data splits or converges after each hop. A simple representation of the Data Lineage can be shown with dots and lines, where dot represents a data container for data points and lines connecting them represents the transformations the data point undergoes, between the data containers. Representation broadly depends on the scope of the metadata management and reference point of interest. Data lineage provides sources of the data and intermediate data flow hops from the reference point with backward data lineage, leading to the final destination's data points and its intermediate data flows with forward data lineage. These views can be combined with end-to-end lineage for a reference point that provides a complete audit trail of that data point of interest from sources to their final destinations. As the data points or hops increase, the complexity of such representation becom
https://en.wikipedia.org/wiki/Thermal%20fluctuations
In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero. Thermal fluctuations are a basic manifestation of the temperature of systems: A system at nonzero temperature does not stay in its equilibrium microscopic state, but instead randomly samples all possible states, with probabilities given by the Boltzmann distribution. Thermal fluctuations generally affect all the degrees of freedom of a system: There can be random vibrations (phonons), random rotations (rotons), random electronic excitations, and so forth. Thermodynamic variables, such as pressure, temperature, or entropy, likewise undergo thermal fluctuations. For example, for a system that has an equilibrium pressure, the system pressure fluctuates to some extent about the equilibrium value. Only the 'control variables' of statistical ensembles (such as the number of particules N, the volume V and the internal energy E in the microcanonical ensemble) do not fluctuate. Thermal fluctuations are a source of noise in many systems. The random forces that give rise to thermal fluctuations are a source of both diffusion and dissipation (including damping and viscosity). The competing effects of random drift and resistance to drift are related by the fluctuation-dissipation theorem. Thermal fluctuations play a major role in phase transitions and chemical kinetics. Central limit theorem The volume of phase space , occupied by a system of degrees of freedom is the product of the configuration volume and the momentum space volume. Since the energy is a quadratic form of the momenta for a non-relativistic system, the radius of momentum space will be so that the volume of a hypersphere will vary as giving a phase volume of where is a constant depending upon the
https://en.wikipedia.org/wiki/RBMS3
RNA-binding motif, single-stranded-interacting protein 3 is a protein that in humans is encoded by the RBMS3 gene.
https://en.wikipedia.org/wiki/J-aggregate
A J-aggregate is a type of dye with an absorption band that shifts to a longer wavelength (bathochromic shift) of increasing sharpness (higher absorption coefficient) when it aggregates under the influence of a solvent or additive or concentration as a result of supramolecular self-organisation. The dye can be characterized further by a small Stokes shift with a narrow band. The J in J-aggregate refers to E.E. Jelley who discovered the phenomenon in 1936. The dye is also called a Scheibe aggregate after G. Scheibe who also independently published on this topic in 1937. Scheibe and Jelley independently observed that in ethanol the dye PIC chloride has two broad absorption maxima at around 19,000 cm−1 and 20,500 cm−1 (526 and 488 nm respectively) and that in water a third sharp absorption maximum appears at 17,500 cm−1 (571 nm). The intensity of this band further increases on increasing concentration and on adding sodium chloride. In the oldest aggregation model for PIC chloride the individual molecules are stacked like a roll of coins forming a supramolecular polymer but the true nature of this aggregation phenomenon is still under investigation. Analysis is complicated because PIC chloride is not a planar molecule. The molecular axis can tilt in the stack creating a helix pattern. In other models the dye molecules orient themselves in a brickwork, ladder, or staircase fashion. In various experiments the J-band was found to split as a function of temperature, liquid crystal phases were found with concentrated solutions and CryoTEM revealed aggregate rods 350 nm long and 2.3 nm in diameter. J-aggregate dyes are found with polymethine dyes in general, with cyanines, merocyanines, squaraine and perylene bisimides. Certain π-conjugated macrocycles, reported by Swager and co-workers at MIT, were also found to form J-aggregates and exhibited exceptionally high photoluminescence quantum yields. In 2020, a famous cyanine dye (TDBC) was reported with enhanced photolumines
https://en.wikipedia.org/wiki/Dataflow%20programming
In computer programming, dataflow programming is a programming paradigm that models a program as a directed graph of the data flowing between operations, thus implementing dataflow principles and architecture. Dataflow programming languages share some features of functional languages, and were generally developed in order to bring some functional concepts to a language more suitable for numeric processing. Some authors use the term datastream instead of dataflow to avoid confusion with dataflow computing or dataflow architecture, based on an indeterministic machine paradigm. Dataflow programming was pioneered by Jack Dennis and his graduate students at MIT in the 1960s. Considerations Traditionally, a program is modelled as a series of operations happening in a specific order; this may be referred to as sequential, procedural, control flow (indicating that the program chooses a specific path), or imperative programming. The program focuses on commands, in line with the von Neumann vision of sequential programming, where data is normally "at rest". In contrast, dataflow programming emphasizes the movement of data and models programs as a series of connections. Explicitly defined inputs and outputs connect operations, which function like black boxes. An operation runs as soon as all of its inputs become valid. Thus, dataflow languages are inherently parallel and can work well in large, decentralized systems. State One of the key concepts in computer programming is the idea of state, essentially a snapshot of various conditions in the system. Most programming languages require a considerable amount of state information, which is generally hidden from the programmer. Often, the computer itself has no idea which piece of information encodes the enduring state. This is a serious problem, as the state information needs to be shared across multiple processors in parallel processing machines. Most languages force the programmer to add extra code to indicate which data an
https://en.wikipedia.org/wiki/Eccentricity%20%28mathematics%29
In mathematics, the eccentricity of a conic section is a non-negative real number that uniquely characterizes its shape. One can think of the eccentricity as a measure of how much a conic section deviates from being circular. In particular: The eccentricity of a circle is 0. The eccentricity of an ellipse which is not a circle is between 0 and 1. The eccentricity of a parabola is 1. The eccentricity of a hyperbola is greater than 1. The eccentricity of a pair of lines is Two conic sections with the same eccentricity are similar. Definitions Any conic section can be defined as the locus of points whose distances to a point (the focus) and a line (the directrix) are in a constant ratio. That ratio is called the eccentricity, commonly denoted as . The eccentricity can also be defined in terms of the intersection of a plane and a double-napped cone associated with the conic section. If the cone is oriented with its axis vertical, the eccentricity is where β is the angle between the plane and the horizontal and α is the angle between the cone's slant generator and the horizontal. For the plane section is a circle, for a parabola. (The plane must not meet the vertex of the cone.) The linear eccentricity of an ellipse or hyperbola, denoted (or sometimes or ), is the distance between its center and either of its two foci. The eccentricity can be defined as the ratio of the linear eccentricity to the semimajor axis : that is, (lacking a center, the linear eccentricity for parabolas is not defined). It is worth to note that a parabola can be treated as an ellipse or a hyperbola, but with one focal point at infinity. Alternative names The eccentricity is sometimes called the first eccentricity to distinguish it from the second eccentricity and third eccentricity defined for ellipses (see below). The eccentricity is also sometimes called the numerical eccentricity. In the case of ellipses and hyperbolas the linear eccentricity is sometimes called the hal
https://en.wikipedia.org/wiki/USNS%20Relentless
USNS Relentless (T-AGOS-18) was a Stalwart-class modified tactical auxiliary general ocean surveillance ship in service in the United States Navy from 1990 to 1993. Since 1998, she has been in commission in the National Oceanic and Atmospheric Administration (NOAA) fleet as the fisheries research ship NOAAS Gordon Gunter (R 336). Construction The U.S. Navy ordered Relentless from VT Halter Marine, on 20 February 1987. VT Halter Marine laid her down at Moss Point, Mississippi, on 22 April 1988, launched her on 12 May 1989, and delivered her to the U.S. Navy on 12 January 1990. United States Navy service On the day of her delivery, the U.S. Navy placed the ship in non-commissioned service in the Military Sealift Command as USNS Relentless (T-AGOS-18). Like the other Stalwart-class ships, she was designed to collect underwater acoustical data in support of Cold War anti-submarine warfare operations against Soviet Navy submarines using Surveillance Towed Array Sensor System (SURTASS) sonar equipment. She operated with a mixed crew of U.S. Navy personnel and civilian merchant mariners. After the Cold War ended with the collapse of the Soviet Union in late December 1991, the requirement for SURTASS collection declined. The Navy took Relentless out of service on 17 March 1993 and transferred her to the National Oceanic and Atmospheric Administration (NOAA) the same day. She was stricken from the Naval Vessel Register on 20 May 1993. National Oceanic and Atmospheric Administration service NOAA converted the ship into a fisheries research ship and commissioned her into NOAA service as NOAAS Gordon Gunter (R 336) on 28 August 1998. She replaced the decommissioned NOAA fisheries research ship NOAAS Chapman (R 446). Capabilities Gordon Gunter is outfitted for fishing operations employing stern trawling, longlining, plankton tows, dredging, and trap fishing. She is fitted with modern navigation electronics and oceanographic winches, as well as sophisticated sensors and sa
https://en.wikipedia.org/wiki/Kernel%20panic
A kernel panic (sometimes abbreviated as KP) is a safety measure taken by an operating system's kernel upon detecting an internal fatal error in which either it is unable to safely recover or continuing to run the system would have a higher risk of major data loss. The term is largely specific to Unix and Unix-like systems. The equivalent on Microsoft Windows operating systems is a stop error, often called a "blue screen of death". The kernel routines that handle panics, known as panic() in AT&T-derived and BSD Unix source code, are generally designed to output an error message to the console, dump an image of kernel memory to disk for post-mortem debugging, and then either wait for the system to be manually rebooted, or initiate an automatic reboot. The information provided is of a highly technical nature and aims to assist a system administrator or software developer in diagnosing the problem. Kernel panics can also be caused by errors originating outside kernel space. For example, many Unix operating systems panic if the init process, which runs in user space, terminates. History The Unix kernel maintains internal consistency and runtime correctness with assertions as the fault detection mechanism. The basic assumption is that the hardware and the software should perform correctly and a failure of an assertion results in a panic, i.e. a voluntary halt to all system activity. The kernel panic was introduced in an early version of Unix and demonstrated a major difference between the design philosophies of Unix and its predecessor Multics. Multics developer Tom van Vleck recalls a discussion of this change with Unix developer Dennis Ritchie: I remarked to Dennis that easily half the code I was writing in Multics was error recovery code. He said, "We left all that stuff out. If there's an error, we have this routine called panic, and when it is called, the machine crashes, and you holler down the hall, 'Hey, reboot it.'" The original panic() function was essenti
https://en.wikipedia.org/wiki/The%20leans
The leans is the most common type of spatial disorientation for aviators. Through stabilization of the fluid in the semicircular canals, a pilot may perceive straight and level flight while actually in a banked turn. This is caused by a quick return to level flight after a gradual, prolonged turn that the pilot failed to notice. The phenomenon consists of a false perception of angular displacement about the roll axis and therefore becomes an illusion of bank. This illusion is often associated with a vestibulospinal reflex that results in the pilot actually leaning in the direction of the falsely perceived vertical. Other common explanations of the leans are due to deficiencies of both otolith-organ and semicircular-duct sensory mechanisms. Physiology The leans is a type of vestibular illusion in flight which causes spatial disorientation. The process involves the semicircular canals of the vestibular system. The semicircular canals detect angular acceleration. In total, there are three semicircular canals: the anterior, posterior, and lateral canals. Each canal is filled with a fluid called endolymph and each canal arises from a small bag-like structure called a utricle. At the ends of each duct, there is a saclike portion called the ampulla. Inside are hair cells and supporting cells known as the crista ampullaris. Changing a person's orientation will cause specific ducts to be stimulated due to these hair cells. When the head turns, the canals move but because of its inertia, the endolymph fluid tends to lag and thereby stimulates the hair cells. This stimulation results in awareness of angular acceleration in that plane. After approximately 10–20 seconds, the endolymph velocity matches that of the canal, which stops stimulation of the hair cells, and the person's awareness of rotation is reduced or stopped. In addition, the canals cannot detect rotational acceleration of approximately 2 degrees per second or lower; this is the detection threshold of the semici
https://en.wikipedia.org/wiki/Permanent%20teeth
Permanent teeth or adult teeth are the second set of teeth formed in diphyodont mammals. In humans and old world simians, there are thirty-two permanent teeth, consisting of six maxillary and six mandibular molars, four maxillary and four mandibular premolars, two maxillary and two mandibular canines, four maxillary and four mandibular incisors. Timeline The first permanent tooth usually appears in the mouth at around 5-6 years of age, and the mouth will then be in a transition time with both primary (or deciduous dentition) teeth and permanent teeth during the mixed dentition period until the last primary tooth is lost or shed. The first of the permanent teeth to erupt are the permanent first molars, right behind the last 'milk' molars of the primary dentition. These first permanent molars are important for the correct development of a permanent dentition. Up to thirteen years of age, 28 of the 32 permanent teeth will appear. The full permanent dentition is completed much later during the permanent dentition period. The four last permanent teeth, the third molars, usually appear between the ages of 17 and 38 years; they are considered wisdom teeth. Pathology It is possible to have extra, or "supernumerary", teeth. This phenomenon is called hyperdontia and is often erroneously referred to as "a third set of teeth." These teeth may erupt into the mouth or remain impacted in the bone. Hyperdontia is often associated with syndromes such as cleft lip and cleft palate, tricho-rhino-phalangeal syndrome, cleidocranial dysplasia, and Gardner's syndrome. See also Deciduous dentition Tooth development Tooth eruption Teething Dentition
https://en.wikipedia.org/wiki/Transient%20equilibrium
In nuclear physics, transient equilibrium is a situation in which equilibrium is reached by a parent-daughter radioactive isotope pair where the half-life of the daughter is shorter than the half-life of the parent. Contrary to secular equilibrium, the half-life of the daughter is not negligible compared to parent's half-life. An example of this is a molybdenum-99 generator producing technetium-99 for nuclear medicine diagnostic procedures. Such a generator is sometimes called a cow because the daughter product, in this case technetium-99, is milked at regular intervals. Transient equilibrium occurs after four half-lives, on average. Activity in transient equilibrium The activity of the daughter is given by the Bateman equation: where and are the activity of the parent and daughter, respectively. and are the half-lives (inverses of reaction rates in the above equation modulo ln(2)) of the parent and daughter, respectively, and BR is the branching ratio. In transient equilibrium, the Bateman equation cannot be simplified by assuming the daughter's half-life is negligible compared to the parent's half-life. The ratio of daughter-to-parent activity is given by: Time of maximum daughter activity In transient equilibrium, the daughter activity increases and eventually reaches a maximum value that can exceed the parent activity. The time of maximum activity is given by: where and are the half-lives of the parent and daughter, respectively. In the case of ^{99\!m}Tc-^{99}Mo generator, the time of maximum activity () is approximately 24 hours which makes it convenient for medical use. See also Bateman equation Secular equilibrium
https://en.wikipedia.org/wiki/The%20Contagiousness%20of%20Puerperal%20Fever
The Contagiousness of Puerperal Fever is an essay written by Oliver Wendell Holmes which first appeared in The New England Quarterly Journal of Medicine in 1843. It was later reprinted in the “Medical Essays” in 1855. It is included as Volume 38, Part 5 of the Harvard Classics series. Synopsis In just under 12,000 words Holmes argues forcefully and convincingly that the rampant infection killing women within a few days of childbirth was caused mainly through infection spread by their birth attendants. He also laid down well-thought out and easy to execute behaviors through which the spread of infection could be contained. History Holmes developed an interest in puerperal fever by accident. In 1836 Holmes graduated from Harvard Medical School. He was Professor of Anatomy and Physiology at Dartmouth College from 1838 to 1840. In 1840 Holmes went back to Boston, took up general practice, and joined the Boston Society for Medical Improvement. At one of the meetings of the Society a report was presented about a physician who had performed a post mortem examination of a woman who had died from puerperal fever. Within a week that doctor himself died of infection, seemingly contracted when he was wounded during the autopsy he had conducted on the dead woman. The report stated that during the time interval between when the physician received the wound and then subsequently died from it, all the women whom he had attended during childbirth contracted puerperal fever. This report seems to have convinced Holmes that “The disease, known as Puerperal Fever, is so far contagious as to be frequently carried from patient to patient by physicians and nurses.” The end of the essay included eight steps which birth attendants needed to adhere to in order to prevent the spread of infection from patient to patient, especially infected patients to susceptible women after childbirth. Until this time in history these eight rules were the most definitive standard so far published o
https://en.wikipedia.org/wiki/Radical%20of%20an%20ideal
In ring theory, a branch of mathematics, the radical of an ideal of a commutative ring is another ideal defined by the property that an element is in the radical if and only if some power of is in . Taking the radical of an ideal is called radicalization. A radical ideal (or semiprime ideal) is an ideal that is equal to its radical. The radical of a primary ideal is a prime ideal. This concept is generalized to non-commutative rings in the Semiprime ring article. Definition The radical of an ideal in a commutative ring , denoted by or , is defined as (note that ). Intuitively, is obtained by taking all roots of elements of within the ring . Equivalently, is the preimage of the ideal of nilpotent elements (the nilradical) of the quotient ring (via the natural map ). The latter proves that is an ideal. If the radical of is finitely generated, then some power of is contained in . In particular, if and are ideals of a Noetherian ring, then and have the same radical if and only if contains some power of and contains some power of . If an ideal coincides with its own radical, then is called a radical ideal or semiprime ideal. Examples Consider the ring of integers. The radical of the ideal of integer multiples of is . The radical of is . The radical of is . In general, the radical of is , where is the product of all distinct prime factors of , the largest square-free factor of (see Radical of an integer). In fact, this generalizes to an arbitrary ideal (see the Properties section). Consider the ideal . It is trivial to show (using the basic property ), but we give some alternative methods: The radical corresponds to the nilradical of the quotient ring , which is the intersection of all prime ideals of the quotient ring. This is contained in the Jacobson radical, which is the intersection of all maximal ideals, which are the kernels of homomorphisms to fields. Any ring homomorphism must have in the kernel in order to have a w
https://en.wikipedia.org/wiki/Habitat%20destruction
Habitat destruction (also termed habitat loss and habitat reduction) is the process by which a natural habitat becomes incapable of supporting its native species. The organisms that previously inhabited the site are displaced or dead, thereby reducing biodiversity and species abundance. Habitat destruction is the leading cause of biodiversity loss. Fragmentation and loss of habitat have become one of the most important topics of research in ecology as they are major threats to the survival of endangered species. Activities such as harvesting natural resources, industrial production and urbanization are human contributions to habitat destruction. Pressure from agriculture is the principal human cause. Some others include mining, logging, trawling, and urban sprawl. Habitat destruction is currently considered the primary cause of species extinction worldwide. Environmental factors can contribute to habitat destruction more indirectly. Geological processes, climate change, introduction of invasive species, ecosystem nutrient depletion, water and noise pollution are some examples. Loss of habitat can be preceded by an initial habitat fragmentation. Attempts to address habitat destruction are in international policy commitments embodied by Sustainable Development Goal 15 "Life on Land" and Sustainable Development Goal 14 "Life Below Water". However, the United Nations Environment Programme report on "Making Peace with Nature" released in 2021 found that most of these efforts had failed to meet their internationally agreed upon goals. Impacts on organisms When a habitat is destroyed, the carrying capacity for indigenous plants, animals, and other organisms is reduced so that populations decline, sometimes up to the level of extinction. Habitat loss is perhaps the greatest threat to organisms and biodiversity. Temple (1986) found that 82% of endangered bird species were significantly threatened by habitat loss. Most amphibian species are also threatened by native habit
https://en.wikipedia.org/wiki/Differentiation%20%28journal%29
Differentiation is a peer-reviewed academic journal covering cell differentiation and cell development. It was established in 1973 and is published 10 times per year by Elsevier, on behalf of the International Society of Differentiation. The editor-in-chief is Colin Stewart (Agency for Science, Technology and Research). According to the Journal Citation Reports, the journal has a 2016 impact factor of 2.567.
https://en.wikipedia.org/wiki/List%20of%20circulating%20currencies
<onlyinclude> There are 180 currencies recognized as legal tender in United Nations (UN) member states, UN General Assembly non-member observer states, partially recognized or unrecognized states, and their dependencies. However, excluding the pegged (fixed exchange rate) currencies, there are only 130 currencies that are independent or pegged to a currency basket. Dependencies and unrecognized states are listed here only if another currency is used on their territory that is different from the one of the state that administers them or has jurisdiction over them. Criteria for inclusion A currency is a kind of money and medium of exchange. Currency includes paper, cotton, or polymer banknotes and metal coins. States generally have a monopoly on the issuing of currency, although some states share currencies with other states. For the purposes of this list, only currencies that are legal tender, including those used in actual commerce or issued for commemorative purposes, are considered "circulating currencies". This includes fractional units that have no physical form but are recognized by the issuing state, such as the United States mill, the Egyptian millieme, and the Japanese rin. Currencies used by non-state entities, like the Sovereign Military Order of Malta, scrips used by private entities, and other private, virtual, and alternative currencies are not under the purview of this list. List of circulating currencies by state or territory Currencies by number of countries/territories See also List of currencies List of historical currencies Money Private currency Exchange rate List of countries by exchange rate regime Notes
https://en.wikipedia.org/wiki/William%20John%20Burchell
William John Burchell (23 July 1781 – 23 March 1863) was an English explorer, naturalist, traveller, artist, and author. His thousands of plant specimens, as well as field journals from his South African expedition, are held by Kew Gardens, and his insect collection by the Oxford University Museum. Early life and education William John Burchell was born in Fulham, London, the son of Matthew Burchell, botanist and owner of Fulham Nursery, and his wife. His father owned nine and a half acres of land adjacent to the gardens of Fulham Palace. Burchell served a botanical apprenticeship at Kew and was elected F.L.S. (Fellow of the Linnaen Society) in 1803. At about this time, he became enamoured of Lucia Green of Fulham, but faced strong disapproval from his parents when he broached the idea of an engagement. Career On 7 August 1805, Burchell at the age of 24 sailed for St. Helena aboard the East Indiaman intending to set up there as a merchant with a partner from London, William Balcombe (1779-1829). After a year of trading, Burchell did not want to continue and dissolved the partnership. Three months later, he accepted a position as schoolmaster on the island and later as official botanist. In 1810, he sailed to the Cape of Good Hope in South Africa on the recommendation of Gen. J.W. Janssens to explore and to add to his botanical collection. Burchell's intended wife had jilted him for the captain of the boat taking her to St. Helena to join him. Landing at Table Bay on 26 November 1810, after stormy weather had prevented a landing for 13 days, Burchell set about planning an expedition into the interior. He left Cape Town in June 1811. Burchell travelled in South Africa through 1815, collecting over 50,000 specimens, and covering more than 7000 km, much over unexplored terrain. He described his journey in Travels in the Interior of Southern Africa, a two-volume work appearing in 1822 and 1824. (It was reprinted in 1967 by C. Struik of Cape Town.) He is believed
https://en.wikipedia.org/wiki/Tunicamycin
Tunicamycin is a mixture of homologous nucleoside antibiotics that inhibits the UDP-HexNAc: polyprenol-P HexNAc-1-P family of enzymes. In eukaryotes, this includes the enzyme GlcNAc phosphotransferase (GPT), which catalyzes the transfer of N-acetylglucosamine-1-phosphate from UDP-N-acetylglucosamine to dolichol phosphate in the first step of glycoprotein synthesis. Tunicamycin blocks N-linked glycosylation (N-glycans) and treatment of cultured human cells with tunicamycin causes cell cycle arrest in G1 phase. It is used as an experimental tool in biology, e.g. to induce unfolded protein response. Tunicamycin is produced by several bacteria, including Streptomyces clavuligerus and Streptomyces lysosuperificus. Tunicamycin homologues have varying molecular weights owing to the variability in fatty acid side chain conjugates. Biosynthesis The biosynthesis of tunicamycins was studied in Streptomyces chartreusis and a proposed biosynthetic pathway was characterized. The bacteria utilize the enzymes in the tun gene cluster (TunA-N) to make tunicamycins. TunA uses the starter unit uridine diphosphate-N-acetyl-glucosamine (UDP-GlcNAc) and catalyzes the dehydration of the 6’ hydroxyl group. First, a Tyr residue in TunA abstracts a proton from the 4’ hydroxyl group, forming a ketone at that position. A hydride is subsequently abstracted from the 4’ carbon by NAD+, forming NADH. The ketone is stabilized by hydrogen bonding from the Tyr residue, and a nearby Thr residue. A glutamate residue then abstracts a proton from the 5’ carbon, pushing the electrons up to form a double bond between the 5’ and 6’ carbon. A nearby cysteine donates a proton to the hydroxyl group as it leaves as water. NADH donates a hydride to the 4’ carbon, reforming a hydroxide in that position and forming UDP-6’-deoxy-5-6-ene-GlcNAc. TunF then catalyzes the epimerization of the intermediate to UDP-6’-deoxy-5-6-ene-GalNAc, changing the 4’ hydroxyl from the equatorial to axial position. The ot
https://en.wikipedia.org/wiki/Guild%20hosting%20service
A guild hosting or clan hosting service is a specialized type of web hosting service designed to support online gaming communities, generally referred to as guilds or clans. They vary from game server hosting in that the focus of such companies is to provide applications and communication tools outside the gaming environments themselves. Guild hosting services address a guild's basic need to have an online presence and allow guild members to communicate with each other outside of the game. While it is possible for any guild to do this on their own, setting up and maintaining a site requires constant maintenance, upgrades and integration of new software. One of the key reasons for the popularity of guild hosting services is their focus on relieving the guild from this overhead and freeing them up to spend more time playing the game. Typical features The services typically offered by such a service include: Public and/or private forums for members to communicate between themselves, or other tools for communications such as instant messaging or chat servers. Tools for tracking the roster of characters that a player might have in an MMORPG. An application for scheduling and organizing raids, tournaments and other gaming events. Applications for tracking treasure, items, or points accrued toward redeeming treasure (often referred to as a DKP system). History Originally, most people who decided to create a website for their guild used bulletin board software such as vBulletin and phpBB on traditional web hosting services. However, as the complexity of online games increased, many guilds sought after more advanced management features and turned to specialized services to accommodate their needs. However, there is still a considerable base of users who still employ the older method as it can be cheaper and allows them the flexibility to be creative in their efforts and in some cases be able to transfer their guild sites between hosting services. Many of the
https://en.wikipedia.org/wiki/XM%20%28file%20format%29
XM, standing for "extended module", is an audio file type introduced by Triton's FastTracker 2. XM introduced multisampling-capable instruments with volume and panning envelopes, sample looping and basic pattern compression. It also expanded the available effect commands and channels, added 16-bit sample support, and offered an alternative frequency table for portamentos. XM is a common format for many module files. The file format has been initially documented by its creator in the file XM.TXT, which accompanied the 2.08 release of FastTracker 2, as well as its latest known beta version: 2.09b. The file, written in 1994 and attributed to Mr.H of Triton (Fredrik Huss), bears the header "The XM module format description for XM files version $0104." The contents of the file have been posted on this article's Talk subpage for reference. This documentation is however said to be incomplete and insufficient to properly recreate the behaviour of the original program. The MilkyTracker project has expanded the documentation of the XM file format, in an attempt to replicate not only the behaviour of the original software but also its quirks. Their documentation of the XM file format is available on the project's GitHub repository. OXM (oggmod) is a subformat, which compresses the XM samples using Vorbis. Supporting Media Players Windows Media Player – supports .XM files as long as the player version is x86 (32-bit) Cowon jetAudio – A freeware audio player for Windows which supports .XM files Xmplay – A freeware audio player for Windows which supports .XM files Foobar2000 – A freeware audio player for Windows that supports .XM files through a plugin. VLC Media Player – An open-source media player for Windows, Linux, & macOS which supports .XM files MusicBee – A freeware audio player for Windows which supports .XM files
https://en.wikipedia.org/wiki/MHC%20restriction
MHC-restricted antigen recognition, or MHC restriction, refers to the fact that a T cell can interact with a self-major histocompatibility complex molecule and a foreign peptide bound to it, but will only respond to the antigen when it is bound to a particular MHC molecule. When foreign proteins enter a cell, they are broken into smaller pieces called peptides. These peptides, also known as antigens, can derive from pathogens such as viruses or intracellular bacteria. Foreign peptides are brought to the surface of the cell and presented to T cells by proteins called the major histocompatibility complex (MHC). During T cell development, T cells go through a selection process in the thymus to ensure that the T cell receptor (TCR) will not recognize MHC molecule presenting self-antigens, i.e that its affinity is not too high. High affinity means it will be autoreactive, but no affinity means it will not bind strongly enough to the MHC. The selection process results in developed T cells with specific TCRs that might only respond to certain MHC molecules but not others. The fact that the TCR will recognize only some MHC molecules but not others contributes to "MHC restriction". The biological reason of MHC restriction is to prevent supernumerary wandering lymphocytes generation, hence energy saving and economy of cell-building materials. T-cells are a type of lymphocyte that is significant in the immune system to activate other immune cells. T-cells will recognize foreign peptides through T-cell receptors (TCRs) on the surface of the T cells, and then perform different roles depending on the type of T cell they are in order to defend the host from the foreign peptide, which may have come from pathogens like bacteria, viruses or parasites. Enforcing the restriction that T cells are activated by peptide antigens only when the antigens are bound to self-MHC molecules, MHC restriction adds another dimension to the specificity of T cell receptors so that an antigen is rec
https://en.wikipedia.org/wiki/Full%20spectral%20imaging
Full spectral imaging (FSI) is a form of imaging spectroscopy and is the successor to hyperspectral imaging. Full spectral imaging was developed to improve the capabilities of remote sensing including Earth remote sensing. Data acquisition Whereas hyperspectral imaging acquires data as many contiguous spectral bands, full spectral imaging acquires data as spectral curves. A significant advantage of FSI over hyperspectral imaging is a significant reduction in data rate and volume. FSI extracts and saves only the information that is in the raw data. The information is contained in the shape of the spectral curves. The rate at which data is produced by an FSI system is proportional to the amount of information in the scene/image. Applications Full spectral imaging, along with empirical reflectance retrieval and autonomous remote sensing are the components of the new systems for remote sensing and the successor to the Landsat series of satellites of the Landsat program.
https://en.wikipedia.org/wiki/Zego
The ZEGO ("Zest to go") is a rackmount server platform built by Sony, targeted for the video post-production and broadcast markets. The platform is based on Sony's PlayStation 3 as it features both the Cell Processor as well as the RSX 'Reality Synthesizer'. It is aimed to greatly speed up postproduction work (in particular in the computationally extremely taxing 4K resolution), 3D rendering and video processing. In some respects it is rather similar to IBM's QS20/21/22 blades (such as used in the Roadrunner supercomputer that took the top spot in the Top500 in May 2008), although Sony seems to target the DCC (Digital Content Creation) markets rather than scientific like IBM, which can be seen by the inclusion of the RSX graphics processor in the ZEGO platform. ZEGO runs Fixstars's Yellow Dog Enterprise Linux, which was also Sony's favourite Linux distribution for the PlayStation 3. Architecture The architecture is not identical to the PlayStation 3. One difference is that the BCU-100 has 1 GB XDR RAM instead of the PlayStation 3's 256 MB. Video RAM is missing in Sony's system diagrams, but it is listed as 256 MB (like the PlayStation 3) further down in the tech specs. The XDR memory is shared by both the Cell and RSX. Sony uses the SCC (Super Companion Chip) to handle I/O tasks (HDD, USB 2.0, Gigabit Ethernet and other unspecified I/O); the SCC has its own dedicated memory of 1GB DDR2 as well as a Memory Extension Adapter connected via PCI Express that can hold up to 8 GB. Another option for the single PCI express slot is a Video Display Board with a DVI-I output. Further, the Cell in the BCU-100 offers the full 8 SPUs that Cell is manufactured with, as opposed to the 6 SPUs available in the PlayStation 3, which has one SPU disabled to improve manufacturing yields and one reserved for the system. This gives the BCU-100 an extra 33% potential CPU-power (or 51.2 GFLOPS more). History Sony presented its first ZEGO product, the BCU-100, to the public at SIGGRAPH 20
https://en.wikipedia.org/wiki/Health%20information%20on%20the%20Internet
Health information on the Internet refers to all health-related information communicated through or available on the Internet. Description The Internet is widely used by the general public as a tool for finding health information. In the late 1990s, researchers noted an increase in Internet users' access to health-related content despite the variation in the quality of information, level of accessibility, and overall health literacy. Access to health information does not guarantee understanding, as health literacy of individuals vary. It is believed patients who know their medical history may learn and interpret this information in a way that benefits them. This, however, is not always the case because online health information is not always peer reviewed. Physicians worry that patients who conduct Internet research on their medical history are at a risk of being misinformed. In 2013, the opinions about the relationship between health care providers and online health information were still being established. According to a 2014 study, "The flow of information has fundamentally changed, and physicians have less control over health information relayed to patients. Not surprisingly, this paradigm shift has elicited varied and sometimes conflicting views about the value of the Internet as a tool to improve health care." Importance of the physician-patient relationship In cases in which a physician has difficulty explaining complicated medical concepts to a patient, that patient may be inclined to seek information on the internet. A consensus exists that patients should have shared decision making, meaning that patients should be able to make informed decisions about the direction of their medical treatment in collaboration with their physician. Rich, educated, and socially advantaged patients may enjoy the benefits of the shared decision-making approach more than those with a lower socioeconomic class or minority status. Patients' naive understanding of their healt
https://en.wikipedia.org/wiki/Boyer%E2%80%93Moore%20majority%20vote%20algorithm
The Boyer–Moore majority vote algorithm is an algorithm for finding the majority of a sequence of elements using linear time and a constant number of words of memory. It is named after Robert S. Boyer and J Strother Moore, who published it in 1981, and is a prototypical example of a streaming algorithm. In its simplest form, the algorithm finds a majority element, if there is one: that is, an element that occurs repeatedly for more than half of the elements of the input. A version of the algorithm that makes a second pass through the data can be used to verify that the element found in the first pass really is a majority. If a second pass is not performed and there is no majority the algorithm will not detect that no majority exists. In the case that no strict majority exists, the returned element can be arbitrary; it is not guaranteed to be the element that occurs most often (the mode of the sequence). It is not possible for a streaming algorithm to find the most frequent element in less than linear space, for sequences whose number of repetitions can be small. Description The algorithm maintains in its local variables a sequence element and a counter, with the counter initially zero. It then processes the elements of the sequence, one at a time. When processing an element , if the counter is zero, the algorithm stores as its remembered sequence element and sets the counter to one. Otherwise, it compares to the stored element and either increments the counter (if they are equal) or decrements the counter (otherwise). At the end of this process, if the sequence has a majority, it will be the element stored by the algorithm. This can be expressed in pseudocode as the following steps: Initialize an element and a counter with For each element of the input sequence: If , then assign and else if , then assign else assign Return Even when the input sequence has no majority, the algorithm will report one of the sequence elements as its result. However, it i
https://en.wikipedia.org/wiki/Big-little-big%20lemma
In the mathematics of paper folding, the big-little-big lemma is a necessary condition for a crease pattern with specified mountain folds and valley folds to be able to be folded flat. It differs from Kawasaki's theorem, which characterizes the flat-foldable crease patterns in which a mountain-valley assignment has not yet been made. Together with Maekawa's theorem on the total number of folds of each type, the big-little-big lemma is one of the two main conditions used to characterize the flat-foldability of mountain-valley assignments for crease patterns that meet the conditions of Kawasaki's theorem. Mathematical origami expert Tom Hull calls the big-little-big lemma "one of the most basic rules" for flat foldability of crease patterns. Statement The lemma concerns the angles made by consecutive pairs of creases at a single vertex of the crease pattern. It states that if any one of these angles is a local minimum (that is, smaller than the two angles on either side of it), then exactly one of the two creases bounding the angle must be a mountain fold and exactly one must be a valley fold. Generalization and applications A generalized version of the lemma holds for a sequence of equal angles at a single vertex, surrounded on both sides by a larger angle. For such a sequence, the number of mountain and valley folds bounding any of these angles must either be equal, or differ by one. It can be used as part of a linear time algorithm that tests whether a folding pattern with a single vertex can be folded flat, by repeatedly looking for sequences of angles that obey the lemma and pinching them off, until either getting stuck or reducing the input to two equal angles bounded by two creases of the same type as each other. History In their book Geometric Folding Algorithms, Erik Demaine and Joe O'Rourke credit the lemma to publications of Toshikazu Kawasaki in 1989, and Jacques Justin in 1994.
https://en.wikipedia.org/wiki/Discontinuity%20layout%20optimization
Discontinuity layout optimization (DLO) is an engineering analysis procedure which can be used to directly establish the amount of load that can be carried by a solid or structure prior to collapse. Using DLO the layout of failure planes, or 'discontinuities', in a collapsing solid or structure are identified using mathematical optimization methods (hence the name, 'discontinuity layout optimization'). It is assumed that failure occurs in a ductile or 'plastic' manner. How it works The DLO procedure involves a number of steps, as outlined below. The set of potential discontinuities can include discontinuities which crossover one another, allowing complex failure patterns to be identified (e.g. involving ‘fan’ mechanisms, where many discontinuities radiate from a point). DLO can be formulated in terms of equilibrium relations ('static' formulation) or in terms of displacements ('kinematic' formulation). In the latter case the objective of the mathematical optimization problem is to minimize the internal energy dissipated along discontinuities, subject to nodal compatibility constraints. This can be solved using efficient linear programming techniques and, when combined with an algorithm originally developed for truss layout optimization problems, it has been found that modern computer power can be used to directly search through very large numbers of different failure mechanism topologies (up to approx. 21,000,000,000 different topologies on current generation PCs). A full description of the application of DLO to plane strain problems has been provided by Smith and Gilbert, to masonry arch analysis by Gilbert et al, to slab problems by Gilbert et al, and to 3D problems by Hawksbee et al, and Zhang. DLO vs FEM Whereas with finite element analysis (FEM), a widely used alternative engineering analysis procedure, mathematical relations are formed for the underlying continuum mechanics problem, DLO involves analysis of a potentially much simpler discontinuum problem,
https://en.wikipedia.org/wiki/Cheilanthoideae
Cheilanthoideae is one of the five subfamilies of the fern family Pteridaceae. The subfamily is thought to be monophyletic, but some of the genera into which it has been divided are not, and the taxonomic status of many of its genera and species remains uncertain, with radically different approaches in use . Phylogenic relationships The following phylogram shows a likely relationship between Cheilanthoideae and the other Pteridaceae subfamilies. Although subfamily Cheilanthoideae itself is thought to be monophyletic, many of the genera into which it has been divided (including Cheilanthes, Doryopteris, Notholaena, and Pellaea) have been shown to be polyphyletic. Genera The division of the subfamily Cheilanthoideae into genera and species remains uncertain . Christenhusz et al. (2011), the Pteridophyte Phylogeny Group classification of 2016 (PPG I), and the November 2019 version of the Checklist of Ferns and Lycophytes of the World (World Ferns 8.11) agree on the following genera: Adiantopsis Fée Aleuritopteris Fée Argyrochosma (J.Sm.) Windham Aspidotis (Nutt. ex Hooker) Copel. Astrolepis D.M.Benham & Windham Bommeria E.Fourn. Calciphilopteris Yesilyurt & H.Schneid. Cheilanthes Sw. (nom. cons.) Cheiloplecton Fée Doryopteris J.Sm. (nom. cons.) Gaga Pryer, F.W. Li & Windham Hemionitis L. Lytoneuron (Klotzsch) Yesilyurt Mildella Trev. Myriopteris Fée Notholaena R.Br. Ormopteris J.Sm. Paragymnopteris K.H.Shing Pellaea Link (nom. cons.) Pentagramma Yatsk., Windham & E.Wollenw. Trachypteris André ex Christ. Some other genera that have been included in the subfamily (or split off from genera included in the subfamily) are: Allosorus Bernh. – not included by Christenhusz et al. (2011); accepted in PPG I; World Ferns 8.11 includes it in Aleuritopteris, saying that it was wrongly typified Baja Windham & L.O.George – accepted by World Ferns 8.11 Mickelopteris Fraser-Jenk. – accepted by World Ferns 8.11 Oeosporangium Vis. – included in Allosorus in
https://en.wikipedia.org/wiki/Present%20value
In economics and finance, present value (PV), also known as present discounted value, is the value of an expected income stream determined as of the date of valuation. The present value is usually less than the future value because money has interest-earning potential, a characteristic referred to as the time value of money, except during times of negative interest rates, when the present value will be equal or more than the future value. Time value can be described with the simplified phrase, "A dollar today is worth more than a dollar tomorrow". Here, 'worth more' means that its value is greater than tomorrow. A dollar today is worth more than a dollar tomorrow because the dollar can be invested and earn a day's worth of interest, making the total accumulate to a value more than a dollar by tomorrow. Interest can be compared to rent. Just as rent is paid to a landlord by a tenant without the ownership of the asset being transferred, interest is paid to a lender by a borrower who gains access to the money for a time before paying it back. By letting the borrower have access to the money, the lender has sacrificed the exchange value of this money, and is compensated for it in the form of interest. The initial amount of borrowed funds (the present value) is less than the total amount of money paid to the lender. Present value calculations, and similarly future value calculations, are used to value loans, mortgages, annuities, sinking funds, perpetuities, bonds, and more. These calculations are used to make comparisons between cash flows that don’t occur at simultaneous times, since time and dates must be consistent in order to make comparisons between values. When deciding between projects in which to invest, the choice can be made by comparing respective present values of such projects by means of discounting the expected income streams at the corresponding project interest rate, or rate of return. The project with the highest present value, i.e. that is most valua
https://en.wikipedia.org/wiki/Cranial%20fossa
A cranial fossa is formed by the floor of the cranial cavity. There are three distinct cranial fossae: Anterior cranial fossa (fossa cranii anterior), housing the projecting frontal lobes of the brain Middle cranial fossa (fossa cranii media), separated from the posterior fossa by the clivus and the petrous crest housing the temporal lobe Posterior cranial fossa (fossa cranii posterior), between the foramen magnum and tentorium cerebelli, containing the brainstem and cerebellum Additional images See also Fossa (anatomy) Brain anatomy
https://en.wikipedia.org/wiki/Geophilus%20monoporus
Geophilus monoporus is a species of soil centipede in the family Geophilidae found in Tiba, Japan. It grows up to 18 millimeters in length; it's named for the single pore at the base of the final leg pair.
https://en.wikipedia.org/wiki/38th%20meridian%20east
The meridian 38° east of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Europe, Asia, Africa, the Indian Ocean, the Southern Ocean, and Antarctica to the South Pole. The 38th meridian east forms a great circle with the 142nd meridian west. From Pole to Pole Starting at the North Pole and heading south to the South Pole, the 38th meridian east passes through: {| class="wikitable plainrowheaders" ! scope="col" width="125" | Co-ordinates ! scope="col" | Country, territory or sea ! scope="col" | Notes |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Arctic Ocean | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Queen Victoria Sea | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Barents Sea | style="background:#b0e0e6;" | |- | ! scope="row" | | Kola Peninsula |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | White Sea | style="background:#b0e0e6;" | |- | ! scope="row" | | Onega Peninsula |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | White Sea | style="background:#b0e0e6;" | Onega Bay |- | ! scope="row" | | |- | ! scope="row" | | Passing just east of Donetsk |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Sea of Azov | style="background:#b0e0e6;" | Taganrog Bay |- | ! scope="row" | | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Sea of Azov | style="background:#b0e0e6;" | |- | ! scope="row" | | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Black Sea | style="background:#b0e0e6;" | |- | ! scope="row" | | |- | ! scope="row" | | |- | ! scope="row" | | |- | ! scope="row" | | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Red Sea | style="background:#b0e0e6;" | |-
https://en.wikipedia.org/wiki/Ephyriades%20arcas
Ephyriades arcas, the Caribbean duskywing, is a butterfly of the family Hesperiidae. It is found in Central America and the Caribbean, the type specimen being described from Saint Kitts. Description Upper side: Thorax and abdomen black. Wings very dark brownish black, immaculate. Margins entire. Under side: Legs, breast, and abdomen dark brown, but rather lighter than on the upper side, immaculate, except a small white spot on the anterior, placed near the anterior edge towards the tip. Subspecies Ephyriades arcas arcas Ephyriades arcas philemon (Fabricius, 1775) (Cuba, Bahamas)
https://en.wikipedia.org/wiki/Orbiton
Orbitons are one of three quasiparticles, along with holons and spinons, that electrons in solids are able to split into during the process of spin–charge separation, when extremely tightly confined at temperatures close to absolute zero. The electron can always be theoretically considered as a bound state of the three, with the spinon carrying the spin of the electron, the orbiton carrying the orbital location and the holon carrying the charge, but in certain conditions they can become deconfined and behave as independent particles. Overview Orbitons can be thought of as energy stored in an orbital occupancy that can move throughout a material, in other words, an orbital-based excitation. An orbiton propagates through a material as a series of orbital excitations and relaxations of the electrons in a material without changes in either the spin of those electrons or the charge at any point in the material. Electrons, being of like charge, repel each other. As a result, in order to move past each other in an extremely crowded environment, they are forced to modify their behavior. Research published in July 2009 by the University of Cambridge and the University of Birmingham in England showed that electrons could jump from the surface of a metal onto a closely located quantum wire by quantum tunneling, and upon doing so, will separate into two quasiparticles, named spinons and holons by the researchers. The orbiton was predicted theoretically by van den Brink, Khomskii and Sawatzky in 1997–1998. Its experimental observation as a separate quasiparticle was reported in paper sent to publishers in September 2011. The research states that firing a beam of X-ray photons at a single electron in a one-dimensional sample of strontium cuprate will excite the electron into a higher orbital, causing the beam to lose a fraction of its energy in the process before it rebounds. In doing so, the electron is separated into a spinon and an orbiton. See also Condensed matter phys
https://en.wikipedia.org/wiki/Nyquist%20ISI%20criterion
In communications, the Nyquist ISI criterion describes the conditions which, when satisfied by a communication channel (including responses of transmit and receive filters), result in no intersymbol interference or ISI. It provides a method for constructing band-limited functions to overcome the effects of intersymbol interference. When consecutive symbols are transmitted over a channel by a linear modulation (such as ASK, QAM, etc.), the impulse response (or equivalently the frequency response) of the channel causes a transmitted symbol to be spread in the time domain. This causes intersymbol interference because the previously transmitted symbols affect the currently received symbol, thus reducing tolerance for noise. The Nyquist theorem relates this time-domain condition to an equivalent frequency-domain condition. The Nyquist criterion is closely related to the Nyquist–Shannon sampling theorem, with only a differing point of view. Nyquist criterion If we denote the channel impulse response as , then the condition for an ISI-free response can be expressed as: for all integers , where is the symbol period. The Nyquist theorem says that this is equivalent to: , where is the Fourier transform of . This is the Nyquist ISI criterion. This criterion can be intuitively understood in the following way: frequency-shifted replicas of must add up to a constant value. This condition is satisfied when spectrum has even symmetry, has bandwidth less than or equal to , and its single-sideband has odd symmetry at the cutoff frequency . In practice this criterion is applied to baseband filtering by regarding the symbol sequence as weighted impulses (Dirac delta function). When the baseband filters in the communication system satisfy the Nyquist criterion, symbols can be transmitted over a channel with flat response within a limited frequency band, without ISI. Examples of such baseband filters are the raised-cosine filter, or the sinc filter as the ideal case. Der
https://en.wikipedia.org/wiki/List%20of%20multivariable%20calculus%20topics
This is a list of multivariable calculus topics. See also multivariable calculus, vector calculus, list of real analysis topics, list of calculus topics. Closed and exact differential forms Contact (mathematics) Contour integral Contour line Critical point (mathematics) Curl (mathematics) Current (mathematics) Curvature Curvilinear coordinates Del Differential form Differential operator Directional derivative Divergence Divergence theorem Double integral Equipotential surface Euler's theorem on homogeneous functions Exterior derivative Flux Frenet–Serret formulas Gauss's law Gradient Green's theorem Green's identities Harmonic function Helmholtz decomposition Hessian matrix Hodge star operator Inverse function theorem Irrotational vector field Isoperimetry Jacobian matrix Lagrange multiplier Lamellar vector field Laplacian Laplacian vector field Level set Line integral Matrix calculus Mixed derivatives Monkey saddle Multiple integral Newtonian potential Parametric equation Parametric surface Partial derivative Partial differential equation Potential Real coordinate space Saddle point Scalar field Solenoidal vector field Stokes' theorem Submersion Surface integral Symmetry of second derivatives Taylor's theorem Total derivative Vector field Vector operator Vector potential list Mathematics-related lists Outlines of mathematics and logic Outlines
https://en.wikipedia.org/wiki/Hardy%20Rodenstock
Meinhard Görke, known as Hardy Rodenstock (7 December 1941 – 19 May 2018) was a German publisher and manager of pop and Schlager music, and a prominent wine collector, connoisseur, and trader, with a special interest in old and rare wines. He became famous for his allegedly uncanny ability to track down old and very rare wines, and for arranging extravagant wine tastings featuring these wines. It has been alleged that Rodenstock was the perpetrator of an elaborate wine fraud. In 1992, a German court found that Rodenstock had "knowingly offered adulterated wine" for sale. On appeal, the case was settled out of court. Rare wine tastings From 1980, Rodenstock arranged annual high-profile wine tastings of old and rare wines from his collections to which he invited friends and other prominent people. The tastings would be weekend tastings held at gourmet restaurants, hotels, and resorts, and they featured huge quantities of wine at Rodenstock's expense. The participants included German celebrities and later, expanded to include some of the most prominent international wine critics. The most famous Rodenstock tasting was held from 30 August to 5 September 1998 at Hotel Königshof in Munich, when a tasting of 125 vintages of Château d'Yquem, the oldest of which were of the 1784 vintage, was held. Two eighteenth-century, forty nineteenth-century, and all released twentieth-century vintages of Château d'Yquem up to 1991 were featured in this vertical tasting, which was conducted over the course of a week. The events of the week included five luncheons, seven dinners, and more than 175 other wines. It is most likely the most extensive Yquem tasting to that date and it has been the subject of a book. The exclusive nature of the wine selection featured at Rodenstock's tastings is indicated by the fact that Michael Broadbent, who was considered to be the world's leading authority on old wines, had tasted many of the rarest and oldest wines at Rodenstock's tastings, in particu
https://en.wikipedia.org/wiki/Quadrics%20%28company%29
Quadrics was a supercomputer company formed in 1996 as a joint venture between Alenia Spazio and the technical team from Meiko Scientific. They produced hardware and software for clustering commodity computer systems into massively parallel systems. Their highpoint was in June 2003 when six out of the ten fastest supercomputers in the world were based on Quadrics' interconnect. They officially closed on June 29, 2009. Company history The Quadrics name was first used in 1993 for a commercialized version of the APE100 SIMD parallel computer produced by Alenia Spazio and originally developed by INFN, the Italian National Institute of Nuclear Physics. In 1996, a new Alenia subsidiary, Quadrics Supercomputers World (QSW) was formed, based in Bristol, UK and Rome, Italy, inheriting the Quadrics SIMD product line and the Meiko CS-2 massively parallel supercomputer architecture. In 2002 the company name was shortened to be simply Quadrics. Initially, the new company focussed on the development potential of the CS-2's processor interconnect technology. Their first design was the Elan2 network ASIC, intended for use with the UltraSPARC CPU, attached to it using the Ultra Port Architecture (UPA) system bus. Plans to introduce the Elan2 were later dropped, and a new Elan3 hosted on PCI introduced instead. By the time of its release Elan3 had been re-aimed at the Alpha/PCI market instead, after Quadrics had formed a relationship with Digital Equipment Corporation (DEC). The combination of Quadrics and Alpha 21264 (EV6) microprocessors proved very successful, and Digital/Compaq rapidly became one of the world's largest suppliers of supercomputers. This culminated with the building the largest machine in the US, the 20 TFLOP ASCI Q, installed at Los Alamos National Laboratory during 2002 and 2003. The machine consisted of 2,048 AlphaServer SC nodes (which are based on AlphaServer ES45), each with four 1.25 GHz Alpha 21264A (EV67) microprocessors and two rails of the Quadrics Q
https://en.wikipedia.org/wiki/2008%20Chinese%20heparin%20adulteration
2008 Chinese heparin adulteration, refers to heparin adulteration incidents that occurred in the United States of America in 2008. Pharmaceutical company Baxter International subcontracted the creation of precursor chemicals of heparin to Scientific Protein Laboratories, an American company with production facilities located in China. Scientific Protein Laboratories then used counterfeit precursors to create the chemicals ordered. Baxter then sold this adulterated heparin in the US, which killed 81 people, and left 785 severely injured. This caught the attention of the media and the USA Food and Drug Administration leading to numerous ongoing lawsuits. Overview The raw material for the recalled heparin batches was processed in China from pig's intestines by the American pharmaceutical firm Scientific Protein Laboratories. The U.S. Food and Drug Administration was quoted as stating that at least 81 deaths were believed to be linked to a raw heparin ingredient imported from the People's Republic of China, and that they had also received 785 reports of serious injuries associated with the drug's use. According to The New York Times, "problems with heparin reported to the agency include difficulty breathing, nausea, vomiting, excessive sweating and rapidly falling blood pressure that in some cases led to life-threatening shock." In March 2008, due to contamination of the raw heparin stock imported from mainland China, major recalls of heparin, a substance widely used as an injectable anticoagulant, were announced by the U.S. Food and Drug Administration (FDA). Upon investigation of these adverse events by the FDA, academic institutions, and the involved pharmaceutical companies, the contaminant was identified as an "over-sulfated" derivative of chondroitin sulfate, a closely related substance obtained from mammal or fish cartilage and often used as a treatment for arthritis. Since over-sulfated chondroitin is not a naturally occurring molecule, costs a fraction of tr
https://en.wikipedia.org/wiki/2003%20ricin%20letters
The 2003 ricin letters were two ricin-laden letters found on two occasions between October and November 2003. One letter was mailed to the White House and intercepted at a processing facility; another was discovered with no address in South Carolina. A February 2004 ricin incident at the Dirksen Senate Office Building was initially connected to the 2003 letters as well. The letters were sent by an individual who referred to themselves as "Fallen Angel". The sender, who claimed to own a trucking company, expressed anger over changes in federal trucking regulations. As of 2008, no connection between the Fallen Angel letters and the Dirksen building incident has been established. A $100,000 reward was offered in 2004 by the federal law enforcement agencies investigating the case, but to date the reward remains unclaimed. Background Ricin Ricin is a white powder that can be produced as a liquid or a crystal. Ricin is an extremely toxic plant protein that can cause severe allergic reactions, and exposure to small quantities can be fatal. The toxin inhibits the formation of proteins within cells of exposed individuals. The U.S. Centers for Disease Control and Prevention (CDC) states that 500 micrograms is the minimum lethal dose of ricin in humans provided that exposure is from injection or inhalation. Ricin is easily purified from castor-oil manufacturing waste. It has been utilized by various states and organizations as a weapon, being most effective as an assassination weapon, notably in the case of the 1978 assassination of Bulgarian dissident Georgi Markov. Trucking regulations On January 4, 2004 new federal transportation rules took effect which directly affected the over-the-road trucking industry in the United States. The rules took effect with a 60-day grace period and were aimed at reducing fatigue related accidents and fatalities. Called the most far-reaching rule changes in 65 years, the regulations reduced daily allowed driving time from 11 hours to 10.
https://en.wikipedia.org/wiki/Beer%20Barrel%20Man
The Barrelman is a mascot logo used by two baseball teams in Milwaukee nicknamed "Brewers". Introduction The character was first used in the 1940s by the Milwaukee Brewers, a Minor League Baseball team based in Milwaukee, Wisconsin. At the time, he was known as "Owgust". With a beer barrel for a torso and tap for his nose, the Beer Barrel Man embodied the whimsical spirit of the minor leagues in the early to mid-twentieth century. In the 1940s and 1950s, a whole series of Beer Barrel Men were used as logos by the club – pitching, batting, fielding balls and running the bases. The December 1944 issue of Brewer News, the club's newsletter, depicted Owgust in a Santa Claus suit and long white beard. The Beer Barrel Man was used until spring training of 1953, when the Boston Braves displaced the Brewers in Milwaukee. Major Leagues After the Braves moved to Atlanta after the 1965 season, former Braves minority owner Bud Selig announced the formation of a group to bring major league baseball club back to Milwaukee, adopting the batting Beer Barrel Man as his organization's logo. When Selig's group was awarded the bankrupt American League Seattle Pilots franchise, he moved them to Milwaukee and the Beer Barrel Man made a comeback as the first logo of the new Milwaukee Brewers. The Beer Barrel Man was used by the American League club through the 1977 season. Legacy Since then, he has made appearances on stadium giveaways, such as the 1999 Turn Ahead the Clock promotion, and has found new life on Cooperstown Collection merchandise. The Beer Barrel Man was also featured in the winning design for the Brewers' "Design A Youniform" contest in 2013. The contest received nearly 700 entries and the winning design, created by Ben Peters of Richfield, Minnesota, used the Beer Barrelman as the cap logo and sleeve patch. This design was used in exhibitions games on March 22 in Arizona against the Chicago Cubs and once again March 30 in a game at Miller Park in Milwaukee ag
https://en.wikipedia.org/wiki/Vicarious%20Hypothesis
The Vicarious Hypothesis, or hypothesis vicaria, was a planetary hypothesis proposed by Johannes Kepler to describe the motion of Mars. The hypothesis adopted the circular orbit and equant of Ptolemy's planetary model as well as the heliocentrism of the Copernican model. Calculations using the Vicarious Hypothesis did not support a circular orbit for Mars, leading Kepler to propose elliptical orbits as one of three laws of planetary motion in Astronomia Nova. History In 1600, Johannes Kepler met and began working with Tycho Brahe at Benátky, a town north of Prague where Brahe's new observatory was being built. Brahe assigned Kepler the task of modeling the motion of Mars using only data that Brahe had collected himself. Upon the death of Brahe in 1601, all of Brahe's data was willed to Kepler. Brahe's observational data was among the most accurate of his time, which Kepler used in the construction of the Vicarious Hypothesis. Predecessors Ptolemy Claudius Ptolemy's planetary model consisted of a stationary earth surrounded by fixed circles, called deferents, which carried smaller, rotating circles called epicycles. Planets rotated on the epicycles as the epicycles traveled along the deferent. Ptolemy shifted the Earth away from the center of the deferent and introduced another point, the equant, equidistant to the deferent's center on the opposite side of the Earth. The Vicarious Hypothesis uses a circular orbit for Mars and reintroduces a form of the equant to describe the motion of Mars with constant angular speed. Copernicus Nicolaus Copernicus broke from the geocentric model of Ptolemy by placing the Sun at the center of his planetary model. However, Copernicus retained circular orbits for the planets and added an orbit for the Earth, insisting that the Earth revolved around the Sun. The Sun was positioned off-center of the orbits but was still contained within all orbits. Kepler adopted Copernican heliocentrism in the construction of the Vicarious Hyp
https://en.wikipedia.org/wiki/Constrained%20optimization
In mathematical optimization, constrained optimization (in some contexts called constraint optimization) is the process of optimizing an objective function with respect to some variables in the presence of constraints on those variables. The objective function is either a cost function or energy function, which is to be minimized, or a reward function or utility function, which is to be maximized. Constraints can be either hard constraints, which set conditions for the variables that are required to be satisfied, or soft constraints, which have some variable values that are penalized in the objective function if, and based on the extent that, the conditions on the variables are not satisfied. Relation to constraint-satisfaction problems The constrained-optimization problem (COP) is a significant generalization of the classic constraint-satisfaction problem (CSP) model. COP is a CSP that includes an objective function to be optimized. Many algorithms are used to handle the optimization part. General form A general constrained minimization problem may be written as follows: where and are constraints that are required to be satisfied (these are called hard constraints), and is the objective function that needs to be optimized subject to the constraints. In some problems, often called constraint optimization problems, the objective function is actually the sum of cost functions, each of which penalizes the extent (if any) to which a soft constraint (a constraint which is preferred but not required to be satisfied) is violated. Solution methods Many constrained optimization algorithms can be adapted to the unconstrained case, often via the use of a penalty method. However, search steps taken by the unconstrained method may be unacceptable for the constrained problem, leading to a lack of convergence. This is referred to as the Maratos effect. Equality constraints Substitution method For very simple problems, say a function of two variables subject to
https://en.wikipedia.org/wiki/Synchronous%20Data%20Link%20Control
Synchronous Data Link Control (SDLC) is a computer communications protocol. It is the layer 2 protocol for IBM's Systems Network Architecture (SNA). SDLC supports multipoint links as well as error correction. It also runs under the assumption that an SNA header is present after the SDLC header. SDLC was mainly used by IBM mainframe and midrange systems; however, implementations exist on many platforms from many vendors. In the United States and Canada, SDLC can be found in traffic control cabinets. In 1975, IBM developed the first bit-oriented protocol, SDLC, from work done for IBM in the early 1970s. This de facto standard has been adopted by ISO as High-Level Data Link Control (HDLC) in 1979 and by ANSI as Advanced Data Communication Control Procedures (ADCCP). The latter standards added features such as the Asynchronous Balanced Mode, frame sizes that did not need to be multiples of bit-octets, but also removed some of the procedures and messages (such as the TEST message). SDLC operates independently on each communications link, and can operate on point-to-point multipoint or loop facilities, on switched or dedicated, two-wire or four-wire circuits, and with full-duplex and half-duplex operation. A unique characteristic of SDLC is its ability to mix half-duplex secondary stations with full-duplex primary stations on four-wire circuits, thus reducing the cost of dedicated facilities. Intel used SDLC as a base protocol for BITBUS, still popular in Europe as fieldbus and included support in several controllers (i8044/i8344, i80152). The 8044 controller is still in production by third-party vendors. Other vendors putting hardware support for SDLC (and the slightly different HDLC) into communication controller chips of the 1980s included Zilog, Motorola, and National Semiconductor. As a result, a wide variety of equipment in the 1980s used it and it was very common in the mainframe centric corporate networks which were the norm in the 1980s. The most common
https://en.wikipedia.org/wiki/Gummel%E2%80%93Poon%20model
The Gummel–Poon model is a model of the bipolar junction transistor. It was first described in an article published by Hermann Gummel and H. C. Poon at Bell Labs in 1970. The Gummel–Poon model and modern variants of it are widely used in popular circuit simulators such as SPICE. A significant effect that the Gummel–Poon model accounts for is the variation of the transistor and values with the direct current level. When certain parameters are omitted, the Gummel–Poon model reduces to the simpler Ebers–Moll model. Model parameters Spice Gummel–Poon model parameters See also Gummel plot
https://en.wikipedia.org/wiki/Sheldon%20spectrum
The Sheldon spectrum is an empirically-observed feature of marine life by which the size of an organism is inversely correlated with its abundance in the ocean. The spectrum is named after Ray Sheldon, a marine ecologist at Canada’s Bedford Institute of Oceanography in Dartmouth, Nova Scotia. Sheldon and colleagues first suggested the existence of the inverse correlation based on seagoing measurements of plankton made with a Coulter counter in the late 1960s, most notably during the first circum-navigation of the Americas aboard the CCGS Hudson. The inverse correlation implies that biomass density as a function of logarithmic body mass is approximately constant over many orders of magnitude. For example, when Sheldon and his colleagues analyzed a plankton sample in a bucket of seawater, they would tend to find that one third of the plankton mass was between 1 and 10 micrometers, another third was between 10 and 100 micrometers, and a third was between 100 micrometers and 1 millimeter. To make up for the differences of size, there must be a remarkably accurate mathematically correlative decrease in number of organisms as they become larger, in order for the biomass to remain constant. Thus, the rule predicts that krill which are a million times smaller than tuna are a million times more abundant in the ocean, a prediction which appears to be true. There is strong evidence that human behavior, particularly overfishing and whaling, have modified the Sheldon spectrum for larger species, and it is unknown what long term effects such global alteration may have.
https://en.wikipedia.org/wiki/Bloch%27s%20principle
Bloch's Principle is a philosophical principle in mathematics stated by André Bloch. Bloch states the principle in Latin as: Nihil est in infinito quod non prius fuerit in finito, and explains this as follows: Every proposition in whose statement the actual infinity occurs can be always considered a consequence, almost immediate, of a proposition where it does not occur, a proposition in finite terms. Bloch mainly applied this principle to the theory of functions of a complex variable. Thus, for example, according to this principle, Picard's theorem corresponds to Schottky's theorem, and Valiron's theorem corresponds to Bloch's theorem. Based on his Principle, Bloch was able to predict or conjecture several important results such as the Ahlfors's Five Islands theorem, Cartan's theorem on holomorphic curves omitting hyperplanes, Hayman's result that an exceptional set of radii is unavoidable in Nevanlinna theory. In the more recent times several general theorems were proved which can be regarded as rigorous statements in the spirit of the Bloch Principle: Zalcman's lemma A family of functions meromorphic on the unit disc is not normal if and only if there exist: a number points functions numbers such that spherically uniformly on compact subsets of where is a nonconstant meromorphic function on Zalcman's lemma may be generalized to several complex variables. First, define the following: A family of holomorphic functions on a domain is normal in if every sequence of functions contains either a subsequence which converges to a limit function uniformly on each compact subset of or a subsequence which converges uniformly to on each compact subset. For every function of class define at each point a Hermitian form and call it the Levi form of the function at If function is holomorphic on set This quantity is well defined since the Levi form is nonnegative for all In particular, for the above formula takes the form and coincid
https://en.wikipedia.org/wiki/Ellingham%E2%80%93Horton%20graph
In the mathematical field of graph theory, the Ellingham–Horton graphs are two 3-regular graphs on 54 and 78 vertices: the Ellingham–Horton 54-graph and the Ellingham–Horton 78-graph. They are named after Joseph D. Horton and Mark N. Ellingham, their discoverers. These two graphs provide counterexamples to the conjecture of W. T. Tutte that every cubic 3-connected bipartite graph is Hamiltonian. The book thickness of the Ellingham-Horton 54-graph and the Ellingham-Horton 78-graph is 3 and the queue numbers 2. The first counterexample to the Tutte conjecture was the Horton graph, published by . After the Horton graph, a number of smaller counterexamples to the Tutte conjecture were found. Among them are a 92-vertex graph by , a 78-vertex graph by , and the two Ellingham–Horton graphs. The first Ellingham–Horton graph was published by and is of order 78. At that time it was the smallest known counterexample to the Tutte conjecture. The second Ellingham–Horton graph was published by and is of order 54. In 1989, Georges' graph, the smallest currently-known Non-Hamiltonian 3-connected cubic bipartite graph was discovered, containing 50 vertices. Gallery
https://en.wikipedia.org/wiki/Plant%20virus
Plant viruses are viruses that affect plants. Like all other viruses, plant viruses are obligate intracellular parasites that do not have the molecular machinery to replicate without a host. Plant viruses can be pathogenic to vascular plants ("higher plants"). Most plant viruses are rod-shaped, with protein discs forming a tube surrounding the viral genome; isometric particles are another common structure. They rarely have an envelope. The great majority have an RNA genome, which is usually small and single stranded (ss), but some viruses have double-stranded (ds) RNA, ssDNA or dsDNA genomes. Although plant viruses are not as well understood as their animal counterparts, one plant virus has become very recognizable: tobacco mosaic virus (TMV), the first virus to be discovered. This and other viruses cause an estimated US$60 billion loss in crop yields worldwide each year. Plant viruses are grouped into 73 genera and 49 families. However, these figures relate only to cultivated plants, which represent only a tiny fraction of the total number of plant species. Viruses in wild plants have not been well-studied, but the interactions between wild plants and their viruses often do not appear to cause disease in the host plants. To transmit from one plant to another and from one plant cell to another, plant viruses must use strategies that are usually different from animal viruses. Most plants do not move, and so plant-to-plant transmission usually involves vectors (such as insects). Plant cells are surrounded by solid cell walls, therefore transport through plasmodesmata is the preferred path for virions to move between plant cells. Plants have specialized mechanisms for transporting mRNAs through plasmodesmata, and these mechanisms are thought to be used by RNA viruses to spread from one cell to another. Plant defenses against viral infection include, among other measures, the use of siRNA in response to dsRNA. Most plant viruses encode a protein to suppress this respo
https://en.wikipedia.org/wiki/Posterior%20tibiofibular%20ligament
The posterior ligament of the lateral malleolus (posterior tibiofibular ligament, posterior inferior ligament) is smaller than the anterior ligament of the lateral malleolus and is disposed in a similar manner on the posterior surface of the syndesmosis. It connects the tibia and fibular on the inferior part of both bones.
https://en.wikipedia.org/wiki/Spreewald%20gherkins
Spreewald gherkins (German: Spreewälder Gurken or Spreewaldgurken) are a specialty pickled cucumber from Brandenburg, which are protected by the EU as a Protected Geographical Indication (PGI). Overview In the 1870s, Theodor Fontane found that the Spreewaldgurke stood at the top of the agricultural products in Brandenburg's Spreewald: The secret of the Spreewald gherkins' special taste remained hidden, even to the satirist Fontane. Certainly, the moist soil, rich in humus, and the climate in the Spreewald also contribute to the good growth in the cucumber fields. The actual reason for the taste, which is considered by connoisseurs to be delicate, is found in their processing. While the process of fermentation in large barrels formerly took several weeks, gherkins today are ready for sale after only one day of processing—whether for mustard gherkins (Senfgurke), gherkins or dill pickles (Gewürzgurke) or pickled cucumbers (Salzgurke). This enormous time saving is achieved by heating to with the addition of caustic soda. The composition of the additional ingredients, however, still remains a well-guarded secret of the approximately twenty picklers. These taste-enhancing ingredients, such as basil, lemon balm, grape leaves, cherry leaves or walnut leaves, give Spreewald gherkins their special sour, spicy taste. After the reunification of Germany in 1990, Spreewald gherkins were one of the few products from East Germany which were still available without interruption. The gherkins also achieved fame in 2003 with the award-winning film Good Bye Lenin! by Wolfgang Becker. In this tragicomedy, Daniel Brühl has great difficulty obtaining the Spreewald gherkins that his sick mother (Katrin Sass) dearly loved and which he absolutely needed to convince her of the continued existence of the (in her view) "ideal East German world". Meanwhile, the gherkins can again be obtained under their trademarked name Spreewälder Gurken, which is a Protected Geographical Indication (PGI)
https://en.wikipedia.org/wiki/Molecular%20autoionization
In chemistry, molecular autoionization (or self-ionization) is a chemical reaction between molecules of the same substance to produce ions. If a pure liquid partially dissociates into ions, it is said to be self-ionizing. In most cases the oxidation number on all atoms in such a reaction remains unchanged. Such autoionization can be protic ( transfer), or non-protic. Examples Protic solvents Protic solvents often undergo some autoionization (in this case autoprotolysis): 2 H2O <=> H3O+ + OH- The self-ionization of water is particularly well studied, due to its implications for acid-base chemistry of aqueous solutions. 2 NH3 <=> NH4+ + NH2- 2 H2SO4 <=> H3SO4+ + HSO4- 3 HF <=> H2F+ + HF2- Here proton transfer between two HF combines with homoassociation of and a third HF to form Non-protic solvents 2 PF5 <=> PF6- + PF4+ N2O4 <=> NO+ + NO3- Here the nitrogen oxidation numbers change from (+4 and +4) to (+3 and +5). 2 BrF3 <=> BrF2+ + BrF4- These solvents all possess atoms with odd atomic numbers, either nitrogen or a halogen. Such atoms enable the formation of singly charged, nonradical ions (which must have at least one odd atomic number atom), which are the most favorable autoionization products. Protic solvents, mentioned previously, use hydrogen for this role. Autoionization would be much less favorable in solvents such as sulfur dioxide or carbon dioxide, which have only even atomic number atoms. Coordination chemistry Autoionization is not restricted to neat liquids or solids. Solutions of metal complexes exhibit this property. For example, compounds of the type (where X = Cl or Br) are unstable with respect to autoionization forming . See also Ionization Ion association
https://en.wikipedia.org/wiki/Expocode
EXPOCODE, or the "expedition code", is a unique alphanumeric identifier defined by the National Oceanographic Data Center (NODC) of the US. The code defines a standard nomenclature for cruise labels of research vessels and intends to avoid confusion in oceanographic data management. The code was used by international projects (WOCE, CarboOcean) and is considered a de facto standard in the international hydrographic community beginning with the Climate Variability Program (CLIVAR) and the EU-Project Eurofleets. The format of an expocode for an oceanographic cruise is defined in the format NODCYYYYMMDD where: NODC is NOAA's National Oceanographic Data Center's 4-character research vessel identifier, consisting of country and ship code YYYYMMDD is the GMT date when the cruise left port. Example for a cruise of the US research vessel Nathaniel B. Palmer, starting on 2011-02-19: 320620110219 (Code of US = 32, code of Palmer = 06, date when cruise starts 2011-02-19) External links NODC country codes NODC ship codes CLIVAR and Carbon Hydrographic Data Office (CCHDO) Oceanography
https://en.wikipedia.org/wiki/Mike%20Develin
Michael Lee Develin (born August 27, 1980) is an American mathematician known for his work in combinatorics and discrete geometry. Early life Mike Develin was born in Hobart, Tasmania. He moved to the United States with his Korean mother, living in New York City. He attended Stuyvesant High School, where he was captain of the math team, and entered Harvard University at the age of 16. At 22, he received his PhD from UC Berkeley, doing his dissertation on Topics in Discrete Geometry. He was awarded the 2003 American Institute of Mathematics five-year fellowship. Mathematics Develin is a 2-time Putnam fellow in 1997 and 1998. He studied under advisor Bernd Sturmfels at UC-Berkeley, and has been noted for work on Stanley's reciprocity theorem and tight spans. His 2004 paper, "Tropical Convexity", with Sturmfels, is regarded as one of the seminal papers of tropical geometry, garnering over 300 citations to date. Facebook Develin worked on data science for Facebook and Instagram from 2011 to 2018. On January 23, 2014, Develin published a satirical note on behalf of Facebook's data science team, predicting the demise of Princeton University, in response to a research paper by Princeton PhD candidates predicting the demise of Facebook. Bridge Develin started playing competitive bridge in 2005. Wins Manfield Non-Life Master Pairs 2005 Grand National Teams Flight B 2007 South American Junior Championships 2007 Red Ribbon Pairs 2008 0-10,000 Fast Pairs 2022 Runner-up North American Pairs Flight C 2006 Mini-Spingold II 2007 Personal life Develin was naturalized as an American citizen in 2010. Develin organized and maintains SimBase, a simulated baseball league with fictitious players, whose inaugural members also included Jeopardy! champion Joon Pahk. Develin occasionally set up a "free advice" table near the San Francisco Ferry Building. He currently resides in Kirkland, Washington.
https://en.wikipedia.org/wiki/Psarolepis
Psarolepis (; psārolepis, from Greek ψαρός 'speckled' and λεπίς 'scale') is a genus of extinct bony fish which lived around 397 to 418 million years ago (Pridoli to Lochkovian stages). Fossils of Psarolepis have been found mainly in South China and described by paleontologist Xiaobo Yu in 1998. It is not known certainly in which group Psarolepis belongs, but paleontologists agree that it probably is a basal genus and seems to be close to the common ancestor of lobe-finned and ray-finned fishes. In 2001, paleontologist John A. Long compared Psarolepis with onychodontiform fishes and refer to their relationships. Description Psarolepis had a pair of 'parasymphysical tooth whorls', teeth which extend up at the front of the lower jaw. The head was made of several thick dermal plates and covered with deep pock-marks and large pores. Another trait is a large pectoral spine, just in front of the pectoral fin, extending back from the shoulder girdle, and a dorsal spine located in front of a median fin behind the head, which gives the fish a shark-like form. The pock-marked head of Psarolepis was made of plates containing a layer of porcelain-like cosmine. Because the cosmine layer obscures the suture lines of the skull, it is difficult to study the exact bone structure. The snout was strangely humped and the nostrils were located above the eyes, which were just above the upper jaw. The most spectacular findings were the fin spines. Two are known: one extending back from the shoulder girdle and another which is associated with the dorsal fin. These fin spines are found only in primitive jawed fishes and are apparently absent from the most primitive sharks, but present in abundance in more derived forms. Psarolepis had teeth at the very front of the snout with large fangs on the tooth plate. Outstanding feature are the 'parasymphysical tooth whorls' which place the fish in the order of Onychodontida. The premaxilla and the dentary had large inner teeth and irregular array
https://en.wikipedia.org/wiki/Clock%20and%20wavefront%20model
The clock and wavefront model is a model used to describe the process of somitogenesis in vertebrates. Somitogenesis is the process by which somites, blocks of mesoderm that give rise to a variety of connective tissues, are formed. The model describes the splitting off of somites from the paraxial mesoderm as the result of oscillating expression of particular proteins and their gradients. Overview Once the cells of the pre-somitic mesoderm are in place following by cell migration during gastrulation, oscillatory expression of many genes begins in these cells as if regulated by a developmental "clock." This has led many to conclude that somitogenesis is coordinated by a "clock and wave" mechanism. More technically, this means that somitogenesis occurs due to the largely cell-autonomous oscillations of a network of genes and gene products which causes cells to oscillate between a permissive and a non-permissive state in a consistently timed-fashion, like a clock. These genes include members of the FGF family, Wnt and Notch pathway, as well as targets of these pathways. The wavefront progresses slowly in an anterior-to-posterior direction. As the wavefront of signaling comes in contact with cells in the permissive state, they undergo a mesenchymal-epithelial transition and pinch off of the more anterior pre-somitic mesoderm, forming a somite boundary and resetting the process for the next somite. In particular, the cyclic activation of the Notch pathway appears to be of great importance in the wavefront-clock model. It has been suggested that the activation of Notch cyclically activates a cascade of genes necessary for the somites to separate from the main paraxial body. This is controlled by different means in different species, such as through a simple negative feedback loop in zebrafish or in a complicated process in which FGF and Wnt clocks affect the Notch clock, as in chicks and mice. Generally speaking, however, the segmentation clock model is highly evolut
https://en.wikipedia.org/wiki/Dead%20letter%20queue
In message queueing a dead letter queue (DLQ) is a service implementation to store messages that the messaging system cannot or should not deliver. Although implementation-specific, messages can be routed to the DLQ for the following reasons: The message is sent to a queue that does not exist. The maximum queue length is exceeded. The message exceeds the size limit. The message expires because it reached the TTL (time to live) The message is rejected by another queue exchange. The message has been read and rejected too many times. Routing these messages to a dead letter queue enables analysis of common fault patterns and potential software problems. If a message consumer receives a message that it considers invalid, it can instead forward it an Invalid Message Channel, allowing a separation between application-level faults and delivery failures. Queueing systems that incorporate dead letter queues include Amazon EventBridge, Amazon Simple Queue Service, Apache ActiveMQ, Google Cloud Pub/Sub, HornetQ, Microsoft Message Queuing, Microsoft Azure Event Grid and Azure Service Bus, WebSphere MQ, Solace PubSub+, Rabbit MQ, Apache Kafka and Apache Pulsar. See also Poison message Dead letter mail
https://en.wikipedia.org/wiki/Gullstrand%E2%80%93Painlev%C3%A9%20coordinates
Gullstrand–Painlevé coordinates are a particular set of coordinates for the Schwarzschild metric – a solution to the Einstein field equations which describes a black hole. The ingoing coordinates are such that the time coordinate follows the proper time of a free-falling observer who starts from far away at zero velocity, and the spatial slices are flat. There is no coordinate singularity at the Schwarzschild radius (event horizon). The outgoing ones are simply the time reverse of ingoing coordinates (the time is the proper time along outgoing particles that reach infinity with zero velocity). The solution was proposed independently by Paul Painlevé in 1921 and Allvar Gullstrand in 1922. It was not explicitly shown until 1933 in Lemaître's paper that these solutions were simply coordinate transformations of the usual Schwarzschild solution, although Einstein immediately believed that to be true. Derivation The derivation of GP coordinates requires defining the following coordinate systems and understanding how data measured for events in one coordinate system is interpreted in another coordinate system. Convention: The units for the variables are all geometrized. Time and mass have units in meters. The speed of light in flat spacetime has a value of 1. The gravitational constant has a value of 1. The metric is expressed in the +−−− sign convention. Schwarzschild coordinates A Schwarzschild observer is a far observer or a bookkeeper. He does not directly make measurements of events that occur in different places. Instead, he is far away from the black hole and the events. Observers local to the events are enlisted to make measurements and send the results to him. The bookkeeper gathers and combines the reports from various places. The numbers in the reports are translated into data in Schwarzschild coordinates, which provide a systematic means of evaluating and describing the events globally. Thus, the physicist can compare and interpret the data inte
https://en.wikipedia.org/wiki/Pork%20and%20beans
Pork and beans is a culinary dish that uses pork and beans as its main ingredients. Numerous variations exist, usually with more specific names, such as Fabada Asturiana, Olla podrida, or American canned pork and beans. American canned pork and beans Although the time and place of the first appearance of American canned pork and beans is unclear, the dish was well established in the American diet by the mid-19th century. The 1832 cookbook The American Frugal Housewife lists only three ingredients for pork and beans: a quart of beans, a pound of salt pork, and pepper. Commercially canned pork and beans were introduced in the United States sometime around 1880. According to the 1975 Better Homes and Garden Heritage Cookbook, canned pork and beans was the first convenience food. Today, the dish is "an American canned classic, [and] is recognized by American consumers generally as an article of commerce that contains very little pork." The recipe for American commercially canned pork and beans varies slightly from company to company, but generally consists of rehydrated navy beans packed in tomato sauce (usually made from concentrate and which may incorporate starch, sugar, salt, and seasoning), with very small chunks of salt pork or rendered pork fat. The ingredients are cooked, packed into hermetically sealed containers, and processed by heat to assure preservation. See also Baked beans Beans and franks Boston baked beans Cowboy beans Feijoada Cassoulet List of pork dishes List of stews
https://en.wikipedia.org/wiki/Culturomics%20%28microbiology%29
Culturomics is the high-throughput cell culture of bacteria that aims to comprehensively identify strains or species in samples obtained from tissues such as the human gut or from the environment. This approach was conceived as an alternative, complementary method to metagenomics, which relies on the presence of homologous sequences to identify new bacteria. Due to the limited phylogenetic information available on bacteria, metagenomic data generally contains large amounts of "microbial dark matter", sequences of unknown origin. Culturomics provides some of the missing gaps with the added advantage of enabling the functional study of the generated cultures. Its main drawback is that many bacterial species remain effectively uncultivable until their growth conditions are better understood. Therefore, optimization of the culturomics approach has been done by improving culture conditions. Unlike metagenomics, which relies on direct shotgun sequencing or 16S rRNA gene sequencing, culturomics is based on matrix-assisted laser desorption/ionization–time-of-flight (MALDI-TOF) mass spectrometry. However, culturomics also uses 16S RNA sequencing to identify new species. See also Genomics Microbiota
https://en.wikipedia.org/wiki/Megamax%20C
Megamax C is a K&R C-based development system originally written for Macintosh and ported to the Atari ST and Apple IIGS computers. Sold by Megamax, Inc., based in Richardson, Texas, the package includes a one-pass compiler, linker, text editor, resource construction kit, and documentation. Megamax C was written by Michael Bunnell with Eric Parker providing the linker and most of the standard library. A circa-1988 version of the compiler was renamed Laser C, while the company remained Megamax. In the early days of the Atari ST, Megamax C was the primary competitor to the Alcyon C compiler from Digital Research included in the official developer kit from Atari Corporation, and the documentation covers Atari-specific features. The company advertised that Megamax C could be used on a 520 ST with a single floppy drive. The ST version includes the source code and assets for Megaroids, a clone of the Asteroids video game, written by Mike Bunnell with sound effects by Mitch Bunnell. Technical details On both the Atari ST and Macintosh, the size of a compiled module is limited to 32K of code, and arrays have the same 32K restriction. The limitation stems from a requirement on the Macintosh which was carried over to the Atari. This is despite the Motorola 68000 CPU in both machines having a 24-bit address range. Reception According to a review of the Atari ST version in Antic by Mike Fleishman, Megamax C compiled a small benchmark program six times faster than Digital Research's compiler. In a comparison of C compilers for the Atari ST, STart magazine wrote, "For a development compiler, Megamax C is, without question, the best available on the Atari. It will reduce your compile/test turn-around time by at least a factor of five." They also pointed out that the $200 price may be steep for hobbyists and students. The compiler was used for development by Batteries Included and FTL Games.
https://en.wikipedia.org/wiki/ATP-binding%20domain%20of%20ABC%20transporters
In molecular biology, ATP-binding domain of ABC transporters is a water-soluble domain of transmembrane ABC transporters. ABC transporters belong to the ATP-Binding Cassette superfamily, which uses the hydrolysis of ATP to translocate a variety of compounds across biological membranes. ABC transporters are minimally constituted of two conserved regions: a highly conserved ATP binding cassette (ABC) and a less conserved transmembrane domain (TMD). These regions can be found on the same protein or on two different ones. Most ABC transporters function as a dimer and therefore are constituted of four domains, two ABC modules and two TMDs. Biological function ABC transporters are involved in the export or import of a wide variety of substrates ranging from small ions to macromolecules. The major function of ABC import systems is to provide essential nutrients to bacteria. They are found only in prokaryotes and their four constitutive domains are usually encoded by independent polypeptides (two ABC proteins and two TMD proteins). Prokaryotic importers require additional extracytoplasmic binding proteins (one or more per systems) for function. In contrast, export systems are involved in the extrusion of noxious substances, the export of extracellular toxins and the targeting of membrane components. They are found in all living organisms and in general the TMD is fused to the ABC module in a variety of combinations. Some eukaryotic exporters encode the four domains on the same polypeptide chain. Amino acid sequence The ABC module (approximately two hundred amino acid residues) is known to bind and hydrolyze ATP, thereby coupling transport to ATP hydrolysis in a large number of biological processes. The cassette is duplicated in several subfamilies. Its primary sequence is highly conserved, displaying a typical phosphate-binding loop: Walker A, and a magnesium binding site: Walker B. Besides these two regions, three other conserved motifs are present in the ABC cassett
https://en.wikipedia.org/wiki/Aurifeuillean%20factorization
In number theory, an aurifeuillean factorization, named after Léon-François-Antoine Aurifeuille, is factorization of certain integer values of the cyclotomic polynomials. Because cyclotomic polynomials are irreducible polynomials over the integers, such a factorization cannot come from an algebraic factorization of the polynomial. Nevertheless, certain families of integers coming from cyclotomic polynomials have factorizations given by formulas applying to the whole family, as in the examples below. Examples Numbers of the form have the following factorization (Sophie Germain's identity): Setting and , one obtains the following aurifeuillean factorization of , where is the fourth cyclotomic polynomial: Numbers of the form have the following factorization, where the first factor () is the algebraic factorization of sum of two cubes: Setting and , one obtains the following factorization of : Here, the first of the three terms in the factorization is and the remaining two terms provide an aurifeuillean factorization of , where . Numbers of the form or their factors , where with square-free , have aurifeuillean factorization if and only if one of the following conditions holds: and and Thus, when with square-free , and is congruent to modulo , then if is congruent to 1 mod 4, have aurifeuillean factorization, otherwise, have aurifeuillean factorization. When the number is of a particular form (the exact expression varies with the base), aurifeuillean factorization may be used, which gives a product of two or three numbers. The following equations give aurifeuillean factors for the Cunningham project bases as a product of F, L and M: If we let L = C − D, M = C + D, the aurifeuillean factorizations for bn ± 1 of the form F * (C − D) * (C + D) = F * L * M with the bases 2 ≤ b ≤ 24 (perfect powers excluded, since a power of bn is also a power of b) are: (for the coefficients of the polynomials for all square-free bases up to 199 and up
https://en.wikipedia.org/wiki/Davenport%E2%80%93Schinzel%20sequence
In combinatorics, a Davenport–Schinzel sequence is a sequence of symbols in which the number of times any two symbols may appear in alternation is limited. The maximum possible length of a Davenport–Schinzel sequence is bounded by the number of its distinct symbols multiplied by a small but nonconstant factor that depends on the number of alternations that are allowed. Davenport–Schinzel sequences were first defined in 1965 by Harold Davenport and Andrzej Schinzel to analyze linear differential equations. Following these sequences and their length bounds have also become a standard tool in discrete geometry and in the analysis of geometric algorithms. Definition A finite sequence U = u1, u2, u3, is said to be a Davenport–Schinzel sequence of order s if it satisfies the following two properties: No two consecutive values in the sequence are equal to each other. If x and y are two distinct values occurring in the sequence, then the sequence does not contain a subsequence ... x, ... y, ..., x, ..., y, ... consisting of s + 2 values alternating between x and y. For instance, the sequence 1, 2, 1, 3, 1, 3, 2, 4, 5, 4, 5, 2, 3 is a Davenport–Schinzel sequence of order 3: it contains alternating subsequences of length four, such as ...1, ... 2, ... 1, ... 2, ... (which appears in four different ways as a subsequence of the whole sequence) but it does not contain any alternating subsequences of length five. If a Davenport–Schinzel sequence of order s includes n distinct values, it is called an (n,s) Davenport–Schinzel sequence, or a DS(n,s)-sequence. Length bounds The complexity of DS(n,s)-sequence has been analyzed asymptotically in the limit as n goes to infinity, with the assumption that s is a fixed constant, and nearly tight bounds are known for all s. Let λs(n) denote the length of the longest DS(n,s)-sequence. The best bounds known on λs involve the inverse Ackermann function α(n) = min { m | A(m,m) ≥ n }, where A is the Ackermann function. Due to the very rapid
https://en.wikipedia.org/wiki/PRR29
PRR29 (proline-rich protein 29) is a protein encoded by the PRR29 gene located in humans on chromosome 17 at 17q23. Its function is not fully understood. Its name is derived from the chain of 5 proline amino acids located toward the end of the protein. The primary domain within the sequence of this protein is known as DUF4587. It is reported to have high levels of expression in tissues pertaining to the circulatory system and the immune system. It is hypothesized that PRR29 is a nuclear protein that facilitates communication between the nucleus and the mitochondria. The gene is also commonly known as C17orf72. The gene has a size of 5961 base pairs and contains five exons. Gene PRR29 is located on the long arm of chromosome 17 (17q23.3), starting at 63998344 and ending at 64004305. The gene spans 5961 base pairs and is oriented on the plus strand. Genes SNHG25 and LOC105371858 neighbor PRR29 on chromosome 17.The gene ICAM2 is located on the negative strand, directly opposite of PRR29. The human gene for PRR29, also referred to as C17of72, spans 6 exons and is located at the genomic coordinates chr17:63,998,351-64,002,516 on the positive DNA strand (hg38). It is 4,166 base pairs in length including introns. After they have been removed, the length is shortened to 3,611 base pairs. mRNA The gene has 12 common splice variants and one unspliced form. The longest transcribed mRNA is made up of 3048 base pairs and the transcribed protein sequence for this mRNA is 189 amino acids. Protein General properties Homo sapiens PRR29 has several protein isoforms, with the longest being 236 amino acids. PRR29 has a predicted Isoelectric point of 5.23 and a predicted Molecular weight of 26.1 kilodaltons. PRR29 is characterized by a larger than average proportion of prolines (19.1%) and a smaller than average amount of asparagines (0.4%) The PRR29 protein is estimated to have a molecular weight of 20.7 kDa. Its isoelectric point is predicted to lie at 4.83. Domains PRR
https://en.wikipedia.org/wiki/Chamaecydin
Chamaecydin is a chemical compound with the molecular formula C30H40O3. It is made up of three six-membered rings and two five-membered rings and has one polar hydroxyl functional group. It is well preserved in the rock record and is only found in a specific family of conifers, the swamp cypress subfamily. The presence and abundance of chamaecydin in the rock record can reveal environmental changes in ancient biomes. Background Notable properties The molecule has a mass of 448.298 Daltons, and consists of two carbonyl groups and three double bonds. It is a hexacarboxylic triterpene. The 2d structure is shown in Figure 1. The physical properties are not well known; the melting point of chamaecydin is 197-198 C. Its crystal structure is orthorhombic. Chamaecydin shows significant antifeedant activity against the larvae of Spodoptera Iitura and has an antifeedant index (AFI) of 0.44. The index ranges from 1 to -1, with one being the most powerful antifeedant. It is calculated by comparing the amount of area lost in the treated disk to the amount of area lost in the control disk or C-T/C+T where C is control and T is treated. Preservation Chamaecydin is a biomarker for certain species of Conifer trees. Once living organism die, the organic molecules they biosynthesized often undergo various chemical transformations in the soil and thus usually retain only basic structures of the molecules that were synthesized. These modified molecules are biomarkers but can often only be used as chemical tracers for a wide group of organisms. Chamaecydin is rare because it is a polar molecule that is found perfectly preserved millions of years later, and can therefore be used to trace specific species. Despite being a polar compound, chamaecydin is likely preserved because it is found trapped in resinous plant material, where it is prevented from bonding to kerogen. In the paleorecord, it is found in clayey sediments, which prevents further oxidation. Chamaecydin is found in conce
https://en.wikipedia.org/wiki/Charge-shift%20bond
In theoretical chemistry, the charge-shift bond is a proposed new class of chemical bonds that sits alongside the three familiar families of covalent, ionic, and metallic bonds where electrons are shared or transferred respectively. The charge shift bond derives its stability from the resonance of ionic forms rather than the covalent sharing of electrons which are often depicted as having electron density between the bonded atoms. A feature of the charge shift bond is that the predicted electron density between the bonded atoms is low. It has long been known from experiment that the accumulation of electric charge between the bonded atoms is not necessarily a feature of covalent bonds. An example where charge shift bonding has been used to explain the low electron density found experimentally is in the central bond between the inverted tetrahedral carbon atoms in [1.1.1]propellanes. Theoretical calculations on a range of molecules have indicated that a charge shift bond is present, a striking example being fluorine, , which is normally described as having a typical covalent bond. The charge shift bond (CSB) has also been shown to exist at the cation-anion interface of protic ionic liquids (PILs). The authors have also shown how CSB character in PILs correlates with their physicochemical properties. Valence bond description The valence bond view of chemical bonding that owes much to the work of Pauling is familiar to many, if not all, chemists. The basis of Pauling's description of the chemical bond is that an electron pair bond involves the mixing, resonance, of one covalent and two ionic structures. In bonds between two atoms of the same element, homonuclear bonds, Pauling assumed that the ionic structures make no appreciable contribution to the overall bonding. This assumption followed on from published calculations for the hydrogen molecule in 1933 by Weinbaum and by James and Coolidge that showed that the contribution of ionic forms amounted to only a small
https://en.wikipedia.org/wiki/Ciliogenesis
Ciliogenesis is defined as the building of the cell's antenna (primary cilia) or extracellular fluid mediation mechanism (motile cilium). It includes the assembly and disassembly of the cilia during the cell cycle. Cilia are important organelles of cells and are involved in numerous activities such as cell signaling, processing developmental signals, and directing the flow of fluids such as mucus over and around cells. Due to the importance of these cell processes, defects in ciliogenesis can lead to numerous human diseases related to non-functioning cilia. Ciliogenesis may also play a role in the development of left/right handedness in humans. Cilia formation Ciliogenesis occurs through an ordered set of steps. First, the basal bodies from centrioles must migrate to the surface of the cell and attach to the cortex. Along the way, the basal bodies attach to membrane vesicles and the basal body/membrane vesicle complex fuses with the plasma membrane of the cell. Fusion with the plasma membrane is likely what forms the membrane of the cilia. The alignment of the forming cilia is determined by the original positioning and orientation of the basal bodies. Once the alignment is determined, axonemal microtubules extend from the basal body and go beneath the developing ciliary membrane, forming the cilia. Proteins must be synthesized in the cytoplasm of the cell and cannot be synthesized within cilia. For the cilium to elongate, proteins must be selectively imported from the cytoplasm into the cilium and transported to the tip of the cilium by intraflagellar transport (IFT). Once the cilium is completely formed, it continues to incorporate new tubulin at the tip of the cilia. However, the cilium does not elongate further, because older tubulin is simultaneously degraded. This requires an active mechanism that maintains ciliary length. Impairments in these mechanisms can affect the motility of the cell and cell signaling between cells. Ciliogenesis types Mo
https://en.wikipedia.org/wiki/Ipsectrace
ipsectrace is a software tool designed by Wayne Schroeder to help profile IPsec connections in a packet capture (PCP) file. The program uses a command line interface to point at a PCP capture and informs the user about what is going on. It is somewhat inspired by tcptrace, which uses the same input of PCP files. Ipsectrace is only available for the Linux operating system. It is coded in C++ and is licensed under the GPL, effectively allowing anyone to modify and redistribute it. Although its main purpose is to monitor IPsec traffic, ipsectrace can be used to crack extra layers of security brought about by VPN implementations of security such as IPsec and SSH, whereas programs such as Anger, Deceit, and Ettercap can be used to infiltrate PPTP security.
https://en.wikipedia.org/wiki/BCMP%20network
In queueing theory, a discipline within the mathematical theory of probability, a BCMP network is a class of queueing network for which a product-form equilibrium distribution exists. It is named after the authors of the paper where the network was first described: Baskett, Chandy, Muntz, and Palacios. The theorem is a significant extension to a Jackson network allowing virtually arbitrary customer routing and service time distributions, subject to particular service disciplines. The paper is well known, and the theorem was described in 1990 as "one of the seminal achievements in queueing theory in the last 20 years" by J. Michael Harrison and Ruth J. Williams. Definition of a BCMP network A network of m interconnected queues is known as a BCMP network if each of the queues is of one of the following four types: FCFS discipline where all customers have the same negative exponential service time distribution. The service rate can be state dependent, so write for the service rate when the queue length is j. Processor sharing queues Infinite-server queues LCFS with pre-emptive resume (work is not lost) In the final three cases, service time distributions must have rational Laplace transforms. This means the Laplace transform must be of the form Also, the following conditions must be met. external arrivals to node i (if any) form a Poisson process, a customer completing service at queue i will either move to some new queue j with (fixed) probability or leave the system with probability , which is non-zero for some subset of the queues. Theorem For a BCMP network of m queues which is open, closed or mixed in which each queue is of type 1, 2, 3 or 4, the equilibrium state probabilities are given by where C is a normalizing constant chosen to make the equilibrium state probabilities sum to 1 and represents the equilibrium distribution for queue i. Proof The original proof of the theorem was given by checking the independent balance equations were satis
https://en.wikipedia.org/wiki/Stereotype%20embodiment%20theory
Stereotype embodiment theory (SET) is a theoretical model first posited by psychologist Becca Levy to explain the process by which age stereotypes influence the health of older adults. There are multiple well-documented effects of age stereotypes on a number of cognitive and physical outcomes (including memory, cardiovascular reactivity, and longevity). SET explains these findings according to a three-step process: Age stereotypes are internalized from the host culture at a young age. At some point, these age stereotype become "self stereotypes" about oneself as an aging individual. These self-stereotypes are then consciously and unconsciously activated to exert their effects on individual health. Underlying these three steps are SET's four main theoretical premises. According to Levy (2009): "The theory has four components: The stereotypes (a) become internalized across the lifespan, (b) can operate unconsciously, (c) gain salience from self-relevance, and (d) utilize multiple pathways." Although this theory was developed to explain the operation of age stereotypes across the lifespan, it may also explain how other types of self-stereotypes operate, such as race stereotypes among African Americans and gender stereotypes among women. Theoretical premises Internalization of stereotypes across the life span Age stereotypes are internalized starting in early childhood. This process of early internalization is facilitated by the lack of resistances that are usually present when stereotypes are relevant to the personal identity of those exposed to them. In North America and Europe, these stereotypes tend to be negative. This process continues on into early adulthood, where the acceptance and invocation of negative age stereotypes may represent short-term benefits in the form of greater social and economic resources being allocated to younger, rather than older, adults. Hence, younger adult workers tend to assume that older adult workers are less productive tha
https://en.wikipedia.org/wiki/Zealandia%20%28personification%29
Zealandia is a national personification of New Zealand. In her stereotypical form, Zealandia appears as an evidently Western European woman who is similar in dress and appearance to Britannia. Britannia is said to be the mother of Zealandia. History As a direct reference to the United Kingdom and the old world, she brought a sense of history and classical respectability to the colony during the formative years as a young nation. Zealandia appeared on postage stamps, posters, cartoons, war memorials, and New Zealand government publications most commonly during the first half of the 20th century. Zealandia was a commonly used symbol of the New Zealand Centennial Exhibition, which was held in Wellington in 1939 and 1940. Four large Zealandia statues exist in New Zealand towns or cities; one is in Waimate, one is in Palmerston, and one in Symonds Street, Auckland, and one inside the Auckland War Museum. The first two (in stone) are Second Boer War memorials and the latter one (in bronze) is a New Zealand Wars memorial. Some smaller statues exist in other museums and in private hands. Postage stamps Zealandia also featured on one penny definitive postage stamps issued in 1901 and 1909 during the reign of Queen Victoria and Edward VII when it went from being a Colony to a Dominion and was also depicted on a stamp featuring the coat of arms issued in 1929. Coat of arms The woman who appears on the left side of the coat of arms of New Zealand is Zealandia. Apart from the coat of arms, Zealandia is seldom depicted in works today, or indeed referred to. Image gallery
https://en.wikipedia.org/wiki/Local%20oscillator
In electronics, a local oscillator (LO) is an electronic oscillator used with a mixer to change the frequency of a signal. This frequency conversion process, also called heterodyning, produces the sum and difference frequencies from the frequency of the local oscillator and frequency of the input signal. Processing a signal at a fixed frequency gives a radio receiver improved performance. In many receivers, the function of local oscillator and mixer is combined in one stage called a "converter" - this reduces the space, cost, and power consumption by combining both functions into one active device. Applications Local oscillators are used in the superheterodyne receiver, the most common type of radio receiver circuit. They are also used in many other communications circuits such as modems, cable television set top boxes, frequency division multiplexing systems used in telephone trunklines, microwave relay systems, telemetry systems, atomic clocks, radio telescopes, and military electronic countermeasure (antijamming) systems. In satellite television reception, the microwave frequencies used from the satellite down to the receiving antenna are converted to lower frequencies by a local oscillator and mixer mounted at the antenna. This allows the received signals to be sent over a length of cable that would otherwise have unacceptable signal loss at the original reception frequency. In this application, the local oscillator is of a fixed frequency and the down-converted signal frequency is variable. Performance requirements Application of local oscillators in a receiver design requires care to ensure no spurious signals are radiated. Such signals can cause interference in the operation of other receivers. The performance of a signal processing system depends on the characteristics of the local oscillator. The local oscillator must produce a stable frequency with low harmonics. Stability must take into account temperature, voltage, and mechanical drift as factors
https://en.wikipedia.org/wiki/Soil%20consolidation
Soil consolidation refers to the mechanical process by which soil changes volume gradually in response to a change in pressure. This happens because soil is a two-phase material, comprising soil grains and pore fluid, usually groundwater. When soil saturated with water is subjected to an increase in pressure, the high volumetric stiffness of water compared to the soil matrix means that the water initially absorbs all the change in pressure without changing volume, creating excess pore water pressure. As water diffuses away from regions of high pressure due to seepage, the soil matrix gradually takes up the pressure change and shrinks in volume. The theoretical framework of consolidation is therefore closely related to the concept of effective stress, and hydraulic conductivity. The early theoretical modern models were proposed one century ago, according to two different approaches, by Karl Terzaghi and Paul Fillunger. The Terzaghi’s model is currently the most utilized in engineering practice and is based on the diffusion equation. In the narrow sense, "consolidation" refers strictly to this delayed volumetric response to pressure change due to gradual movement of water. Some publications also use "consolidation" in the broad sense, to refer to any process by which soil changes volume due to a change in applied pressure. This broader definition encompasses the overall concept of soil compaction, subsidence, and heave. Some types of soil, mainly those rich in organic matter, show significant creep, whereby the soil changes volume slowly at constant effective stress over a longer time-scale than consolidation due to the diffusion of water. To distinguish between the two mechanisms, "primary consolidation" refers to consolidation due to dissipation of excess water pressure, while "secondary consolidation" refers to the creep process. The effects of consolidation are most conspicuous where a building sits over a layer of soil with low stiffness and low permeability, s
https://en.wikipedia.org/wiki/GPS%20Block%20IIIF
GPS Block IIIF, or GPS III Follow On (GPS IIIF), is the second set of GPS Block III satellites, consisting of up to 22 space vehicles. The United States Air Force began the GPS Block IIIF acquisition effort in 2016. On 14 September 2018, a manufacturing contract with options worth up to $7.2 billion was awarded to Lockheed Martin. The 22 satellites in Block IIIF are projected to start launching in 2027, with launches estimated to last through at least 2034. System Enhancements Engineering efforts for Block IIIF satellites began upon contract award in 2016—a full 16 years after the government approved entry into the initial modernization efforts for GPS III in 2000. As a result, GPS Block IIIF introduces a number of improvements and novel capabilities compared to all previous GPS satellite blocks. System Improvements Redesigned Nuclear Detonation Detection System Payload Block IIIF satellites host a redesigned U.S. Nuclear Detonation Detection System (USNDS) solution that is both smaller and lighter. The USNDS is a worldwide system of space-based sensors and ground processing equipment designed to detect, identify, locate, characterize, and report nuclear detonations in the earth's atmosphere and in space. Fully-Digital Navigation Payload GPS IIIF satellites are the first to feature a 100% digital navigation payload. The fully-digital navigation payload first introduced by Block IIIF (SV11+) produces improved accuracy, better reliability, and stronger signals compared to the 70% digital navigation payload used by GPS Block III (SV01-SV10). Improved Satellite Bus GPS IIIF-03 and beyond (GPS III SV13+) will incorporate the Lockheed Martin LM2100 Combat Bus, an improvement on the LM2100M bus used in GPS III SV01 through SV12. The LM2100 Combat Bus provides improved resiliency to cyber attacks, as well as improved spacecraft power, propulsion, and electronics. Novel Capabilities Energetic Charged Particle Sensor GPS IIIF satellites will be the first GP
https://en.wikipedia.org/wiki/Fracton
A fracton is a collective quantized vibration on a substrate with a fractal structure. Fractons are the fractal analog of phonons. Phonons are the result of applying translational symmetry to the potential in a Schrödinger equation. Fractal self-similarity can be thought of as a symmetry somewhat comparable to translational symmetry. Translational symmetry is symmetry under displacement or change of position, and fractal self-similarity is symmetry under change of scale. The quantum mechanical solutions to such a problem in general lead to a continuum of states with different frequencies. In other words, a fracton band is comparable to a phonon band. The vibrational modes are restricted to part of the substrate and are thus not fully delocalized, unlike phonon vibrational modes. Instead, there is a hierarchy of vibrational modes that encompass smaller and smaller parts of the substrate.
https://en.wikipedia.org/wiki/6-Phosphogluconic%20acid
6-Phosphogluconic acid (6-phosphogluconate) is an intermediate in the pentose phosphate pathway and the Entner–Doudoroff pathway. It is formed by 6-phosphogluconolactonase, and acted upon by phosphogluconate dehydrogenase to produce ribulose 5-phosphate. It may also be acted upon by 6-phosphogluconate dehydratase to produce 2-keto-3-deoxy-6-phosphogluconate. Organophosphates
https://en.wikipedia.org/wiki/Notholaena%20hookeri
Notholaena hookeri is a species name, which may refer to: Argyrochosma chilensis, described in 1856 as Notholaena hookeri E.J.Lowe Notholaena standleyi, described in 1879 as Notholaena hookeri D.C.Eaton hookeri
https://en.wikipedia.org/wiki/Telbivudine
Telbivudine is an antiviral drug used in the treatment of hepatitis B infection. It is marketed by Swiss pharmaceutical company Novartis under the trade names Sebivo (European Union) and Tyzeka (United States). Clinical trials have shown it to be significantly more effective than lamivudine or adefovir, and less likely to cause resistance. However, HBV signature resistance mutation M204I (a change from methionine to isoleucine at position 204 in the reverse transcriptase domain of the hepatitis B polymerase) or L180M+M204V have been associated with Telbivudine resistance. Telbivudine is a synthetic thymidine β-L-nucleoside analogue; it is the L-isomer of thymidine. Telbivudine impairs hepatitis B virus (HBV) DNA replication by leading to chain termination. It differs from the natural nucleotide only with respect to the location of the sugar and base moieties, taking on an levorotatory configuration versus a dextrorotatory configuration as do the natural deoxynucleosides. It is taken orally in a dose of 600 mg once daily with or without food. Telbivudine has no in vitro activity against HIV-1, and in a case-series of three HIV-HBV co-infected patients, telbivudine did not produce sustained HIV-1 virologic suppression or induce any resistance mutations in HIV-1. Phase III clinical trials suggested that telbivudine put patients at greater risk for myopathy and peripheral neuropathy than the comparator drug lamivudine. FDA required a required a risk evaluation and mitigation strategy (REMS) aiming to increase awareness of peripheral neuropathy by requiring distribution of a medication guide. In 2016, Novartis posted a discontinuation notice. Efficacy or safety concerns were not cited as rationale for discontinuation, but rather "availability of alternative medications"; presumably this refers to tenofovir disoproxil, which became available as a generic medication in 2017, and is a safe and effective treatment for chronic HBV infection.
https://en.wikipedia.org/wiki/Long%20Josephson%20junction
In superconductivity, a long Josephson junction (LJJ) is a Josephson junction which has one or more dimensions longer than the Josephson penetration depth . This definition is not strict. In terms of underlying model a short Josephson junction is characterized by the Josephson phase , which is only a function of time, but not of coordinates i.e. the Josephson junction is assumed to be point-like in space. In contrast, in a long Josephson junction the Josephson phase can be a function of one or two spatial coordinates, i.e., or . Simple model: the sine-Gordon equation The simplest and the most frequently used model which describes the dynamics of the Josephson phase in LJJ is the so-called perturbed sine-Gordon equation. For the case of 1D LJJ it looks like: where subscripts and denote partial derivatives with respect to and , is the Josephson penetration depth, is the Josephson plasma frequency, is the so-called characteristic frequency and is the bias current density normalized to the critical current density . In the above equation, the r.h.s. is considered as perturbation. Usually for theoretical studies one uses normalized sine-Gordon equation: where spatial coordinate is normalized to the Josephson penetration depth and time is normalized to the inverse plasma frequency . The parameter is the dimensionless damping parameter ( is McCumber-Stewart parameter), and, finally, is a normalized bias current. Important solutions Small amplitude plasma waves. Soliton (aka fluxon, Josephson vortex): Here , and are the normalized coordinate, normalized time and normalized velocity. The physical velocity is normalized to the so-called Swihart velocity , which represent a typical unit of velocity and equal to the unit of space divided by unit of time .
https://en.wikipedia.org/wiki/Dionysodorus
Dionysodorus of Caunus (, c. 250 BC – c. 190 BC) was an ancient Greek mathematician. Life and work Little is known about the life of Dionysodorus. Pliny the Elder writes about a Dionysodorus who measured the earth's circumference, however he is probably the one from Pontus and different from the one from Caunus as Strabo differentiates between the two mathematicians. Dionysodorus is remembered for solving the cubic equation by means of the intersection of a rectangular hyperbola and a parabola. Eutocius credits Dionysodorus with the method of cutting a sphere into a given ratio, as described by him. Heron mentions a work by Dionysauras entitled On the Tore, in which the volume of a torus is calculated and found to be equal to the area of the generating circle multiplied by the circumference of the circle created by tracing the center of the generating circle as it rotates about the torus's axis of revolution. Dionysodorus used Archimedes' methods to prove this result. It is also likely that this Dionysodorus was the inventor of a conical sundial. Pliny's mentioning tells of an inscription placed on his tomb, addressed to the world above, stating that he had been to the centre of the earth and found it 42 thousand stadia distant. Pliny calls this a striking instance of Greek vanity; but this figure compares well with the modern measurement. Citations and footnotes
https://en.wikipedia.org/wiki/Sonic%20artifact
In sound and music production, sonic artifact, or simply artifact, refers to sonic material that is accidental or unwanted, resulting from the editing or manipulation of a sound. Types Because there are always technical restrictions in the way a sound can be recorded (in the case of acoustic sounds) or designed (in the case of synthesised or processed sounds), sonic errors often occur. These errors are termed artifacts (or sound/sonic artifacts), and may be pleasing or displeasing. A sonic artifact is sometimes a type of digital artifact, and in some cases is the result of data compression (not to be confused with dynamic range compression, which also may create sonic artifacts). Often an artifact is deliberately produced for creative reasons. For example to introduce a change in timbre of the original sound or to create a sense of cultural or stylistic context. A well-known example is the overdriving of an electric guitar or electric bass signal to produce a clipped, distorted guitar tone or fuzz bass. Editing processes that deliberately produce artifacts often involve technical experimentation. A good example of the deliberate creation of sonic artifacts is the addition of grainy pops and clicks to a recent recording in order to make it sound like a vintage vinyl record. Flanging and distortion were originally regarded as sonic artifacts; as time passed they became a valued part of pop music production methods. Flanging is added to electric guitar and keyboard parts. Other magnetic tape artifacts include wow, flutter, saturation, hiss, noise, and print-through. It is valid to consider the genuine surface noise such as pops and clicks that are audible when a vintage vinyl recording is played back or recorded onto another medium as sonic artifacts, although not all sonic artifacts must contain in their meaning or production a sense of "past", more so a sense of "by-product". Other vinyl record artifacts include turntable rumble, ticks, crackles and groove ec
https://en.wikipedia.org/wiki/Kristina%20Reiss
Kristina Reiss (born 1952) is a German mathematics educator. She is professor of mathematics education and dean of education at the Technical University of Munich, where she holds the Heinz Nixdorf Chair of Mathematics Education. Education and career Reiss studied mathematics beginning in 1971 at Heidelberg University, completing her doctorate (Dr. rer. nat.) there in 1980. Her dissertation, Eine allgemeinere Kennzeichnung der sporadischen einfachen Gruppe von Rudvalis, concerned group theory and was supervised by Zvonimir Janko. She worked as a researcher at the Karlsruhe University of Education from 1980 until 1991, when she became a professor of mathematics at the Stuttgart Technology University of Applied Sciences. She moved in 1992 to the University of Flensburg, in 1997 to the University of Oldenburg, in 2002 to the University of Augsburg, and in 2005 to the Technical University of Munich (TUM). At TUM, she was initially a professor of mathematics and computer science education and since 2009 as Heinz Nixdorf Chair of Mathematics Education. Since 2014 she has been dean of the TUM School of Education. Recognition In 2011 Reiss joined Acatech, the German Academy of Science and Engineering.
https://en.wikipedia.org/wiki/List%20of%20WHO%20fungal%20priority%20pathogens
WHO fungal priority pathogens are groups of pathogenic fungi that the World Health Organization deems in need of global attention. The list has three priority groups. In decreasing order of concern, they are: critical, high, and medium. Critical group Cryptococcus neoformans Candida auris Aspergillus fumigatus Candida albicans High group Nakaseomyces glabrata (formerly Candida glabrata) Histoplasma spp. Histoplasma capsulatum Histoplasma capsulatum var. capsulatum Histoplasma capsulatum var. duboisii Histoplasma capsulatum var. farciminosum Histoplasma duboisii Histoplasma muris eumycetoma causative agents, including: Acremonium falciform Acremonium kiliense Acremonium recifei Aspergillus flavus Aspergillus nidulans Cladophialophora bantiana Cladophialophora mycetomatis Curvularia geniculata Curvularia lunata Cylindrocarpon cyanescens Exophiala jeanselmei Falciformispora senegalensis Fusarium moniliforme Fusarium solani Glenospora clapieri Leptosphaeria senegalensis Leptosphaeria tompkinsii Madurella grisea Madurella mycetomatis Microsporum audouinii Microsporum canis Neotestudina rosatii Phaeoacremonium parasiticum Phialophora cyanescens Phialophora verrucosa Pseudoallescheria boydii (Scedosporium apiospermum) Pyrenochaeta mackinonii Pyrenochaeta romeroi Trichophyton rubrum Zopfia rosatii Mucorales includes 13 families, 56 genera, 300 species Fusarium spp. see List of Fusarium species Candida tropicalis Candida parapsilosis Medium group Scedosporium spp., including: Scedosporium americanum Scedosporium apiospermum Scedosporium aurantiacum Scedosporium boydii Scedosporium cereisporum Scedosporium deficiens Scedosporium dehoogii Scedosporium desertorum Scedosporium magalhaesii Scedosporium minutisporum Scedosporium prolificans Scedosporium rarisporum Scedosporium sanyaense Lomentospora prolificans Coccidioides spp., including: Coccidioides immitis Coccidioides posadasii Pichia kudriavzeveii (formerly Candida krusei) Cryptococcus gattii Talaromyces marneffei Pn
https://en.wikipedia.org/wiki/Radioimmunoassay
A radioimmunoassay (RIA) is an immunoassay that uses radiolabeled molecules in a stepwise formation of immune complexes. A RIA is a very sensitive in vitro assay technique used to measure concentrations of substances, usually measuring antigen concentrations (for example, hormone levels in blood) by use of antibodies. Although the RIA technique is extremely sensitive and extremely specific, requiring specialized equipment, it remains among the least expensive methods to perform such measurements. It requires special precautions and licensing, since radioactive substances are used. In contrast, an immunoradiometric assay (IRMA) is an immunoassay that uses radiolabeled molecules but in an immediate rather than stepwise way. A radioallergosorbent test (RAST) is an example of radioimmunoassay. It is used to detect the causative allergen for an allergy. Method Classically, to perform a radioimmunoassay, a known quantity of an antigen is made radioactive, frequently by labeling it with gamma-radioactive isotopes of iodine, such as 125-I, or tritium attached to tyrosine. This radiolabeled antigen is then mixed with a known amount of antibody for that antigen, and as a result, the two specifically bind to one another. Then, a sample of serum from a patient containing an unknown quantity of that same antigen is added. This causes the unlabeled (or "cold") antigen from the serum to compete with the radiolabeled antigen ("hot") for antibody binding sites. As the concentration of "cold" antigen is increased, more of it binds to the antibody, displacing the radiolabeled variant, and reducing the ratio of antibody-bound radiolabeled antigen to free radiolabeled antigen. The bound antigens are then separated and the radioactivity of the free(unbound) antigen remaining in the supernatant is measured using a gamma counter. This value is then compared to a standardised calibration curve to work out the concentration of the unlabelled antigen in the patient serum sample. RIAs c
https://en.wikipedia.org/wiki/Marcel%27s%20Quantum%20Kitchen
Marcel's Quantum Kitchen is an American television program broadcast by the Syfy channel. The first episode premiered on March 22, 2011 at 10 pm EST. The series follows Marcel Vigneron of Top Chef fame in his new molecular gastronomy catering company, where each episode features the development of unique dishes for a client's event and the event itself. In the creation of dishes, Vigneron draws inspiration from the client and the purpose of the event which has resulted in dishes such as a hash brown bird's nest with tomato foam egg and liquor-filled bonbon engagement rings in a passionfruit marshmallow pillow box. Due to low ratings, the show was cancelled after 6 episodes. Team Marcel Vigneron - Executive chef and molecular gastronomist. Jarrid Masse - Sous chef and fabricator for Vigneron's serving displays. Devon Espinosa - Sous chef and mixologist; friend of Marcel's from culinary school. Robyn Wilson - Prep cook with experience in the catering industry. Sally Camacho - (additional member called in for complex desserts) Pastry chef specializing in chocolate and sugar works, wedding cakes, bonbons, and plate desserts. Katsuya Fukushima - (additional member called in for extra help) Chef and protégé of José Andrés, as well as Marcel's friend. Specializing in avant-garde cooking techniques. Episodes
https://en.wikipedia.org/wiki/List%20of%20OpenCL%20applications
The following list contains a list of computer programs that are built to take advantage of the OpenCL or WebCL heterogeneous compute framework. Graphics ACDSee Adobe Photoshop Affinity Photo Capture One Blurate darktable FAST: imaging Medical GIMP HALCON by MVTec Helicon Focus ImageMagick Musemage Pathfinder, GPU-based font rasterizer PhotoScan seedimg CAD and 3D modelling Autodesk Maya Blender GPU rendering with NVIDIA CUDA and OptiX & AMD OpenCL Houdini LuxRender Mandelbulber Audio, video, and multimedia AlchemistXF CUETools DaVinci Resolve by Blackmagic Design FFmpeg has a number of OpenCL filters gr-fosphor GNU Radio block for RTSA-like spectrum visualization HandBrake Final Cut Pro X KNLMeansCL: Denoise plugin for AviSynth Libav OpenCV RealFlow Hybrido2 Sony Catalyst Vegas Pro by Magix Software GmbH vReveal by MotionDSP Total Media Theatre by ArcSoft x264 x265 h.265/HEVC possible Web (including WebCL) Google Chrome (experimental) Mozilla Firefox (experimental) Office Collabora Online LibreOffice Calc Games Military Operations, operational level real-time strategy game where the complete army is simulated in real-time using OpenCL Planet Explorers is using OpenCL to calculate the voxels. BeamNG.drive is going to use OpenCL for the physics engine. Leela Zero, open source replication of Alpha Go Zero using OpenCL for neural network computation. Scientific computing Advanced Simulation Library (ASL) AMD Compute Libraries clBLAS, complete set of BLAS level 1, 2 & 3 routines clSPARSE, routines for sparse matrices clFFT, FFT routines clRNG, random numbers generators MRG31k3p, MRG32k3a, LFSR113, and Philox-4×32-10 ArrayFire: parallel computing with an easy-to-use API with JIT compiler (open source), BEAGLE, Bayesian and Maximum Likelihood phylogenetics library BigDFT BOINC Bolt, STL-compatible library for creating accelerated data parallel applications Bullet CLBlast: tuned clBlas clMAGMA, OpenCL
https://en.wikipedia.org/wiki/Wine%20for%20the%20Confused
Wine for the Confused is a documentary hosted by John Cleese. It is a light-hearted introduction to wine for novices. Cleese guides viewers through the basics of wine types and grape varieties, wine making, wine tasting and terminology, buying and storing wines, through direct narrative and interviews with wine makers and wine sellers. The film duration is 92 minutes and includes visits to wineries in California. The film concludes with a large group conducting a blind wine tasting. One of the tasting results was the fact that most tasters could not distinguish between red wine and white wine. Another was that most tasters rated an inexpensive wine equal in taste to an expensive prestige wine, and both of these out scored the rest of the mid-priced and high-priced wines in the blind test.