source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Server%20Efficiency%20Rating%20Tool
The Server Efficiency Rating Tool (SERT) is a performance analysis tool that is specifically designed to addresses the requirements of the Environmental Protection Agency's ENERGY STAR for Servers v2.0 specification. The SERT Beta 1 was introduced in September, 2011. Several SPEC member companies contributed to the development of the SERT including AMD, Dell, Fujitsu, HPE, Intel, IBM, and Microsoft.
https://en.wikipedia.org/wiki/Digital%20Devil%20Story%3A%20Megami%20Tensei
Digital Devil Story: Megami Tensei refers to two distinct role-playing video games based on a trilogy of science fantasy novels by Japanese author Aya Nishitani. One version was developed by Atlus and published by Namco in 1987 for the Famicom—Atlus would go on to create further games in the Megami Tensei franchise. A separate version for personal computers was developed and published by Telenet Japan with assistance from Atlus during the same year. The story sees Japanese high school students Akemi Nakajima and Yumiko Shirasagi combat the forces of Lucifer, unleashed by a demon summoning program created by Nakajima. The gameplay features first-person dungeon crawling and turn-based battles or negotiation with demons in the Famicom version, and a journey through a hostile labyrinth as Nakajima featuring real-time combat in the Telenet version. Development on both versions of the video game began as part of a multimedia expansion of Nishitani's book series. Nishitani was deeply involved with the design and scenario. The gameplay mechanics in Atlus' role-playing version of the game were based on the Wizardry series, but with an added demon negotiation system considered revolutionary for the time. Atlus and Telenet Japan worked on their projects simultaneously, playing against genre expectations for their respective platforms. The Famicom version proved the more popular with both critics and players, leading to the development of the 1990 Famicom sequel Digital Devil Story: Megami Tensei II. An enhanced port of both games for the Super Famicom was released in 1995. Gameplay The Famicom version of Digital Devil Story: Megami Tensei is a traditional role-playing video game in which the player takes control of a party composed of two humans and a number of demons. The party explores a large dungeon using a first-person perspective. The human characters use a variety of weapons and items, with the primary weapons being swords and guns. The items, which can range from h
https://en.wikipedia.org/wiki/Algorithmic%20Puzzles
Algorithmic Puzzles is a book of puzzles based on computational thinking. It was written by computer scientists Anany and Maria Levitin, and published in 2011 by Oxford University Press. Topics The book begins with a "tutorial" introducing classical algorithm design techniques including backtracking, divide-and-conquer algorithms, and dynamic programming, methods for the analysis of algorithms, and their application in example puzzles. The puzzles themselves are grouped into three sets of 50 puzzles, in increasing order of difficulty. A final two chapters provide brief hints and more detailed solutions to the puzzles, with the solutions forming the majority of pages of the book. Some of the puzzles are well known classics, some are variations of known puzzles making them more algorithmic, and some are new. They include: Puzzles involving chessboards, including the eight queens puzzle, knight's tours, and the mutilated chessboard problem Balance puzzles River crossing puzzles The Tower of Hanoi Finding the missing element in a data stream The geometric median problem for Manhattan distance Audience and reception The puzzles in the book cover a wide range of difficulty, and in general do not require more than a high school level of mathematical background. William Gasarch notes that grouping the puzzles only by their difficulty and not by their themes is actually an advantage, as it provides readers with fewer clues about their solutions. Reviewer Narayanan Narayanan recommends the book to any puzzle aficionado, or to anyone who wants to develop their powers of algorithmic thinking. Reviewer Martin Griffiths suggests another group of readers, schoolteachers and university instructors in search of examples to illustrate the power of algorithmic thinking. Gasarch recommends the book to any computer scientist, evaluating it as "a delight".
https://en.wikipedia.org/wiki/Jitter
In electronics and telecommunications, jitter is the deviation from true periodicity of a presumably periodic signal, often in relation to a reference clock signal. In clock recovery applications it is called timing jitter. Jitter is a significant, and usually undesired, factor in the design of almost all communications links. Jitter can be quantified in the same terms as all time-varying signals, e.g., root mean square (RMS), or peak-to-peak displacement. Also, like other time-varying signals, jitter can be expressed in terms of spectral density. Jitter period is the interval between two times of maximum effect (or minimum effect) of a signal characteristic that varies regularly with time. Jitter frequency, the more commonly quoted figure, is its inverse. ITU-T G.810 classifies deviation lower frequencies below 10 Hz as wander and higher frequencies at or above 10 Hz as jitter. Jitter may be caused by electromagnetic interference and crosstalk with carriers of other signals. Jitter can cause a display monitor to flicker, affect the performance of processors in personal computers, introduce clicks or other undesired effects in audio signals, and cause loss of transmitted data between network devices. The amount of tolerable jitter depends on the affected application. Metrics For clock jitter, there are three commonly used metrics: Absolute jitter The absolute difference in the position of a clock's edge from where it would ideally be. Maximum time interval error (MTIE) Maximum error committed by a clock under test in measuring a time interval for a given period of time. Period jitter (a.k.a. cycle jitter) The difference between any one clock period and the ideal or average clock period. Period jitter tends to be important in synchronous circuitry such as digital state machines where the error-free operation of the circuitry is limited by the shortest possible clock period (average period less maximum cycle jitter), and the performance of the circuitry is s
https://en.wikipedia.org/wiki/LibreCMC
LibreCMC is a Linux-libre distribution for computers with minimal resources, such as the Ben NanoNote, ath9k-based Wi-Fi routers, and other hardware with emphasis on free software. Based on OpenWrt, the project's goal is to aim for compliance with the GNU Free System Distribution Guidelines (GNU FSDG) and ensure that the project continues to meet these requirements set forth by the Free Software Foundation (FSF). LibreCMC does not support ac (Wi-Fi 5) or ax (Wi-Fi 6) due to a lack of free chipsets. As of 2020, releases do not utilize codenames anymore. The acronym "CMC" in the libreCMC name stands for "Concurrent Machine Cluster". History On April 23, 2014, libreCMC's first public release is mentioned in a Trisquel Linux forum. On September 4, 2014, the Free Software Foundation (FSF) added libreCMC to its list of endorsed distributions. Shortly afterwards, on September 12, 2014, the FSF awarded their Respects Your Freedom (RYF) Certification to a new router pre-installed with libreCMC. On May 2, 2015, libreCMC merged with the LibreWRT project. LibreWRT, initially developed as a case study, was listed by the website prism-break.org as one of the alternatives to proprietary firmware, but today the website lists libreCMC. On March 10, 2016, the FSF awarded their RYF certification to a new router pre-installed with libreCMC. On March 29, 2017, libreCMC began its first release based upon the LEDE (Linux Embedded Development Environment) 17.01 codebase. On January 3, 2020, libreCMC began its first release based upon the OpenWrt 19.07 codebase. Release history Source List of supported hardware LibreCMC supports the following devices: Buffalo (Melco subsidiary) WZR-HP-G300NH WHR-HP-G300NH Netgear WNDR3800: v1.x TP-Link TL-MR3020: v1 TL-WR741ND: v1 - v2, v4.20 - v4.27 TL-WR841ND: v5.x, v8.x, v9.x, v10.x, v11.x, v12.x TL-WR842ND: v1, v2 TL-WR1043ND: v1.x, v2.x, v3.x, v4.x, v5.x ThinkPenguin TPE-NWIFIROUTER2 TPE-R1100 TPE-R1200 TPE-R1300 TPE-R140
https://en.wikipedia.org/wiki/Alfred%20Y.%20Cho
Alfred Yi Cho (; born July 10, 1937) is a Chinese-American electrical engineer, inventor, and optical engineer. He is the Adjunct Vice President of Semiconductor Research at Alcatel-Lucent's Bell Labs. He is known as the "father of molecular beam epitaxy"; a technique he developed at that facility in the late 1960s. He is also the co-inventor, with Federico Capasso of quantum cascade lasers at Bell Labs in 1994. Cho was elected a member of the National Academy of Engineering in (1985) for his pioneering development of a molecular beam epitaxy technique, leading to unique semiconductor layer device structures. Biography Cho was born in Beiping. He went to Hong Kong in 1949 and had his secondary education in Pui Ching Middle School there. Cho holds B.S., M.S. and Ph.D. degrees in electrical engineering from the University of Illinois. He joined Bell Labs in 1968. He is a member of the National Academy of Sciences and the National Academy of Engineering, as well as a Fellow of the American Physical Society, the Institute of Electrical and Electronics Engineers, the American Philosophical Society, and the American Academy of Arts and Sciences. In June 2007 he was honoured with the U.S. National Medal of Technology, the highest honor awarded by the President of the United States for technological innovation. Cho received the award for his contributions to the invention of molecular beam epitaxy (MBE) and his work to commercialize the process. He already has many awards to his name, including: the American Physical Society's International Prize for New Materials in 1982, the Solid State Science and Technology Medal of the Electrochemical Society in 1987, the World Materials Congress Award of ASM International in 1988, the Gaede-Langmuir Award of the American Vacuum Society in 1988, the IRI Achievement Award of the Industrial Research Institute in 1988, the New Jersey Governor's Thomas Alva Edison Science Award in 1990, the International Crystal Growth Award of the
https://en.wikipedia.org/wiki/Rhodothermus%20marinus
Rhodothermus marinus is a species of bacteria. It is obligately aerobic, moderately halophilic, thermophilic, Gram-negative and rod-shaped, about 0.5 μm in diameter and 2-2.5 μm long.
https://en.wikipedia.org/wiki/Metabolomics
Metabolomics is the scientific study of chemical processes involving metabolites, the small molecule substrates, intermediates, and products of cell metabolism. Specifically, metabolomics is the "systematic study of the unique chemical fingerprints that specific cellular processes leave behind", the study of their small-molecule metabolite profiles. The metabolome represents the complete set of metabolites in a biological cell, tissue, organ, or organism, which are the end products of cellular processes. Messenger RNA (mRNA), gene expression data, and proteomic analyses reveal the set of gene products being produced in the cell, data that represents one aspect of cellular function. Conversely, metabolic profiling can give an instantaneous snapshot of the physiology of that cell, and thus, metabolomics provides a direct "functional readout of the physiological state" of an organism. There are indeed quantifiable correlations between the metabolome and the other cellular ensembles (genome, transcriptome, proteome, and lipidome), which can be used to predict metabolite abundances in biological samples from, for example mRNA abundances. One of the ultimate challenges of systems biology is to integrate metabolomics with all other -omics information to provide a better understanding of cellular biology. History The concept that individuals might have a "metabolic profile" that could be reflected in the makeup of their biological fluids was introduced by Roger Williams in the late 1940s, who used paper chromatography to suggest characteristic metabolic patterns in urine and saliva were associated with diseases such as schizophrenia. However, it was only through technological advancements in the 1960s and 1970s that it became feasible to quantitatively (as opposed to qualitatively) measure metabolic profiles. The term "metabolic profile" was introduced by Horning, et al. in 1971 after they demonstrated that gas chromatography-mass spectrometry (GC-MS) could be used to me
https://en.wikipedia.org/wiki/Robertson%20Centre%20for%20Biostatistics
The Robertson Centre for Biostatistics is a specialised biostatistical research centre in Glasgow, Scotland. It is part of the College of Medical, Veterinary and Life Sciences and the Institute of Health and Wellbeing at the University of Glasgow. All scales of research are carried out at the centre from multi-site clinical trials to small scale research projects. The centre also has interests in the development of novel informatics solutions for clinical research, statistical issues in epidemiology and health economic evaluation. History The centre led the WOSCOP study (New England Journal of Medicine 1995; 333:1301-7) which found that treatment with Pravastatin significantly reduced the risk of myocardial infarction and the risk of death from cardiovascular causes without adversely affecting the risk of death from noncardiovascular causes in men with moderate hypercholesterolaemia and no history of myocardial infarction. The Robertson Centre joined with the Glasgow Clinical Research Facility and Greater Glasgow and Clyde NHS R&D division in November 2007 to create a UKCRN registered Clinical Trials Unit - the Glasgow Clinical Trials Unit.
https://en.wikipedia.org/wiki/InterNetNews
InterNetNews (INN) is a Usenet news server package, originally released by Rich Salz in 1991, and presented at the Summer 1992 USENIX conference in San Antonio, Texas. It was the first news server with integrated NNTP functionality. While previous servers processed articles individually or in batches, innd is a single continuously running process that receives articles from the network, files them, and records what remote hosts should receive them. Readers can access articles directly from the disk in the same manner as B News and C News, but an included program, called nnrpd, also serves newsreaders that employ NNTP. A later improvement was the Cyclical News Filesystem (CNFS), which sequentially stores articles in large on-disk buffers. This method, implemented by Scott Fritchie, greatly increased performance by eliminating the operating system overhead needed to deal with thousands of individual article files. James Brister's innfeed program was also added to the package. Like innd, innfeed operates continuously to feed articles out to other servers, while the earlier innxmit processed them in batches. This combination allows articles to be received and redistributed with virtually no latency, and has substantially changed the nature of Usenet interaction by reducing the time for messages to be posted, read across the network and answered, from hours or days, to seconds or minutes. A similar earlier program, called nntplink, provided a comparable function, but it was produced independently. INN is under active development . The package is maintained by volunteers, and development is hosted by the Internet Systems Consortium. The current maintainer of INN is Russ Allbery and the ISC. Notes
https://en.wikipedia.org/wiki/Skewb%20Ultimate
The Skewb Ultimate, originally marketed as the Pyraminx Ball, is a twelve-sided puzzle derivation of the Skewb, produced by German toy-maker Uwe Mèffert. Most versions of this puzzle are sold with six different colors of stickers attached, with opposite sides of the puzzle having the same color; however, some early versions of the puzzle have a full set of 12 colors. Description The Skewb Ultimate is made in the shape of a dodecahedron, like the Megaminx, but cut differently. Each face is cut into four parts, two equal and two unequal. Each cut is a deep cut: it bisects the puzzle. This results in eight smaller corner pieces and six larger "edge" pieces. The object of the puzzle is to scramble the colors, and then restore them to the original configuration. Solutions At first glance, the Skewb Ultimate appears to be much more difficult to solve than the other Skewb puzzles, because of its uneven cuts which cause the pieces to move in a way that may seem irregular or strange. Mathematically speaking, however, the Skewb Ultimate has exactly the same structure as the Skewb Diamond. The solution for the Skewb Diamond can be used to solve this puzzle, by identifying the Diamond's face pieces with the Ultimate's corner pieces, and the Diamond's corner pieces with the Ultimate's edge pieces. The only additional trick here is that the Ultimate's corner pieces (equivalent to the Diamond's face pieces) are sensitive to orientation, and so may require an additional algorithm for orienting them after being correctly placed. Similarly, the Skewb Ultimate is mathematically identical to the Skewb, by identifying corners with corners, and the Skewb's face centers with the Ultimate's edges. The solution of the Skewb can be used directly to solve the Skewb Ultimate. The only addition is that the edge pieces of the Skewb Ultimate are sensitive to orientation, and may require an additional algorithm to orient them after being placed correctly. Number of combinations The Skewb Ult
https://en.wikipedia.org/wiki/Shottsuru
Shottsuru (塩魚汁) is a pungent regional Japanese fish sauce similar to the Thai nam pla. The authentic version is made from the fish known the hatahata (Arctoscopus japonicus or sailfin sandfish), and its production is associated with the Akita region. See also List of fish sauces External links Information Japanese condiments Fish sauces Umami enhancers
https://en.wikipedia.org/wiki/Carl%20Gustav%20Jacob%20Jacobi
Carl Gustav Jacob Jacobi (; ; 10 December 1804 – 18 February 1851) was a German mathematician who made fundamental contributions to elliptic functions, dynamics, differential equations, determinants, and number theory. His name is occasionally written as Carolus Gustavus Iacobus Iacobi in his Latin books, and his first name is sometimes given as Karl. Biography Jacobi was born of Ashkenazi Jewish parentage in Potsdam on 10 December 1804. He was the second of four children of banker Simon Jacobi. His elder brother Moritz von Jacobi would also become known later as an engineer and physicist. He was initially home schooled by his uncle Lehman, who instructed him in the classical languages and elements of mathematics. In 1816, the twelve-year-old Jacobi went to the Potsdam Gymnasium, where students were taught all the standard subjects: classical languages, history, philology, mathematics, sciences, etc. As a result of the good education he had received from his uncle, as well as his own remarkable abilities, after less than half a year Jacobi was moved to the senior year despite his young age. However, as the University would not accept students younger than 16 years old, he had to remain in the senior class until 1821. He used this time to advance his knowledge, showing interest in all subjects, including Latin, Greek, philology, history and mathematics. During this period he also made his first attempts at research, trying to solve the quintic equation by radicals. In 1821 Jacobi went to study at Berlin University, where he initially divided his attention between his passions for philology and mathematics. In philology he participated in the seminars of Böckh, drawing the professor's attention with his talent. Jacobi did not follow a lot of mathematics classes at the time, finding the level of mathematics taught at Berlin University too elementary. He continued, instead, with his private study of the more advanced works of Euler, Lagrange and Laplace. By 1823 he u
https://en.wikipedia.org/wiki/Beeswax
Beeswax (also known as cera alba) is a natural wax produced by honey bees of the genus Apis. The wax is formed into scales by eight wax-producing glands in the abdominal segments of worker bees, which discard it in or at the hive. The hive workers collect and use it to form cells for honey storage and larval and pupal protection within the beehive. Chemically, beeswax consists mainly of esters of fatty acids and various long-chain alcohols. Beeswax has been used since prehistory as the first plastic, as a lubricant and waterproofing agent, in lost wax casting of metals and glass, as a polish for wood and leather, for making candles, as an ingredient in cosmetics and as an artistic medium in encaustic painting. Beeswax is edible, having similarly negligible toxicity to plant waxes, and is approved for food use in most countries and in the European Union under the E number E901. However, due to its inability to be broken down by the human digestive system, it has insignificant nutritional value. Production Beeswax is formed by worker bees, which secrete it from eight wax-producing mirror glands on the inner sides of the sternites (the ventral shield or plate of each segment of the body) on abdominal segments 4 to 7. The sizes of these wax glands depend on the age of the worker, and after many daily flights, these glands gradually begin to atrophy. The new wax is initially glass-clear and colorless, becoming opaque after chewing and being contaminated with pollen by the hive worker bees, becoming progressively yellower or browner by incorporation of pollen oils and propolis. The wax scales are about across and thick, and about 1100 are needed to make a gram of wax. Worker bees use the beeswax to build honeycomb cells. For the wax-making bees to secrete wax, the ambient temperature in the hive must be . The book Beeswax Production, Harvesting, Processing and Products suggests of beeswax is sufficient to store of honey. Another study estimated that of wax c
https://en.wikipedia.org/wiki/Voigt%20notation
In mathematics, Voigt notation or Voigt form in multilinear algebra is a way to represent a symmetric tensor by reducing its order. There are a few variants and associated names for this idea: Mandel notation, Mandel–Voigt notation and Nye notation are others found. Kelvin notation is a revival by Helbig of old ideas of Lord Kelvin. The differences here lie in certain weights attached to the selected entries of the tensor. Nomenclature may vary according to what is traditional in the field of application. For example, a 2×2 symmetric tensor X has only three distinct elements, the two on the diagonal and the other being off-diagonal. Thus it can be expressed as the vector . As another example: The stress tensor (in matrix notation) is given as In Voigt notation it is simplified to a 6-dimensional vector: The strain tensor, similar in nature to the stress tensor—both are symmetric second-order tensors --, is given in matrix form as Its representation in Voigt notation is where , , and are engineering shear strains. The benefit of using different representations for stress and strain is that the scalar invariance is preserved. Likewise, a three-dimensional symmetric fourth-order tensor can be reduced to a 6×6 matrix. Mnemonic rule A simple mnemonic rule for memorizing Voigt notation is as follows: Write down the second order tensor in matrix form (in the example, the stress tensor) Strike out the diagonal Continue on the third column Go back to the first element along the first row. Voigt indexes are numbered consecutively from the starting point to the end (in the example, the numbers in blue). Mandel notation For a symmetric tensor of second rank only six components are distinct, the three on the diagonal and the others being off-diagonal. Thus it can be expressed, in Mandel notation, as the vector The main advantage of Mandel notation is to allow the use of the same conventional operations used with vectors, for example: A symmetric tensor o
https://en.wikipedia.org/wiki/Pyraclostrobin
Pyraclostrobin is a quinone outside inhibitor (QI)-type fungicide used in agriculture. Among the QIs, it lies within the strobilurin chemical class. Use Pyraclostrobin is used to protect Fragaria, Rubus idaeus, Vaccinium corymbosum, Ribes rubrum, Ribes uva-crispa, blackberry (various Rubus spp.), and Pistachio vera. Target pathogens Pyraclostrobin is used against Botrytis cinerea and Alternaria alternata. Resistance Resistant populations have been identified in: Botrytis cinerea on Fragaria in the Carolinas, conferred by the G143A mutation in the partial cytochrome b (CYTB) gene. Botrytis cinerea on Fragaria, Rubus idaeus, Vaccinium corymbosum, Ribes rubrum, Ribes uva-crispa, and blackberry (various Rubus spp.) in Northern Germany. Botrytis cinerea on Fragaria in Florida. Alternaria alternata on Pistachio vera in California. Geography of use United States Pyraclostrobin was widely used throughout the United States , but especially in the Upper Midwest. Off-target toxicity Although toxic, and recommended to be avoided by humans, pyraclostrobin is of temporary and low toxicity, that is to say it is merely an irritant of eyes and skin. It does cause some degree of reproductive and developmental failure in mammals but does not absorb well through the skin. It is likely to bioaccumulate in aquatic organisms. Residues in diet Pyraclostrobin does not accumulate in foods to a significant degree. Biodegradability Pyraclostrobin is described by one source as not very biodegradable, and by another as possibly significantly biodegradable.
https://en.wikipedia.org/wiki/Processing%20amplifier
Processing amplifier, commonly called ProcAmp, is used to alter, change or clean video or audio signal components or parameters in realtime. Form factor Broadcast professionals prefer to use hardware rack mountable ProcAmps that helps them make video broadcast safe by correcting video inconsistencies. They may also be chip-based, as part of other larger multi-purpose devices in professional environments. Software ProcAmps are also available as code embedded in media players like Windows Media Player, VLC, KMPlayer, or in codecs like ffdshow. Software ProcAmps can process media either on the CPU or GPU. Video ProcAmp Video ProcAmps can be used for processing standard-definition 525/30 (NTSC) 625/25 (PAL) or high-definition video signals. ProcAmps can process video signals ranging from analog composite to SDI video signals. Common ProcAmp Controls: Brightness (Luminance) Contrast (Gain) Saturation (Amplitude) Hue (Phase) Common ProcAmp features: Regenerate sync and color burst Adjust sync amplitude Boost low light level video Reduce video wash out Chroma clipping See also Video processing Broadcast-safe External links More information about TV standards
https://en.wikipedia.org/wiki/Reactive%20empirical%20bond%20order
The reactive empirical bond-order (REBO) model is a function for calculating the potential energy of covalent bonds and the interatomic force. In this model, the total potential energy of system is a sum of nearest-neighbour pair interactions which depend not only on the distance between atoms but also on their local atomic environment. A parametrized bond order function was used to describe chemical pair bonded interactions. The early formulation and parametrization of REBO for carbon systems was done by Tersoff in 1988, based on works of Abell. The Tersoff's model could describe single, double and triple bond energies in carbon structures such as in hydrocarbons and diamonds. A significant step was taken by Brenner in 1990. He extended Tersoff's potential function to radical and conjugated hydrocarbon bonds by introducing two additional terms into the bond order function. Compared to classical first-principle and semi-empirical approaches, the REBO model is less time-consuming, since only the 1st- and 2nd-nearest-neighbour interactions were considered. This advantage of computational efficiency is especially helpful for large-scale atomic simulations (from 1000 to 1000000 atoms). In recent years, the REBO model has been widely used in the studies concerning mechanical and thermal properties of carbon nanotubes. Despite numerous successful applications of the first-generation REBO potential function, its several drawbacks have been reported. e.g. its form is too restrictive to simultaneously fit equilibrium distances, energies, and force constants for all types of C-C bonds, the possibility of modeling processes involving energetic atomic collisions is limited because both Morse-type terms go to finite values when the atomic distance decreases, and the neglect of a separate pi bond contribution leads to problems with the overbinding of radicals and a poor treatment of conjugacy. To overcome these drawbacks, an extension of Brenner's potential was proposed by St
https://en.wikipedia.org/wiki/Blowing%20agent
A blowing agent is a substance which is capable of producing a cellular structure via a foaming process in a variety of materials that undergo hardening or phase transition, such as polymers, plastics, and metals. They are typically applied when the blown material is in a liquid stage. The cellular structure in a matrix reduces density, increasing thermal and acoustic insulation, while increasing relative stiffness of the original polymer. Blowing agents (also known as 'pneumatogens') or related mechanisms to create holes in a matrix producing cellular materials, have been classified as follows: Physical blowing agents include CFCs (however, these are ozone depletants, banned by the Montreal Protocol of 1987), HCFCs (replaced CFCs, but are still ozone depletants, therefore being phased out), hydrocarbons (e.g. pentane, isopentane, cyclopentane), and liquid CO2. The bubble/foam-making process is irreversible and endothermic, i.e. it needs heat (e.g. from a melt process or the chemical exotherm due to cross-linking), to volatilize a liquid blowing agent. However, on cooling the blowing agent will condense, i.e. a reversible process. Chemical blowing agents include isocyanate and water for polyurethane, azodicarbonamide for vinyl, hydrazine and other nitrogen-based materials for thermoplastic and elastomeric foams, and sodium bicarbonate for thermoplastic foams. Gaseous products and other byproducts are formed by a chemical reaction of the chemical blowing agent, promoted by the heat of the foam production process or a reacting polymer's exothermic heat. Since the blowing reaction occurs forming low molecular weight compounds acting as the blowing gas, additional exothermic heat is also released. Powdered titanium hydride is used as a foaming agent in the production of metal foams, as it decomposes to form hydrogen gas and titanium at elevated temperatures. Zirconium(II) hydride is used for the same purpose. Once formed the low molecular weight compounds will neve
https://en.wikipedia.org/wiki/Journal%20of%20Recreational%20Mathematics
The Journal of Recreational Mathematics was an American journal dedicated to recreational mathematics, started in 1968. It had generally been published quarterly by the Baywood Publishing Company, until it ceased publication with the last issue (volume 38, number 2) published in 2014. The initial publisher (of volumes 1–5) was Greenwood Periodicals. Harry L. Nelson was primary editor for five years (volumes 9 through 13, excepting volume 13, number 4, when the initial editor returned as lead) and Joseph Madachy, the initial lead editor and editor of a predecessor called Recreational Mathematics Magazine which ran during the years 1961 to 1964, was the editor for many years. Charles Ashbacher and Colin Singleton took over as editors when Madachy retired (volume 30 number 1). The final editors were Ashbacher and Lamarr Widmer. The journal has from its inception also listed associate editors, one of whom was Leo Moser. The journal contains: Original articles Book reviews Alphametics And Solutions To Alphametics Problems And Conjectures Solutions To Problems And Conjectures Proposer's And Solver's List For Problems And Conjectures Indexing The journal is indexed in: Academic Search Premier Book Review Index International Bibliography of Periodical Literature International Bibliography of Book Reviews Readers' Guide to Periodical Literature The Gale Group
https://en.wikipedia.org/wiki/Magnetocardiography
Magnetocardiography (MCG) is a technique to measure the magnetic fields produced by electrical currents in the heart using extremely sensitive devices such as the superconducting quantum interference device (SQUID). If the magnetic field is measured using a multichannel device, a map of the magnetic field is obtained over the chest; from such a map, using mathematical algorithms that take into account the conductivity structure of the torso, it is possible to locate the source of the activity. For example, sources of abnormal rhythms or arrhythmia may be located using MCG. History The first MCG measurements were made by Baule and McFee using two large coils placed over the chest, connected in opposition to cancel out the relatively large magnetic background. Heart signals were indeed seen, but were very noisy. The next development was by David Cohen, who used a magnetically shielded room to reduce the background, and a smaller coil with better electronics; the heart signals were now less noisy, allowing a magnetic map to be made, verifying the magnetic properties and source of the signal. However, the use of an inherently noisy coil detector discouraged widespread interest in the MCG. The turning point came with the development of the sensitive detector called the SQUID (superconducting quantum interference device) by James Zimmerman. The combination of this detector and Cohen's new shielded room at MIT allowed the MCG signal to be seen as clearly as the conventional electrocardiogram, and the publication of this result by Cohen et al. marked the real beginning of magnetocardiography (as well as biomagnetism generally). Magnetocardiography is used in various laboratories and clinics around the world, both for research on the normal human heart, and for clinical diagnosis. Clinical implementation MCG technology has been implemented in hospitals in Germany. The MCG system, CS MAG II of Biomagnetik Park GmbH, was installed at Coburg Hospital in 2013. The CS-MAG III
https://en.wikipedia.org/wiki/WISPr
WISPr (pronounced "whisper") or Wireless Internet Service Provider roaming is a draft protocol submitted to the Wi-Fi Alliance that allows users to roam between wireless internet service providers in a fashion similar to that which allows cellphone users to roam between carriers. A RADIUS server is used to authenticate the subscriber's credentials. It covers best practices for authenticating users via 802.1X or the Universal Access Method (UAM), the latter being another name for browser-based login at a captive portal hotspot. It requires that RADIUS be used for AAA and defines the required RADIUS attributes. For authentication by smart-clients, Appendix D defines the Smart Client to Access Gateway Interface Protocol, which is an XML-based protocol for authentication. Smart-client software (and devices that use it) use this so-called WISPr XML to seamlessly login to HotSpots without the need for the user to interact with a captive portal. The draft WISPr specification is no longer available from the Wi-Fi Alliance. It was submitted in a manner that does not conform with current IPR policies within the Wi-Fi Alliance. Intel and others have started a similar proposal — IRAP, which has now been rolled into ETSI Telecommunications and Internet converged Services and Protocols for Advanced Networking (TISPAN); TS 183 019 and TS 183 020. The WISPr 2.0 specification was published by the Wireless Broadband Alliance in March 2010. See also IEEE 802.11 Wireless Internet service provider ETSI
https://en.wikipedia.org/wiki/Freshers%27%20flu
Freshers' flu is a name commonly given to a battery of illnesses contracted by new students (freshers) during the first few weeks at a university, and colleges of further education in some form; common symptoms include fever, sore throat, severe headache, coughing and general discomfort. The illnesses may or may not include actual flu and is often simply a bad cold. Causes The most likely cause is the convergence of large numbers of people arriving from all over the world; this is a particularly elevated risk due to the COVID-19 pandemic. The poor diet and heavy consumption of alcohol during freshers' week is also reported as a cause for many of the illnesses contracted during this time. "Stress, which may be induced by tiredness, combined with a poor diet, late nights and too much alcohol, can weaken the immune system and be a recipe for ill health. All this can make students and staff working with the students more susceptible to infections within their first weeks of term." In addition to this, nearly all university academic years in the UK commence around the end of September or beginning of October, which "marks the start of the annual flu season". The increased susceptibility to illness from late nights, heavy alcohol consumption and stress peaks 2–4 weeks after arrival at university and happens to coincide with the seasonal surge in the outbreaks of colds and flu in the Northern Hemisphere. Other effects As well as the usual viral effects, freshers' flu can also have some psychological effects. These effects arise where the stress of leaving home and other consequences of being independent, not to mention various levels of homesickness and the attempts at making new friends, can further weaken the immune system, increasing susceptibility to illness. See also Freshman 15
https://en.wikipedia.org/wiki/Troll%20doll
A troll doll (Danish: Gjøltrold) is a type of plastic doll with furry up-combed hair depicting a troll, also known as a Dam doll after their creator Danish woodcutter Thomas Dam. The inspiration came from trolls in old Scandinavian folklore. The toys are also known as good luck trolls, or gonk trolls in the United Kingdom. The dolls were first created in the late 1950s, and introduced in 1959, before becoming one of the United States' biggest toy fads in the early 1960s. They became briefly popular again during the 1970s through the 1990s and were copied by several manufacturers under different names. During the 1990s, several video games and a video show were created based on troll dolls. In 2003, the Dam company restored the United States copyrights for the trolls, stopping unlicensed production. In 2005, the Dam company licensed the brand to DIC Entertainment, who attempted to modernize the brand by creating a cartoon under the name Trollz, but the show only lasted one season. The failed cartoon also lead into a lawsuit followed by a counter-claim lawsuit. In 2013, the brand was bought by DreamWorks Animation, with an animated feature film called Trolls being released in 2016, a sequel being released in 2020, and another sequel set to be released in 2023. Toy history Troll dolls were created in 1959 by Danish fisherman and woodcutter Thomas Dam. Dam could not afford a Christmas gift for his young daughter Lila and carved the doll from his imagination. Other children in the Danish town of Gjøl saw the doll and wanted one. Dam's company Dam Things began producing the dolls in plastic under the name Good Luck Trolls. It became popular in several European countries during the early 1960s, shortly before they were introduced in the United States. They became one of the United States' biggest toy fads from the autumn of 1963 to 1965. The originals were of the highest quality, also called Dam dolls and featuring sheep wool hair and glass eyes. Their sudden popula
https://en.wikipedia.org/wiki/Tar%20water
Tar-water was a medieval medicine consisting of pine tar and water. As it was foul-tasting, it slowly dropped in popularity, but was revived in the Victorian era. It is used both as a tonic and as a substitute to get rid of "strong spirits". Both these uses were originally advocated by the philosopher George Berkeley (1685–1753), who lauded it in his tract Siris: A Chain of Philosophical Reflections and Inquiries, Concerning the Virtues of Tar Water (1744). It was regarded by medical experts to be quackery. History The use of tar water is mentioned in the second chapter of Charles Dickens's (1812–1870) Great Expectations (1861). Young Pip and his brother-in-law, Joe, were often force fed it by Mrs. Joe, Pip's elder sister, whether they were ill or not, as a sort of cruel punishment. The physician Cadwallader Colden (1688–1776) extolled the virtues of pine resin steeped in water. This concoction also was called "tar water". George Berkeley suggested that tar from pine or fir be stirred for three or four minutes with an equal quantity of water and the mixture allowed to stand for 48 hours. At this time, the separated water is drawn off to be drunk, at the rate of one half-pint night and morning "on an empty stomach". Fresh water is added to the unused portion and again stirred to provide more of the preparation, until the mixture becomes too weak. Explorer Henry Ellis (1721–1806) praises tar water as "the only powerful and prevailing medicine" against scurvy during his 1746 voyage to Hudson's Bay (although his editor, James Lind, notes that "a want of greens and herbage" was the chief cause of the outbreak). Fleuriot the lieutenant, who suffers from consumption in second degree, is mentioned to have been advised to take tar water in aid of his battle against the ailment, by Sarrazin the general in Memoirs of Vidocq (1828) by Eugène François Vidocq (1775–1857): In the introduction of his Journal of A Voyage to Lisbon (1749), English author Henry Fielding (1707–1
https://en.wikipedia.org/wiki/Lung%20float%20test
The lung float test, also called the hydrostatic test or docimasia, is a controversial autopsy procedure used in determining whether lungs have undergone respiration. It has historically been employed in cases of suspected infanticide to help determine whether or not an infant was stillborn. In the test, lungs that float in water are thought to have been aerated, while those that sink are presumed to indicate an absence of air. The test is not infallible and many factors can cause the test to give false positive or negative results. Decomposition may result in postmortem gas formation, allowing a non-aerated lung to float. During labor, air can be introduced to a deceased infant's lungs while moving through the birth canal. Lungs exposed to air do not always float. Große Ostendorf et al. showed the procedure gave a false result in 2% of cases. In a 1997 paper, J.J. Moar emphasises the risk of misdiagnosing live birth, writing that the "majority of new born infants seen at autopsy show signs of varying degrees of decomposition, as they are often found in garbage, wrapped in newspaper or plastic bags, or lying in an open field. Even microscopic putrefaction can cause unexpanded lungs to float, when gas formation may not be macroscopically apparent. Naturally, any attempts at resuscitation may partially expand the lungs of a new born infant, leading to further difficulty in establishing live birth." The difference between the lungs of a foetus and those of an infant was noted by the Ancient Greek physician Galen. The lung float test was described in the 1670s by Hungarian botanist Károly Rayger and first performed in 1681. German physician Johannes Schreyer performed a lung float test in 1690. The application of the lung float test to determine breathing and live birth has many spurious medico-legal considerations. In South Africa a foetus must have breathed to be considered a person under the law. Lung float tests are also used by forensic pathologists to determin
https://en.wikipedia.org/wiki/Applied%20ecology
Applied ecology is a sub-field within ecology that considers the application of the science of ecology to real-world (usually management) questions. It is also described as a scientific field that focuses on the application of concepts, theories, models, or methods of fundamental ecology to environmental problems. Concept Applied ecology is an integrated treatment of the ecological, social, and biotechnological aspects of natural resource conservation and management. Applied ecology typically focuses on geomorphology, soils, and plant communities as the underpinnings for vegetation and wildlife (both game and non-game) management. Applied ecology includes all disciplines that are related to human activities so that it does not only cover agriculture, forestry and fisheries but also global change. It has two study categories. The first involves the outputs or those fields that address the use and management of the environment, particularly for its ecosystem services and exploitable resources. The second are the inputs or those that are concerned with the management strategies or human influences on the ecosystem or biodiversity. The discipline is often linked to ecological management on the grounds that the effective management of natural ecosystems depends on ecological knowledge. It often uses an ecological approach to solve problems of specific parts of the environment, which can involve the comparison of plausible options (e.g. best management options). The role of applied science in agricultural production has been brought into greater focus as fluctuations in global food production feed through into prices and availability to consumers. Approaches Applied ecologists often use one or more of the following approaches, namely, observation, experimentation, and modeling. For example, a wildlife preservation project could involve: observational studies of the wildlife ecology; experiments to understand causal relationships; and the application of modeling to
https://en.wikipedia.org/wiki/Coded%20mark%20inversion
In telecommunication, coded mark inversion (CMI) is a non-return-to-zero (NRZ) line code. It encodes zero bits as a half bit time of zero followed by a half bit time of one, and while one bits are encoded as a full bit time of a constant level. The level used for one bits alternates each time one is coded. This is vaguely reminiscent of, but quite different from, Miller encoding, which also uses half-bit and full-bit pulses, but additionally uses the half-one/half-zero combination and arranges them so that the signal always spends at least a full bit time at a particular level before transitioning again. CMI doubles the bitstream frequency, when compared to its simple NRZ equivalent, but allows easy and reliable clock recovery. See also Manchester code
https://en.wikipedia.org/wiki/Magnetic%20reconnection
Magnetic reconnection is a physical process occurring in electrically conducting plasmas, in which the magnetic topology is rearranged and magnetic energy is converted to kinetic energy, thermal energy, and particle acceleration. Magnetic reconnection involves plasma flows at a substantial fraction of the Alfvén wave speed, which is the fundamental speed for mechanical information flow in a magnetized plasma. The concept of magnetic reconnection was developed in parallel by researchers working in solar physics and in the interaction between the solar wind and magnetized planets. This reflects the bidirectional nature of reconnection, which can either disconnect formerly connected magnetic fields or connect formerly disconnected magnetic fields, depending on the circumstances. Ron Giovanelli is credited with the first publication invoking magnetic energy release as a potential mechanism for particle acceleration in solar flares. Giovanelli proposed in 1946 that solar flares stem from the energy obtained by charged particles influenced by induced electric fields within close proximity of sunspots. In the years 1947-1948, he published more papers further developing the reconnection model of solar flares. In these works, he proposed that the mechanism occurs at points of neutrality (weak or null magnetic field) within structured magnetic fields. James Dungey is credited with first use of the term “magnetic reconnection” in his 1950 PhD thesis, to explain the coupling of mass, energy and momentum from the solar wind into Earth's magnetosphere. The concept was published for the first time in a seminal paper in 1961. Dungey coined the term "reconnection" because he envisaged field lines and plasma moving together in an inflow toward a magnetic neutral point (2D) or line (3D), breaking apart and then rejoining again but with different magnetic field lines and plasma, in an outflow away from the magnetic neutral point or line. In the meantime, the first theoretical fram
https://en.wikipedia.org/wiki/Irma%20board
Irma board, originally spelled IRMA board, refers to a brand of coaxial interface cards for PCs and Macintosh computers used to enable 3270 emulator programs to connect to IBM mainframe computers. IRMA boards were used to connect PCs and Macs to IBM 3274 terminal controllers. IRMA boards supported both Control Unit Terminal (CUT) and Distributed Function Terminal (DFT) mode, although the later required additional software–DFT mode supported multiple simultaneous mainframe sessions. IRMA boards were invented by Technical Analysis Corp. (TAC), acquired by Digital Communications Associates, Inc. (DCA) who manufactured and marketed the Irma products from 1982 on. DCA of Alpharetta, Georgia, was acquired in 1994 by Attachmate of Bellevue, Washington. A board with all the capabilities of that which would eventually be called IRMA was originally developed in-house by Amdahl Corp in 1977, but it was not actively marketed by Amdahl. See also IBM 3270 PC Avatar Technologies, Inc. (née 3R Computers), makers of the Mac Mainframe line of products allowing IBM 3270 emulation on the Macintosh SE and II Terminal emulator
https://en.wikipedia.org/wiki/Unitarity%20gauge
In theoretical physics, the unitarity gauge or unitary gauge is a particular choice of a gauge fixing in a gauge theory with a spontaneous symmetry breaking. In this gauge, the scalar fields responsible for the Higgs mechanism are transformed into a basis in which their Goldstone boson components are set to zero. In other words, the unitarity gauge makes the manifest number of scalar degrees of freedom minimal. The gauge was introduced to particle physics by Steven Weinberg in the context of the electroweak theory. In electroweak theory, the degrees of freedom in a unitarity gauge are the massive spin-1 W+, W− and Z bosons with three polarizations each, the photon with two polarizations, and the scalar Higgs boson. The unitarity gauge is usually used in tree-level calculations. For loop calculations, other gauge choices such as the 't Hooft–Feynman gauge often reduce the mathematical complexity of the calculation.
https://en.wikipedia.org/wiki/Plasma%20etching
Plasma etching is a form of plasma processing used to fabricate integrated circuits. It involves a high-speed stream of glow discharge (plasma) of an appropriate gas mixture being shot (in pulses) at a sample. The plasma source, known as etch species, can be either charged (ions) or neutral (atoms and radicals). During the process, the plasma generates volatile etch products at room temperature from the chemical reactions between the elements of the material etched and the reactive species generated by the plasma. Eventually the atoms of the shot element embed themselves at or just below the surface of the target, thus modifying the physical properties of the target. Mechanisms Plasma generation A plasma is a high energetic condition in which a lot of processes can occur. These processes happen because of electrons and atoms. To form the plasma electrons have to be accelerated to gain energy. Highly energetic electrons transfer the energy to atoms by collisions. Three different processes can occur because of this collisions: Excitation Dissociation Ionization Different species are present in the plasma such as electrons, ions, radicals, and neutral particles. Those species are interacting with each other constantly. Plasma etching can be divided into two main types of interaction: generation of chemical species interaction with the surrounding surfaces Without a plasma, all those processes would occur at a higher temperature. There are different ways to change the plasma chemistry and get different kinds of plasma etching or plasma depositions. One of the excitation techniques to form a plasma is by using RF excitation of a power source of 13.56 MHz. The mode of operation of the plasma system will change if the operating pressure changes. Also, it is different for different structures of the reaction chamber. In the simple case, the electrode structure is symmetrical, and the sample is placed upon the grounded electrode. Influences on the process The key to de
https://en.wikipedia.org/wiki/Cavalieri%27s%20principle
In geometry, Cavalieri's principle, a modern implementation of the method of indivisibles, named after Bonaventura Cavalieri, is as follows: 2-dimensional case: Suppose two regions in a plane are included between two parallel lines in that plane. If every line parallel to these two lines intersects both regions in line segments of equal length, then the two regions have equal areas. 3-dimensional case: Suppose two regions in three-space (solids) are included between two parallel planes. If every plane parallel to these two planes intersects both regions in cross-sections of equal area, then the two regions have equal volumes. Today Cavalieri's principle is seen as an early step towards integral calculus, and while it is used in some forms, such as its generalization in Fubini's theorem and layer cake representation, results using Cavalieri's principle can often be shown more directly via integration. In the other direction, Cavalieri's principle grew out of the ancient Greek method of exhaustion, which used limits but did not use infinitesimals. History Cavalieri's principle was originally called the method of indivisibles, the name it was known by in Renaissance Europe. Cavalieri developed a complete theory of indivisibles, elaborated in his Geometria indivisibilibus continuorum nova quadam ratione promota (Geometry, advanced in a new way by the indivisibles of the continua, 1635) and his Exercitationes geometricae sex (Six geometrical exercises, 1647). While Cavalieri's work established the principle, in his publications he denied that the continuum was composed of indivisibles in an effort to avoid the associated paradoxes and religious controversies, and he did not use it to find previously unknown results. In the 3rd century BC, Archimedes, using a method resembling Cavalieri's principle, was able to find the volume of a sphere given the volumes of a cone and cylinder in his work The Method of Mechanical Theorems. In the 5th century AD, Zu Chongzhi and
https://en.wikipedia.org/wiki/Vacuum%20engineering
Vacuum engineering is the field of engineering that deals with the practical use of vacuum in industrial and scientific applications. Vacuum may improve the productivity and performance of processes otherwise carried out at normal air pressure, or may make possible processes that could not be done in the presence of air. Vacuum engineering techniques are widely applied in materials processing such as drying or filtering, chemical processing, application of metal coatings to objects, manufacture of electron devices and incandescent lamps, and in scientific research. Vacuum techniques vary depending on the desired vacuum pressure to be achieved. For a "rough" vacuum, over 100 Pascals pressure, conventional methods of analysis, materials, pumps and measuring instruments can be used, whereas ultrahigh vacuum systems use specialized equipment to achieve pressures below one-millionth of one Pascal. At such low pressures, even metals may emit enough gas to cause serious contamination. Design and mechanism Vacuum systems usually consist of gauges, vapor jet and pumps, vapor traps and valves along with other extensional piping. A vessel that is operating under vacuum system may be any of these types such as processing tank, steam simulator, particle accelerator, or any other type of space that has an enclosed chamber to maintain the system in less than atmospheric gas pressure. Since a vacuum is created in an enclosed chamber, the consideration of being able to withstand external atmospheric pressure are the usual precaution for this type of design. Along with the effect of buckling or collapsing, the outer shell of vacuum chamber will be carefully evaluated and any sign of deterioration will be corrected by the increase of thickness of the shell itself. The main materials used for vacuum design are usually mild steel, stainless steel, and aluminum. Other sections such as glass are used for gauge glass, view ports, and sometimes electrical insulation. The interior of
https://en.wikipedia.org/wiki/Fundamental%20pair%20of%20periods
In mathematics, a fundamental pair of periods is an ordered pair of complex numbers that defines a lattice in the complex plane. This type of lattice is the underlying object with which elliptic functions and modular forms are defined. Definition A fundamental pair of periods is a pair of complex numbers such that their ratio is not real. If considered as vectors in , the two are not collinear. The lattice generated by and is This lattice is also sometimes denoted as to make clear that it depends on and It is also sometimes denoted by or or simply by The two generators and are called the lattice basis. The parallelogram with vertices is called the fundamental parallelogram. While a fundamental pair generates a lattice, a lattice does not have any unique fundamental pair; in fact, an infinite number of fundamental pairs correspond to the same lattice. Algebraic properties A number of properties, listed below, can be seen. Equivalence Two pairs of complex numbers and are called equivalent if they generate the same lattice: that is, if No interior points The fundamental parallelogram contains no further lattice points in its interior or boundary. Conversely, any pair of lattice points with this property constitute a fundamental pair, and furthermore, they generate the same lattice. Modular symmetry Two pairs and are equivalent if and only if there exists a matrix with integer entries and and determinant such that that is, so that This matrix belongs to the modular group This equivalence of lattices can be thought of as underlying many of the properties of elliptic functions (especially the Weierstrass elliptic function) and modular forms. Topological properties The abelian group maps the complex plane into the fundamental parallelogram. That is, every point can be written as for integers with a point in the fundamental parallelogram. Since this mapping identifies opposite sides of the parallelogram as being the same, the f
https://en.wikipedia.org/wiki/DLVO%20theory
The DLVO theory (named after Boris Derjaguin and Lev Landau, Evert Verwey and Theodoor Overbeek) explains the aggregation and kinetic stability of aqueous dispersions quantitatively and describes the force between charged surfaces interacting through a liquid medium. It combines the effects of the van der Waals attraction and the electrostatic repulsion due to the so-called double layer of counterions. The electrostatic part of the DLVO interaction is computed in the mean field approximation in the limit of low surface potentials - that is when the potential energy of an elementary charge on the surface is much smaller than the thermal energy scale, . For two spheres of radius each having a charge (expressed in units of the elementary charge) separated by a center-to-center distance in a fluid of dielectric constant containing a concentration of monovalent ions, the electrostatic potential takes the form of a screened-Coulomb or Yukawa potential, where is the Bjerrum length, is the potential energy, ≈ 2.71828 is Euler's number, is the inverse of the Debye–Hückel screening length (); is given by , and is the thermal energy scale at absolute temperature Overview DLVO theory is a theory of colloidal dispersion stability in which zeta potential is used to explain that as two particles approach one another their ionic atmospheres begin to overlap and a repulsion force is developed. In this theory, two forces are considered to impact on colloidal stability: Van der Waals forces and electrical double layer forces. The total potential energy is described as the sum of the attraction potential and the repulsion potential. When two particles approach each other, electrostatic repulsion increases and the interference between their electrical double layers increases. However, the Van der Waals attraction also increases as they get closer. At each distance, the net potential energy of the smaller value is subtracted from the larger value. At very close distance
https://en.wikipedia.org/wiki/Layer%20%28electronics%29
A layer is the deposition of molecules on a substrate or base (glass, ceramic, semiconductor, or plastic/bioplastic). High temperature substrates includes stainless steel and polyimide film (expensive) and PET (cheap). A depth of less than one micrometre is generally called a thin film while a depth greater than one micrometre is called a coating. A web is a flexible substrate. See also Coating Heterojunction Nanoparticle Organic electronics Passivation Printed electronics Sputtering Thermal annealing Transparent conducting oxide
https://en.wikipedia.org/wiki/Noble%20rot
Noble rot (; ; ; ) is the beneficial form of a grey fungus, Botrytis cinerea, affecting wine grapes. Infestation by Botrytis requires moist conditions, but if the weather stays wet, the damaging form, "grey rot", can destroy crops of grapes. Grapes typically become infected with Botrytis when they are ripe. If they are then exposed to drier conditions and become partially raisined, this form of infection is known as noble rot. Grapes picked at a certain point during infestation can produce particularly fine and concentrated sweet wine. Wines produced by this method are known as botrytized wines. Origins According to Hungarian legend, the first aszú (a wine using botrytised grapes) was made by Laczkó Máté Szepsi in 1630. However, mention of wine made from botrytised grapes appears before this in the Nomenklatura of Fabricius Balázs Sziksai, which was completed in 1576. A recently discovered inventory of aszú predates this reference by five years. When vineyard classification began in 1730 in the Tokaj region, one of the gradings given to the various terroirs centered on their potential to develop Botrytis cinerea. There is a popular story that the practice originated independently in Germany in 1775, where the Riesling producers at Schloss Johannisberg (Geisenheim, in the Rheingau region) traditionally awaited the say-so of the estate owner, Heinrich von Bibra, Bishop of Fulda, before cutting their grapes. In this year (so the legend goes), the abbey messenger was robbed en route to delivering the order to harvest and the cutting was delayed for three weeks, time enough for the botrytis to take hold. The grapes were presumed worthless and given to local peasants, who produced a surprisingly good, sweet wine which subsequently became known as Spätlese, or late harvest wine. In the following few years, several different classes of increasing must weight were introduced, and the original Spätlese was further elaborated, first into Auslese in 1787 and later Eiswein in
https://en.wikipedia.org/wiki/Assisted%20migration%20of%20forests%20in%20North%20America
Assisted migration is the movement of populations or species by humans from one territory to another in response to climate change. This is the definition offered in a nontechnical document published by the United States Forest Service in 2023, suggesting that this form of climate adaptation "could be a proactive, pragmatic tool for building climate resilience in our landscapes." Programs for assisted migration of forests in North America have been created by public and indigenous governmental bodies, private forest owners, and land trusts. They have been researching, testing, evaluating, and sometimes implementing forest assisted migration projects as a form of adaptation to climate change. Assisted migration in the forestry context differs from assisted migration as originally proposed in the context of conservation biology, where it is regarded as a management tool for helping endangered species cope with the need for climate adaptation. The focus in forestry is mitigating climate change's negative effects on the health and productivity of working forests. Forestry assisted migration is already underway in North America because of the rapidly changing climate and the forestry industry's reseeding practices. It is now standard practice for governmental and industrial harvests of trees to be followed by the planting of seeds or seedlings in the harvested areas. Hence, an opportunity automatically arises post-harvest to select seeds (and sometimes different species of trees) from areas with climates that are expected to arrive in the harvested sites decades into the future. The government of British Columbia in Canada was the first federated state on the continent to make the decision to change their seed transfer guidelines accordingly in 2009. Longer distance forms of assisted migration were not, however, considered prior to climate modelling and within-forest evidence of the increasing pace of climate change. Serious discussion and debate ensued in the forestr
https://en.wikipedia.org/wiki/Lossless%20predictive%20audio%20compression
Lossless predictive audio compression (LPAC) is an improved lossless audio compression algorithm developed by Tilman Liebchen, Marcus Purat and Peter Noll at Institute for Telecommunications, Technical University Berlin (TU Berlin), to compress PCM audio in a lossless manner, unlike conventional audio compression algorithms which are lossy. Meanwhile, it is no longer developed because an advanced version of it has become an official standard under the name of MPEG-4 Audio Lossless Coding. See also Monkey's Audio (APE) Free Lossless Audio Codec (FLAC) Lossless Transform Audio Compression (LTAC) True Audio (TTA) External links Lossless Predictive Audio Compression (LPAC) The basic principles of lossless audio data compression (TTA) The Lossless Audio Blog Lossless Audio News & Information Site. Lossless audio codecs
https://en.wikipedia.org/wiki/Qubes%20OS
Qubes OS is a security-focused desktop operating system that aims to provide security through isolation. Isolation is provided through the use of virtualization technology. This allows the segmentation of applications into secure virtual machines called qubes. Virtualization services in Qubes OS are provided by the Xen hypervisor. The runtimes of individual qubes are generally based on a unique system of underlying operating system templates. Templates provide a single, immutable root file system which can be shared by multiple qubes. This approach has a two major benefits. First, updates to a given template are automatically "inherited" by all qubes based on it. Second, shared templates can dramatically reduce storage requirements compared to separate VMs with a full operating install per secure domain. The base installation of Qubes OS provides a number of officially supported templates based on the Fedora and Debian Linux distributions. Alternative community-supported templates include Whonix, Ubuntu, Arch Linux, CentOS, or Gentoo. Users may also create their own templates. Operating Systems like Qubes OS are referred to in academia as Converged Multi-Level Secure (MLS) Systems. Other proposals of similar systems have surfaced and SecureView and VMware vSphere are commercial competitors. Security goals Qubes implements a Security by Isolation approach. The assumption is that there can be no perfect, bug-free desktop environment: such an environment counts millions of lines of code and billions of software/hardware interactions. One critical bug in any of these interactions may be enough for malicious software to take control of a machine. To secure a desktop using Qubes OS, the user takes care to isolate various environments, so that if one of the components gets compromised, the malicious software would get access to only the data inside that environment. In Qubes OS, the isolation is provided in two dimensions: hardware controllers can be isolated i
https://en.wikipedia.org/wiki/Minimum%20Information%20Required%20About%20a%20Glycomics%20Experiment
The Minimum Information Required About a Glycomics Experiment (MIRAGE) initiative is part of the Minimum Information Standards and specifically applies to guidelines for reporting (describing metadata) on a glycomics experiment. The initiative is supported by the Beilstein Institute for the Advancement of Chemical Sciences. The MIRAGE project focuses on the development of publication guidelines for interaction and structural glycomics data as well as the development of data exchange formats. The project was launched in 2011 in Seattle and set off with the description of the aims of the MIRAGE project. Organization The MIRAGE Commission consists of three groups which tightly interact with each other. The advisory board consists of leading scientists in glycobiology, who, for example, critically review the outcomes of the working group and promote the reporting guidelines within the community. The working group seeks for external consultation and directly interacts with the glycomics community. The group members carry out defined subprojects (e.g. development and revision of guidelines) by focusing on specific research areas to fulfill the overall aims of the MIRAGE project. The co-ordination team links the subprojects from the working group together and passes the outcomes to the advisory board for review. Reporting guidelines The following reporting guidelines were developed and published: MIRAGE MS guidelines for reporting mass spectrometry-based glycan analysis. These guidelines are based on the MIAPE guideline template, i.e. MIAPE-MS version 2.24. MIRAGE Sample preparation guidelines which are considered a common basis for any further MIRAGE reporting guidelines in order to keep the requirements for data analysis short and consistent. MIRAGE Glycan microarray guidelines for the comprehensive description of Glycan array experiments the reporting guidelines for glycan microarray analysis have been developed. In order to assist the authors to reporting in
https://en.wikipedia.org/wiki/List%20of%20computer%20system%20manufacturers
A computer system is a nominally complete computer that includes the hardware, operating system (main software), and the means to use peripheral equipment needed and used for full or mostly full operation. Such systems may constitute personal computers (including desktop computers, portable computers, laptops, all-in-ones, and more), mainframe computers, minicomputers, servers, and workstations, among other classes of computing. The following is a list of notable manufacturers and sellers of computer systems, both present and past. Current Inactive See also Market share of personal computer vendors List of computer hardware manufacturers List of laptop brands and manufacturers List of touch-solution manufacturers Notes
https://en.wikipedia.org/wiki/Superpartient%20ratio
In mathematics, a superpartient ratio, also called superpartient number or epimeric ratio, is a rational number that is greater than one and is not superparticular. The term has fallen out of use in modern pure mathematics, but continues to be used in music theory and in the historical study of mathematics. Superpartient ratios were written about by Nicomachus in his treatise Introduction to Arithmetic. Overview Mathematically, a superpartient number is a ratio of the form where a is greater than 1 (a > 1) and is also coprime to n. Ratios of the form are also greater than one and fully reduced, but are called superparticular ratios and are not superpartient. Etymology "Superpartient" comes from Greek ἐπιμερής epimeres "containing a whole and a fraction," literally "superpartient". See also Mathematics of musical scales Further reading Partch, Harry (1979). Genesis of a Music, p.68. . Rational numbers Intervals (music)
https://en.wikipedia.org/wiki/Dominated%20convergence%20theorem
In measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which almost everywhere convergence of a sequence of functions implies convergence in the L1 norm. Its power and utility are two of the primary theoretical advantages of Lebesgue integration over Riemann integration. In addition to its frequent appearance in mathematical analysis and partial differential equations, it is widely used in probability theory, since it gives a sufficient condition for the convergence of expected values of random variables. Statement Lebesgue's dominated convergence theorem. Let be a sequence of complex-valued measurable functions on a measure space . Suppose that the sequence converges pointwise to a function and is dominated by some integrable function in the sense that for all numbers n in the index set of the sequence and all points . Then f is integrable (in the Lebesgue sense) and which also implies Remark 1. The statement "g is integrable" means that measurable function is Lebesgue integrable; i.e. Remark 2. The convergence of the sequence and domination by can be relaxed to hold only almost everywhere provided the measure space is complete or is chosen as a measurable function which agrees everywhere with the everywhere existing pointwise limit. (These precautions are necessary, because otherwise there might exist a non-measurable subset of a set , hence might not be measurable.) Remark 3. If , the condition that there is a dominating integrable function can be relaxed to uniform integrability of the sequence (fn), see Vitali convergence theorem. Remark 4. While is Lebesgue integrable, it is not in general Riemann integrable. For example, take fn to be defined in so that it is 1/n at rational numbers and zero everywhere else (on the irrationals). The series (fn) converges pointwise to 0, so f is identically zero, but is not Riemann integrable, since its image in every finite interval is and thus the upper and
https://en.wikipedia.org/wiki/International%20Ideographs%20Core
International Ideographs Core (IICore) is a subset of up to ten thousand CJK Unified Ideographs characters, which can be implemented on devices with limited memories and capability that make it not feasible to implement the full ISO 10646/Unicode standard. History The IICore subset was initially raised in the 21st meeting of the Ideographic Rapporteur Group (IRG) in Guilin during 17th-20 November in 2003, and is subsequently passed in the group's 22nd meeting in Chengdu in May 2004. See also Chinese character encoding Han unification
https://en.wikipedia.org/wiki/Odontochelys
Odontochelys semitestacea (meaning "toothed turtle with a half-shell") is a Late Triassic relative of turtles. Before Pappochelys was discovered and Eunotosaurus was redescribed, Odontochelys was considered the oldest undisputed member of Pantestudines (i.e. a stem-turtle). It is the only known species in the genus Odontochelys and the family Odontochelyidae. Discovery Odontochelys semitestacea was first described from three 200-million-year-old specimens excavated in Triassic deposits in Guizhou, China. The locale of its discovery at one time was the Nanpanjiang Trough basin, a shallow marine environment surrounded on three sides by land. These deposits preserve an ecosystem known as the Guanling biota, which was dominated by marine reptiles. Description Odontochelys differed grossly from modern turtles. Modern turtles have a horny beak without teeth in their mouth. In contrast, Odontochelys fossils were found to have had teeth embedded in their upper and lower jaws. One of the most striking features of turtles, both modern and prehistoric alike, are their dorsal shells, forming an armored carapace over the body of the animal. Odontochelys only possessed the bottom portion of a turtle's armor, the plastron. It did not yet have a solid carapace as most other turtles do. Instead of a solid carapace, Odontochelys possessed broadened ribs like those of modern turtle embryos that still have not started developing the ossified plates of a carapace. It reached nearly in total body length. Aside from the presence of teeth and the absence of a solid carapace, a few other skeletal traits distinguish Odontochelys as basal compared to other turtles, extant and otherwise. The point of articulation between the dorsal ribs and the vertebrae is decidedly different in Odontochelys than in later turtles. In a comparison of skull proportions, the skull of Odontochelys is far more elongated pre-orbitally (in front of the eyes) compared to other turtles. The tail of the prehistori
https://en.wikipedia.org/wiki/Sister%20Mary%20Joseph%20nodule
In medicine, the Sister Mary Joseph nodule (sometimes Sister Mary Joseph node or Sister Mary Joseph sign) refers to a palpable nodule bulging into the umbilicus as a result of metastasis of a malignant cancer in the pelvis or abdomen. Sister Mary Joseph nodules can be painful to palpation. A periumbilical mass is not always a Sister Mary Joseph nodule. Other conditions that can cause a palpable periumbilical mass include umbilical hernia, infection, and endometriosis. Medical imaging, such as abdominal ultrasound, may be used to distinguish a Sister Mary Joseph nodule from another kind of mass. Gastrointestinal malignancies account for about half of underlying sources (most commonly gastric cancer, colonic cancer or pancreatic cancer, mostly of the tail and body of the pancreas), and men are even more likely to have an underlying cancer of the gastrointestinal tract. Gynecological cancers account for about 1 in 4 cases (primarily ovarian cancer and also uterine cancer). Nodules will also, rarely, originate from appendix cancer spillage and pseudomyxoma peritonei. Unknown primary tumors and rarely, urinary or respiratory tract malignancies can cause umbilical metastases. How exactly the metastases reach the umbilicus remains largely unknown. Proposed mechanisms for the spread of cancer cells to the umbilicus include direct transperitoneal spread, via the lymphatics which run alongside the obliterated umbilical vein, hematogenous spread, or via remnant structures such as the falciform ligament, median umbilical ligament, or a remnant of the vitelline duct. Sister Mary Joseph nodule is associated with multiple peritoneal metastases and a poor prognosis. History Sister Mary Joseph Dempsey (born Julia Dempsey) was a Catholic nun and surgical assistant of William J. Mayo at St. Mary's Hospital in Rochester, Minnesota from 1890 to 1915. She drew Mayo's attention to the phenomenon, and he published an article about it in 1928. The eponymous term Sister Mary Joseph nodul
https://en.wikipedia.org/wiki/Institute%20of%20Genetics%20and%20Genomics%20of%20Geneva
The Institute of Genetics and Genomics of Geneva, also known as iGE3, is a research institute in Geneva, Switzerland. The institute is affiliated with the University of Geneva and focuses on conducting biomedical research and teaching based on genetic and genomic scientific analysis. The abbreviation "iGE3" was devised by its former director Stylianos Antonarakis by combining "institute" and "GEnetics, GEnomics, GEneva". The institute's inauguration ceremony was held on February 8, 2012 and as of October 2020 the institute is directed by Emmanouil Dermitzakis.
https://en.wikipedia.org/wiki/System%20Module
System Modules (originally known as System Building Blocks; the name was changed around 1961) are a DEC modular digital logic family which preceded the later FLIP CHIPs. They connect to the units they are plugged into via a set of 22 gold-plated discrete pins along one edge. They use transistor inverter circuits, with the transistors operating saturated, to avoid dependence on tight tolerances; they use -3V and 0V as logic levels. Intended for prototyping as well as production, they include design features intended to avoid damage. They are provided with design advice which includes loading rules and wiring instructions. They were available in several compatible speed lines: 4000-Series: the second series, nominally 500 KHz, but some 1 MHz 1000-Series: the original series, nominally 5 MHz 6000-Series: higher speeds, nominally 10 Mhz 8000-Series: very high speeds, nominally 30MHz In addition, special modules were available for purposes such as Input/Output (I/O) converters (to standard internal voltages), bus drivers, lamp and solenoid drivers, A/D conversion, relays, core memory drivers, etc. Larger assemblies which are part of the same family provide core memory testing devices. There are also power supplies, mounting panels with slots for the modules, cabinets to hold groups of mounting panels, indicator light panels, etc.
https://en.wikipedia.org/wiki/IFT%20Research%20%26%20Development%20Award
The IFT Research & Development Award has been awarded since 1997. It has been awarded by the Institute of Food Technologists (IFT) to scientists who have made recent and significant research and development contributions to the understanding of food science, food technology, or nutrition. Award winners receive a USD 3000 honorarium and a plaque from IFT. Winners
https://en.wikipedia.org/wiki/Biopolymer
Biopolymers are natural polymers produced by the cells of living organisms. Like other polymers, biopolymers consist of monomeric units that are covalently bonded in chains to form larger molecules. There are three main classes of biopolymers, classified according to the monomers used and the structure of the biopolymer formed: polynucleotides, polypeptides, and polysaccharides. The Polynucleotides, RNA and DNA, are long polymers of nucleotides. Polypeptides include proteins and shorter polymers of amino acids; some major examples include collagen, actin, and fibrin. Polysaccharides are linear or branched chains of sugar carbohydrates; examples include starch, cellulose, and alginate. Other examples of biopolymers include natural rubbers (polymers of isoprene), suberin and lignin (complex polyphenolic polymers), cutin and cutan (complex polymers of long-chain fatty acids), melanin, and polyhydroxyalkanoates (PHAs). In addition to their many essential roles in living organisms, biopolymers have applications in many fields including the food industry, manufacturing, packaging, and biomedical engineering. Biopolymers versus synthetic polymers A major defining difference between biopolymers and synthetic polymers can be found in their structures. All polymers are made of repetitive units called monomers. Biopolymers often have a well-defined structure, though this is not a defining characteristic (example: lignocellulose): The exact chemical composition and the sequence in which these units are arranged is called the primary structure, in the case of proteins. Many biopolymers spontaneously fold into characteristic compact shapes (see also "protein folding" as well as secondary structure and tertiary structure), which determine their biological functions and depend in a complicated way on their primary structures. Structural biology is the study of the structural properties of biopolymers. In contrast, most synthetic polymers have much simpler and more random (or st
https://en.wikipedia.org/wiki/SPECvirt
SPECvirt_sc2010 is a computer benchmark that evaluates the performance of a server computer for virtualization. It is available from the Standard Performance Evaluation Corporation (SPEC). It was introduced in July, 2010. The SPECvirt_sc2010 benchmark measures the maximum number of workloads that a platform can simultaneously run while maintaining specific quality of service metrics. Each workload, called a tile, consists of a specific set of virtual machines. In addition to generating results that show performance, the benchmark can also be used to generate performance per watt results. The SPECvirt_sc2010 was not supported by SPEC from February 26, 2014 and its successor is SPECvirt_sc2013. See also Performance per watt VMmark
https://en.wikipedia.org/wiki/Chandler%20Robbins%20Award
The ABA Chandler Robbins Award for Education/Conservation is an award given by the American Birding Association to an individual who has made significant contributions either to the education of birders or to bird conservation and the "management or preservation of habitats on which birds and birding depends." The award may also recognize efforts in both fields. One of five awards presented by the ABA for contributions to ornithology, the award is named in honor of Chandler Robbins, who himself advanced both conservation and education. Robbins is author of an influential field guide to birds and the architect of the North American Breeding Bird Survey. The award was first bestowed on Ted Lee Eubanks. List of recipients Since the award's inception in 2000, there have been 17 recipients. See also List of ornithology awards
https://en.wikipedia.org/wiki/Mirror%20TV
A mirror TV or TV mirror is a television that can change into a mirror. Mirror TVs are often used to save space or hide electronics in bathrooms, bedrooms and living rooms. Mirror TVs can be integrated into interior designs, including in smart homes, hotels, offices, gyms, and spas. A mirror TV consists of special semi-transparent mirror glass with an LCD TV behind the mirrored surface. The mirror is carefully polarized to allow an image to transfer through the mirror, such that when the TV is off, the device looks like a mirror. Placement of a mirror TV is important to ensure both good mirror reflection and television picture quality. A space with high levels of lighting is optimal for reflection when the TV looks like a mirror, while low levels of light is ideal for TV viewing. Experts recommend using block out blinds in bright rooms, such as those with large windows and skylights, when watching television on a TV-Mirror during the day to reduce the amount of reflection when the TV is on. TV viewing is not affected by reflection on the TV mirror in the evenings. Some manufacturers offer high-end input and output options for entire-home A/V integration.
https://en.wikipedia.org/wiki/Permittivity
In electromagnetism, the absolute permittivity, often simply called permittivity and denoted by the Greek letter ε (epsilon), is a measure of the electric polarizability of a dielectric. A material with high permittivity polarizes more in response to an applied electric field than a material with low permittivity, thereby storing more energy in the material. In electrostatics, the permittivity plays an important role in determining the capacitance of a capacitor. In the simplest case, the electric displacement field D resulting from an applied electric field E is More generally, the permittivity is a thermodynamic function of state. It can depend on the frequency, magnitude, and direction of the applied field. The SI unit for permittivity is farad per meter (F/m). The permittivity is often represented by the relative permittivity εr which is the ratio of the absolute permittivity ε and the vacuum permittivity ε0 . This dimensionless quantity is also often and ambiguously referred to as the permittivity. Another common term encountered for both absolute and relative permittivity is the dielectric constant which has been deprecated in physics and engineering as well as in chemistry. By definition, a perfect vacuum has a relative permittivity of exactly 1 whereas at standard temperature and pressure, air has a relative permittivity of κair ≈ 1.0006. Relative permittivity is directly related to electric susceptibility (χ) by otherwise written as The term "permittivity" was introduced in the 1880s by Oliver Heaviside to complement Thomson's (1872) "permeability". Formerly written as p, the designation with ε has been in common use since the 1950s. Units The SI unit for permittivity is farad per meter (F/m or F·m−1). Explanation In electromagnetism, the electric displacement field represents the distribution of electric charges in a given medium resulting from the presence of an electric field . This distribution includes charge migration and electric dipol
https://en.wikipedia.org/wiki/Wine%20fault
A wine fault is a sensory-associated (organoleptic) characteristic of a wine that is unpleasant, and may include elements of taste, smell, or appearance, elements that may arise from a "chemical or a microbial origin", where particular sensory experiences (e.g., an off-odor) might arise from more than one wine fault. Wine faults may result from poor winemaking practices or storage conditions that lead to wine spoilage. In the case of a chemical origin, many compounds causing wine faults are already naturally present in wine, but at insufficient concentrations to be of issue, and in fact may impart positive characters to the wine; however, when the concentration of such compounds exceed a sensory threshold, they replace or obscure desirable flavors and aromas that the winemaker wants the wine to express. The ultimate result is that the quality of the wine is reduced (less appealing, sometimes undrinkable), with consequent impact on its value.<ref name="Baldy pp 37-39, et al">M. Baldy: "The University Wine Course", Third Edition, pp. 37-39, 69-80, 134-140. The Wine Appreciation Guild 2009 .</ref> There are many underlying causes of wine faults, including poor hygiene at the winery, excessive or insufficient exposure of the wine to oxygen, excessive or insufficient exposure of the wine to sulphur, overextended maceration of the wine either pre- or post-fermentation, faulty fining, filtering and stabilization of the wine, the use of dirty oak barrels, over-extended barrel aging and the use of poor quality corks. Outside of the winery, other factors within the control of the retailer or end user of the wine can contribute to the perception of flaws in the wine. These include poor storage of the wine that exposes it to excessive heat and temperature fluctuations as well as the use of dirty stemware during wine tasting that can introduce materials or aromas to what was previously a clean and fault-free wine. Differences between flaws and faults In wine tasting, there is
https://en.wikipedia.org/wiki/List%20of%20largest%20non-human%20primates
This is a list of large extant primate species (excluding humans) that can be ordered by average weight or height range. There is no fixed definition of a large primate, it is typically assessed empirically. Primates exhibit the highest levels of sexual dimorphism amongst mammals, therefore the maximum body dimensions included in this list generally refer to male specimens. Mandrills and baboons are monkeys; the rest of the species on this list are apes. Typically, Old World monkeys (paleotropical) are larger than New World monkeys (neotropical); the reasons for this are not entirely understood but several hypotheses have been generated. As a rule, primate brains are "significantly larger" than those of other mammals with similar body sizes. Until well into the 19th century, juvenile orangutans were taken from the wild and died within short order, eventually leading naturalists to mistakenly assume that the living specimens they briefly encountered and skeletons of adult orangutans were entirely different species. Largest non-human primates See also Largest wild canids List of largest land carnivorans Monkey Great apes List of heaviest land mammals Largest mammals Sexual dimorphism in non-human primates
https://en.wikipedia.org/wiki/ISO/IEC%2029119
ISO/IEC/IEEE 29119 Software and systems engineering -- Software testing is a series of five international standards for software testing. First developed in 2007 and released in 2013, the standard "defines vocabulary, processes, documentation, techniques, and a process assessment model for testing that can be used within any software development lifecycle." History and revisions Development of the set of ISO/IEC/IEEE 29119 software testing standards began in May 2007, based on existing standards such as the Institute of Electrical and Electronics Engineers's IEEE 829 (test documentation), and IEEE 1008 (unit testing); and the BSI Group's BS 7925-1 (vocabulary) and -2 (software components). At first the International Organization for Standardization (ISO) had no working group with significant software testing experience, so the ISO created WG26, which by 2011 was represented by more than 20 different countries. Initially four sections were developed for the standard: Concepts and definitions (1), Test processes (2), Test documentation (3), and Test techniques (4). A fifth part concerning process assessment was considered for addition, ultimately becoming ISO/IEC 33063:2015, which ties to 29119-2's test processes. The actual fifth part of 29119 was published in November 2016 concerning the concept of keyword-driven testing. , no major revisions have occurred to the five parts of the standard. These parts are, from most recent to oldest: ISO/IEC/IEEE 29119-5:2016, Part 5: Keyword-driven testing, published in November 2016 ISO/IEC/IEEE 29119-4:2015, Part 4: Test techniques, published in December 2015 ISO/IEC/IEEE 29119-3:2013, Part 3: Test documentation, published in September 2013 ISO/IEC/IEEE 29119-2:2013, Part 2: Test processes, published in September 2013 ISO/IEC/IEEE 29119-1:2013, Part 1: Concepts and definitions, published in September 2013 Structure and contents ISO/IEC/IEEE 29119-1:2013, Part 1: Concepts and definitions ISO/IEC/IEEE 29119 Part 1 fa
https://en.wikipedia.org/wiki/Interferon%20type%20I
The type-I interferons (IFN) are cytokines which play essential roles in inflammation, immunoregulation, tumor cells recognition, and T-cell responses. In the human genome, a cluster of thirteen functional IFN genes is located at the 9p21.3 cytoband over approximately 400 kb including coding genes for IFNα (IFNA1, IFNA2, IFNA4, IFNA5, IFNA6, IFNA7, IFNA8, IFNA10, IFNA13, IFNA14, IFNA16, IFNA17 and IFNA21), IFNω (IFNW1), IFNɛ (IFNE), IFNк (IFNK) and IFNβ (IFNB1), plus 11 IFN pseudogenes. Interferons bind to interferon receptors. All type I IFNs bind to a specific cell surface receptor complex known as the IFN-α receptor (IFNAR) that consists of IFNAR1 and IFNAR2 chains. Type I IFNs are found in all mammals, and homologous (similar) molecules have been found in birds, reptiles, amphibians and fish species. Sources and functions IFN-α and IFN-β are secreted by many cell types including lymphocytes (NK cells, B-cells and T-cells), macrophages, fibroblasts, endothelial cells, osteoblasts and others. They stimulate both macrophages and NK cells to elicit an anti-viral response, involving IRF3/IRF7 antiviral pathways, and are also active against tumors. Plasmacytoid dendritic cells have been identified as being the most potent producers of type I IFNs in response to antigen, and have thus been coined natural IFN producing cells. IFN-ω is released by leukocytes at the site of viral infection or tumors. IFN-α acts as a pyrogenic factor by altering the activity of thermosensitive neurons in the hypothalamus thus causing fever. It does this by binding to opioid receptors and eliciting the release of prostaglandin-E2 (PGE2). A similar mechanism is used by IFN-α to reduce pain; IFN-α interacts with the μ-opioid receptor to act as an analgesic. In mice, IFN-β inhibits immune cell production of growth factors, thereby slowing tumor growth, and inhibits other cells from producing vessel-producing growth factors, thereby blocking tumor angiogenesis and hindering the tumour fr
https://en.wikipedia.org/wiki/GPS%20week%20number%20rollover
The GPS week number rollover is a phenomenon that happens every 1,024 weeks, which is about 19.6 years. The Global Positioning System (GPS) broadcasts a date, including a week number counter that is stored in only ten binary digits, whose range is therefore 0–1,023. After 1,023, an integer overflow causes the internal value to roll over, changing to zero again. Software that is not coded to anticipate the rollover to zero may stop working or could be moved back in time by 20 or 40 years. GPS is not only used for positioning, but also for accurate time. Time is used to accurately synchronize payment operations, broadcasters, and mobile operators. 1999 occurrence The first rollover took place midnight (UTC) August 21 to 22, 1999. NavCen issued an advisory prior to the rollover stating that some devices would not tolerate the rollover. Because of the relatively limited use of GPS during the 1999 rollover, disruption was minor. 2019 occurrence The second rollover occurred on the night of April 6 to 7, 2019, when GPS Week 2,047, represented as 1,023 in the counter, advanced and rolled over to 0 within the counter. The United States Department of Homeland Security, the International Civil Aviation Organization, and others issued a warning about this event. Effects Products known to have been affected by the 2019 rollover include Honeywell's flight management and navigation software that caused delays for a KLM flight and cancellations for numerous flights in China because the technicians failed to patch the software. Furthermore, the New York City Wireless Network (NYCWiN), a private network for New York City's municipal services, crashed. Other products that were affected by the rollover include cellphones that were sold in 2013 or earlier, certain types of older Vaisala radiosonde groundstations, suspending launches at some stations for up to two weeks, NOAA's weather buoys, many scientific instruments, and consumer GPS navigation devices. Prior to return to
https://en.wikipedia.org/wiki/Mitogen-activated%20protein%20kinase%209
Mitogen-activated protein kinase 9 is an enzyme that in humans is encoded by the MAPK9 gene. Function The protein encoded by this gene is a member of the MAP kinase family. MAP kinases act as an integration point for multiple biochemical signals, and are involved in a wide variety of cellular processes such as proliferation, differentiation, transcription regulation and development. This kinase targets specific transcription factors, and thus mediates immediate-early gene expression in response to various cell stimuli. It is most closely related to MAPK8, both of which are involved in UV radiation-induced apoptosis, thought to be related to the cytochrome c-mediated cell death pathway. This gene and MAPK8 are also known as c-Jun N-terminal kinases. This kinase blocks the ubiquitination of tumor suppressor p53, and thus it increases the stability of p53 in nonstressed cells. Studies of this gene's mouse counterpart suggest a key role in T-cell differentiation. Four alternatively spliced transcript variants encoding distinct isoforms have been reported. Interactions Mitogen-activated protein kinase 9 has been shown to interact with: Grb2, MAPK8IP1, MAPK8IP2, MAPK8IP3 P53, and TOB1.
https://en.wikipedia.org/wiki/CUBRID
CUBRID ( "cube-rid") is an open-source SQL-based relational database management system (RDBMS) with object extensions developed by CUBRID Corp. for OLTP. The name CUBRID is a combination of the two words cube and bridge, cube standing for a space for data and bridge standing for data bridge. License policy CUBRID has a separate license for its server engine and its interfaces. The server engine adopts the Apache License 2.0, which allows distribution, modification, and acquisition of the source code. CUBRID APIs and GUI tools have the Berkeley Software Distribution license in which there is no obligation of opening derivative works. The reason of adopting two separate license systems is to provide complete freedom to Independent software vendors (ISV) to develop and distribute CUBRID-based applications. Architecture The feature that distinguishes CUBRID database from other relational database systems is its 3-tier client-server architecture which consists of the database server, the connection broker and the application layer. Database server The database server is the component of the CUBRID database management system which is responsible for storage operations and statement execution. A CUBRID database server instance can mount and use a single database, making inter-database queries impossible. However, more than one instance can run on a machine. Unlike other solutions, the database server does not compile queries itself, but executes queries precompiled in a custom access specification language. Connection broker The CUBRID connection broker's main roles are: management of client application connections caching and relaying information (e.g. query results) query syntax analysis, optimization and execution plan generation Also, a local object pool enables some parts of the execution to be deferred from the database server (e.g. tuple insertion and deletion, DDL statements), thus lowering the database server load. Since the connection broker is not b
https://en.wikipedia.org/wiki/Transaction-level%20modeling
Transaction-level modeling (TLM) is an approach to modelling complex digital systems by using electronic design automation software. TLM language (TLML) is a hardware description language, usually, written in C++ and based on SystemC library. TLMLs are used for modelling where details of communication among modules are separated from the details of the implementation of functional units or of the communication architecture. It's used for modelling of systems that involve complex data communication mechanisms. Components such as buses or FIFOs are modeled as channels, and are presented to modules using SystemC interface classes. Transaction requests take place by calling interface functions of these channel models, which encapsulate low-level details of the information exchange. At the transaction level, the emphasis is more on the functionality of the data transfers – what data are transferred to and from what locations – and less on their actual implementation, that is, on the actual protocol used for data transfer. This approach makes it easier for the system-level designer to experiment, for example, with different bus architectures (all supporting a common abstract interface) without having to recode models that interact with any of the buses, provided these models interact with the bus through the common interface. However, the application of transaction-level modeling is not specific to the SystemC language and can be used with other languages. The concept of TLM first appears in system level language and modeling domain. Transaction-level models are used for high-level synthesis of register-transfer level (RTL) models for a lower-level modelling and implementation of system components. RTL is usually represented by a hardware description language source code (e.g. VHDL, SystemC, Verilog). History In 2000, Thorsten Grötker, R&D Manager at Synopsys was preparing a presentation on the communication mechanism in what was to become the SystemC 2.0 standard, a
https://en.wikipedia.org/wiki/List%20of%20LDAP%20software
The following is a list of software programs that can communicate with and/or host directory services via the Lightweight Directory Access Protocol (LDAP). Client software Cross-platform Admin4 - an open source LDAP browser and directory client for Linux, OS X, and Microsoft Windows, implemented in Python. Apache Directory Server/Studio - an LDAP browser and directory client for Linux, OS X, and Microsoft Windows, and as a plug-in for the Eclipse development environment. FusionDirectory, a web application under license GNU General Public License developed in PHP for managing LDAP directory and associated services. JXplorer - a Java-based browser that runs in any operating environment. JXWorkBench - a Java-based plugin to JXplorer that includes LDAP reporting using the JasperReports reporting engine. LDAP Account Manager - a PHP based webfrontend for managing various account types in an LDAP directory. phpLDAPadmin - a web-based LDAP administration tool for creating and editing LDAP entries in any LDAP server. LDAP User Manager - A simple PHP interface to add LDAP users and groups. Also has a self-service password change feature. Designed to be run as a Docker container. SLAMD - an open source load generation software suite, for testing multiple application protocols, including LDAP. Also contains tools for creating test data and test scripts. RoundCube - an open source and free PHP IMAP client with support with LDAP based address books. GOsa² - provides a powerful framework for managing accounts and systems in LDAP databases web2ldap, a web application under license Apache License 2.0 developed in Python for managing LDAP directories. OpenDJ - a Java-based LDAP server and directory client that runs in any operating environment, under license CDDL LDAP Explorer - a VS Code extension to browse LDAP servers Linux/UNIX Evolution - the contacts part of GNOME's PIM can query LDAP servers. KAddressBook - the address book application for KDE, capabl
https://en.wikipedia.org/wiki/Address%20Resolution%20Protocol
The Address Resolution Protocol (ARP) is a communication protocol used for discovering the link layer address, such as a MAC address, associated with a given internet layer address, typically an IPv4 address. This mapping is a critical function in the Internet protocol suite. ARP was defined in 1982 by , which is Internet Standard STD 37. ARP has been implemented with many combinations of network and data link layer technologies, such as IPv4, Chaosnet, DECnet and Xerox PARC Universal Packet (PUP) using IEEE 802 standards, FDDI, X.25, Frame Relay and Asynchronous Transfer Mode (ATM). In Internet Protocol Version 6 (IPv6) networks, the functionality of ARP is provided by the Neighbor Discovery Protocol (NDP). Operating scope The Address Resolution Protocol is a request-response protocol. Its messages are directly encapsulated by a link layer protocol. It is communicated within the boundaries of a single subnetwork and is never routed. Packet structure The Address Resolution Protocol uses a simple message format containing one address resolution request or response. The packets are carried at the data link layer of the underlying network as raw payload. In the case of Ethernet, a EtherType value is used to identify ARP frames. The size of the ARP message depends on the link layer and network layer address sizes. The message header specifies the types of network in use at each layer as well as the size of addresses of each. The message header is completed with the operation code for request (1) and reply (2). The payload of the packet consists of four addresses, the hardware and protocol address of the sender and receiver hosts. The principal packet structure of ARP packets is shown in the following table which illustrates the case of IPv4 networks running on Ethernet. In this scenario, the packet has 48-bit fields for the sender hardware address (SHA) and target hardware address (THA), and 32-bit fields for the corresponding sender and target protocol addresses
https://en.wikipedia.org/wiki/Maximal%20entropy%20random%20walk
Maximal entropy random walk (MERW) is a popular type of biased random walk on a graph, in which transition probabilities are chosen accordingly to the principle of maximum entropy, which says that the probability distribution which best represents the current state of knowledge is the one with largest entropy. While standard random walk chooses for every vertex uniform probability distribution among its outgoing edges, locally maximizing entropy rate, MERW maximizes it globally (average entropy production) by assuming uniform probability distribution among all paths in a given graph. MERW is used in various fields of science. A direct application is choosing probabilities to maximize transmission rate through a constrained channel, analogously to Fibonacci coding. Its properties also made it useful for example in analysis of complex networks, like link prediction, community detection, robust transport over networks and centrality measures. Also in image analysis, for example for detecting visual saliency regions, object localization, tampering detection or tractography problem. Additionally, it recreates some properties of quantum mechanics, suggesting a way to repair the discrepancy between diffusion models and quantum predictions, like Anderson localization. Basic model Consider a graph with vertices, defined by an adjacency matrix : if there is an edge from vertex to , 0 otherwise. For simplicity assume it is an undirected graph, which corresponds to a symmetric ; however, MERW can also be generalized for directed and weighted graphs (for example Boltzmann distribution among paths instead of uniform). We would like to choose a random walk as a Markov process on this graph: for every vertex and its outgoing edge to , choose probability of the walker randomly using this edge after visiting . Formally, find a stochastic matrix (containing the transition probabilities of a Markov chain) such that for all and for all . Assuming this graph is connected
https://en.wikipedia.org/wiki/Elias%20omega%20coding
Elias ω coding or Elias omega coding is a universal code encoding the positive integers developed by Peter Elias. Like Elias gamma coding and Elias delta coding, it works by prefixing the positive integer with a representation of its order of magnitude in a universal code. Unlike those other two codes, however, Elias omega recursively encodes that prefix; thus, they are sometimes known as recursive Elias codes. Omega coding is used in applications where the largest encoded value is not known ahead of time, or to compress data in which small values are much more frequent than large values. To encode a positive integer N: Place a "0" at the end of the code. If N = 1, stop; encoding is complete. Prepend the binary representation of N to the beginning of the code. This will be at least two bits, the first bit of which is a 1. Let N equal the number of bits just prepended, minus one. Return to Step 2 to prepend the encoding of the new N. To decode an Elias omega-encoded positive integer: Start with a variable N, set to a value of 1. If the next bit is a "0" then stop. The decoded number is N. If the next bit is a "1" then read it plus N more bits, and use that binary number as the new value of N. Go back to Step 2. Examples Omega codes can be thought of as a number of "groups". A group is either a single 0 bit, which terminates the code, or two or more bits beginning with 1, which is followed by another group. The first few codes are shown below. Included is the so-called implied distribution, describing the distribution of values for which this coding yields a minimum-size code; see Relationship of universal codes to practical compression for details. The encoding for 1 googol, 10100, is 11 1000 101001100 (15 bits of length header) followed by the 333-bit binary representation of 1 googol, which is 10010 01001001 10101101 00100101 10010100 11000011 01111100 11101011 00001011 00100111 10000100 11000100 11001110 00001011 11110011 10001010 11001110 01000000 1000111
https://en.wikipedia.org/wiki/ALTQ
ALTQ (ALTernate Queueing) is the network scheduler for Berkeley Software Distribution. ALTQ provides queueing disciplines, and other components related to quality of service (QoS), required to realize resource sharing. It is most commonly implemented on BSD-based routers. ALTQ is included in the base distribution of FreeBSD, NetBSD, and DragonFly BSD, and was integrated into the pf packet filter of OpenBSD but later replaced by a new queueing subsystem (it was deprecated with OpenBSD 5.5 release, and completely removed with 5.6 in 2014). With ALTQ, packets can be assigned to queues for the purpose of bandwidth control. The scheduler defines the algorithm used to decide which packets get delayed, dropped or sent out immediately. There are five schedulers currently supported in the FreeBSD implementation of ALTQ: — Class-based Queueing. Queues attached to an interface build a tree, thus each queue can have further child queues. Each queue can have a priority and a bandwidth assigned. Priority mainly controls the time packets take to get sent out, while bandwidth has primarily effects on throughput. — Controlled Delay. Attempts to combat bufferbloat. — Fair Queuing. Attempts to fairly distribute bandwidth among all connections. — Hierarchical Fair Service Curve. Queues attached to an interface build a tree, thus each queue can have further child queues. Each queue can have a priority and a bandwidth assigned. Priority mainly controls the time packets take to get sent out, while bandwidth has primarily effects on throughput. — Priority Queueing. Queues are flat attached to the interface, thus, queues cannot have further child queues. Each queue has a unique priority assigned, ranging from 0 to 15. Packets in the queue with the highest priority are processed first. See also Traffic shaping KAME project
https://en.wikipedia.org/wiki/Davisson%E2%80%93Germer%20Prize
The Davisson–Germer Prize in Atomic or Surface Physics is an annual prize that has been awarded by the American Physical Society since 1965. The recipient is chosen for "outstanding work in atomic physics or surface physics". The prize is named after Clinton Davisson and Lester Germer, who first measured electron diffraction, and as of 2007 it is valued at $5,000. Recipients 2023: Feng Liu 2022: David S. Weiss 2021: Michael F. Crommie 2020: Klaas Bergmann 2019: Randall M. Feenstra 2018: 2017: and Stephen Kevan 2016: Randall G. Hulet 2015: and 2014: Nora Berrah 2013: Geraldine L. Richmond 2012: Jean Dalibard 2011: Joachim Stohr 2010: Chris H. Greene 2009: and Krishnan Raghavachari 2008: 2007: 2006: 2005: Ernst G. Bauer 2004: 2003: Rudolf M. Tromp 2002: Gerald Gabrielse 2001: Donald M. Eigler 2000: William Happer 1999: Steven Gwon Sheng Louie 1998: Sheldon Datz 1997: Jerry D. Tersoff 1996: 1995: Max G. Lagally 1994: Carl Weiman [sic] 1993: 1992: 1991: 1990: David Wineland 1989: 1988: John L. Hall 1987: 1986: Daniel Kleppner 1985: 1984: and 1983: E. W. Plummer 1982: Llewellyn H. Thomas 1981: Robert Gomer 1980: Alexander Dalgarno 1979: and 1978: Vernon Hughes 1977: Walter Kohn and 1976: Ugo Fano 1975: and Homer D. Hagstrum 1974: Norman Ramsey 1972: Erwin Wilhelm Müller 1970: Hans Dehmelt 1967: Horace Richard Crane 1965: Source: See also List of physics awards
https://en.wikipedia.org/wiki/Spanish%20Statistics%20and%20Operations%20Research%20Society
The Sociedad de Estadística e Investigación Operativa (SEIO, Statistics and Operations Research Society) is the professional non-profit society for the scientific fields of Statistics and Operations Research in Spain. It was founded in 1962 and it is dedicated to the development, improvement and promotion of the methods and applications of Statistics and Operations Research in its widest possible sense. The society has an ultimate goal of putting Statistics and Operations Research to the service of science and society. The society is recognized by the International Federation of Operational Research Societies and its subgrouping, the Association of European Operational Research Societies, as the main national society for Operations Research in its country. SEIO is also member of CIMPA, Centre International de Mathématiques Pures et Appliquées, Confederación de Sociedades Científicas de España (COSCE), and of the European Mathematical Society (EMS). History SEIO was created in 1962 initially as an Operations Research society. In 1976, the fields of Statistics and Computer Science were included in the society. Since 1962 the society has had the following presidents: Sixto Ríos García Pilar Ibarrola Muñoz Daniel Peña Sánchez de Rivera Jesús T. Pastor Ciurana Domingo Morales González Leandro Pardo Llorente Emilio Carrizosa Priego Begoña Vitoriano Villanueva Governance SEIO is governed by a three-year term president who manages the association with an executive council and two vice-presidents, one for statistics and one for operations research. There are also two academic councils, one per discipline. A president elect supports the president during one year. On a proposal of the members, within SEIO some working groups can be formed, based on either geographical or methodological and application affinity. The headquarters of SEIO is at the Universidad Complutense de Madrid, Facultad de Matemáticas, Despacho 502, 3 Plaza de Ciencias, 28040 Mad
https://en.wikipedia.org/wiki/Analog%20front-end
An analog front-end (AFE or analog front-end controller AFEC) is a set of analog signal conditioning circuitry that uses sensitive analog amplifiers, often operational amplifiers, filters, and sometimes application-specific integrated circuits for sensors, radio receivers, and other circuits to provide a configurable and flexible electronics functional block needed to interface a variety of sensors to an antenna, analog-to-digital converter or, in some cases, to a microcontroller. A radio frequency AFE is used in radio receivers, known as an RF front end. Modules AFE hardware modules are used as interface sensors for many kinds of analog and digital systems, providing hardware modularity. For example, Texas Instruments markets health monitoring AFEs as the ADS1298, AFE4400 and AFE4490. Atmel markets analog front-ends for smart meters. Analog Devices markets a CN0209 product for test and measurement applications.
https://en.wikipedia.org/wiki/Acid%20guanidinium%20thiocyanate-phenol-chloroform%20extraction
Acid guanidinium thiocyanate-phenol-chloroform extraction (abbreviated AGPC) is a liquid–liquid extraction technique in biochemistry. It is widely used in molecular biology for isolating RNA (as well as DNA and protein in some cases). This method may take longer than a column-based system such as the silica-based purification, but has higher purity and the advantage of high recovery of RNA: an RNA column is typically unsuitable for purification of short (<200 nucleotides) RNA species, such as siRNA, miRNA, gRNA and tRNA. It was originally devised by Piotr Chomczynski and Nicoletta Sacchi, who published their protocol in 1987. The reagent is sold by Sigma-Aldrich by the name TRI Reagent; by Invitrogen under the name TRIzol; by Bioline as Trisure; and by Tel-Test as STAT-60. How it works This method relies on phase separation by centrifugation of a mixture of the aqueous sample and a solution containing water-saturated phenol and chloroform, resulting in an upper aqueous phase and a lower organic phase (mainly phenol). Guanidinium thiocyanate, a chaotropic agent, is added to the organic phase to aid in the denaturation of proteins (such as those that strongly bind nucleic acids or those that degrade RNA). The nucleic acids (RNA and/or DNA) partition into the aqueous phase, while protein partitions into the organic phase. The pH of the mixture determines which nucleic acids get purified. Under acidic conditions (pH 4-6), DNA partitions into the organic phase while RNA remains in the aqueous phase. Under neutral conditions (pH 7-8), both DNA and RNA partition into the aqueous phase. In a last step, the nucleic acids are recovered from the aqueous phase by precipitation with 2-propanol. The 2-propanol is then washed with ethanol and the pellet briefly air-dried and dissolved in TE buffer or RNAse free water. Guanidinium thiocyanate denatures proteins, including RNases, and separates rRNA from ribosomal proteins, while phenol, isopropanol and water are solvents with po
https://en.wikipedia.org/wiki/Multi-attribute%20utility
In decision theory, a multi-attribute utility function is used to represent the preferences of an agent over bundles of goods either under conditions of certainty about the results of any potential choice, or under conditions of uncertainty. Preliminaries A person has to decide between two or more options. The decision is based on the attributes of the options. The simplest case is when there is only one attribute, e.g.: money. It is usually assumed that all people prefer more money to less money; hence, the problem in this case is trivial: select the option that gives you more money. In reality, there are two or more attributes. For example, a person has to select between two employment options: option A gives him $12K per month and 20 days of vacation, while option B gives him $15K per month and only 10 days of vacation. The person has to decide between (12K,20) and (15K,10). Different people may have different preferences. Under certain conditions, a person's preferences can be represented by a numeric function. The article ordinal utility describes some properties of such functions and some ways by which they can be calculated. Another consideration that might complicate the decision problem is uncertainty. Although there are at least four sources of uncertainty - the attribute outcomes, and a decisionmaker's fuzziness about: a) the specific shapes of the individual attribute utility functions, b) the aggregating constants' values, and c) whether the attribute utility functions are additive, these terms being addressed presently - uncertainty henceforth means only randomness in attribute levels. This uncertainty complication exists even when there is a single attribute, e.g.: money. For example, option A might be a lottery with 50% chance to win $2, while option B is to win $1 for sure. The person has to decide between the lottery <2:0.5> and the lottery <1:1>. Again, different people may have different preferences. Again, under certain conditions the pr
https://en.wikipedia.org/wiki/Pinstripes
Pinstripes are a pattern of very thin stripes of any color running in parallel. The pattern is often found in fashion. The pinstripe is often compared to the similar chalk stripe. Pinstripes are very thin, often in width, and are created with one single-warp yarn. Although found mostly in men's suits, any type of fabric can be pinstriped. Pinstripes were originally worn only on suit pants but upon being adopted in America during the 20th century they were also used on suit jackets. The Chicago Cubs' baseball uniforms have had pinstripes since 1907 and they are recognized as the first Major League Baseball team to incorporate pinstriping into a baseball uniform Many other former and current Major League Baseball teams—including the Florida Marlins, Minnesota Twins, Montreal Expos, Colorado Rockies, New York Mets, New York Yankees, Chicago White Sox, Detroit Tigers, Philadelphia Phillies, Houston Astros and the San Diego Padres—later adopted pinstripes on their own uniforms. The Yankees, in particular, are associated with the pattern. This was later carried over into the National Basketball Association, with teams like the Chicago Bulls, Charlotte Hornets and Orlando Magic incorporating pinstripes into their uniforms. In baseball lingo, the term "wearing pinstripes" has become synonymous with being a member of the Yankees, despite other teams using them in their uniforms.
https://en.wikipedia.org/wiki/John%20Price%20Wetherill%20Medal
The John Price Wetherill Medal was an award of the Franklin Institute. It was established with a bequest given by the family of John Price Wetherill (1844–1906) on April 3, 1917. On June 10, 1925, the Board of Managers voted to create a silver medal, to be awarded for "discovery or invention in the physical sciences" or "new and important combinations of principles or methods already known". The legend on the first medal read: "for discovery, invention, or development in the physical sciences". The John Price Wetherill Medal was last awarded in 1997. As of 1998 all of the endowed medals previously awarded by the Franklin Institute were reorganized as the Benjamin Franklin Medals. Recipients 1926 Frank Twyman, Wagner Electric Corporation 1927 Carl Ethan Akeley, North East Appliances Inc. 1928 Albert S. Howell, Frank E. Ross 1929 Gustave Fast, William H. Mason, Johannes Ruths 1930 Charles S. Chrisman, William N. Jennings 1931 Thomas Tarvin Gray, Arthur J. Mason, Edwin G. Steele, Walter L. Steele, Henry M. Sutton, Edward C. Wente 1932 Halvor O. Hem, Monroe Calculating Machine Company, Carl George Munters, Baltzar von Platen, Frank Wenner 1933 Henry S. Hulbert, Industrial Brownhoist Corporation, Koppers Company, Francis C. McMath, Robert R. McMath 1934 E. Newton Harvey, Alfred L. Loomis, Johannes B. Ostermeier 1935 F. Hope-Jones, Francis Ferdinand Lucas, Robert E. Naumburg, William H. Shortt, James Edmond Shrader, Louis Bryant Tuckermann, Henry Ellis Warren 1936 Albert L. Marsh 1939 William Albert Hyde 1940 Laurens Hammond, Edward Ernst Kleinschmidt, Howard L. Krum 1941 Harold Stephen Black 1943 Robert Howland Leach 1944 Richard C. DuPont, Willem Fredrik Westendorp 1946 Lewis A. Rodert 1947 Kenneth S. M. Davidson 1948 Wendell Frederick Hess 1949 Edgar Collins Bain, Thomas L. Fawick, Harlan D. Fowler 1950 Donald William Kerst, Sigurd Varian, Russell Varian 1951 Samuel C. Collins, Reid Berry Gray, Gaylord W. Penney 1952 Martin E. Nordberg, Har
https://en.wikipedia.org/wiki/Foxfire
Foxfire, also called fairy fire and chimpanzee fire, is the bioluminescence created by some species of fungi present in decaying wood. The bluish-green glow is attributed to a luciferase, an oxidative enzyme, which emits light as it reacts with a luciferin. The phenomenon has been known since ancient times, with its source determined in 1823. Description Foxfire is the bioluminescence created by some species of fungi present in decaying wood. It occurs in a number of species, including Panellus stipticus, Omphalotus olearius and Omphalotus nidiformis. The bluish-green glow is attributed to luciferin, which emits light after oxidation catalyzed by the enzyme luciferase. Some believe that the light attracts insects to spread spores, or acts as a warning to hungry animals, like the bright colors exhibited by some poisonous or unpalatable animal species. Although generally very dim, in some cases foxfire is bright enough to read by. History The oldest recorded documentation of foxfire is from 382 B.C., by Aristotle, whose notes refer to a light that, unlike fire, was cold to the touch. The Roman thinker Pliny the Elder also mentioned glowing wood in olive groves. Foxfire was used to illuminate the needles on the barometer and the compass of Turtle, an early submarine. This is commonly thought to have been suggested by Benjamin Franklin; a reading of the correspondence from Benjamin Gale, however, shows that Benjamin Franklin was only consulted for alternative forms of lighting when the cold temperatures rendered the foxfire inactive. After many more literary references to foxfire by early scientists and naturalists, its cause was discovered in 1823. The glow emitted from wooden support beams in mines was examined, and it was found that the luminescence came from fungal growth. The "fox" in foxfire may derive from the Old French word , meaning "false", rather than from the name of the animal. The association of foxes with such fires is widespread, however, and occ
https://en.wikipedia.org/wiki/Haruki%27s%20Theorem
Haruki's Theorem says that given three intersecting circles that only intersect each other at two points that the lines connecting the inner intersecting points to the outer satisfy: where are the measure of segments connecting the inner and outer intersection points
https://en.wikipedia.org/wiki/Nucleotide%20diversity
Nucleotide diversity is a concept in molecular genetics which is used to measure the degree of polymorphism within a population. One commonly used measure of nucleotide diversity was first introduced by Nei and Li in 1979. This measure is defined as the average number of nucleotide differences per site between two DNA sequences in all possible pairs in the sample population, and is denoted by . An estimator for is given by: where and are the respective frequencies of the th and th sequences, is the number of nucleotide differences per nucleotide site between the th and th sequences, and is the number of sequences in the sample. The term in front of the sums guarantees an unbiased estimator, which does not depend on how many sequences you sample. Nucleotide diversity is a measure of genetic variation. It is usually associated with other statistical measures of population diversity, and is similar to expected heterozygosity. This statistic may be used to monitor diversity within or between ecological populations, to examine the genetic variation in crops and related species, or to determine evolutionary relationships. Nucleotide diversity can be calculated by examining the DNA sequences directly, or may be estimated from molecular marker data, such as Random Amplified Polymorphic DNA (RAPD) data and Amplified Fragment Length Polymorphism (AFLP) data. Software DnaSP — DNA Sequence Polymorphism, is a software package for the analysis of nucleotide polymorphism from aligned DNA sequence data. MEGA, Molecular Evolutionary Genetics Analysis, is a software package used for estimating rates of molecular evolution, as well as generating phylogenetic trees, and aligning DNA sequences. Available for Windows, Linux and Mac OS X (since ver. 5.x). Arlequin3 software can be used for calculations of nucleotide diversity and a variety of other statistical tests for intra-population and inter-population analyses. Available for Windows. Variscan R package
https://en.wikipedia.org/wiki/Nambu%E2%80%93Jona-Lasinio%20model
In quantum field theory, the Nambu–Jona-Lasinio model (or more precisely: the Nambu and Jona-Lasinio model) is a complicated effective theory of nucleons and mesons constructed from interacting Dirac fermions with chiral symmetry, paralleling the construction of Cooper pairs from electrons in the BCS theory of superconductivity. The "complicatedness" of the theory has become more natural as it is now seen as a low-energy approximation of the still more basic theory of quantum chromodynamics, which does not work perturbatively at low energies. Overview The model is much inspired by the different field of solid state theory, particularly from the BCS breakthrough of 1957. The first inventor of the Nambu–Jona-Lasinio model, Yoichiro Nambu, also contributed essentially to the theory of superconductivity, i.e., by the "Nambu formalism". The second inventor was Giovanni Jona-Lasinio. The common paper of the authors that introduced the model appeared in 1961. A subsequent paper included chiral symmetry breaking, isospin and strangeness. At the same time, the same model was independently considered by Soviet physicists Valentin Vaks and Anatoly Larkin. The model is quite technical, although based essentially on symmetry principles. It is an example of the importance of four-fermion interactions and is defined in a spacetime with an even number of dimensions. It is still important and is used primarily as an effective although not rigorous low energy substitute for quantum chromodynamics. The dynamical creation of a condensate from fermion interactions inspired many theories of the breaking of electroweak symmetry, such as technicolor and the top-quark condensate. Starting with the one-flavor case first, the Lagrangian density is or, equivalently, The terms proportional to are an attractive four-fermion interaction, which parallels the BCS theory phonon exchange interaction. The global symmetry of the model is U(1)Q×U(1)χ where Q is the ordinary charge of the Dirac f
https://en.wikipedia.org/wiki/Co-fermentation
Co-fermentation is the practice in winemaking of fermenting two or more fruits at the same time when producing a wine. This differs from the more common practice of blending separate wine components into a cuvée after fermentation. While co-fermentation in principle could be practiced for any mixture of grape varieties or other fruits, it is today more common for red wines produced from a mixture of red grape varieties and a smaller proportion of white grape varieties. Co-fermentation is an old practice going back to the now uncommon practice of having field blends (mixed plantations of varieties) in vineyards, and the previous practice in some regions (such as Rioja and Tuscany) of using a small proportion of white grapes to "soften" some red wines which tended to have harsh tannins when produced with the winemaking methods of the time. It is believed that the practice may also have been adopted because it was found empirically to give deeper and better colour to wines, which is due to improved co-pigmentation resulting from some components in white grapes. Use today The only classical Old World wine region where co-fermentation is still widely practiced is now the Côte-Rôtie appellation of northern Rhône, while the use of white varieties in red Rioja and Tuscany wine has more or less disappeared. In Côte-Rôtie, the red variety Syrah and the aromatic white variety Viognier (up to 20% is allowed, but 5–10% is more common) must be co-fermented, if Viognier is used. The reason why Viognier has been kept in Côte-Rôtie (while for example the white grapes Marsanne and Roussanne are hardly found any more in red Hermitage or other red Rhône wines where they are allowed) is that it adds signature floral aromas to the wines. The popularity of Côte-Rôtie has led to New World interpretations of this blend, most notably Australian Shiraz-Viognier blends, which are also produced by co-fermentation. The reason why co-fermentation is not more widely practiced is that it "locks
https://en.wikipedia.org/wiki/Streaming%20data
Streaming data is data that is continuously generated by different sources. Such data should be processed incrementally using stream processing techniques without having access to all of the data. In addition, it should be considered that concept drift may happen in the data which means that the properties of the stream may change over time. It is usually used in the context of big data in which it is generated by many different sources at high speed. Data streaming can also be explained as a technology used to deliver content to devices over the internet, and it allows users to access the content immediately, rather than having to wait for it to be downloaded. Big data is forcing many organizations to focus on storage costs, which brings interest to data lakes and data streams. A data lake refers to the storage of a large amount of unstructured and semi data, and is useful due to the increase of big data as it can be stored in such a way that firms can dive into the data lake and pull out what they need at the moment they need it. Whereas a data stream can perform real-time analysis on streaming data, and it differs from data lakes in speed and continuous nature of analysis, without having to store the data first. Characteristics and consequences In digital innovation management theories, five characteristics of digital innovative technologies are mentioned; homogenization and decoupling, modularity, connectivity, digital traces and programmability. Before these characteristics are explained and further elaborated with different examples of data streaming, it is important to understand the difference between digitalization and digitizing. The latter describes encoding from analog information to a digital format, such as light that enters the lens of a camera and transforms to a digital format/image (Yoo et al. 2012). Where digitalization refers to a more socio-technical process, where digitized techniques are applied to broader social and institutional context
https://en.wikipedia.org/wiki/Greg%20Fahy
Gregory M. Fahy is a California-based cryobiologist, biogerontologist, and businessman. He is Vice President and Chief Scientific Officer at Twenty-First Century Medicine, Inc, and has co-founded Intervene Immune, a company developing clinical methods to reverse immune system aging. He is the 2022–2023 president of the Society for Cryobiology. Education A native of California, Fahy holds a Bachelor of Science degree in Biology from the University of California, Irvine and a PhD in pharmacology and cryobiology from the Medical College of Georgia in Augusta. He currently serves on the board of directors of two organizations and as a referee for numerous scientific journals and funding agencies, and holds 35 patents on cryopreservation methods, aging interventions, transplantation, and other topics. Career Fahy is the world's foremost expert in organ cryopreservation by vitrification. Fahy introduced the modern successful approach to vitrification for cryopreservation in cryobiology and he is widely credited, along with William F. Rall, for introducing vitrification into the field of reproductive biology. In 2005, where he was a keynote speaker at the annual Society for Cryobiology meeting, Fahy announced that Twenty-First Century Medicine had successfully cryopreserved a rabbit kidney at −130 °C by vitrification and transplanted it into a rabbit after rewarming, with subsequent long-term life support by the vitrified-rewarmed kidney as the sole kidney. This research breakthrough was later published in the peer-reviewed journal Organogenesis. Fahy is also a biogerontologist and is the originator and Editor-in-Chief of The Future of Aging: Pathways to Human Life Extension, a multi-authored book on the future of biogerontology. He currently serves on the editorial boards of Rejuvenation Research and the Open Geriatric Medicine Journal and served for 16 years as a Director of the American Aging Association and for 6 years as the editor of AGE News, the organization
https://en.wikipedia.org/wiki/Snow%20Mountain%20garlic
Snow Mountain garlic (also known as Allium sativum L., Allium schoenoprasum, and Kashmiri garlic), is a subspecies of garlic which is found in the mountainous in Jammu and Kashmir. It grows well in the western Himalayas at altitudes of up to , in temperatures as low as , and with very little oxygen. In Hindi, it is known as ek pothi lahsun.
https://en.wikipedia.org/wiki/Pyraminx%20Crystal
The Pyraminx Crystal (also called a Chrysanthemum puzzle) is a dodecahedral puzzle similar to the Rubik's Cube and the Megaminx. It is manufactured by Uwe Mèffert and has been sold in his puzzle shop since 2008. The puzzle was originally called the Brilic, and was first made in 2006 by Aleh Hladzilin, a member of the Twisty Puzzles Forum. It is not to be confused with the Pyraminx, which is also invented and sold by Meffert. History The Pyraminx Crystal was patented in Europe on July 16, 1987. The patent number is DE8707783U. In late 2007, due to requests by puzzle fans worldwide, Uwe Mèffert began manufacturing the puzzle. The puzzles were first shipped in February 2008. There are two 12-color versions, one with the black body commonly used for the Rubik's Cube and its variations, and one with a white body. The puzzle company QJ started manufacturing this puzzle in 2010, leading Meffert's Puzzles to file a lawsuit against QJ. The Pyraminx Crystal ran out of stock fairly quickly, and became a collector's puzzle. In October 2011, a new set was created with some slight improvements to the quality. Description The puzzle consists of a dodecahedron sliced in such a way that each slice cuts through the centers of five different pentagonal faces. This cuts the puzzle into 20 corner pieces and 30 edge pieces, with 50 pieces in total. Each face consists of five corners and five edges. When a face is turned, these pieces and five additional edges move with it. Each corner is shared by 3 faces, and each edge is shared by 2 faces. By alternately rotating adjacent faces, the pieces may be permuted. The goal of the puzzle is to scramble the colors, and then return it to its original state. Solutions The puzzle is essentially a deeper-cut version of the Megaminx, and the same algorithms used for solving the Megaminx's corners may be used to solve the corners on the Pyraminx Crystal. The edge pieces can then be permuted by a simple 4-twist algorithm, R L' R' L, tha
https://en.wikipedia.org/wiki/Eosinophilic%20myocarditis
Eosinophilic myocarditis is inflammation in the heart muscle that is caused by the infiltration and destructive activity of a type of white blood cell, the eosinophil. Typically, the disorder is associated with hypereosinophilia, i.e. an eosinophil blood cell count greater than 1,500 per microliter (normal 100 to 400 per microliter). It is distinguished from non-eosinophilic myocarditis, which is heart inflammation caused by other types of white blood cells, i.e. lymphocytes and monocytes, as well as the respective descendants of these cells, NK cells and macrophages. This distinction is important because the eosinophil-based disorder is due to a particular set of underlying diseases and its preferred treatments differ from those for non-eosinophilic myocarditis. Eosinophilic myocarditis is often viewed as a disorder that has three progressive stages. The first stage of eosinophilic myocarditis involves acute inflammation and cardiac cell necrosis (i.e. areas of dead cells); it is dominated by symptoms characterized as the acute coronary syndrome such as angina, heart attack and/or congestive heart failure. The second stage is a thrombotic stage wherein the endocardium (i.e. interior wall) of the diseased heart forms blood clots which break off, travel in, and block blood through systemic or pulmonary arteries; this stage may dominate the initial presentation in some individuals. The third stage is a fibrotic stage wherein scarring replaces damaged heart muscle tissue to cause a clinical presentation dominated by a poorly contracting heart and cardiac valve disease. Perhaps less commonly, eosinophilic myocarditis, eosinophilic thrombotic myocarditis, and eosinophilic fibrotic myocarditis are viewed as three separate but sequentially linked disorders in a spectrum of disorders termed eosinophilic cardiac diseases. The focus here is on eosinophilic myocarditis as a distinct disorder separate from its thrombotic and fibrotic sequelae. Eosinophilic myocarditis is a ra
https://en.wikipedia.org/wiki/Andrew%20D.%20Gordon
Andrew D. Gordon is a British computer scientist employed by Microsoft Research. His research interests include programming language design, formal methods, concurrency, cryptography, and access control. Biography Gordon earned a Ph.D. from the University of Cambridge in 1992. Until 1997 Gordon was a Research Fellow at the University of Cambridge Computer Laboratory. He then joined the Microsoft Research laboratory in Cambridge, England, where he is a principal researcher in the Programming Principles and Tools group. He also holds a professorship at the University of Edinburgh. Research Gordon is one of the designers of Concurrent Haskell, a functional programming language with explicit primitives for concurrency. He is the co-designer with Martin Abadi of spi calculus, an extension of the π-calculus for formalized reasoning about cryptographic systems. He and Luca Cardelli invented the ambient calculus for reasoning about mobile code. With Moritz Y. Becker and Cédric Fournet, Gordon also designed SecPAL, a Microsoft specification language for access control policies. Awards and honours Gordon's Ph.D. thesis, Functional Programming and Input/Output, won the 1993 Distinguished Dissertation Award of the British Computer Society. His 2000 paper on the ambient calculus subject with Luca Cardelli, "Anytime, Anywhere: Modal Logics for Mobile Ambients", won the 2010 SIGPLAN Most Influential POPL Paper Award.
https://en.wikipedia.org/wiki/Tear%20gas
Tear gas, also known as a lachrymator agent or lachrymator (), sometimes colloquially known as "mace" after the early commercial self-defense spray, is a chemical weapon that stimulates the nerves of the lacrimal gland in the eye to produce tears. In addition, it can cause severe eye and respiratory pain, skin irritation, bleeding, and blindness. Common lachrymators both currently and formerly used as tear gas include pepper spray (OC gas), PAVA spray (nonivamide), CS gas, CR gas, CN gas (phenacyl chloride), bromoacetone, xylyl bromide and Mace (a branded mixture). While lachrymatory agents are commonly deployed for riot control by law enforcement and military personnel, its use in warfare is prohibited by various international treaties. During World War I, increasingly toxic and deadly lachrymatory agents were used. The short and long-term effects of tear gas are not well studied. The published peer-reviewed literature consists of lower quality evidence that does not establish causality. More rigorous research is needed. Exposure to tear gas agents may produce numerous short-term and long-term health effects, including development of respiratory illnesses, severe eye injuries and diseases (such as traumatic optic neuropathy, keratitis, glaucoma, and cataracts), dermatitis, damage of cardiovascular and gastrointestinal systems, and death, especially in cases with exposure to high concentrations of tear gas or application of the tear gases in enclosed spaces. Effects Tear gas generally consists of aerosolized solid or liquid compounds (bromoacetone or xylyl bromide), not gas. Tear gas works by irritating mucous membranes in the eyes, nose, mouth and lungs. It causes crying, sneezing, coughing, difficulty breathing, pain in the eyes, and temporary blindness. With CS gas, symptoms of irritation typically appear after 20 to 60 seconds of exposure and commonly resolve within 30 minutes of leaving (or being removed from) the area. Risks As with all non-lethal or les
https://en.wikipedia.org/wiki/Behavioral%20ethics
Behavioural ethics is a new field of social scientific research that seeks to understand how people actually behave when confronted with ethical dilemmas. It refers to behaviour that is judged according to generally accepted norms of behaviour. Behavioural ethics lead to the development of ethical models such as the so-called "bystander intervention", which describes ethical behavior as far harder to display because of what we learn from social institutions such as family, school, and religion. Here, intervening in an ethically challenging situation means that an individual must go through several steps and that failure to complete all means a failure to behave ethically. Behavioural ethics in different fields Behavioural ethics and education In ethics teaching and research, ethics is arguably the "next big thing" because its investigation agenda has generated many knowledge on why and how people choose and act when being confronted with ethical subject, which was unknown previously. Based on the extant body of ethics course books and course plans from fields such as medicine, teaching, accounting, and journalism, "moral reasoning" - along with associated skills - is often an established objective. Behavioural ethics, however, is distinguished from the concept of moral reasoning because ethical behaviour is primarily driven by a diverse set of intuitive processes over which individuals have little conscious control. Behavioural ethics calls for a model of ethics in education that focuses not on directly modelling good ethical reasoning but on the way people think clearly and impartially about ethical problems. Behavioural Ethics and Rational Actor Model Philosophical views about morality has been supported traditionally by theoretical reasoning and introspection, with at best passing reference to actual human behaviour. Models of human morality advanced by behavioural ethics based on the fact that morality is a new and still developing quality of the evolution
https://en.wikipedia.org/wiki/Quadrature%20mirror%20filter
In digital signal processing, a quadrature mirror filter is a filter whose magnitude response is the mirror image around of that of another filter. Together these filters, first introduced by Croisier et al., are known as the quadrature mirror filter pair. A filter is the quadrature mirror filter of if . The filter responses are symmetric about : In audio/voice codecs, a quadrature mirror filter pair is often used to implement a filter bank that splits an input signal into two bands. The resulting high-pass and low-pass signals are often reduced by a factor of 2, giving a critically sampled two-channel representation of the original signal. The analysis filters are often related by the following formula in addition to quadrate mirror property: where is the frequency, and the sampling rate is normalized to . This is known as power complementary property. In other words, the power sum of the high-pass and low-pass filters is equal to 1. Orthogonal wavelets – the Haar wavelets and related Daubechies wavelets, Coiflets, and some developed by Mallat, are generated by scaling functions which, with the wavelet, satisfy a quadrature mirror filter relationship. Relationship to other filter banks The earliest wavelets were based on expanding a function in terms of rectangular steps, the Haar wavelets. This is usually a poor approximation, whereas Daubechies wavelets are among the simplest but most important families of wavelets. A linear filter that is zero for “smooth” signals, given a record of points is defined as It is desirable to have it vanish for a constant, so taking the order , for example, And to have it vanish for a linear ramp, so that A linear filter will vanish for any , and this is all that can be done with a fourth-order wavelet. Six terms will be needed to vanish a quadratic curve, and so on, given the other constraints to be included. Next an accompanying filter may be defined as This filter responds in an exactly opposite manne
https://en.wikipedia.org/wiki/LGM-35%20Sentinel
The LGM-35 Sentinel, also known as the Ground Based Strategic Deterrent (GBSD), is a future American land-based intercontinental ballistic missile system (ICBM) currently in the early stages of development. It is slated to replace Minuteman III missiles, currently stationed in North Dakota, Wyoming, Montana, and Nebraska from 2029 through 2075. In 2020 the Department of the Air Force awarded defense contractor Northrop Grumman a $13.3 billion sole-source contract for development of the LGM-35 after Boeing withdrew its proposal. Northrop Grumman's subcontractors on the LGM-35 include Lockheed Martin, General Dynamics, Bechtel, Honeywell, Aerojet Rocketdyne, Parsons, Textron, and others. Name According to the United States Air Force website, the L in LGM is the Department of Defense designation for silo-launched; G means surface attack; and "M" stands for guided missile. History In 2010, the ICBM Coalition, legislators from states that house nuclear missiles, told President Obama they would not support ratification of the New START treaty with Russia unless Obama agreed to revamp the US nuclear triad: nuclear weapons that could be launched from land, sea, and air. In a written statement, President Obama agreed to "modernize or replace" all three legs of the triad. A request for proposal for development and maintenance of a next-generation nuclear ICBM was made by the US Air Force Nuclear Weapons Center in July 2016. The GBSD would replace the Minuteman III, which was first deployed in 1970, in the land-based portion of the US nuclear triad. The new missiles, to be phased in over a decade from the late 2020s, are estimated over a fifty-year life cycle to cost around $264 billion. Boeing and Northrop Grumman competed for the contract. In August 2017, the Air Force awarded three-year development contracts to Boeing and Northrop Grumman for $349 million and $329 million, respectively. One of these companies was to be selected to produce a ground-based nuclear ICBM i
https://en.wikipedia.org/wiki/Dave%20Bayer
David Allen Bayer (born November 29, 1955) is an American mathematician known for his contributions in algebra and symbolic computation and for his consulting work in the movie industry. He is a professor of mathematics at Barnard College, Columbia University. Education and career Bayer was educated at Swarthmore College as an undergraduate, where he attended a course on combinatorial algorithms given by Herbert Wilf. During that semester, Bayer related several original ideas to Wilf on the subject. These contributions were later incorporated into the second edition of Wilf and Albert Nijenhuis' influential book Combinatorial Algorithms, with a detailed acknowledgement by its authors. Bayer subsequently earned his Ph.D. at Harvard University in 1982 under the direction of Heisuke Hironaka with a dissertation entitled The Division Algorithm and the Hilbert Scheme. He joined Columbia University thereafter. Bayer is the son of Joan and Bryce Bayer, the inventor of the Bayer filter. Contributions Bayer has worked in various areas of algebra and symbolic computation, including Hilbert functions, Betti numbers, and linear programming. He has written a number of highly cited papers in these areas with other notable mathematicians, including Bernd Sturmfels, Jeffrey Lagarias, Persi Diaconis, Irena Peeva, and David Eisenbud. Bayer is one of ten individuals cited in the white paper published by the pseudonymous Satoshi Nakamoto describing the technological underpinnings of Bitcoin. He is cited as a co-author, along with Stuart Haber and W. Scott Stornetta, of a paper to improve on a system for tamper-proofing timestamps by incorporating Merkle trees. Consulting Bayer was a mathematics consultant for the film A Beautiful Mind, the biopic of John Nash, and also had a cameo as one of the "Pen Ceremony" professors.
https://en.wikipedia.org/wiki/Derrick%20Lonsdale
Derrick Lonsdale (born April 22, 1924) is an American pediatrician and researcher into the benefits of certain nutrients in preventing disease and psychotic behavior. He is a Fellow of the American College of Nutrition (FACN), and also a Fellow of the American College for Advancement in Medicine (FACAM) Lonsdale is known for his research into thiamine and his controversial theory that thiamine deficiency is widespread among Americans and predictive of anti-social behavior. Positions Lonsdale was a practitioner in pediatrics at the Cleveland Clinic for 20 years. He became Head of the Section of Biochemical Genetics at the Clinic. In 1982, Lonsdale retired from the Cleveland Clinic and joined the Preventive Medicine Group to specialize in nutrient-based therapy. He is also on the Scientific Research Advisory Committee of the American College for Advancement in Medicine and is an editor of their Journal. Research work Lonsdale hypothesizes that healing comes from the body itself rather than from external medical interventions. Lonsdale has studied the use of nutrients to prevent diseases. He is particularly interested in Vitamin B1, also known as thiamine. The World Health Organisation have cited three of Lonsdale's thiamine deficiency papers on Sudden Infant Death Syndrome. Lonsdale has spoken at orthomolecular medicine conferences. Autism Lonsdale led an uncontrolled study on the treatment of autism spectrum children with thiamine.<ref>Treatment of autism spectrum children with thiamine tetrahydrofurfuryl disulfide: A pilot study Derrick Lonsdale, Raymond J. Shamberger 2 & Tapan Audhya</> (2002), Neuroendocrinol Lett, Vol 23:302-308.</ref> He also led a study (uncontrolled) of secretin, which he and Shamberger say led to an improvement in behaviour and bowel control of autistic children in his study. Both of these studies are controversial because they link nutrition with autism. Analysing the findings in the latter study, autism researchers say that while
https://en.wikipedia.org/wiki/Alpine%20tundra
Alpine tundra is a type of natural region or biome that does not contain trees because it is at high elevation, with an associated harsh climate. As the latitude of a location approaches the poles, the threshold elevation for alpine tundra gets lower until it reaches sea level, and alpine tundra merges with polar tundra. The high elevation causes an adverse climate, which is too cold and windy to support tree growth. Alpine tundra transitions to sub-alpine forests below the tree line; stunted forests occurring at the forest-tundra ecotone are known as krummholz. With increasing elevation it ends at the snow line where snow and ice persist through summer. Alpine tundra occurs in mountains worldwide. The flora of the alpine tundra is characterized by dwarf shrubs close to the ground. The cold climate of the alpine tundra is caused by adiabatic cooling of air, and is similar to polar climate. Geography Alpine tundra occurs at high enough altitude at any latitude. Portions of montane grasslands and shrublands ecoregions worldwide include alpine tundra. Large regions of alpine tundra occur in the North American Cordillera and parts of the northern Appalachian Mountains in North America, the Alps and Pyrenees of Europe, the Himalaya and Karakoram of Asia, the Andes of South America, the Eastern Rift mountains of Africa, the Snowy Mountains of Australia, the South Island of New Zealand, and the Scandinavian mountains Alpine tundra occupies high-mountain summits, slopes, and ridges above timberline. Aspect plays a role as well; the treeline often occurs at higher elevations on warmer equator-facing slopes. Because the alpine zone is present only on mountains, much of the landscape is rugged and broken, with rocky, snowcapped peaks, cliffs, and talus slopes, but also contains areas of gently rolling to almost flat topography. Averaging over many locations and local microclimates, the treeline rises when moving 1 degree south from 70 to 50°N, and per degree from 50 to
https://en.wikipedia.org/wiki/Baking
Baking is a method of preparing food that uses dry heat, typically in an oven, but can also be done in hot ashes, or on hot stones. The most common baked item is bread, but many other types of foods can be baked. Heat is gradually transferred "from the surface of cakes, cookies, and pieces of bread to their center. As heat travels through, it transforms batters and doughs into baked goods and more with a firm dry crust and a softer center". Baking can be combined with grilling to produce a hybrid barbecue variant by using both methods simultaneously, or one after the other. Baking is related to barbecuing because the concept of the masonry oven is similar to that of a smoke pit. Baking has traditionally been performed at home for day-to-day meals and in bakeries and restaurants for local consumption. When production was industrialized, baking was automated by machines in large factories. The art of baking remains a fundamental skill and is important for nutrition, as baked goods, especially bread, are a common and important food, both from an economic and cultural point of view. A person who prepares baked goods as a profession is called a baker. On a related note, a pastry chef is someone who is trained in the art of making pastries, cakes, desserts, bread, and other baked goods. Foods and techniques All types of food can be baked, but some require special care and protection from direct heat. Various techniques have been developed to provide this protection. In addition to bread, baking is used to prepare cakes, pastries, pies, tarts, quiches, cookies, scones, crackers, pretzels, and more. These popular items are known collectively as "baked goods," and are often sold at a bakery, which is a store that carries only baked goods, or at markets, grocery stores, farmers markets or through other venues. Meat, including cured meats, such as ham can also be baked, but baking is usually reserved for meatloaf, smaller cuts of whole meats, or whole meats that contain s
https://en.wikipedia.org/wiki/James%20Joule%20Medal%20and%20Prize
The James Joule Medal and Prize is awarded by the Institute of Physics. It was established in 2008, and was named in honour of James Prescott Joule, British physicist and brewer. The award is made for distinguished contributions to applied physics. The medal is silver and is accompanied by a prize of £1000. The medal gained international recognition in 2018 when it was awarded to Sri Lankan scientist Ravi Silva of University of Surrey, whose work in part led to the establishment of the Sri Lanka Institute of Nanotechnology (SLINTec). Recipients The following persons have received this medal: 2022 Michael Holynski, for distinguished contributions to the development of quantum sensors 2021 Bajram Zeqiri, for development of acoustic measurement techniques and sensors 2020 Richard Bowtell, for new hardware and techniques for biomedical imaging 2019 Robert Hadfield, for infrared single photon detection technology 2018 Ravi Silva, for carbon nanomaterials 2017 Henry Snaith, for metal-halide perovskite solar cells 2015 Judith Driscoll, for strongly correlated oxides 2013 Paul French, for fluorescence-lifetime imaging microscopy 2011 Donald D Arnone, for terahertz radiation research 2009 Jenny Nelson, for theoretical analysis of photovoltaic materials 2008 David Parker, for positron emission particle tracking See also Institute of Physics Awards List of physics awards List of awards named after people
https://en.wikipedia.org/wiki/Bonini%27s%20paradox
Bonini's paradox, named after Stanford business professor Charles Bonini, explains the difficulty in constructing models or simulations that fully capture the workings of complex systems (such as the human brain). Statements In modern discourse, the paradox was articulated by John M. Dutton and William H. Starbuck: "As a model of a complex system becomes more complete, it becomes less understandable. Alternatively, as a model grows more realistic, it also becomes just as difficult to understand as the real-world processes it represents." This paradox may be used by researchers to explain why complete models of the human brain and thinking processes have not been created and will undoubtedly remain difficult for years to come. This same paradox was observed earlier from a quote by philosopher-poet Paul Valéry (1871–1945): "Ce qui est simple est toujours faux. Ce qui ne l’est pas est inutilisable". ("If it's simple, it's always false. If it's not, it's unusable.") Also, the same topic has been discussed by Richard Levins in his classic essay "The Strategy of Model Building in Population Biology", in stating that complex models have 'too many parameters to measure, leading to analytically insoluble equations that would exceed the capacity of our computers, but the results would have no meaning for us even if they could be solved. Related issues Bonini's paradox can be seen as a case of the map–territory relation: simpler maps are less accurate though more useful representations of the territory. An extreme form is given in the fictional stories Sylvie and Bruno Concluded and "On Exactitude in Science", which imagine a map of a scale of 1:1 (the same size as the territory), which is precise but unusable, illustrating one extreme of Bonini's paradox. Isaac Asimov's fictional science of "Psychohistory" in his Foundation series also faces with this dilemma; Asimov even had one of his psychohistorians discuss the paradox. See also
https://en.wikipedia.org/wiki/Halobiforma
Halobiforma (common abbreviation: Hbf.) is a genus of halophilic archaea of the family Natrialbaceae. Phylogeny The currently accepted taxonomy is based on the List of Prokaryotic names with Standing in Nomenclature (LPSN) and National Center for Biotechnology Information (NCBI). See also List of Archaea genera
https://en.wikipedia.org/wiki/Zellballen
A zellballen is a small nest of chromaffin cells or chief cells with pale eosinophilic staining. Zellballen are separated into groups by segmenting bands of fibrovascular stroma, and are surrounded by supporting sustentacular cells. A zellballen pattern is diagnostic for paraganglioma or pheochromocytoma. Zellballen is German for "ball of cells".
https://en.wikipedia.org/wiki/Oxygen%20enhancement%20ratio
The oxygen enhancement ratio (OER) or oxygen enhancement effect in radiobiology refers to the enhancement of therapeutic or detrimental effect of ionizing radiation due to the presence of oxygen. This so-called oxygen effect is most notable when cells are exposed to an ionizing radiation dose. The OER is traditionally defined as the ratio of radiation doses during lack of oxygen compared to no lack of oxygen for the same biological effect. This may give varying numerical values depending on the chosen biological effect. Additionally, OER may be presented in terms of hyperoxic environments and/or with altered oxygen baseline, complicating the significance of this value. The maximum OER depends mainly on the ionizing density or LET of the radiation. Radiation with higher LET and higher relative biological effectiveness (RBE) have a lower OER in mammalian cell tissues. The value of the maximum OER varies from about 1–4. The maximum OER ranges from about 2–4 for low-LET radiations such as X-rays, beta particles and gamma rays, whereas the OER is unity for high-LET radiations such as low energy alpha particles. Uses in medicine The effect is used in medical physics to increase the effect of radiation therapy in oncology treatments. Additional oxygen abundance creates additional free radicals and increases the damage to the target tissue. In solid tumors the inner parts become less oxygenated than normal tissue and up to three times higher dose is needed to achieve the same tumor control probability as in tissue with normal oxygenation. Explanation of the Oxygen Effect The best known explanation of the oxygen effect is the oxygen fixation hypothesis which postulates that oxygen permanently fixes radical-induced DNA damage so it becomes permanent. Recently, it has been posited that the oxygen effect involves radiation exposures of cells causing their mitochondria to produce greater amounts of reactive oxygen species. See also Radiation therapy Radiobiology Health