source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Capped%20square%20antiprismatic%20molecular%20geometry
In chemistry, the capped square antiprismatic molecular geometry describes the shape of compounds where nine atoms, groups of atoms, or ligands are arranged around a central atom, defining the vertices of a gyroelongated square pyramid. The gyroelongated square pyramid is a square pyramid with a square antiprism connected to the square base. In this respect, it can be seen as a "capped" square antiprism (a square antiprism with a pyramid erected on one of the square faces). It is very similar to the tricapped trigonal prismatic molecular geometry, and there is some dispute over the specific geometry exhibited by certain molecules. Examples is sometimes described as having a capped square antiprismatic geometry, although its geometry is most often described as tricapped trigonal prismatic. , a lanthanum(III) complex with a La–La bond.
https://en.wikipedia.org/wiki/Doubly%20special%20relativity
Doubly special relativity (DSR) – also called deformed special relativity or, by some, extra-special relativity – is a modified theory of special relativity in which there is not only an observer-independent maximum velocity (the speed of light), but also, an observer-independent maximum energy scale (the Planck energy) and/or a minimum length scale (the Planck length). This contrasts with other Lorentz-violating theories, such as the Standard-Model Extension, where Lorentz invariance is instead broken by the presence of a preferred frame. The main motivation for this theory is that the Planck energy should be the scale where as yet unknown quantum gravity effects become important and, due to invariance of physical laws, this scale should remain fixed in all inertial frames. History First attempts to modify special relativity by introducing an observer-independent length were made by Pavlopoulos (1967), who estimated this length at about . In the context of quantum gravity, Giovanni Amelino-Camelia (2000) introduced what is now called doubly special relativity, by proposing a specific realization of preserving invariance of the Planck length . This was reformulated by Kowalski-Glikman (2001) in terms of an observer-independent Planck mass. A different model, inspired by that of Amelino-Camelia, was proposed in 2001 by João Magueijo and Lee Smolin, who also focused on the invariance of Planck energy. It was realized that there are, indeed, three kinds of deformation of special relativity that allow one to achieve an invariance of the Planck energy; either as a maximum energy, as a maximal momentum, or both. DSR models are possibly related to loop quantum gravity in 2+1 dimensions (two space, one time), and it has been conjectured that a relation also exists in 3+1 dimensions. The motivation for these proposals is mainly theoretical, based on the following observation: The Planck energy is expected to play a fundamental role in a theory of quantum gravity; setting
https://en.wikipedia.org/wiki/Cleavage%20%28crystal%29
Cleavage, in mineralogy and materials science, is the tendency of crystalline materials to split along definite crystallographic structural planes. These planes of relative weakness are a result of the regular locations of atoms and ions in the crystal, which create smooth repeating surfaces that are visible both in the microscope and to the naked eye. If bonds in certain directions are weaker than others, the crystal will tend to split along the weakly bonded planes. These flat breaks are termed "cleavage". The classic example of cleavage is mica, which cleaves in a single direction along the basal pinacoid, making the layers seem like pages in a book. In fact, mineralogists often refer to "books of mica". Diamond and graphite provide examples of cleavage. Each is composed solely of a single element, carbon. In diamond, each carbon atom is bonded to four others in a tetrahedral pattern with short covalent bonds. The planes of weakness (cleavage planes) in a diamond are in four directions, following the faces of the octahedron. In graphite, carbon atoms are contained in layers in a hexagonal pattern where the covalent bonds are shorter (and thus even stronger) than those of diamond. However, each layer is connected to the other with a longer and much weaker van der Waals bond. This gives graphite a single direction of cleavage, parallel to the basal pinacoid. So weak is this bond that it is broken with little force, giving graphite a slippery feel as layers shear apart. As a result, graphite makes an excellent dry lubricant. While all single crystals will show some tendency to split along atomic planes in their crystal structure, if the differences between one direction or another are not large enough, the mineral will not display cleavage. Corundum, for example, displays no cleavage. Types of cleavage Cleavage forms parallel to crystallographic planes: Basal, pinacoidal, or planar cleavage occurs when there is only one cleavage plane. Talc has basal cleavage.
https://en.wikipedia.org/wiki/Robert%20Kennedy%20%28chemist%29
Robert Travis Kennedy (born 1962) is an American chemist specializing in bioanalytical chemistry including liquid chromatography, capillary electrophoresis, and microfluidics. He is currently the Hobart H. Willard Distinguished University Professor of Chemistry and the chair of the department of chemistry at the University of Michigan. He holds joint appointments with the Department of Pharmacology and Department Macromolecular Science and Engineering. Kennedy is an Associate Editor of Analytical Chemistry. Early life and education Kennedy was born on November 11, 1962, in Sault Ste. Marie, Michigan. He earned a Bachelor of Science degree in chemistry at the University of Florida in 1984 and a Ph.D. from the University of North Carolina-Chapel Hill (UNC) in 1988 while working under James Jorgenson. He was an NSF post-doctoral fellow at UNC from 1989-1991 with R. Mark Wightman. Academic career and research interests Kennedy became a professor of chemistry at the University of Florida in 1991. After 11 years, he moved to the University of Michigan. He has graduated approximately 70 graduate students. Kennedy’s research focuses on developing analytical instrumentation and methods that can help solve biological problems. He is considered a leader in the field of analytical chemistry, and an expert in endocrinology, neurochemistry, and high-throughput analysis. Major contributions to analytical chemistry include affinity probe capillary electrophoresis, in vivo neurochemical measurements, and ultra-high pressure liquid chromatography. He has been an Lilly Analytical Research Fellow, Alfred P. Sloan Fellow, NSF Presidential Faculty Fellow, and AAAS Fellow. Honors and awards Martin Medal (2019) Ralph N. Adams Award in Bioanalyical Chemistry (2016) ACS Award in Chromatography (2017) CASSS Award for Outstanding Achievements in Separation Science (2017) Marcel Golay Award for Lifetime Achievement in Capillary Chromatography (2012) Eastern Analytical Symposium Aw
https://en.wikipedia.org/wiki/Type%20VI%20secretion%20system
The type VI secretion system (T6SS) is molecular machine used by a wide range of Gram-negative bacterial species to transport effectors from the interior (cytoplasm or cytosol) of a bacterial cell across the cellular envelope into an adjacent target cell. While often reported that the T6SS was discovered in 2006 by researchers studying the causative agent of cholera, Vibrio cholerae, the first study demonstrating that T6SS genes encode a protein export apparatus was actually published in 2004, in a study of protein secretion by the fish pathogen Edwardsiella tarda. Since then, it is estimated that at least a quarter of all pathogenic and non-pathogenic proteobacterial genomes encode for a T6SS, including pathogens of animals, plants, and humans, as well as soil, environmental or marine bacteria. Genes encoding for the T6SSs are commonly found chromosomally, but can also be harboured in mobile genetic elements and on plasmids mediating their transfer and increase in genetic diversity. While most of the early studies of Type VI secretion focused on its role in the pathogenesis of higher organisms, it is now known to function primarily in interbacterial antagonism. Structure and mechanism The T6SS is thought to resemble an inverted phage extending outward from the bacterial cell surface. It consists of 14 proteins that assemble into three sub-complexes: a phage tail-like tubule, a phage baseplate-like structure, and cell-envelope spanning membrane complex. These three subcomplexes work together to transport proteins across the bacterial cell envelope and into a target cell through a contractile mechanism Phage tail-like The phage tail-like component of the T6SS is a dynamic tubular structure that undergoes cycles of assembly and disassembly. It can be up to 600 nm long, and has been visualized extending across the bacterial cytoplasm in electron micrographs. The tubules consist of repeating units of the proteins TssA and TssB (VipA/VipB) arranged as a sheath ar
https://en.wikipedia.org/wiki/Lettering
Lettering is an umbrella term that covers the art of drawing letters, instead of simply writing them. Lettering is considered an art form, where each letter in a phrase or quote acts as an illustration. Each letter is created with attention to detail and has a unique role within a composition. Lettering is created as an image, with letters that are meant to be used in a unique configuration. Lettering words do not always translate into alphabets that can later be used in a typeface, since they are created with a specific word in mind. Examples Lettering includes that used for purposes of blueprints and comic books, as well as decorative lettering such as sign painting and custom graphics. For instance; on posters, for a letterhead or business wordmark, lettering in stone, lettering for advertisements, tire lettering, fileteado, graffiti, or on chalkboards. Lettering may be drawn, incised, applied using stencils, using a digital medium with a stylus, or a vector program. Lettering that was not created using digital tools is commonly referred to as hand-lettering. In the past, almost all decorative lettering other than that on paper was created as custom or hand-painted lettering. The use of fonts in place of lettering has increased due to new printing methods, phototypesetting, and digital typesetting, which allow fonts to be printed at any desired size. Lettering has been particularly important in Islamic art, due to the Islamic practice of avoiding depictions of sentient beings generally and of Muhammad in particular, and instead using representations in the form of Islamic calligraphy, including hilyes, or artforms based on written descriptions of Muhammed. More recently, there has been an influx of aspiring artists attempting hand-lettering with brush pens and digital mediums. Some popular styles are sans serif, serif, cursive/script, vintage, blackletter ("gothic") calligraphy, graffiti, and creative lettering. Related artforms Lettering can be confused
https://en.wikipedia.org/wiki/Anaphase%20lag
Anaphase lag is a consequence of an event during cell division where sister chromatids do not properly separate from each other because of improper spindle formation. The chromosome or chromatid does not properly migrate during anaphase and the daughter cells will lose some genetic information. It is one of many causes of aneuploidy. This event can occur during both meiosis and mitosis with unique repercussions. In either case, anaphase lag will cause one daughter cell to receive a complete set of chromosomes while the other lacks one paired set of chromosomes, creating a form of monosomy. Whether the cell survives depends on which sister chromatid was lost and the background genomic state of the cell. The passage of abnormal numbers of chromosomes will have unique consequences with regards to mosaicism and development as well as the progression and heterogeneity of cancers. Mechanisms There are two notable mechanisms that cause Anaphase Lag, each of which are characterized by merotelic attachments of kinetochores to the microtubules responsible for chromatid separation. Merotelic attachments occur when a single centromere kinetochore attaches to microtubules originating from both spindle poles of the dividing cell. The merotelic attachments can occur in two ways: centrosome spindle attachments from both poles on the same chromatid kinetochore or the formation of a third centrosome whose microtubule spindles attach to a chromatid kinetochore. Because the chromatid is being pulled in two opposing directions or away from the correct centriole, it cannot migrate to the mass of segregated chromatids at either pole. If the migration is significantly delayed the reformation of nuclei will begin to occur without a full complement of chromosomes. This nuclear envelope formation is also seen for the lone lagging sister chromatid, forming a micronucleus. The micronucleus has the capacity to persist in the daughter cell but with abnormal replication and maintenance machi
https://en.wikipedia.org/wiki/Scalar%20%28physics%29
In physics, scalars (or scalar quantities) are physical quantities that are unaffected by changes to a vector space basis (i.e., a coordinate system transformation). Scalars are often accompanied by units of measurement, as in "10cm". Examples of scalar quantities are mass, distance, charge, volume, time, speed, and the magnitude of physical vectors in general (such as velocity). A change of a vector space basis changes the description of a vector in terms of the basis used but does not change the vector itself, while a scalar has nothing to do with this change. In classical physics, like Newtonian mechanics, rotations and reflections preserve scalars, while in relativity, Lorentz transformations or space-time translations preserve scalars. The term "scalar" has origin in the multiplication of vectors by a unitless scalar, which is a uniform scaling transformation. Relationship with the mathematical concept A scalar in physics is also a scalar in mathematics, as an element of a mathematical field used to define a vector space. For example, the magnitude (or length) of an electric field vector is calculated as the square root of its absolute square (the inner product of the electric field with itself); so, the inner product's result is an element of the mathematical field for the vector space in which the electric field is described. As the vector space in this example and usual cases in physics is defined over the mathematical field of real numbers or complex numbers, the magnitude is also an element of the field, so it is mathematically a scalar. Since the inner product is independent of any vector space basis, the electric field magnitude is also physically a scalar. The mass of an object is unaffected by a change of vector space basis so it is also a physical scalar, described by a real number as an element of the real number field. Since a field is a vector space with addition defined based on vector addition and multiplication defined as scalar multiplicat
https://en.wikipedia.org/wiki/MAJC
MAJC (Microprocessor Architecture for Java Computing) was a Sun Microsystems multi-core, multithreaded, very long instruction word (VLIW) microprocessor design from the mid-to-late 1990s. Originally called the UltraJava processor, the MAJC processor was targeted at running Java programs, whose "late compiling" allowed Sun to make several favourable design decisions. The processor was released into two commercial graphical cards from Sun. Lessons learned regarding multi-threads on a multi-core processor provided a basis for later OpenSPARC implementations such as the UltraSPARC T1. Design elements Move instruction scheduling to the compiler Like other VLIW designs, notably Intel's IA-64 (Itanium), MAJC attempted to improve performance by moving several expensive operations out of the processor and into the related compilers. In general, VLIW designs attempt to eliminate the instruction scheduler, which often represents a relatively large amount of the overall processor's transistor budget. With this portion of the CPU removed to software, those transistors can be used for other purposes, often to add additional functional units to process more instructions at once, or to increase the amount of cache memory to reduce the amount of time spent waiting for data to arrive from the much slower main memory. Although MAJC shared these general concepts, it was unlike other VLIW designs, and processors in general, in a number of specific details. Generalized functional units Most processors include a number of separate "subprocessors" known as functional units that are tuned to operating on a particular type of data. For instance, a modern CPU typically has two or three functional units dedicated to processing integer data and logic instructions, known as ALUs, while other units handle floating-point numbers, the FPUs, or multimedia data, SIMD. MAJC instead used a single multi-purpose functional unit which could process any sort of data. In theory this approach meant t
https://en.wikipedia.org/wiki/Vibrio%20cholerae
Vibrio cholerae is a species of Gram-negative, facultative anaerobe and comma-shaped bacteria. The bacteria naturally live in brackish or saltwater where they attach themselves easily to the chitin-containing shells of crabs, shrimp, and other shellfish. Some strains of V. cholerae are pathogenic to humans and cause a deadly disease called cholera, which can be derived from the consumption of undercooked or raw marine life species. V. cholerae was first described by Félix-Archimède Pouchet in 1849 as some kind of protozoa. Filippo Pacini correctly identified it as a bacterium and from him, the scientific name is adopted. The bacterium as the cause of cholera was discovered by Robert Koch in 1884. Sambhu Nath De isolated the cholera toxin and demonstrated the toxin as the cause of cholera in 1959. The bacterium has a flagellum at one pole and several pili throughout its cell surface. It undergoes respiratory and fermentative metabolism. Two serogroups called O1 and O139 are responsible for cholera outbreaks. Infection is mainly through drinking contaminated water, therefore is linked to sanitation and hygiene. When ingested, it invades the intestinal mucosa which can cause diarrhea and vomiting in a host within several hours to 2–3 days of ingestion. Oral rehydration solution and antibiotics such as fluoroquinolones and tetracyclines are the common treatment methods. V. cholerae has two circular DNA. One DNA produces the cholera toxin (CT), a protein that causes profuse, watery diarrhea (known as "rice-water stool"). But the DNA does not directly code for the toxin as the genes for cholera toxin are carried by CTXphi (CTXφ), a temperate bacteriophage (virus). The virus only produces the toxin when inserted into the bacterial DNA. Quorum sensing in V. cholerae is well studied and it activates host immune signaling and prolongs host survival, by limiting the bacterial intake of nutrients, such as tryptophan, which further is converted to serotonin. As such, quorum s
https://en.wikipedia.org/wiki/Peer%20Distributed%20Transfer%20Protocol
The Peer Distributed Transfer Protocol is an Internet file transfer protocol for distributing files from a central server across a peer-to-peer network. It is conceptually similar to BitTorrent but allows for streaming media. The protocol has been assigned port 6086 by the Internet Assigned Numbers Authority. The primary implementation is DistribuStream. External links PDTP protocol web site PowerPoint Presentation Network protocols
https://en.wikipedia.org/wiki/Set%20TSP%20problem
In combinatorial optimization, the set TSP, also known as the generalized TSP, group TSP, One-of-a-Set TSP, Multiple Choice TSP or Covering Salesman Problem, is a generalization of the traveling salesman problem (TSP), whereby it is required to find a shortest tour in a graph which visits all specified subsets of the vertices of a graph. The subsets of vertices must be disjoint, since the case of overlapping subsets can be reduced to the case of disjoint ones. The ordinary TSP is a special case of the set TSP when all subsets to be visited are singletons. Therefore, the set TSP is also NP-hard. There is a transformation for an instance of the set TSP to an instance of the standard asymmetric TSP. The idea is to connect each subset into a directed cycle with edges of zero weight, and inherit the outgoing edges from the original graph shifting by one vertex backwards along this cycle. The salesman, when visiting a vertex v in some subset, walks around the cycle for free and exits it from the vertex preceding v by an outgoing edge corresponding to an outgoing edge of v in the original graph. The Set TSP has a lot of interesting applications in several path planning problems. For example, a two vehicle cooperative routing problem could be transformed into a set TSP, tight lower bounds to the Dubins TSP and generalized Dubins path problem could be computed by solving a Set TSP. Illustration from the cutting stock problem The one-dimensional cutting stock problem as applied in the paper / plastic film industries, involves cutting jumbo rolls into smaller ones. This is done by generating cutting patterns typically to minimise waste. Once such a solution has been produced, one may seek to minimise the knife changes, by re-sequencing the patterns (up and down in the figure), or moving rolls left or right within each pattern. These moves do not affect the waste of the solution. In the above figure, patterns (width no more than 198) are rows; knife changes are indicated b
https://en.wikipedia.org/wiki/The%20Equidistribution%20of%20Lattice%20Shapes%20of%20Rings%20of%20Integers%20of%20Cubic%2C%20Quartic%2C%20and%20Quintic%20Number%20Fields
The Equidistribution of Lattice Shapes of Rings of Integers of Cubic, Quartic, and Quintic Number Fields: An Artist's Rendering is a mathematics book by Piper Harron (also known as Piper H), based on her Princeton University doctoral thesis of the same title. It has been described as "feminist", "unique", "honest", "generous", and "refreshing". Thesis and reception Harron was advised by Fields Medalist Manjul Bhargava, and her thesis deals with the properties of number fields, specifically the shape of their rings of integers. Harron and Bhargava showed that, viewed as a lattice in real vector space, the ring of integers of a random number field does not have any special symmetries. Rather than simply presenting the proof, Harron intended for the thesis and book to explain both the mathematics and the process (and struggle) that was required to reach this result. The writing is accessible and informal, and the book features sections targeting three different audiences: laypeople, people with general mathematical knowledge, and experts in number theory. Harron intentionally departs from the typical academic format as she is writing for a community of mathematicians who "do not feel that they are encouraged to be themselves". Unusually for a mathematics thesis, Harron intersperses her rigorous analysis and proofs with cartoons, poetry, pop-culture references, and humorous diagrams. Science writer Evelyn Lamb, in Scientific American, expresses admiration for Harron for explaining the process behind the mathematics in a way that is accessible to non-mathematicians, especially "because as a woman of color, she could pay a higher price for doing it." Mathematician Philp Ording calls her approach to communicating mathematical abstractions "generous". Her thesis went viral in late 2015, especially within the mathematical community, in part because of the prologue which begins by stating that "respected research math is dominated by men of a certain attitude". Harron had
https://en.wikipedia.org/wiki/List%20of%20feeding%20behaviours
Feeding is the process by which organisms, typically animals, obtain food. Terminology often uses either the suffixes -vore, -vory, or -vorous from Latin vorare, meaning "to devour", or -phage, -phagy, or -phagous from Greek φαγεῖν (), meaning "to eat". Evolutionary history The evolution of feeding is varied with some feeding strategies evolving several times in independent lineages. In terrestrial vertebrates, the earliest forms were large amphibious piscivores 400 million years ago. While amphibians continued to feed on fish and later insects, reptiles began exploring two new food types, other tetrapods (carnivory), and later, plants (herbivory). Carnivory was a natural transition from insectivory for medium and large tetrapods, requiring minimal adaptation (in contrast, a complex set of adaptations was necessary for feeding on highly fibrous plant materials). Evolutionary adaptations The specialization of organisms towards specific food sources is one of the major causes of evolution of form and function, such as: mouth parts and teeth, such as in whales, vampire bats, leeches, mosquitos, predatory animals such as felines and fishes, etc. distinct forms of beaks in birds, such as in hawks, woodpeckers, pelicans, hummingbirds, parrots, kingfishers, etc. specialized claws and other appendages, for apprehending or killing (including fingers in primates) changes in body colour for facilitating camouflage, disguise, setting up traps for preys, etc. changes in the digestive system, such as the system of stomachs of herbivores, commensalism and symbiosis Classification By mode of ingestion There are many modes of feeding that animals exhibit, including: Filter feeding: obtaining nutrients from particles suspended in water Deposit feeding: obtaining nutrients from particles suspended in soil Fluid feeding: obtaining nutrients by consuming other organisms' fluids Bulk feeding: obtaining nutrients by eating all of an organism. Ram feeding and suction feeding: in
https://en.wikipedia.org/wiki/Cauchy%E2%80%93Born%20rule
The Cauchy–Born rule or Cauchy–Born approximation is a basic hypothesis used in the mathematical formulation of solid mechanics which relates the movement of atoms in a crystal to the overall deformation of the bulk solid. It states that in a crystalline solid subject to a small strain, the positions of the atoms within the crystal lattice follow the overall strain of the medium. The currently accepted form is Max Born's refinement of Cauchy's original hypothesis which was used to derive the equations satisfied by the Cauchy stress tensor. The approximation generally holds for face-centered and body-centered cubic crystal systems. For complex lattices such as diamond, however, the rule has to be modified to allow for internal degrees of freedom between the sublattices. The approximation can then be used to obtain bulk properties of crystalline materials such as stress-strain relationship. For crystalline bodies of finite size, the effect of surface stress is also significant. However, the standard Cauchy–Born rule cannot deduce the surface properties. To overcome this limitation, Park et al. (2006) proposed a surface Cauchy–Born rule. Several modified forms of the Cauchy–Born rule have also been proposed to cater to crystalline bodies having special shapes. Arroyo & Belytschko (2002) proposed an exponential Cauchy Born rule for modeling of mono-layered crystalline sheets as two-dimensional continuum shells. Kumar et al. (2015) proposed a helical Cauchy–Born rule for modeling slender bodies (such as nano and continuum rods) as special Cosserat continuum rods.
https://en.wikipedia.org/wiki/Forest%20reproductive%20material
Forest reproductive material is a part of a tree that can be used for reproduction such as seed, cutting or seedling. Artificial regeneration, carried out through seeding or planting, typically involves transferring forest reproductive material to a particular site from other locations while natural regeneration relies on genetic material that is already available on the site. Technical opportunities and challenges to ensure quality and quantity of forest reproductive material can be found in the activities of identification, selection, procurement, propagation, conservation, improvement and sustained production of reproductive material. The use of low quality or poorly adapted forest reproductive material can have very negative impact on the vitality and resilience of a forest. In Europe, much of the material used for artificial regeneration is produced and transferred within a single country. However, forest reproductive material, usually in the form of seeds or cuttings, is increasingly traded across national borders, especially within the European Union. Forest reproductive material and climate changes As a result of climate changes, leading to increasing temperatures, some parts of the current distribution ranges of forest trees are expected to become unsuitable while new areas may become suitable for many species in higher latitudes or altitudes. This will most likely increase the future demand for imported forest reproductive material as forest managers and owners try to identify tree species and provenances that will be able to grow in their land under new climatic conditions. Especially, forest reproductive material with high plasticity will be increasingly useful for this purpose.
https://en.wikipedia.org/wiki/Bright%20Computing
Bright Computing, Inc. is a developer of software for deploying and managing high-performance (HPC) clusters, Kubernetes clusters, and OpenStack private clouds in on-premises data centers as well as in the public cloud. History Bright Computing was founded by Matthijs van Leeuwen in 2009, who spun the company out of ClusterVision, which he had co-founded with Alex Ninaber and Arijan Sauer. Alex and Matthijs had worked together at UK’s Compusys, which was one of the first companies to commercially build HPC clusters. They left Compusys in 2002 to start ClusterVision in the Netherlands, after determining there was a growing market for building and managing supercomputer clusters using off-the-shelf hardware components and open source software, tied together with their own customized scripts. ClusterVision also provided delivery and installation support services for HPC clusters at universities and government entities. In 2004, Martijn de Vries joined ClusterVision and began development of cluster management software. The software was made available to customers in 2008, under the name ClusterVisionOS v4. In 2009, Bright Computing was spun out of ClusterVision. ClusterVisionOS was renamed Bright Cluster Manager, and van Leeuwen was named Bright Computing’s CEO. In February 2016, Bright appointed Bill Wagner as chief executive officer. Matthijs van Leeuwen became chief strategy officer, and then left the company and board of directors in 2018. In January 2022 Bright was acquired by Nvidia. Customers Early customers included Boeing, Sandia National Laboratories, Virginia Tech, Hewlett Packard, NSA, and Drexel University. Many early customers were introduced through resellers, including SICORP, Cray, Dell, and Advanced HPC. As of 2019, the company has more than 700 customers, including more than fifty Fortune 500 Companies. Products and services Bright Cluster Manager for HPC lets customers deploy and manage complete clusters. It provides management for the h
https://en.wikipedia.org/wiki/RKM%20code
The RKM code, also referred to as "letter and numeral code for resistance and capacitance values and tolerances", "letter and digit code for resistance and capacitance values and tolerances", or informally as "R notation" is a notation to specify resistor and capacitor values defined in the international standard IEC 60062 (formerly IEC 62) since 1952. Other standards including DIN 40825 (1973), BS 1852 (1975), IS 8186 (1976), and EN 60062 (1993) have also accepted it. The updated IEC 60062:2016, amended in 2019, comprises the most recent release of the standard. Overview Originally meant also as part marking code, this shorthand notation is widely used in electrical engineering to denote the values of resistors and capacitors in circuit diagrams and in the production of electronic circuits (for example in bills of material and in silk screens). This method avoids overlooking the decimal separator, which may not be rendered reliably on components or when duplicating documents. The standards also define a color code for fixed resistors. Part value code For brevity, the notation omits to always specify the unit (ohm or farad) explicitly and instead relies on implicit knowledge raised from the usage of specific letters either only for resistors or for capacitors, the case used (uppercase letters are typically used for resistors, lowercase letters for capacitors), a part's appearance, and the context. The notation also avoids using a decimal separator and replaces it by a letter associated with the prefix symbol for the particular value. This is not only for brevity (for example when printed on the part or PCB), but also to circumvent the problem that decimal separators tend to "disappear" when photocopying printed circuit diagrams. The code letters are loosely related to the corresponding SI prefix, but there are several exceptions, where the capitalization differs or alternative letters are used. For example, 8K2 indicates a resistor value of 8.2 kΩ. Additiona
https://en.wikipedia.org/wiki/Trachylepis%20tschudii
Trachylepis tschudii is an enigmatic skink, purportedly from Peru. First described in 1845 on the basis of a single specimen, it may be the same as the Noronha skink (T. atlantica) from Fernando de Noronha, off northeastern Brazil. T. tschudii represents one of two doubtful records of the otherwise African genus Trachylepis on mainland South America; the other is T. maculata from Guyana. The only specimen, the holotype, is mostly brownish above, with dark and light spots, and white below. The snout-to-vent length is 83 mm (3.3 in). Several features of the scales align it with Trachylepis over the related American genus Mabuya. Taxonomy In 1845, Swiss zoologist Johann Jakob von Tschudi described the new species Trachylepis (Xystrolepis) punctata among other species he had collected in Peru. The species was recorded as being from the "forest region" (Amazonia) of Peru and was known from a single specimen, the holotype. In 1887, G.A. Boulenger placed it at an uncertain position within the genus Mabuia, which included Tschudi's Trachylepis. In a 1907 reappraisal of some of Tschudi's reptiles and amphibians, J. Roux redescribed punctata under the name "Mabuia punctata", but did not comment on its affinities. In 1935, E.R. Dunn reviewed some American Mabuya and commented that he was unable to tell the identity of punctata, but that it probably was not a true Mabuya. Writing in 1946, H. Travassos considered Tschudi's punctata to be identical to the Noronha skink (then known as Mabuya punctata), a species otherwise known only from Fernando de Noronha, a small archipelago off northeastern Brazil. On the basis of its geographic origin, J. Peters and R. Donoso-Barros preferred to place it with one of the Mabuya species of Amazonia and classified it as a junior synonym of Mabuya mabouya. In reviewing the nomenclature of the Noronha skink, P. Mausfeld and D. Vrcibradic noted in 2002 that Tschudi's name punctata was preoccupied within Mabuya, making it unavailable for use as a
https://en.wikipedia.org/wiki/Louis%20Essen
Louis Essen OBE FRS(6 September 1908 – 24 August 1997) was an English physicist whose most notable achievements were in the precise measurement of time and the determination of the speed of light. He was a critic of Albert Einstein's theory of relativity, particularly as it related to time dilation. Early work Born in Nottingham, Essen earned his degree in physics from the University of London in 1928, having studied at University College Nottingham. He started work at the National Physical Laboratory (NPL) the following year, under D. W. Dye, investigating the potential of tuning forks and quartz crystal oscillators for precise time measurement. His research led to his development of the quartz ring clock in 1938, the clock soon becoming a standard for time measurement at observatories throughout the world. The speed of light During World War II, Essen worked on radar and developed a number of instruments, including the cavity resonance wavemeter. It was this work that suggested to Essen the possibility of a more precise measurement of the speed of light. In 1946, in collaboration with A.C. Gordon-Smith, he used a microwave cavity, of precisely known dimensions, and exploited his expertise in time-measurement to establish the frequency for a variety of its normal modes. As the wavelength of the modes was known from the geometry of the cavity and from electromagnetic theory, knowledge of the associated frequencies enabled a calculation of the speed of light. Their result, 299,792±3 km/s, was substantially greater than that from the prevailing sequence of optical measurements that had begun around the start of the 20th century and Essen had to withstand some fierce criticism and disbelief. Even NPL director Sir Charles Galton Darwin, while supporting the work, observed that Essen would get the correct result once he had perfected the technique. Moreover, W.W. Hansen at Stanford University had used a similar technique and obtained a measurement which was more cons
https://en.wikipedia.org/wiki/Latency%20oriented%20processor%20architecture
Latency oriented processor architecture is the microarchitecture of a microprocessor designed to serve a serial computing thread with a low latency. This is typical of most central processing units (CPU) being developed since the 1970s. These architectures, in general, aim to execute as many instructions as possible belonging to a single serial thread, in a given window of time; however, the time to execute a single instruction completely from fetch to retire stages may vary from a few cycles to even a few hundred cycles in some cases. Latency oriented processor architectures are the opposite of throughput-oriented processors which concern themselves more with the total throughput of the system, rather than the service latencies for all individual threads that they work on. Flynn's taxonomy Typically, latency oriented processor architectures execute a single task operating on a single data stream, and so they are SISD under Flynn's taxonomy. Latency oriented processor architectures might also include SIMD instruction set extensions such as Intel MMX and SSE; even though these extensions operate on large data sets, their primary goal is to reduce overall latency. Implementation techniques There are many architectural techniques employed to reduce the overall latency for a single computing task. These typically involve adding additional hardware in the pipeline to serve instructions as soon as they are fetched from memory or instruction cache. A notable characteristic of these architectures is that a significant area of the chip is used up in parts other than the Execution Units themselves. This is because the intent is to bring down the time required to complete a 'typical' task in a computing environment. A typical computing task is a serial set of instructions, where there is a high dependency on results produced by the previous instructions of the same task. Hence, it makes sense that the microprocessor will be spending its time doing many other tasks other tha
https://en.wikipedia.org/wiki/Movile%20Cave
Movile Cave () is a cave near Mangalia, Constanța County, Romania discovered in 1986 by Cristian Lascu a few kilometers from the Black Sea coast. It is notable for its unique groundwater ecosystem abundant in hydrogen sulfide and carbon dioxide, but low in oxygen. Life in the cave has been separated from the outside for the past 5.5 million years and it is based completely on chemosynthesis rather than photosynthesis. Similar caves where life partly or fully depends on chemosynthesis have been found in Ein-Nur Cave and Ayalon Cave (Israel), Frasassi Caves (Italy), Melissotrypa Cave (Greece), Tashan Cave (Iran), caves in the Sharo-Argun Valley in the Caucasus Mountains, Lower Kane Cave, Cesspool Cave (USA), and Villa Luz Cave (Mexico). Description Movile Cave is a network of paths in limestone that are approximately long, with portions that are partially or fully submerged by hydrothermal waters. The temperature of the air and water is a constant 21°C (70°F) and the relative humidity is about 100%. The cave is closed to the general public, and only a few researchers are permitted inside each year, to minimize disturbance to the delicate ecosystem. Chemical environment The air in the cave is very different from the outer atmosphere. The level of oxygen is only a third to half of the concentration found in open air (7–10% O2 in the cave atmosphere, compared to 21% O2 in air), and about one hundred times more carbon dioxide (2–3.5% CO2 in the cave atmosphere, versus 0.04% CO2 in air). It also contains 1–2% methane (CH4) and both the air and waters of the cave contain high concentrations of hydrogen sulfide (H2S) and ammonia (NH3). The water in the lake only contains dissolved oxygen for the first centimeter, at most, and in some places only the first millimeter. Deeper down the lake water becomes completely anoxic. Biology The cave is known to contain 57 animal species, among them leeches, spiders, pseudoscorpions, woodlice, a centipede, a water scorpion (Nepa an
https://en.wikipedia.org/wiki/High%20Assurance%20Internet%20Protocol%20Encryptor
A High Assurance Internet Protocol Encryptor (HAIPE) is a Type 1 encryption device that complies with the National Security Agency's HAIPE IS (formerly the HAIPIS, the High Assurance Internet Protocol Interoperability Specification). The cryptography used is Suite A and Suite B, also specified by the NSA as part of the Cryptographic Modernization Program. HAIPE IS is based on IPsec with additional restrictions and enhancements. One of these enhancements includes the ability to encrypt multicast data using a "preplaced key" (see definition in List of cryptographic key types). This requires loading the same key on all HAIPE devices that will participate in the multicast session in advance of data transmission. A HAIPE is typically a secure gateway that allows two enclaves to exchange data over an untrusted or lower-classification network. Examples of HAIPE devices include: L3Harris Technologies' Encryption Products KG-245X 10Gbit/s (HAIPE IS v3.1.2 and Foreign Interoperable), KG-245A fully tactical 1 Gbit/s (HAIPE IS v3.1.2 and Foreign Interoperable) RedEagle ViaSat's AltaSec Products KG-250, and KG-255 [1 Gbit/s] General Dynamics Mission Systems TACLANE Products FLEX (KG-175F) 10G (KG-175X) Nano (KG-175N) Airbus Defence & Space ECTOCRYP Transparent Cryptography Three of these devices are compliant to the HAIPE IS v3.0.2 specification while the remaining devices use the HAIPE IS version 1.3.5, which has a couple of notable limitations: limited support for routing protocols or open network management. A HAIPE is an IP encryption device, looking up the destination IP address of a packet in its internal Security Association Database (SAD) and picking the encrypted tunnel based on the appropriate entry. For new communications, HAIPEs use the internal Security Policy Database (SPD) to set up new tunnels with the appropriate algorithms and settings. Due to lack of support for modern commercial routing protocols the HAIPEs often must be preprogrammed with sta
https://en.wikipedia.org/wiki/Chlorophycean%20mitochondrial%20code
The chlorophycean mitochondrial code (translation table 16) is a genetic code found in the mitochondria of Chlorophyceae. Code    AAs = FFLLSSSSYY*LCC*WLLLLPPPPHHQQRRRRIIIMTTTTNNKKSSRRVVVVAAAADDEEGGGG Starts = -----------------------------------M----------------------------  Base1 = TTTTTTTTTTTTTTTTCCCCCCCCCCCCCCCCAAAAAAAAAAAAAAAAGGGGGGGGGGGGGGGG  Base2 = TTTTCCCCAAAAGGGGTTTTCCCCAAAAGGGGTTTTCCCCAAAAGGGGTTTTCCCCAAAAGGGG  Base3 = TCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAG Bases: adenine (A), cytosine (C), guanine (G) and thymine (T) or uracil (U). Amino acids: Alanine (Ala, A), Arginine (Arg, R), Asparagine (Asn, N), Aspartic acid (Asp, D), Cysteine (Cys, C), Glutamic acid (Glu, E), Glutamine (Gln, Q), Glycine (Gly, G), Histidine (His, H), Isoleucine (Ile, I), Leucine (Leu, L), Lysine (Lys, K), Methionine (Met, M), Phenylalanine (Phe, F), Proline (Pro, P), Serine (Ser, S), Threonine (Thr, T), Tryptophan (Trp, W), Tyrosine (Tyr, Y), Valine (Val, V) Differences from the standard code Systematic range and comments Chlorophyceae and the chytridiomycete fungus ''Spizellomyces punctatus. See also List of genetic codes
https://en.wikipedia.org/wiki/Promoter%20based%20genetic%20algorithm
The promoter based genetic algorithm (PBGA) is a genetic algorithm for neuroevolution developed by F. Bellas and R.J. Duro in the Integrated Group for Engineering Research (GII) at the University of Coruña, in Spain. It evolves variable size feedforward artificial neural networks (ANN) that are encoded into sequences of genes for constructing a basic ANN unit. Each of these blocks is preceded by a gene promoter acting as an on/off switch that determines if that particular unit will be expressed or not. PBGA basics The basic unit in the PBGA is a neuron with all of its inbound connections as represented in the following figure: The genotype of a basic unit is a set of real valued weights followed by the parameters of the neuron and proceeded by an integer valued field that determines the promoter gene value and, consequently, the expression of the unit. By concatenating units of this type we can construct the whole network. With this encoding it is imposed that the information that is not expressed is still carried by the genotype in evolution but it is shielded from direct selective pressure, maintaining this way the diversity in the population, which has been a design premise for this algorithm. Therefore, a clear difference is established between the search space and the solution space, permitting information learned and encoded into the genotypic representation to be preserved by disabling promoter genes. Results The PBGA was originally presented within the field of autonomous robotics, in particular in the real time learning of environment models of the robot. It has been used inside the Multilevel Darwinist Brain (MDB) cognitive mechanism developed in the GII for real robots on-line learning. In another paper it is shown how the application of the PBGA together with an external memory that stores the successful obtained world models, is an optimal strategy for adaptation in dynamic environments. Recently, the PBGA has provided results that outperform
https://en.wikipedia.org/wiki/Centuria%20Insectorum
(Latin, "one hundred insects") is a 1763 taxonomic work by Carl Linnaeus, and defended as a thesis by Boas Johansson; which of the two men should for taxonomic purposes be credited with its authorship has been the subject of some controversy. It includes descriptions of 102 new insect and crustacean species that had been sent to Linnaeus from British America, Suriname, Java and other locations. Most of the new names included in Centuria Insectorum are still in use, although a few have been sunk into synonymy, and one was the result of a hoax: a common brimstone butterfly with spots painted on was described as the new "species" Papilio ecclipsis. Publications The contents of the work were published twice, under two slightly different titles. ("one hundred rare insects") was published as a standalone thesis, while was published as part of Linnaeus' series of ("academic delights"). Both bear the date June 23, 1763, although the latter was printed later, in September 1763. Authorship Since was a thesis presented and defended by one of Linnaeus' students, Boas Johansson (1742–1809) from Kalmar, it has been argued that authorship of the taxa named in it should be assigned to Johansson. The authorship, however, has been the subject of some controversy. Several lines of argument have been used to suggest that Linnaeus should be considered the author. The role of the person defending the thesis at Swedish universities at the time was to prove his command of Latin, and responsibility for the text of the thesis rested mainly, if not entirely, with the professor. Linnaeus appeared to consider himself the author, referring in his later works to without including an abbreviation for the author, as he did for works written by other people. Works presented by students of other taxonomists of the era (such as Carl Peter Thunberg, Adam Afzelius and Elias Magnus Fries) are generally credited to their supervisors, and not the students themselves. Finally, most zoologists, a
https://en.wikipedia.org/wiki/Dual%20code
In coding theory, the dual code of a linear code is the linear code defined by where is a scalar product. In linear algebra terms, the dual code is the annihilator of C with respect to the bilinear form . The dimension of C and its dual always add up to the length n: A generator matrix for the dual code is the parity-check matrix for the original code and vice versa. The dual of the dual code is always the original code. Self-dual codes A self-dual code is one which is its own dual. This implies that n is even and dim C = n/2. If a self-dual code is such that each codeword's weight is a multiple of some constant , then it is of one of the following four types: Type I codes are binary self-dual codes which are not doubly even. Type I codes are always even (every codeword has even Hamming weight). Type II codes are binary self-dual codes which are doubly even. Type III codes are ternary self-dual codes. Every codeword in a Type III code has Hamming weight divisible by 3. Type IV codes are self-dual codes over F4. These are again even. Codes of types I, II, III, or IV exist only if the length n is a multiple of 2, 8, 4, or 2 respectively. If a self-dual code has a generator matrix of the form , then the dual code has generator matrix , where is the identity matrix and .
https://en.wikipedia.org/wiki/Assembly%20language
In computer programming, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence between the instructions in the language and the architecture's machine code instructions. Assembly language usually has one statement per machine instruction (1:1), but constants, comments, assembler directives, symbolic labels of, e.g., memory locations, registers, and macros are generally also supported. The first assembly code in which a language is used to represent machine code instructions is found in Kathleen and Andrew Donald Booth's 1947 work, Coding for A.R.C.. Assembly code is converted into executable machine code by a utility program referred to as an assembler. The term "assembler" is generally attributed to Wilkes, Wheeler and Gill in their 1951 book The Preparation of Programs for an Electronic Digital Computer, who, however, used the term to mean "a program that assembles another program consisting of several sections into a single program". The conversion process is referred to as assembly, as in assembling the source code. The computational step when an assembler is processing a program is called assembly time. Because assembly depends on the machine code instructions, each assembly language is specific to a particular computer architecture. Sometimes there is more than one assembler for the same architecture, and sometimes an assembler is specific to an operating system or to particular operating systems. Most assembly languages do not provide specific syntax for operating system calls, and most assembly languages can be used universally with any operating system, as the language provides access to all the real capabilities of the processor, upon which all system call mechanisms ultimately rest. In contrast to assembly languages, most high-level programming languages are generally portable ac
https://en.wikipedia.org/wiki/Tacheometry
Tacheometry (; from Greek for "quick measure") is a system of rapid surveying, by which the horizontal and vertical positions of points on the earth's surface relative to one another are determined without using a chain or tape, or a separate levelling instrument. Instead of the pole normally employed to mark a point, a staff similar to a level staff is used. This is marked with heights from the base or foot, and is graduated according to the form of tacheometer in use. The horizontal distance S is inferred from the vertical angle subtended between two well-defined points on the staff and the known distance 2L between them. Alternatively, also by readings of the staff indicated by two fixed stadia wires in the diaphragm (reticle) of the telescope. The difference of height Δh is computed from the angle of depression z or angle of elevation α of a fixed point on the staff and the horizontal distance S already obtained. The azimuth angle is determined as normally. Thus, all the measurements requisite to locate a point both vertically and horizontally with reference to the point where the tacheometer is centred are determined by an observer at the instrument without any assistance beyond that of a person to hold the level staff. The ordinary methods of surveying with a theodolite, chain, and levelling instrument are fairly satisfactory when the ground is relatively clear of obstructions and not very precipitous, but it becomes extremely cumbersome when the ground is covered with bush, or broken up by ravines. Chain measurements then become slow and liable to considerable error; the levelling, too, is carried on at great disadvantage in point of speed, though without serious loss of accuracy. These difficulties led to the introduction of tacheometry. In western countries, tacheometry is primarily of historical interest in surveying, as professional measurement nowadays is usually carried out using total stations and recorded using data collectors. Location positions
https://en.wikipedia.org/wiki/Applied%20spectroscopy
Applied spectroscopy is the application of various spectroscopic methods for the detection and identification of different elements or compounds to solve problems in fields like forensics, medicine, the oil industry, atmospheric chemistry, and pharmacology. Spectroscopic methods A common spectroscopic method for analysis is Fourier transform infrared spectroscopy (FTIR), where chemical bonds can be detected through their characteristic infrared absorption frequencies or wavelengths. These absorption characteristics make infrared analyzers an invaluable tool in geoscience, environmental science, and atmospheric science. For instance, atmospheric gas monitoring has been facilitated by the development of commercially available gas analyzers which can distinguish between carbon dioxide, methane, carbon monoxide, oxygen, and nitric oxide. Ultraviolet (UV) spectroscopy is used where strong absorption of UV radiation occurs in a substance. Such groups are known as chromophores and include aromatic groups, conjugated system of bonds, carbonyl groups and so on. Nuclear magnetic resonance spectroscopy detects hydrogen atoms in specific environments, and complements both infrared (IR) spectroscopy and UV spectroscopy. The use of Raman spectroscopy is growing for more specialist applications. There are also derivative methods such as infrared microscopy, which allows very small areas to be analyzed in an optical microscope. One method of elemental analysis that is important in forensic analysis is energy-dispersive X-ray spectroscopy (EDX) performed in the environmental scanning electron microscope (ESEM). The method involves analysis of back-scattered X-rays from the sample as a result of interaction with the electron beam. Automated EDX is further used in a range of automated mineralogy techniques for identification and textural mapping. Sample preparation In all three spectroscopic methods, the sample usually needs to be present in solution, which may present problems
https://en.wikipedia.org/wiki/International%20Conference%20on%20Nitride%20Semiconductors
The International Conference on Nitride Semiconductors (ICNS) is a major academic conference and exhibition in the field of group III nitride research. It has been held biennially since 1995. Since the second conference in 1997, hosting of the event has rotated between the Asian, European and North American continents. The ICNS and the International Workshop on Nitride Semiconductors (IWN) are held in alternating years, both covering similar subject areas. ICNS-9 was held at the Scottish Exhibition and Conference Centre in Glasgow, Scotland, on 10–15 July 2011. Keynote speakers included Professor Umesh Mishra (University of California Santa Barbara and Transphorm) and Professor Hiroshi Amano (Meijo University, Nagoya). ICNS-10 was held in Washington, D.C., United States on 25–30 August 2013, ICNS-11 was held in Beijing, China 30 August–4 September 2015 and ICNS-12 will take place in Strasbourg, France on 24–28 July 2017. ICNS-13 was held 7-12 July 2019 in Seattle, Washington, United States, was chaired by Alan Doolittle (Georgia Institute of Technology, USA) Conference list See also Gallium nitride Indium nitride Aluminium nitride
https://en.wikipedia.org/wiki/Volodymyr%20Savchenko%20%28writer%29
Vladimir Ivanovich Savchenko (; ) was a Soviet Ukrainian science fiction writer and engineer. Born on February 15, 1933, in Poltava, he studied at the Moscow Power Engineering Institute and was an electronics engineer. Savchenko, who wrote in Russian , published his first short stories in the late 1950s, and his first novel (Black Stars) in 1960. His works were often self-published. Savchenko also authored several texts about physics and engineering, including the article "Sixteen New Formulas of Physics and Cosmology," which he considered to be his most important scientific text. As of today, Savchenko's works have been published in 29 countries and have been translated into many of the world's languages. He was found dead on January 19, 2005, in Kyiv. He was 71 years old. Biography Savchenko was a graduate of the Moscow Power Engineering Institute. He worked at the V.M. Glushkov Institute of Cybernetics in Kyiv. His first publication, "Toward the Stars" (1955), identified the author as an advocate of science fiction that was interested in exploring the heuristic potential of the personality. In 1956, Savchenko's story "The Awakening of Professor Bern" was published. Of publications in his native Ukrainian language, the most well-known is the story "The Ghost of Time" (1964). In the novel Black Stars, Savchenko investigated the boundaries of traditional science, putting forward original hypotheses. In particular, Savchenko's 1959 novel The Second Expedition to the Strange Planet (known in English as The Second Oddball Expedition) explored the political nuances revealed by contact with crystalline forms of life. Savchenko positioned himself as an adherent of the cybernetic view of society and the living organism, consistently developing different aspects of the process of self-discovery. After the 1967 publication of the novel Self-Discovery, in which Savchenko warned about the ethical problems involved in the creation of clones, Savchenko occupied the leading
https://en.wikipedia.org/wiki/Max%20Abraham
Max Abraham (; 26 March 1875 – 16 November 1922) was a German physicist known for his work on electromagnetism and his opposition to the theory of relativity. Biography Abraham was born in Danzig, Imperial Germany (now Gdańsk in Poland) to a family of Jewish merchants. His father was Moritz Abraham and his mother was Selma Moritzsohn. Attending the University of Berlin, he studied under Max Planck. He graduated in 1897. For the next three years, Abraham worked as Planck's assistant. From 1900 to 1909, Abraham worked at Göttingen as a privatdozent, an unpaid lecturing position. Abraham developed his theory of the electron in 1902, in which he hypothesized that the electron was a perfect sphere with a charge divided evenly around its surface. Abraham's model was competing with that developed by Hendrik Lorentz (1899, 1904) and Albert Einstein (1905) which seem to have become more widely accepted; nevertheless, Abraham never gave up his model, since he considered it was based on "common sense". Abraham was a staunch opponent of the theory of relativity. In 1909 Abraham travelled to the United States to accept a position at the University of Illinois, but ended up returning to Göttingen after a few months. He was later invited to Italy by Tullio Levi-Civita, and found work as the professor of rational mechanics at the Politecnico di Milano university until 1914. When World War I started, Abraham was forced to return to Germany. During this time he worked on the theory of radio transmission. After the war, he still was not allowed back into Milan, so until 1921 he worked at Stuttgart as the professor of physics at Technische Hochschule. After his work at Stuttgart, Abraham accepted the position of chair in Aachen; however, before he started his work there he was diagnosed with a brain tumor. He died on 16 November 1922 in Munich, Germany. After his death, Max Born and Max von Laue wrote about him in an obituary: He loved his absolute aether, his field equations,
https://en.wikipedia.org/wiki/Franco-British%20Nuclear%20Forum
The first meeting of the Franco–British Nuclear Forum was held in Paris in November 2007, chaired by the Minister for Energy and the French Industry Minister. The working groups are focusing on specific areas for collaboration. A follow-up meeting on the issue in London was planned for March 2008, but did not take place. See also Nuclear power in the United Kingdom Nuclear power in France External links BBC: France and UK boost nuclear ties
https://en.wikipedia.org/wiki/Endocannabinoid%20system
The endocannabinoid system (ECS) is a biological system composed of endocannabinoids, which are endogenous lipid-based retrograde neurotransmitters that bind to cannabinoid receptors (CBRs), and cannabinoid receptor proteins that are expressed throughout the vertebrate central nervous system (including the brain) and peripheral nervous system. The endocannabinoid system remains under preliminary research, but may be involved in regulating physiological and cognitive processes, including fertility, pregnancy, pre- and postnatal development, various activity of immune system, appetite, pain-sensation, mood, and memory, and in mediating the pharmacological effects of cannabis. The ECS plays an important role in multiple aspects of neural functions, including the control of movement and motor coordination, learning and memory, emotion and motivation, addictive-like behavior and pain modulation, among others. Two primary cannabinoid receptors have been identified: CB1, first cloned (or isolated) in 1990; and CB2, cloned in 1993. CB1 receptors are found predominantly in the brain and nervous system, as well as in peripheral organs and tissues, and are the main molecular target of the endogenous partial agonist, anandamide (AEA), as well as exogenous THC, the most known active component of cannabis. Endocannabinoid 2-arachidonoylglycerol (2-AG), which was found to be two and three orders of magnitude more abundant in mammalian brain than AEA, acts as a full agonist at both CB receptors. Basic overview The endocannabinoid system, broadly speaking, includes: The endogenous arachidonate-based lipids, anandamide (N-arachidonoylethanolamide) and 2-AG, besides other N-acylethanolamines (NAEs); these are known as "endocannabinoids" and are physiological ligands for the cannabinoid receptors. Endocannabinoids are all eicosanoids. The enzymes that synthesize and degrade the endocannabinoids, such as fatty acid amide hydrolase or monoacylglycerol lipase. The cannabinoid recep
https://en.wikipedia.org/wiki/Coating
A coating is a covering that is applied to the surface of an object, usually referred to as the substrate. The purpose of applying the coating may be decorative, functional, or both. Coatings may be applied as liquids, gases or solids e.g. Powder coatings. Paints and lacquers are coatings that mostly have dual uses, which are protecting the substrate and being decorative, although some artists paints are only for decoration, and the paint on large industrial pipes is for preventing corrosion and identification e.g. blue for process water, red for fire-fighting control. Functional coatings may be applied to change the surface properties of the substrate, such as adhesion, wettability, corrosion resistance, or wear resistance. In other cases, e.g. semiconductor device fabrication (where the substrate is a wafer), the coating adds a completely new property, such as a magnetic response or electrical conductivity, and forms an essential part of the finished product. A major consideration for most coating processes is that the coating is to be applied at a controlled thickness, and a number of different processes are in use to achieve this control, ranging from a simple brush for painting a wall, to some very expensive machinery applying coatings in the electronics industry. A further consideration for 'non-all-over' coatings is that control is needed as to where the coating is to be applied. A number of these non-all-over coating processes are printing processes. Many industrial coating processes involve the application of a thin film of functional material to a substrate, such as paper, fabric, film, foil, or sheet stock. If the substrate starts and ends the process wound up in a roll, the process may be termed "roll-to-roll" or "web-based" coating. A roll of substrate, when wound through the coating machine, is typically called a web. Applications Coating applications are diverse and serve many purposes. Coatings can be both decorative and have other functions. A
https://en.wikipedia.org/wiki/MyGrid
The myGrid consortium produces and uses a suite of tools design to “help e-Scientists get on with science and get on with scientists”. The tools support the creation of e-laboratories and have been used in domains as diverse as systems biology, social science, music, astronomy, multimedia and chemistry. The consortium is led by Carole Goble of the Department of Computer Science at the University of Manchester, UK. Tools produced and used by myGrid Tools developed by the myGrid consortium include: The Taverna workbench for designing, editing and executing scientific workflows myExperiment for sharing workflows and related data BioCatalogue a public registry of Web services for Life Scientists Seek produced in collaboration with the SysModb: Systems Biology of Micro-Organisms DataBase Finding, sharing and exchanging data, models and processes in Systems Biology MethodBox Browse datasets and share knowledge. RightField Sharing the meaning of your data by embedding ontology annotation in spreadsheets The Kidney and Urinary Pathway Database (KUPKB) Workflows for Ever (wf4ever) Scientific workflow preservation History The consortium has three distinct phases: Phase 1 The consortium was formed in 2001, bringing together collaborators at the Universities of Manchester, Southampton, Newcastle, Nottingham and Sheffield, The European Molecular Biology Laboratory-European Bioinformatics Institute (EMBL-EBI) in Cambridge, and industrial partners GlaxoSmithKline, Merck KGaA, AstraZeneca, Sun Microsystems, IBM, GeneticXchange, Epistemics and Cerebra, (formerly Network Inference). The UK Engineering and Physical Sciences Research Council funded the first phase of the project with £3.5 million. To date, Grid development has focused on the basic issues of storage, computation and resource management needed to make a global scientific community's information and tools accessible in a high performance environment. However, from an e-Science viewpoint, the purpose of th
https://en.wikipedia.org/wiki/Section%20formula
In coordinate geometry, Section formula is used to find the ratio in which a line segment is divided by a point internally or externally. It is used to find out the centroid, incenter and excenters of a triangle. In physics, it is used to find the center of mass of systems, equilibrium points, etc. Internal Divisions If point P (lying on AB) divides the line segment AB joining the points and in the ratio m:n, then The ratio m:n can also be written as , or , where . So, the coordinates of point dividing the line segment joining the points and are: Similarly, the ratio can also be written as , and the coordinates of P are . Proof Triangles . External Divisions If a point P (lying on the extension of AB) divides AB in the ratio m:n then Proof Midpoint formula The midpoint of a line segment divides it internally in the ratio . Applying the Section formula for internal division: Derivation Centroid The centroid of a triangle is the intersection of the medians and divides each median in the ratio . Let the vertices of the triangle be , and . So, a median from point A will intersect BC at . Using the section formula, the centroid becomes: In 3-Dimensions Let A and B be two points with Cartesian coordinates (x1, y1, z1) and (x2, y2, z2) and P be a point on the line through A and B. If . Then the section formulae give the coordinates of P as If, instead, P is a point on the line such that , its coordinates are . In vectors The position vector of a point P dividing the line segment joining the points A and B whose position vectors are and in the ratio internally, is given by in the ratio externally, is given by See also Cross-section Formula Distance Formula Midpoint Formula
https://en.wikipedia.org/wiki/Splanchnic
Splanchnic is usually used to describe organs in the abdominal cavity. It is used when describing: Splanchnic tissue Splanchnic organs - including the stomach, small intestine, large intestine, pancreas, spleen, liver, and may also include the kidney. Splanchnic nerves Splanchnic mesoderm Splanchnic circulation – the circulation of the gastrointestinal tract originating at the celiac trunk, the superior mesenteric artery and the inferior mesenteric artery. History and etymology The term derives from , meaning "inward parts, organs". The term "splanchnologia" is used for grouping in Nomina Anatomica, but not in Terminologia Anatomica. It includes most of the structures usually considered "internal organs", but not all (for example, the heart is excluded).
https://en.wikipedia.org/wiki/Greenberg%E2%80%93Hastings%20cellular%20automaton
The Greenberg–Hastings Cellular Automaton (abbrev. GH model) is a three state two dimensional cellular automaton (abbrev CA) named after James M. Greenberg and Stuart Hastings, designed to model excitable media, One advantage of a CA model is ease of computation. The model can be understood quite well using simple "hand" calculations, not involving a computer. Another advantage is that, at least in this case, one can prove a theorem characterizing those initial conditions which lead to repetitive behavior. Informal description As in a typical two dimensional cellular automaton, consider a rectangular grid, or checkerboard pattern, of "cells". It can be finite or infinite in extent. Each cell has a set of "neighbors". In the simplest case, each cell has four neighbors, those being the cells directly above or below or to the left or right of the given cell. Like this: the b's are all of the neighbors of the a. The a is one of the neighbors of each of the b's. b b a b b At each "time" t=0,1,2,3,...., each cell is assigned one of three "states", typically called "resting" (or "quiescent" ; see excitable medium), "excited", or "refractory". The assignment of states for all cells is arbitrary for t = 0, and then at subsequent times the state of each cell is determined by the following rules. 1. If a cell is in the excited state at time t then it is in the refractory state at time t+1. 2. If a cell is in the refractory state at time t then it is in the resting state at time t+1. 3. If a given cell is in the resting state at time t and at least one of its neighbors is in the excited state at time t, then the given cell is in the excited state at time t+1, otherwise (no neighbor is excited at time t) it remains in the resting state at time t+1. In this way the whole grid of cells advances from their initial state
https://en.wikipedia.org/wiki/Binodal
In thermodynamics, the binodal, also known as the coexistence curve or binodal curve, denotes the condition at which two distinct phases may coexist. Equivalently, it is the boundary between the set of conditions in which it is thermodynamically favorable for the system to be fully mixed and the set of conditions in which it is thermodynamically favorable for it to phase separate. In general, the binodal is defined by the condition at which the chemical potential of all solution components is equal in each phase. The extremum of a binodal curve in temperature coincides with the one of the spinodal curve and is known as a critical point. Binary systems In binary (two component) mixtures, the binodal can be determined at a given temperature by drawing a tangent line to the free energy.
https://en.wikipedia.org/wiki/Exchange%20ActiveSync
Exchange ActiveSync (commonly known as EAS) is a proprietary protocol designed for the synchronization of email, contacts, calendar, tasks, and notes from a messaging server to a smartphone or other mobile devices. The protocol also provides mobile device management and policy controls. The protocol is based on XML. The mobile device communicates over HTTP or HTTPS. Usage Originally branded as AirSync and only supporting Microsoft Exchange Servers and Pocket PC devices, Microsoft now licenses the technology widely for synchronization between groupware and mobile devices in a number of competing collaboration platforms, including: GroupWise with the Novell GroupWise Mobility Services software, Lotus Notes with IBM Notes Traveler, Mailsite, MDaemon Email Server. Google in paid Google Apps for Work subscriptions from 2013. In addition to support on Windows Phone, EAS client support is included on: Android, iOS, BlackBerry 10 smartphones and the BlackBerry PlayBook tablet computer. Beyond on premises installations of Exchange, the various personal and enterprise hosted services from Microsoft also utilize EAS, including Outlook.com and Office 365. The built-in email application for Windows 8 desktop, Mail app, also supports the protocol. Apart from the above, EAS client support is not included on: macOS, in the native Apple Mail app. History 1.0 The first version of EAS (called AirSync at the time) was a part of Mobile Information Server (MIS) 2002. This version of EAS communicated over Web-based Distributed Authoring and Versioning (WebDAV) to Exchange 2000 servers syncing Email, contacts, and calendar and allowed users to select a folder list to sync but this was only for email folders (not contacts or calendars). This initial version of EAS has the user’s device “pull” data down rather than have the server “push” new information as soon as it was available. 2.0 EAS 2.0 shipped in Exchange Server 2003. This version of the protocol was dev
https://en.wikipedia.org/wiki/Sowing
Sowing is the process of planting seeds. An area or object that has had seeds planted in it will be described as a sowed or sown area. Plants which are usually sown Among the major field crops, oats, wheat, and rye are sown, grasses and legumes are seeded and maize and soybeans are planted. In planting, wider rows (generally 75 cm (30 in) or more) are used, and the intent is to have precise; even spacing between individual seeds in the row, various mechanisms have been devised to count out individual seeds at exact intervals. Depth of sowing In sowing, little if any soil is placed over the seeds, as seeds can be generally sown into the soil by maintaining a planting depth of about 2-3 times the size of the seed. Sowing types and patterns For hand sowing, several sowing types exist; these include: Flat sowing Ridge sowing Wide bed sowing Several patterns for sowing may be used together with these types; these include: Rows that are indented at the even rows (so that the seeds are Symmetrical grid pattern – using the pattern described in The Garden of Cyrus placed in a crossed pattern). This method is much better, as more light may fall on the seedlings as they come out. Types of sowing Hand sowing Hand sowing or (planting) is the process of casting handfuls of seed over prepared ground, or broadcasting (from which the technological term is derived). Usually, a drag or harrow is employed to incorporate the seed into the soil. Though labor-intensive for any but small areas, this method is still used in some situations. Practice is required to sow evenly and at the desired rate. A hand seeder can be used for sowing, though it is less of a help than it is for the smaller seeds of grasses and legumes. Hand sowing may be combined with pre-sowing in seed trays. This allows the plants to come to strength indoors during cold periods (e.g. spring in temperate countries). Seed drill In agriculture, most seed is now sown using a seed drill, which offers greater
https://en.wikipedia.org/wiki/Klebsiella%20pneumoniae
Klebsiella pneumoniae is a Gram-negative, non-motile, encapsulated, lactose-fermenting, facultative anaerobic, rod-shaped bacterium. It appears as a mucoid lactose fermenter on MacConkey agar. Although found in the normal flora of the mouth, skin, and intestines, it can cause destructive changes to human and animal lungs if aspirated, specifically to the alveoli resulting in bloody, brownish or yellow colored jelly-like sputum. In the clinical setting, it is the most significant member of the genus Klebsiella of the Enterobacteriaceae. K. oxytoca and K. rhinoscleromatis have also been demonstrated in human clinical specimens. In recent years, Klebsiella species have become important pathogens in nosocomial infections. It naturally occurs in the soil, and about 30% of strains can fix nitrogen in anaerobic conditions. As a free-living diazotroph, its nitrogen-fixation system has been much-studied, and is of agricultural interest, as K. pneumoniae has been demonstrated to increase crop yields in agricultural conditions. It is closely related to K. oxytoca from which it is distinguished by being indole-negative and by its ability to grow on melezitose but not 3-hydroxybutyrate. History The genus Klebsiella was named after the German microbiologist Edwin Klebs (1834–1913). It is also known as Friedlander's bacillum in honor of Carl Friedländer, a German pathologist, who proposed that this bacterium was the etiological factor for the pneumonia seen especially in immunocompromised individuals such as people with chronic diseases or alcoholics. Community-acquired pneumonia caused by Klebsiella pneumoniae may occasionally be called Friedländer's pneumonia. Epidemiology Illness most commonly affects middle-aged and older men more often than women with debilitating diseases. This patient population is believed to have impaired respiratory host defenses, including persons with diabetes, alcoholism, malignancy, liver disease, chronic obstructive pulmonary diseases, glucoc
https://en.wikipedia.org/wiki/Goldstine%20theorem
In functional analysis, a branch of mathematics, the Goldstine theorem, named after Herman Goldstine, is stated as follows: Goldstine theorem. Let be a Banach space, then the image of the closed unit ball under the canonical embedding into the closed unit ball of the bidual space is a weak*-dense subset. The conclusion of the theorem is not true for the norm topology, which can be seen by considering the Banach space of real sequences that converge to zero, c0 space and its bi-dual space Lp space Proof Lemma For all and there exists an such that for all Proof of lemma By the surjectivity of it is possible to find with for Now let Every element of satisfies and so it suffices to show that the intersection is nonempty. Assume for contradiction that it is empty. Then and by the Hahn–Banach theorem there exists a linear form such that and Then and therefore which is a contradiction. Proof of theorem Fix and Examine the set Let be the embedding defined by where is the evaluation at map. Sets of the form form a base for the weak* topology, so density follows once it is shown for all such The lemma above says that for any there exists a such that and in particular Since we have We can scale to get The goal is to show that for a sufficiently small we have Directly checking, one has Note that one can choose sufficiently large so that for Note as well that If one chooses so that then Hence one gets as desired. See also
https://en.wikipedia.org/wiki/Phenylsilatrane
Phenylsilatrane is a convulsant chemical which has been used as a rodenticide. Phenylsilatrane and some of its analogs with 4-substituents of H, CH3, Cl, Br, and CSi(CH3)3 are highly toxic to mice. They have been observed in the laboratory to inhibit the 35S-tert-butylbicyclophosphorothionate (TBPS) binding site (GABA-gated chloride channel) of mouse brain membranes. See also Atrane GABAA receptor negative allosteric modulator GABAA receptor § Ligands Chlorophenylsilatrane
https://en.wikipedia.org/wiki/CPAchecker
CPAchecker is a framework and tool for formal software verification, and program analysis, of C programs. Some of its ideas and concepts, for example lazy abstraction, were inherited from the software model checker BLAST. CPAchecker is based on the idea of configurable program analysis which is a concept that allows expression of both model checking and program analysis with one formalism. When executed, CPAchecker performs a reachability analysis, i.e., it checks whether a certain state, which violates a given specification, can potentially be reached. One application of CPAchecker is the verification of Linux device drivers. Achievements CPAchecker came first in two categories (Overall and ControlFlowInteger) in the 1st Competition on Software Verification (2012) that was held at TACAS 2012 in Tallinn. CPAchecker came first (category Overall) in the 2nd Competition on Software Verification (2013) that was held at TACAS 2013 in Rome. Architecture CPAchecker operates on a control-flow automaton (CFA); before a given C program can be analyzed by the CPA algorithm, it gets transformed into a CFA. A CFA is a directed graph whose edges represent either assumptions or assignments and its nodes represent program locations.
https://en.wikipedia.org/wiki/Uteroglobin
Uteroglobin, or blastokinin, also known as secretoglobin family 1A member 1 (SCGB1A1), is a protein that in humans is encoded by the SCGB1A1 gene. SCGB1A1 is the founding member of the secretoglobin family of small, secreted, disulfide-bridged dimeric proteins found only in mammals. This antiparallel disulfide linked homodimeric protein is multifunctional and found in various tissues in various names such as: uteroglobin (UG, UGB), uteroglobin-like antigen (UGL), blastokinin, club-cell secretory protein (CCSP), Clara-cell 16 kD protein (17 in rat/mice), club-cell-specific 10 kD protein (CC10), human protein 1, urine protein 1 (UP-1), polychlorinated biphenyl-binding protein (PCB-BP), human club cell phospholipid-binding protein (hCCPBP), secretoglobin 1A member 1 (SCGB1A1). This protein is specifically expressed in club cells in the lungs. Function The precise physiological role of uteroglobin is not yet known. Putative functions are: Immunomodulation Progesterone binding: weak in some animals, especially weak in humans. (Note: UGB is itself progesterone induced gene in the endometrium in Lagomorphs) Inhibits phospholipase A2 in vitro Binds phosphatidylcholine, phosphatidylinositol Binds to fibronectin: The uteroglobulin knockout mice on the inbred C57Bl6 strain develop Goodpasture's syndrome like glomerulopathy due to fibronectin binding of IgA which might potentially be prevented by uteroglobin replacement. However contrary to the animal model claims, human genetic data might suggest that the effect may be indirect Uteroglobin knockout mice on the inbred 129 strain appear to have healthy phenotype (no glomerulopathy development), but show physiological differences in their responses to respiratory challenges. The phenotype exhibited by these mice are; decreased bioaccumulation of biphenyls, susceptibility and increased IL-13, and IL-6 following hyperoxic challenge, and changes in the club cell morphology. Target of polychlorinated biphenyl (pcb) bindin
https://en.wikipedia.org/wiki/HAIFA%20construction
The HAIFA construction (hash iterative framework) is a cryptographic structure used in the design of hash functions. It is one of the modern alternatives to the Merkle–Damgård construction, avoiding its weaknesses like length extension attacks. The construction was designed by Eli Biham and Orr Dunkelman in 2007. Three of the 14 second round candidates in the NIST hash function competition were based on HAIFA constructions (BLAKE, SHAvite-3, ECHO). Other hash functions based on it are LAKE, Sarmal, SWIFFTX and HNF-256. The construction of Skein (Unique Block Iteration) is similar to HAIFA. Another alternative construction is the sponge construction.
https://en.wikipedia.org/wiki/Lipophilic%20efficiency
Lipophilic efficiency (LiPE), sometimes referred to as ligand-lipophilicity efficiency (LLE) is a parameter used in drug design and drug discovery to evaluate the quality of research compounds, linking potency and lipophilicity in an attempt to estimate druglikeness. For a given compound LiPE is defined as the pIC50 (or pEC50) of interest minus the LogP of the compound. In practice, calculated values such as cLogP or calculated LogD are often used instead of the measured LogP or LogD. LiPE is used to compare compounds of different potencies (pIC50s) and lipophilicities (LogP). High potency (high value of pIC50) is a desirable attribute in drug candidates, as it reduces the risk of non-specific, off-target pharmacology at a given concentration. When associated with low clearance, high potency also allows for low total dose, which lowers the risk of idiosyncratic drug reaction. On the other hand, LogP is an estimate of a compound's overall lipophilicity, a value that influence its behavior in a range of biological processes relevant to a drug discovery, such as solubility, permeability through biological membranes, hepatic clearance, lack of selectivity and non-specific toxicity. For oral drugs, a LogP value comprised between 2 and 3 is often considered optimal to achieve a compromise between permeability and first-pass clearance. LiPE allows capturing both values in a single parameter, and empirical evidence suggest that quality drug candidates have a high LiPE (>6); this value corresponds to a compound with a pIC50 of 8 and a LogP of 2. Plotting LogP against pIC50 for a range of compounds allows ranking series and individual compounds. An alternative equation uses the logarithm of the ratio of potency (measured as binding energy) and the partition coefficient to compute a lipophilic ligand efficiency index (LE) with a different scale. The following review discusses LipE in the context of other compound efficiency metrics.
https://en.wikipedia.org/wiki/KCNK17
Potassium channel subfamily K member 17 is a protein that in humans is encoded by the KCNK17 gene. This gene encodes K2P17.1, one of the members of the superfamily of potassium channel proteins containing two pore-forming P domains. This open channel, primarily expressed in the pancreas, is activated at alkaline pH. See also Tandem pore domain potassium channel
https://en.wikipedia.org/wiki/Roger%20Jones%20%28mathematician%29
Roger L. Jones is an American mathematician specializing in harmonic analysis and ergodic theory. Biography He obtained a B.S. in mathematics in 1971 from University at Albany, SUNY, and a Ph.D. in mathematics in 1974 from Rutgers University, with thesis Inequalities for the Ergodic Maximal Function written under the direction of Richard Floyd Gundy. He has recently retired from a professorship in mathematics at DePaul University in Chicago. There he taught everything from remedial math to graduate-level courses. During his tenure at DePaul, Roger published numerous research papers in math, was awarded an excellence in teaching award, chaired the DePaul University Mathematics Department, and was awarded National Science Foundation grants related to teaching mathematics. He has also worked with the Chicago Public Schools on improving math instruction. Roger was honored for his research work at the International Conference on Harmonic Analysis and Ergodic theory that was held in the name of Roger and his colleague Marshall Ash. Roger has since retired from teaching at DePaul and moved to Northern Wisconsin, where he teaches mathematics at Conserve School. Appointments 1974-1977: DePaul University, Assistant Professor 1977-1984: DePaul University, Associate Professor 1982-1985: DePaul University, Chairman: Department of Mathematics 1984-2004: DePaul University, Professor 2004–present: DePaul University, Professor Emeritus Professional memberships Mathematical Association of America American Mathematical Society Publications
https://en.wikipedia.org/wiki/Command%20and%20Control%20%28book%29
Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety is a 2013 nonfiction book by Eric Schlosser about the history of nuclear weapons systems and accidents involving nuclear weapons in the United States. Incidents Schlosser discusses in the book include the 1980 Damascus Titan missile explosion, the 1966 Palomares B-52 crash, and the 1961 Goldsboro B-52 crash. It was a finalist for the 2014 Pulitzer Prize for History. A documentary film based on the book aired as an episode of American Experience on PBS in early 2017. Critical reception A review in The New York Times described it as a "disquieting but riveting" book and Schlosser as a "better reporter than policy analyst". Speaking of the book, domestic security adviser Lee H. Hamilton said, "The lesson of this powerful and disturbing book is that the world's nuclear arsenals are not as safe as they should be. We should take no comfort in our skill and good fortune in preventing a nuclear catastrophe, but urgently extend our maximum effort to assure that a nuclear weapon does not go off by accident, mistake, or miscalculation."
https://en.wikipedia.org/wiki/Seed%20orchard
A seed orchard is an intensively-managed plantation of specifically arranged trees for the mass production of genetically improved seeds to create plants, or seeds for the establishment of new forests. General Seed orchards are a common method of mass-multiplication for transferring genetically improved material from breeding populations to production populations (forests) and in this sense are often referred to as "multiplication" populations. A seed orchard is often composed of grafts (vegetative copies) of selected genotypes, but seedling seed orchards also occur mainly to combine orchard with progeny testing. Seed orchards are the strong link between breeding programs and plantation establishment. They are designed and managed to produce seeds of superior genetic quality compared to those obtained from seed production areas, seed stands, or unimproved stands. Material and connection with breeding population In first generation seed orchards, the parents usually are phenotypically-selected trees. In advanced generation seed orchards, the seed orchards are harvesting the benefits generated by tree breeding and the parents may be selected among the tested clones or families. It is efficient to synchronise the productive live cycle of the seed orchards with the cycle time of the breeding population. In the seed orchard, the trees can be arranged in a design to keep the related individuals or cloned copies apart from each other. Seed orchards are the delivery vehicle for genetic improvement programs where the trade-off between genetic gain and diversity is the most important concern. The genetic gain of seed orchard crops depends primarily on the genetic superiority of the orchard parents, the gametic contribution to the resultant seed crops, and pollen contamination from outside seed orchards. Genetic diversity of seed orchard crops Seed production and gene diversity is an important aspect when using improved materials like seed orchard crops. Seed orchards crops
https://en.wikipedia.org/wiki/Vector%20processor
In computing, a vector processor or array processor is a central processing unit (CPU) that implements an instruction set where its instructions are designed to operate efficiently and effectively on large one-dimensional arrays of data called vectors. This is in contrast to scalar processors, whose instructions operate on single data items only, and in contrast to some of those same scalar processors having additional single instruction, multiple data (SIMD) or SWAR Arithmetic Units. Vector processors can greatly improve performance on certain workloads, notably numerical simulation and similar tasks. Vector processing techniques also operate in video-game console hardware and in graphics accelerators. Vector machines appeared in the early 1970s and dominated supercomputer design through the 1970s into the 1990s, notably the various Cray platforms. The rapid fall in the price-to-performance ratio of conventional microprocessor designs led to a decline in vector supercomputers during the 1990s. History Early research and development Vector processing development began in the early 1960s at the Westinghouse Electric Corporation in their Solomon project. Solomon's goal was to dramatically increase math performance by using a large number of simple coprocessors under the control of a single master Central processing unit (CPU). The CPU fed a single common instruction to all of the arithmetic logic units (ALUs), one per cycle, but with a different data point for each one to work on. This allowed the Solomon machine to apply a single algorithm to a large data set, fed in the form of an array. In 1962, Westinghouse cancelled the project, but the effort was restarted by the University of Illinois at Urbana–Champaign as the ILLIAC IV. Their version of the design originally called for a 1 GFLOPS machine with 256 ALUs, but, when it was finally delivered in 1972, it had only 64 ALUs and could reach only 100 to 150 MFLOPS. Nevertheless, it showed that the basic concept wa
https://en.wikipedia.org/wiki/Polony%20%28biology%29
Polony is a contraction of "polymerase colony," a small colony of DNA. Polonies are discrete clonal amplifications of a single DNA molecule, grown in a gel matrix. This approach greatly improves the signal-to-noise ratio. Polonies can be generated using several techniques that include solid-phase polymerase chain reaction (PCR) in polyacrylamide gels. However, other earlier patented technologies, such as that from Manteia Predictive Medicine (acquired by Solexa), which generate DNA on a solid-phase surface by bridge amplification, are generally referred to as "clusters". The terminology and distinction between 'polony' and 'cluster' have become confused recently. Growth of clonal copies of DNA on bead surfaces remains to be generically named although some also seek to name this technique as a "polony" method. The concept of localizing and analyzing regions containing clonal nucleic acid populations was first described in patents by Brown, et al.. (assigned to Genomic Nanosystems), however these are in liquid phase. Clusters are distinct in that they are based on solid-phase amplification of single DNA molecules where the DNA has been covalently attached to a surface. This technology, initially coined "DNA colony generation", had been invented and developed in late 1996 at Glaxo-Welcome's Geneva Biomedical Research Institute (GBRI), by Dr Pascal Mayer and Dr Laurent Farinelli, and was publicly presented for the first time in 1998. It was finally brought to market by Solexa. Solexa Ltd/INC (Bentley et al.).
https://en.wikipedia.org/wiki/The%20Hobbit%20%281982%20video%20game%29
The Hobbit is an illustrated text adventure computer game released in 1982 for the ZX Spectrum home computer and based on the 1937 book The Hobbit, by J. R. R. Tolkien. It was developed at Beam Software by Philip Mitchell and Veronika Megler and published by Melbourne House. It was later converted to most home computers available at the time including the Commodore 64, BBC Micro, and Oric computers. By arrangement with the book publishers, a copy of the book was included with each game sold. The parser was very advanced for the time and used a subset of English called Inglish. When it was released, most adventure games used simple verb-noun parsers (allowing for simple phrases like "get lamp"), but Inglish allowed the player to type advanced sentences such as "ask Gandalf about the curious map then take sword and kill troll with it". The parser was complex and intuitive, introducing pronouns, adverbs ("viciously attack the goblin"), punctuation and prepositions and allowing the player to interact with the game world in ways not previously possible. Gameplay Many locations are illustrated by an image, based on originals designed by Kent Rees. On the tape version, to save space, each image was stored in a compressed format by storing outline information and then flood filling the enclosed areas on the screen. The slow CPU speed meant that it would take up to several seconds for each scene to draw. The disk-based versions of the game used pre-rendered, higher-quality images. The game has an innovative text-based physics system, developed by Veronika Megler. Objects, including the characters in the game, have a calculated size, weight, and solidity. Objects can be placed inside other objects, attached together with rope and damaged or broken. If the main character is sitting in a barrel and this barrel is then picked up and thrown through a trapdoor, the player would go through. Unlike other works of interactive fiction, the game is also in real time, insofar as a p
https://en.wikipedia.org/wiki/Speech%20technology
Speech technology relates to the technologies designed to duplicate and respond to the human voice. They have many uses. These include aid to the voice-disabled, the hearing-disabled, and the blind, along with communication with computers without a keyboard. They enhance game software and aid in marketing goods or services by telephone. The subject includes several subfields: Speech synthesis Speech recognition Speaker recognition Speaker verification Speech encoding Multimodal interaction See also Communication aids Language technology Speech interface guideline Speech processing Speech Technology (magazine) External links Speech processing da:Taleteknologi fi:Puheteknologia th:การประมวลผลคำพูด
https://en.wikipedia.org/wiki/Legendre%20polynomials
In mathematics, Legendre polynomials, named after Adrien-Marie Legendre (1782), are a system of complete and orthogonal polynomials with a vast number of mathematical properties and numerous applications. They can be defined in many ways, and the various definitions highlight different aspects as well as suggest generalizations and connections to different mathematical structures and physical and numerical applications. Closely related to the Legendre polynomials are associated Legendre polynomials, Legendre functions, Legendre functions of the second kind, Big q-Legendre polynomials, and associated Legendre functions. Definition by construction as an orthogonal system In this approach, the polynomials are defined as an orthogonal system with respect to the weight function over the interval . That is, is a polynomial of degree , such that With the additional standardization condition , all the polynomials can be uniquely determined. We then start the construction process: is the only correctly standardized polynomial of degree 0. must be orthogonal to , leading to , and is determined by demanding orthogonality to and , and so on. is fixed by demanding orthogonality to all with . This gives conditions, which, along with the standardization fixes all coefficients in . With work, all the coefficients of every polynomial can be systematically determined, leading to the explicit representation in powers of given below. This definition of the 's is the simplest one. It does not appeal to the theory of differential equations. Second, the completeness of the polynomials follows immediately from the completeness of the powers 1, . Finally, by defining them via orthogonality with respect to the most obvious weight function on a finite interval, it sets up the Legendre polynomials as one of the three classical orthogonal polynomial systems. The other two are the Laguerre polynomials, which are orthogonal over the half line , and the Hermite polynomials, orthog
https://en.wikipedia.org/wiki/Cytometry
Cytometry is the measurement of number and characteristics of cells. Variables that can be measured by cytometric methods include cell size, cell count, cell morphology (shape and structure), cell cycle phase, DNA content, and the existence or absence of specific proteins on the cell surface or in the cytoplasm. Cytometry is used to characterize and count blood cells in common blood tests such as the complete blood count. In a similar fashion, cytometry is also used in cell biology research and in medical diagnostics to characterize cells in a wide range of applications associated with diseases such as cancer and AIDS. Cytometric devices Image cytometers Image cytometry is the oldest form of cytometry. Image cytometers operate by statically imaging a large number of cells using optical microscopy. Prior to analysis, cells are commonly stained to enhance contrast or to detect specific molecules by labeling these with fluorochromes. Traditionally, cells are viewed within a hemocytometer to aid manual counting. Since the introduction of the digital camera, in the mid-1990s, the automation level of image cytometers has steadily increased. This has led to the commercial availability of automated image cytometers, ranging from simple cell counters to sophisticated high-content screening systems. Flow cytometers Due to the early difficulties of automating microscopy, the flow cytometer has since the mid-1950s been the dominating cytometric device. Flow cytometers operate by aligning single cells using flow techniques. The cells are characterized optically or by the use of an electrical impedance method called the Coulter principle. To detect specific molecules when optically characterized, cells are in most cases stained with the same type of fluorochromes that are used by image cytometers. Flow cytometers generally provide less data than image cytometers, but have a significantly higher throughput. Cell sorters Cell sorters are flow cytometers capable of sorting ce
https://en.wikipedia.org/wiki/Carolyn%20S.%20Gordon
Carolyn S. Gordon (born 1950) is a mathematician and Benjamin Cheney Professor of Mathematics at Dartmouth College. She is most well known for giving a negative answer to the question "Can you hear the shape of a drum?" in her work with David Webb and Scott A. Wolpert. She is a Chauvenet Prize winner and a 2010 Noether Lecturer. Early life and education Gordon received her Bachelor of Science degree from the Purdue University. She entered graduate studies at the Washington University in St. Louis, earning her Doctor of Philosophy in mathematics in 1979. Her doctoral advisor was Edward Nathan Wilson and her thesis was on isometry groups of homogeneous manifolds. She completed a postdoc at Technion Israel Institute of Technology and held positions at Lehigh University and Washington University in St. Louis. Career Gordon is most well known for her work in isospectral geometry, for which hearing the shape of a drum is the prototypical example. In 1966 Mark Kac asked whether the shape of a drum could be determined by the sound it makes (whether a Riemannian manifold is determined by the spectrum of its Laplace–Beltrami operator). John Milnor observed that a theorem due to Witt implied the existence of a pair of 16-dimensional tori that have the same spectrum but different shapes. However, the problem in two dimensions remained open until 1992, when Gordon, with coauthors Webb and Wolpert, constructed a pair of regions in the Euclidean plane that have different shapes but identical eigenvalues (see figure on right). In further work, Gordon and Webb produced convex isospectral domains in the hyperbolic plane and in Euclidean space. Gordon has written or coauthored over 30 articles on isospectral geometry including work on isospectral closed Riemannian manifolds with a common Riemannian covering. These isospectral Riemannian manifolds have the same local geometry but different topology. They can be found using the "Sunada method," due to Toshikazu Sunada. In 1993
https://en.wikipedia.org/wiki/Matrix%20variate%20beta%20distribution
In statistics, the matrix variate beta distribution is a generalization of the beta distribution. If is a positive definite matrix with a matrix variate beta distribution, and are real parameters, we write (sometimes ). The probability density function for is: Here is the multivariate beta function: where is the multivariate gamma function given by Theorems Distribution of matrix inverse If then the density of is given by provided that and . Orthogonal transform If and is a constant orthogonal matrix, then Also, if is a random orthogonal matrix which is independent of , then , distributed independently of . If is any constant , matrix of rank , then has a generalized matrix variate beta distribution, specifically . Partitioned matrix results If and we partition as where is and is , then defining the Schur complement as gives the following results: is independent of has an inverted matrix variate t distribution, specifically Wishart results Mitra proves the following theorem which illustrates a useful property of the matrix variate beta distribution. Suppose are independent Wishart matrices . Assume that is positive definite and that . If where , then has a matrix variate beta distribution . In particular, is independent of . See also Matrix variate Dirichlet distribution
https://en.wikipedia.org/wiki/Nassi%E2%80%93Shneiderman%20diagram
A Nassi–Shneiderman diagram (NSD) in computer programming is a graphical design representation for structured programming. This type of diagram was developed in 1972 by Isaac Nassi and Ben Shneiderman who were both graduate students at Stony Brook University. These diagrams are also called structograms, as they show a program's structures. Overview Following a top-down design, the problem at hand is reduced into smaller and smaller subproblems, until only simple statements and control flow constructs remain. Nassi–Shneiderman diagrams reflect this top-down decomposition in a straightforward way, using nested boxes to represent subproblems. Consistent with the philosophy of structured programming, Nassi–Shneiderman diagrams have no representation for a GOTO statement. Nassi–Shneiderman diagrams are only rarely used for formal programming. Their abstraction level is close to structured program code and modifications require the whole diagram to be redrawn, but graphic editors removed that limitation. They clarify algorithms and high-level designs, which make them useful in teaching. They were included in Microsoft Visio and dozens of other software tools, such as the German EasyCODE. In Germany, Nassi–Shneiderman diagrams were standardised in 1985 as DIN 66261. They are still used in German introductions to programming, for example Böttcher and Kneißl's introduction to C, Baeumle-Courth and Schmidt's introduction to C and Kirch's introduction to C#. Nassi–Shneiderman diagrams can also be used in technical writing. Diagrams Process blocks: the process block represents the simplest of steps and requires no analysis. When a process block is encountered, the action inside the block is performed and we move onto the next block. Branching blocks: there are two types of branching blocks. First is the simple True/False or Yes/No branching block which offers the program two paths to take depending on whether or not a condition has been fulfilled. These blocks can be u
https://en.wikipedia.org/wiki/Progressively%20measurable%20process
In mathematics, progressive measurability is a property in the theory of stochastic processes. A progressively measurable process, while defined quite technically, is important because it implies the stopped process is measurable. Being progressively measurable is a strictly stronger property than the notion of being an adapted process. Progressively measurable processes are important in the theory of Itô integrals. Definition Let be a probability space; be a measurable space, the state space; be a filtration of the sigma algebra ; be a stochastic process (the index set could be or instead of ); be the Borel sigma algebra on . The process is said to be progressively measurable (or simply progressive) if, for every time , the map defined by is -measurable. This implies that is -adapted. A subset is said to be progressively measurable if the process is progressively measurable in the sense defined above, where is the indicator function of . The set of all such subsets form a sigma algebra on , denoted by , and a process is progressively measurable in the sense of the previous paragraph if, and only if, it is -measurable. Properties It can be shown that , the space of stochastic processes for which the Itô integral with respect to Brownian motion is defined, is the set of equivalence classes of -measurable processes in . Every adapted process with left- or right-continuous paths is progressively measurable. Consequently, every adapted process with càdlàg paths is progressively measurable. Every measurable and adapted process has a progressively measurable modification.
https://en.wikipedia.org/wiki/Atari%20Mindlink
The Atari Mindlink is an unreleased video game controller for the Atari 2600, originally intended for release in 1984. The Mindlink was unique in that its headband form factor controls the game by reading the myoneural signal voltage from the player's forehead. The player's forehead movements are read by infrared sensors and transferred as movement in the game. Specially supported games are similar to those that use the paddle controller, but with the Mindlink controller instead. Three games were in development for the Mindlink by its cancellation: Bionic Breakthrough, Telepathy, and Mind Maze. Bionic Breakthrough is basically a Breakout clone, controlled with the Mindlink. Mind Maze uses the Mindlink for a mimicry of ESP, to pretend to predict what is printed on cards. Testing showed that players frequently got headaches due to moving their eyebrows to play the game. None of these games were ever released in any other form.
https://en.wikipedia.org/wiki/Web%20data%20services
Web data services refers to service-oriented architecture (SOA) applied to data sourced from the World Wide Web and the Internet as a whole. Web data services enable maximal mashup, reuse, and sharing of structured data (such as relational tables), semi-structured information (such as Extensible Markup Language (XML) documents), and unstructured information (such as RSS feeds, content from Web applications, commercial data from online business sources). In a Web data services environment, applications may subscribe to and consume information, provide and publish information for others to consume, or both. Applications that can serve as a consumer-subscriber and/or provider-publisher of Web data services include mobile computing, Web portals, enterprise portals, online business software, social media, and social networks. Web data services may support business-to-consumer (B2C) and business-to-business (B2B) information-sharing requirements. Increasingly, enterprises are including Web data services in their SOA implementations, as they integrate mashup-style user-driven information sharing into business intelligence, business process management, predictive analytics, content management, and other applications, according to industry analysts. To speed development of Web data services, enterprises can deploy technologies that ease discovery, extraction, movement, transformation, cleansing, normalization, joining, consolidation, access, and presentation of disparate information types from diverse internal sources (such as data warehouses and customer relationship management (CRM) systems) and external sources (such as commercial market data aggregators). Web data services build on industry-standard protocols, interfaces, formats, and integration patterns, such as those used for SOA, Web 2.0, Web-Oriented Architecture, and Representational State Transfer (REST). In addition to operating over the public Internet, Web data services may run solely within corporate intrane
https://en.wikipedia.org/wiki/Instituto%20de%20F%C3%ADsica%20Corpuscular
The Instituto de Física Corpuscular (IFIC, English: Institute for Corpuscular Physics) is a CSIC and University of Valencia joint center dedicated to experimental and theoretical research in the fields of particle physics, nuclear physics, cosmology, astroparticles and medical physics. It is located at the scientific park of the University of Valencia, in Paterna (Valencia, Spain). History In the autumn of 1950 Prof. Joaquin Catalá formed a group in Valencia to study atomic nuclei and elementary particles using nuclear emulsions. He had first been working in Bristol with C.F. Powell, who received the Nobel Prize in Physics in 1950 for using this technique to detect particles in cosmic rays. Prof. Catalá’s group first operated as a Local Division of the Instituto de Óptica Daza de Valdés belonging to CSIC and specialized in photo-nuclear studies. One of Catalá’s students, Fernando Senent, later becoming Professor and director of the Institute, produced what was the first Spanish thesis in Experimental Particle and Nuclear Physics. It was at the beginning of 1960 when the Institute got its present name, IFIC, Instituto de Física Corpuscular. IFIC, hence, is one of the oldest Spanish Institutes in Experimental Physics and the oldest studying particle and nuclear physics.
https://en.wikipedia.org/wiki/%CE%A0%20pad
The Π pad (pi pad) is a specific type of attenuator circuit in electronics whereby the topology of the circuit is formed in the shape of the Greek capital letter pi (Π). Attenuators are used in electronics to reduce the level of a signal. They are also referred to as pads due to their effect of padding down a signal by analogy with acoustics. Attenuators have a flat frequency response attenuating all frequencies equally in the band they are intended to operate. The attenuator has the opposite task of an amplifier. The topology of an attenuator circuit will usually follow one of the simple filter sections. However, there is no need for more complex circuitry, as there is with filters, due to the simplicity of the frequency response required. Circuits are required to be balanced or unbalanced depending on the geometry of the transmission lines they are to be used with. For radio frequency applications, the format is often unbalanced, such as coaxial. For audio and telecommunications, balanced circuits are usually required, such as with the twisted pair format. The Π pad is intrinsically an unbalanced circuit. However, it can be converted to a balanced circuit by placing half the series resistance in the return path. Such a circuit is called a box section because the circuit is formed in the shape of a box. Terminology An attenuator is a form of a two-port network with a generator connected to one port and a load connected to the other. In all of the circuits given below it is assumed that the generator and load impedances are purely resistive (though not necessarily equal) and that the attenuator circuit is required to perfectly match to these. The symbols used for these impedances are; the impedance of the generator the impedance of the load Popular values of impedance are 600Ω in telecommunications and audio, 75Ω for video and dipole antennae, and 50Ω for RF. The voltage transfer function, A, is, While the inverse of this is the loss, L, of t
https://en.wikipedia.org/wiki/Epson%20Equity
The Epson Equity series of IBM Compatible Personal Computers was manufactured from 1985 until the early '90s by Epson Inc. Epson was well known for its dot matrix printers at the time and the Equity series represents their entry into the growing PC compatible market. The Equity I was the first system introduced, equipped with an Intel 8088 CPU and one or two 5.25" floppy disk drives. The original Equity I was a no-frills offering. It ran at the PC's standard 4.77 MHz clock rate, came with 256 KB RAM, expansion above 512 KB required an expansion board, displayed CGA video, had few available expansion slots, only two half-height drive bays, and lacked a socket for an 8087 math chip. Subsequent versions, the Equity I+ and Apex 100, upped the clock rate to 10 MHz, the standard RAM to 640 KB, supported 3.5-inch floppy drives and hard disks, sported an 8087 socket, and had a "MGA - Multi-Graphics Adapter" card, offering an Hercules compatible monochrome mode, and a new 120x200 eight colors mode. Epson bundled some utility programs that offered decent turnkey functionality for novice users. The Equity was a reliable and compatible design for half the price of a similarly-configured IBM PC. Epson often promoted sales by bundling one of their printers with it at cost. The Equity I sold well enough to warrant the furtherance of the Equity line with the follow-on Equity II, Equity III, and others based on the i386SX. Models Equity I (8088) Equity Ie (8086, only known IBM clone with MCGA video alongside its European model PSE-30) Equity I+, Equity Apex 100 Equity II (V30) Equity II+ (80286) Equity III (80286) Equity IIe (80286) Equity III+ (80286) Equity LT (V30) Equity 286 Equity 386SX Equity 386DX See also Epson ActionNote
https://en.wikipedia.org/wiki/Word-representable%20graph
In the mathematical field of graph theory, a word-representable graph is a graph that can be characterized by a word (or sequence) whose entries alternate in a prescribed way. In particular, if the vertex set of the graph is V, one should be able to choose a word w over the alphabet V such that letters a and b alternate in w if and only if the pair ab is an edge in the graph. (Letters a and b alternate in w if, after removing from w all letters but the copies of a and b, one obtains a word abab... or a word baba....) For example, the cycle graph labeled by a, b, c and d in clock-wise direction is word-representable because it can be represented by abdacdbc: the pairs ab, bc, cd and ad alternate, but the pairs ac and bd do not. The word w is G's word-representant, and one says that that w represents G. The smallest (by the number of  vertices) non-word-representable graph is the wheel graph W5, which is the only non-word-representable graph on 6 vertices. The definition of a word-representable graph works both in labelled and unlabelled cases since any labelling of a graph is equivalent to any other labelling. Also, the class of word-representable graphs is hereditary. Word-representable graphs generalise several important classes of graphs such as circle graphs, 3-colorable graphs and comparability graphs. Various generalisations of the theory of word-representable graphs accommodate representation of any graph. History Word-representable graphs were introduced by Sergey Kitaev in 2004 based on joint research with Steven Seif on the Perkins semigroup, which has played an important role in semigroup theory since 1960. The first systematic study of word-representable graphs was undertaken in a 2008 paper by Kitaev and Artem Pyatkin, starting development of the theory. One of key contributors to the area is Magnús M. Halldórsson. Up to date, 35+ papers have been written on the subject, and the core of the book by Sergey Kitaev and Vadim Lozin is devoted to the th
https://en.wikipedia.org/wiki/On%20the%20roof%20gang
The On the Roof Gang (sometimes written On-the-Roof-Gang and abbreviated OTRG) was a group of United States Navy cryptologists and radiomen during World War II who are seen as the forerunners of U.S. Navy cryptology and cryptanalysis. One hundred fifty Sailors and 26 Marines worked on the roof of the Navy Department building in Washington, D.C. from 1928 to 1941. History The On the Roof Gang was a school for radiomen and cryptologists who would go on to deploy on ships and to overseas bases and collect foreign signals intelligence (SIGINT) and communications intelligence (COMINT) to monitor the movements and operations, and intercept the message traffic, of foreign navies. In 1928 the Chief of Naval Operations understood that a group of formally trained operators was needed in the Pacific Fleet to monitor Japanese naval communications. There existed a small cadre of self-taught operators in the Pacific theater, and two of them were selected to become instructors in the "On the Roof Gang." Chief Radioman Harry Kidder and Chief Radioman Dorman Chauncey instructed the early classes. Initial graduates were sent to ground stations in the Pacific to monitor the Japanese. Later the cryptologists eventually began to serve on board ships. Modern Impact Naval Network Warfare Command (NETWARCOM) honors Navy and Marine cryptologists with the "On-the-Roof-Gang" Award. The award recognizes lifetime accomplishments in the field of cryptography. Part of the area formerly occupied by the Naval Building is now home to the Vietnam Memorial. See also Defense Language Aptitude Battery (The test taken to become a CTI) Defense Language Proficiency Tests (The tests taken to assess the skill level of CTIs)
https://en.wikipedia.org/wiki/HotChalk
HotChalk was an education technology company founded in September 2004. HotChalk ran an online community application designed for grade school teachers, students, and parents. In August 2007, McGraw-Hill partnered with HotChalk to make McGraw-Hill training and certification tools available to HotChalk users. NBC partnered with HotChalk as well to distribute NBC news archives to supplement educational materials. HotChalk was founded by Edward M. Fields; the company's last CEO was Rob Wrubel. The company drew scrutiny from the U.S. Department of Education in the mid-2010s regarding HotChalk's relationship with Concordia University of Portland, Oregon. A federal prosecutor alleged that the university's $160 million deal with HotChalk violated a law that prohibits incentives for recruitment and outsourcing of more than half an educational program to an unaccredited party. The investigation was settled out-of-court for $1 million with no admissions of wrongdoing. In November 2020, Noodle acquired Hot Chalk.
https://en.wikipedia.org/wiki/The%20Banksias
The Banksias, by Celia Rosser, is a three-volume series of monographs containing paintings of every Banksia species. Its publication represented the first time such a large genus had been entirely painted by a single botanical artist. It has been described as "one of the outstanding botanical works of this century." The paintings themselves are watercolours on Arches rag paper. The three volumes comprise plates reproduced using offset printing, and bound in green leather. Alex George wrote the accompanying text. Rosser began working on the series in 1974. Volume I of The Banksias, containing 24 plates, was published in 1981. The edition comprised 730 books and 100 portfolios. Volume II, published in 1988, also contained 24 plates and was also released in an edition of 730 books and 100 portfolios. Volume III, completed in 2000, contained 28 plates, and was released in an edition of 530 books and 300 portfolios. Since the publication of Volume III, a new Banksia, B. rosserae has been described; Rosser subsequently painted it and released a set of prints. In 2007, the genus Dryandra was transferred to Banksia, so there are now a great many Banksia species that have not been painted by Rosser. Each volume of The Banksias has been presented to the Queen by the Australian Government as a gift of the Australian people. Largely on the basis of her work for The Banksias, Rosser was awarded a Medal of the Order of Australia 1995 for her contribution to botanical art, and the Jill Smythies Award for botanical art in 1997.
https://en.wikipedia.org/wiki/Iterative%20Stencil%20Loops
Iterative Stencil Loops (ISLs) are a class of numerical data processing solution which update array elements according to some fixed pattern, called a stencil. They are most commonly found in computer simulations, e.g. for computational fluid dynamics in the context of scientific and engineering applications. Other notable examples include solving partial differential equations, the Jacobi kernel, the Gauss–Seidel method, image processing and cellular automata. The regular structure of the arrays sets stencil techniques apart from other modeling methods such as the Finite element method. Most finite difference codes which operate on regular grids can be formulated as ISLs. Definition ISLs perform a sequence of sweeps (called timesteps) through a given array. Generally this is a 2- or 3-dimensional regular grid. The elements of the arrays are often referred to as cells. In each timestep, all array elements are updated. Using neighboring array elements in a fixed pattern (the stencil), each cell's new value is computed. In most cases boundary values are left unchanged, but in some cases (e.g. LBM codes) those need to be adjusted during the computation as well. Since the stencil is the same for each element, the pattern of data accesses is repeated. More formally, we may define ISLs as a 5-tuple with the following meaning: is the index set. It defines the topology of the array. is the (not necessarily finite) set of states, one of which each cell may take on on any given timestep. defines the initial state of the system at time 0. is the stencil itself and describes the actual shape of the neighborhood. There are elements in the stencil. is the transition function which is used to determine a cell's new state, depending on its neighbors. Since I is a k-dimensional integer interval, the array will always have the topology of a finite regular grid. The array is also called simulation space and individual cells are identified by their index . The stenc
https://en.wikipedia.org/wiki/Hyperplasia
Hyperplasia (from ancient Greek ὑπέρ huper 'over' + πλάσις plasis 'formation'), or hypergenesis, is an enlargement of an organ or tissue caused by an increase in the amount of organic tissue that results from cell proliferation. It may lead to the gross enlargement of an organ, and the term is sometimes confused with benign neoplasia or benign tumor. Hyperplasia is a common preneoplastic response to stimulus. Microscopically, cells resemble normal cells but are increased in numbers. Sometimes cells may also be increased in size (hypertrophy). Hyperplasia is different from hypertrophy in that the adaptive cell change in hypertrophy is an increase in the size of cells, whereas hyperplasia involves an increase in the number of cells. Causes Hyperplasia may be due to any number of causes, including proliferation of basal layer of epidermis to compensate skin loss, chronic inflammatory response, hormonal dysfunctions, or compensation for damage or disease elsewhere. Hyperplasia may be harmless and occur on a particular tissue. An example of a normal hyperplastic response would be the growth and multiplication of milk-secreting glandular cells in the breast as a response to pregnancy, thus preparing for future breast feeding. Perhaps the most interesting and potent effect insulin-like growth factor 1 (IGF) has on the human body is its ability to cause hyperplasia, which is an actual splitting of cells. By contrast, hypertrophy is what occurs, for example, to skeletal muscle cells during weight training and is simply an increase in the size of the cells. With IGF use, one is able to cause hyperplasia which actually increases the number of muscle cells present in the tissue. Weight training enables these new cells to mature in size and strength. It is theorized that hyperplasia may also be induced through specific power output training for athletic performance, thus increasing the number of muscle fibers instead of increasing the size of a single fiber. Mechanism Hype
https://en.wikipedia.org/wiki/Dimension%20doubling%20theorem
In probability theory, the dimension doubling theorems are two results about the Hausdorff dimension of an image of a Brownian motion. In their core both statements say, that the dimension of a set under a Brownian motion doubles almost surely. The first result is due to Henry P. McKean jr and hence called McKean's theorem (1955). The second theorem is a refinement of McKean's result and called Kaufman's theorem (1969) since it was proven by Robert Kaufman. Dimension doubling theorems For a -dimensional Brownian motion and a set we define the image of under , i.e. McKean's theorem Let be a Brownian motion in dimension . Let , then -almost surely. Kaufman's theorem Let be a Brownian motion in dimension . Then -almost surley, for any set , we have Difference of the theorems The difference of the theorems is the following: in McKean's result the -null sets, where the statement is not true, depends on the choice of . Kaufman's result on the other hand is true for all choices of simultaneously. This means Kaufman's theorem can also be applied to random sets . Literature
https://en.wikipedia.org/wiki/Line%20Printer%20Daemon%20protocol
The Line Printer Daemon protocol/Line Printer Remote protocol (or LPD, LPR) is a network printing protocol for submitting print jobs to a remote printer. The original implementation of LPD was in the Berkeley printing system in the BSD UNIX operating system; the LPRng project also supports that protocol. The Common Unix Printing System (or CUPS), which is more common on modern Linux distributions and also found on Mac OS X, supports LPD as well as the Internet Printing Protocol (IPP). Commercial solutions are available that also use Berkeley printing protocol components, where more robust functionality and performance is necessary than is available from LPR/LPD (or CUPS) alone (such as might be required in large corporate environments). The LPD Protocol Specification is documented in RFC 1179. Usage A server for the LPD protocol listens for requests on TCP port 515. A request begins with a byte containing the request code, followed by the arguments to the request, and is terminated by an ASCII LF character. An LPD printer is identified by the IP address of the server machine and the queue name on that machine. Many different queue names may exist in one LPD server, with each queue having unique settings. Note that the LPD queue name is case sensitive. Some modern implementations of LPD on network printers might ignore the case or queue name altogether and send all jobs to the same printer. Others have the option to automatically create a new queue when a print job with a new queue name is received. This helps to simplify the setup of the LPD server. Some companies (e.g. D-Link in model DP-301P+) have a tradition of calling the queue name “lpt1” or “LPT1”. A printer that supports LPD/LPR is sometimes referred to as a "TCP/IP printer" (TCP/IP is used to establish connections between printers and clients on a network), although that term would be equally applicable to a printer that supports the Internet Printing Protocol. See also Lp (Unix) LPRng Le
https://en.wikipedia.org/wiki/International%20Young%20Physicists%27%20Tournament
The International Young Physicists' Tournament (IYPT), sometimes referred to as the “Physics World Cup”, is a scientific competition between teams of secondary school students. It mimics, as close as possible, the real-world scientific research and the process of presenting and defending the results obtained. Participants have almost a year to work on 17 open-ended inquiry problems that are published yearly in late July. A good part of the problems involves easy-to-reproduce phenomena presenting unexpected behaviour. The aim of the solutions is not to calculate or reach “the correct answer” as there is no such notion here. The Tournament is rather conclusions-oriented as participants have to design and perform experiments, and to draw conclusions argued from the experiments’ outcome. The competition itself is not a pen-and-paper competition but an enactment of a scientific discussion (or a defence of a thesis) where participants take the roles of Reporter, Opponent and Reviewer, thus learning about peer review early on in their school years. Discussion-based sessions are called Physics Fights and the performances of the teams are judged by expert physicists. Teams can take quite different routes to tackle the same problem. As long as they stay within the broadly defined statement of the problem, all routes are legitimate and teams will be judged according to the depths reached by their investigations. The IYPT is a week-long event in which currently around 150 international pre-university contestants participate. IYPT is associated with The European Physical Society (EPS) and in 2013, IYPT was awarded the medal of The International Union of Pure and Applied Physics (IUPAP) "in recognition of its inspiring and wide-ranging contribution to physics education that has touched many lives and countries, over the past 25 years". Tournament structure The most important structural parts of the IYPT are the Physics Fights. There are 5 Selective Fights, and one Final F
https://en.wikipedia.org/wiki/J.%20Robert%20Oppenheimer%20Memorial%20Prize
The J. Robert Oppenheimer Memorial Prize and Medal was awarded by the Center for Theoretical Studies, University of Miami, from 1969, until 1984. Established in memory of US physicist J. Robert Oppenheimer, the award consisted of a medal, certificate and a $1000 honorarium. It was awarded for "outstanding contributions to the theoretical natural sciences [...] during the preceding decade". The acceptance speech for the inaugural award to Dirac was published as The Development of Quantum Theory (1971). Recipients 1969 – Paul Dirac 1970 – Freeman Dyson 1971 – Abdus Salam 1972 – Robert Serber 1973 – Steven Weinberg 1974 – Edwin Ernest Salpeter 1975 – Nicholas Kemmer 1976 – Yoichiro Nambu 1977 – Feza Gursey and Sheldon Glashow 1978 – Jocelyn Bell Burnell 1979 – Abraham Pais 1980 – Richard H. Dalitz 1981 – Frederick Reines 1982 – Maurice Goldhaber and Robert E. Marshak 1983 – Victor F. Weisskopf 1984 – John Archibald Wheeler See also List of physics awards
https://en.wikipedia.org/wiki/Xerox%20Alto
The Xerox Alto is a computer that was designed from its inception to support an operating system based on a graphical user interface (GUI), later using the desktop metaphor. The first machines were introduced on 1 March 1973, a decade before mass-market GUI machines became available. The Alto is contained in a relatively small cabinet and uses a custom central processing unit (CPU) built from multiple SSI and MSI integrated circuits. Each machine cost tens of thousands of dollars despite its status as a personal computer. Only small numbers were built initially, but by the late 1970s, about 1,000 were in use at various Xerox laboratories, and about another 500 in several universities. Total production was about 2,000 systems. The Alto became well known in Silicon Valley and its GUI was increasingly seen as the future of computing. In 1979, Steve Jobs arranged a visit to Xerox PARC, during which Apple Computer personnel would receive demonstrations of Xerox technology in exchange for Xerox being able to purchase stock options in Apple. After two visits to see the Alto, Apple engineers used the concepts in developing the Apple Lisa and Macintosh systems. Xerox eventually commercialized a heavily modified version of the Alto concepts as the Xerox Star, first introduced in 1981. A complete office system including several workstations, storage and a laser printer cost as much as $100,000, and like the Alto, the Star had little direct impact on the market. History The first computer with a graphical operating system, the Alto built on earlier graphical interface designs. It was conceived in 1972 in a memo written by Butler Lampson, inspired by the oN-Line System (NLS) developed by Douglas Engelbart and Dustin Lindberg at SRI International (SRI). Of further influence was the PLATO education system developed at the Computer-based Education Research Laboratory at the University of Illinois. The Alto was designed mostly by Charles P. Thacker. Industrial Design and manufa
https://en.wikipedia.org/wiki/Agrifirm
Agrifirm is a cooperative enterprise in which more than 10.000 Dutch farmers and horticulturalists have combined their purchasing power. Agrifirm in its current form was founded in 2010 due to a successive merger of regional cooperatives. The enterprise operates as a link for farmers being currently active throughout the entire Netherlands. The Coöperative also has business units in Belgium, Germany, France, Spain, Romania, Hungary, Poland, Serbia, Ukraine, Brazil, Uruguay and China. The head office is located in Apeldoorn, Netherlands. History Agrifirm till 2011 Agrifirm grew out from the Cooperative Agricultural Bank and Trading Company (Coöperatieve Landbouwersbank en Handelsvereeniging) established on 13 April 1909 with the purpose of taking advantage of joint purchasing and sales of agricultural products and equipment. The office was located in Meppel. Later on, it was expended further with a feed mill, grain silos and fertilizer sheds. Besides, it operated as a bank as well as a plant for artificial insemination. The original function of the company was expended due to several mergers in the 1960s and 1970s. At that time the name of the company was Cooperative Agricultural Bank Meppel (Coöperatieve Landbouwersbank Meppel - CLM). In the 1980s the bank division became an independent operating unit as Rabobank Meppel and surroundings. In 1990 there was a merger among CLM, Aceco (Groningen) and CAF (Leeuwarden) resulting in the Purchase and Sales Cooperative Meppel (Aan- en verkoop Coöperatie Meppel - ACM). The "old" Agrifirm - which has covered two-thirds of the Netherlands in terms of providing agricultural products and services (excluding Southern Netherlands) - was established due to the merger between ACM and Cavo Latuco (in Utrecht) in 2002. Since 1 June 2010 Agrifirm as it now exists following the merger between the Agrifirm cooperative (in Meppel) and Cehave Landbouwbelang (in Veghel) – the latest had the jurisdiction in Southern Netherlands. The head
https://en.wikipedia.org/wiki/LearnedLeague
LearnedLeague is a web-based, invitation-only global quiz league operated by Seattle-based software engineer Shayne Bushfield under the pseudonym "Thorsten A. Integrity". As of August 2023, it has over 28,000 members worldwide. Structure Players are organized into leagues with nonspecific geographic designations like "Central" and "Frontier". Players in each league are then sorted into "rundles" based on past performance (all first-time players begin in special rookie rundles). A promotion-and-relegation system is used: a player can move up to higher rundles by finishing at or near the top of a lower one, or move down to lower rundles by finishing at or near the bottom of an upper one. The top players in each league compete annually for the title of LearnedLeague Champion. Gameplay Regular season Each calendar year is divided into four seasons. Each season includes 25 match days—essentially one per U.S. business day. Players are paired against each other each day during the season and compete in a six-question trivia match, with questions from 18 categories ranging from world history, science, and geography to lifestyle, food/drink, and television. Each player attempts to answer as many questions correctly as possible ("offense") and assigns point values to each question ("defense"). Players must assign one question a value of 3 points, two questions values of 2 points, two questions values of 1 point, and one question a value of 0 points (allowing a maximum score per player of 9 points). A player's opponent will get the assigned point value if they answer correctly. Since the past performance of all players based on subject matter is openly available, defense is an important factor in gameplay. Answers must be submitted by 10 PM Pacific Time. Results from the previous day along with the new set of questions are released each Match Day by midnight Pacific time. MiniLeagues and One-Days Special Between regular seasons, a number of optional multiday and single-day
https://en.wikipedia.org/wiki/Death%20by%20vending%20machine
Vending machines being rocked or tilted have been known to cause serious injury and death when the heavy machines fall over. Users may rock machines in order to obtain free products, release stuck products, or obtain change.(1 January 1992). Soft-drink machine crushes woman who kicked it, Sun Journal (Associated Press) The U.S. Consumer Product Safety Commission found in a 1995 study that at least 37 deaths and 113 injuries had occurred due to falling vending machines from 1978 to 1995. This resulted in a voluntary campaign from vending machine manufacturers to warn that rocking or tilting the machines could cause serious injury or death, including placing warning labels on all machines.Morford, Mark (18 July 2001). Death By Vending Machine / Warning: Large heavy appliances can be hazardous to your health, San Francisco Chronicle The U.S. military started putting warning labels on machines in the late 1980s after a number of incidents on military installations.(2 January 1996). Newest Safety tip: Tilting a vending machine can be hazardous to your health, Toledo Blade The vast majority of injuries and deaths have happened to men. The argument that death by a vending machine is more likely to occur than something like winning the Powerball lottery, has drawn more attention to these unusual deaths. One 2012 report states that the odds of winning Powerball are 1 in 175 million, versus 1 in 112 million of getting killed by a vending machine. The deaths have also at times been associated with "Darwin Awards".
https://en.wikipedia.org/wiki/Divide%20and%20choose
Divide and choose (also Cut and choose or I cut, you choose) is a procedure for fair division of a continuous resource, such as a cake, between two parties. It involves a heterogeneous good or resource ("the cake") and two partners who have different preferences over parts of the cake. The protocol proceeds as follows: one person ("the cutter") cuts the cake into two pieces; the other person ("the chooser") selects one of the pieces; the cutter receives the remaining piece. The procedure has been used since ancient times to divide land, cake and other resources between two parties. Currently, there is an entire field of research, called fair cake-cutting, devoted to various extensions and generalizations of cut-and-choose. History Divide and choose is mentioned in the Bible, in the Book of Genesis (chapter 13). When Abraham and Lot come to the land of Canaan, Abraham suggests that they divide it among them. Then Abraham, coming from the south, divides the land to a "left" (northern) part and a "right" (southern) part, and lets Lot choose. Lot chooses the eastern part which contains Sodom and Gomorrah, and Abraham is left with the western part which contains Beer Sheva, Hebron, Bethel, and Shechem. The United Nations Convention on the Law of the Sea applies a procedure similar to divide-and-choose for allocating areas in the ocean among countries. A developed state applying for a permit to mine minerals from the ocean must prepare two areas of approximately similar value, let the UN authority choose one of them for reservation to developing states, and get the other area for mining:"Each application... shall cover a total area... sufficiently large and of sufficient estimated commercial value to allow two mining operations... of equal estimated commercial value... Within 45 days of receiving such data, the Authority shall designate which part is to be reserved solely for the conduct of activities by the Authority through the Enterprise or in association with devel
https://en.wikipedia.org/wiki/Delone%20set
In the mathematical theory of metric spaces, ε-nets, ε-packings, ε-coverings, uniformly discrete sets, relatively dense sets, and Delone sets (named after Boris Delone) are several closely related definitions of well-spaced sets of points, and the packing radius and covering radius of these sets measure how well-spaced they are. These sets have applications in coding theory, approximation algorithms, and the theory of quasicrystals. Definitions If (M,d) is a metric space, and X is a subset of M, then the packing radius of X is half of the infimum of distances between distinct members of X. If the packing radius is r, then open balls of radius r centered at the points of X will all be disjoint from each other, and each open ball centered at one of the points of X with radius 2r will be disjoint from the rest of X. The covering radius of X is the infimum of the numbers r such that every point of M is within distance r of at least one point in X; that is, it is the smallest radius such that closed balls of that radius centered at the points of X have all of M as their union. An ε-packing is a set X of packing radius ≥ ε/2 (equivalently, minimum distance ≥ ε), an ε-covering is a set X of covering radius ≤ ε, and an ε-net is a set that is both an ε-packing and an ε-covering. A set is uniformly discrete if it has a nonzero packing radius, and relatively dense if it has a finite covering radius. A Delone set is a set that is both uniformly discrete and relatively dense; thus, every ε-net is Delone, but not vice versa. Construction of ε-nets As the most restrictive of the definitions above, ε-nets are at least as difficult to construct as ε-packings, ε-coverings, and Delone sets. However, whenever the points of M have a well-ordering, transfinite induction shows that it is possible to construct an ε-net N, by including in N every point for which the infimum of distances to the set of earlier points in the ordering is at least ε. For finite sets of points in a Euclidean s
https://en.wikipedia.org/wiki/C%C3%A9cile%20La%20Grenade
Dame Cécile Ellen Fleurette La Grenade, (born 30 December 1952) is a Grenadian food scientist who has served as Governor-General of Grenada since 7 May 2013. Early life and career La Grenade was born in La Borie, located in Saint George Parish, Grenada. She is the third of five daughters born to Allan A. La Grenade , a civil servant, and Sibyl Sylvester-La Grenade, a registered nurse. Her maternal grandmother, Mary Louise "Eva" Ollivierre-Sylvester, was the first woman in the British Windward Islands to serve on the legislative council of her country, after being elected in 1952, less than a year after universal suffrage. She is a paternal first cousin of Maurice Bishop, the Prime Minister of Grenada during the 1979-83 People's Revolutionary Government.<ref>{{Cite web|title= La Grenade "proud to be chosen as new Grenada GG|url= https://spiceislander.com/la-grenade-proud-to-be-chosen-as-new-grenada-gg/|access-date=2023-01-20|website=www.spiceislander.com}}</ref> La Grenade is a food scientist trained in the United States. She holds a bachelor's degree in chemistry from the University of the West Indies, as well as a master's degree and doctorate in food science from the University of Maryland at College Park. She became a member of the Caribbean Export Development Agency’s Steering Committee, serving as the OECS Private Sector Representative, in 2006. In 2007 she was appointed Chairman of the Public Service Commission by Governor-General Sir Daniel Williams, a position she held until 2010. Governor-General of Grenada La Grenade's appointment as governor-general of Grenada was announced during the same week as the death of Grenada's first female governor, Dame Hilda Bynoe. Bynoe was the goddaughter of La Grenade's maternal grandfather, Dr Cyril I. St Bernard Sylvester, an educator and public servant. La Grenade was made a Dame Grand Cross of the Order of Saint Michael and Saint George by Queen Elizabeth II on 11 July 2014. As governor-general, La Grenade open
https://en.wikipedia.org/wiki/Wet-bulb%20temperature
The wet-bulb temperature (WBT) is the temperature read by a thermometer covered in water-soaked (water at ambient temperature) cloth (a wet-bulb thermometer) over which air is passed. At 100% relative humidity, the wet-bulb temperature is equal to the air temperature (dry-bulb temperature); at lower humidity the wet-bulb temperature is lower than dry-bulb temperature because of evaporative cooling. The wet-bulb temperature is defined as the temperature of a parcel of air cooled to saturation (100% relative humidity) by the evaporation of water into it, with the latent heat supplied by the parcel. A wet-bulb thermometer indicates a temperature close to the true (thermodynamic) wet-bulb temperature. The wet-bulb temperature is the lowest temperature that can be reached under current ambient conditions by the evaporation of water only. Even heat-adapted people cannot carry out normal outdoor activities past a wet-bulb temperature of , equivalent to a heat index of . A reading of – equivalent to a heat index of – is considered the theoretical human survivability limit for up to six hours of exposure. Intuition If a thermometer is wrapped in a water-moistened cloth, it will behave differently. The drier and less humid the air is, the faster the water will evaporate. The faster water evaporates, the lower the thermometer's temperature will be relative to air temperature. Water can evaporate only if the air around it can absorb more water. This is measured by comparing how much water is in the air to the maximum that could be in the air—the relative humidity. 0% means the air is completely dry, and 100% means the air contains all the water it can hold in the present circumstances and it cannot absorb any more water (from any source). This is part of the cause of apparent temperature in humans. The drier the air, the more moisture it can take up beyond what is already in it, and the easier it is for extra water to evaporate. The result is that sweat evaporates more
https://en.wikipedia.org/wiki/Forbidden%20Flowers
Forbidden Flowers by Nancy Friday is a book which explores women's sexual fantasies. It can be read as a feminist analysis of the development of women's fantasies against a background of sexual liberation, or simply as a series of candid, erotic fantasies. It is part of a series of three books beginning in 1968 with My Secret Garden, with this second book, its sequel, begun in 1973. It compares how the fantasies have changed over the five-year period, and notes women's reactions to the first book. A third book, Women on Top, was published in 1991. In addition to this series, Friday has written a number of other books examining male and female sexuality. These are entitled My Mother/My Self, Men in Love, Jealousy, and The Power of Beauty. The origins and uses of women's fantasies Forbidden Flowers is divided into two parts, the first part examining where sexual fantasies come from, looking at the influences of childhood and adolescence, women looking at men, and sexual frustration. The second part looks at the uses of these fantasies, in daydreaming and masturbation, during sex, and making fantasies come true. Each of the eight chapters is illustrated by examples of fantasies from women who have written to Friday.
https://en.wikipedia.org/wiki/Cubic%20graph
In the mathematical field of graph theory, a cubic graph is a graph in which all vertices have degree three. In other words, a cubic graph is a 3-regular graph. Cubic graphs are also called trivalent graphs. A bicubic graph is a cubic bipartite graph. Symmetry In 1932, Ronald M. Foster began collecting examples of cubic symmetric graphs, forming the start of the Foster census. Many well-known individual graphs are cubic and symmetric, including the utility graph, the Petersen graph, the Heawood graph, the Möbius–Kantor graph, the Pappus graph, the Desargues graph, the Nauru graph, the Coxeter graph, the Tutte–Coxeter graph, the Dyck graph, the Foster graph and the Biggs–Smith graph. W. T. Tutte classified the symmetric cubic graphs by the smallest integer number s such that each two oriented paths of length s can be mapped to each other by exactly one symmetry of the graph. He showed that s is at most 5, and provided examples of graphs with each possible value of s from 1 to 5. Semi-symmetric cubic graphs include the Gray graph (the smallest semi-symmetric cubic graph), the Ljubljana graph, and the Tutte 12-cage. The Frucht graph is one of the five smallest cubic graphs without any symmetries: it possesses only a single graph automorphism, the identity automorphism. Coloring and independent sets According to Brooks' theorem every connected cubic graph other than the complete graph K4 has a vertex coloring with at most three colors. Therefore, every connected cubic graph other than K4 has an independent set of at least n/3 vertices, where n is the number of vertices in the graph: for instance, the largest color class in a 3-coloring has at least this many vertices. According to Vizing's theorem every cubic graph needs either three or four colors for an edge coloring. A 3-edge-coloring is known as a Tait coloring, and forms a partition of the edges of the graph into three perfect matchings. By Kőnig's line coloring theorem every bicubic graph has a Tait colorin
https://en.wikipedia.org/wiki/Zfp28%20zinc%20finger%20protein
ZFP28 zinc finger protein is a protein that in humans is encoded by the ZFP28 gene.
https://en.wikipedia.org/wiki/Hypersonic%20effect
The hypersonic effect is a phenomenon reported in a controversial scientific study by Tsutomu Oohashi et al., which claims that, although humans cannot consciously hear ultrasound (sounds at frequencies above approximately 20 kHz), the presence or absence of those frequencies has a measurable effect on their physiological and psychological reactions. Numerous other studies have contradicted the portion of the results relating to the subjective reaction to high-frequency audio, finding that people who have "good ears" listening to Super Audio CDs and high resolution DVD-Audio recordings on high fidelity systems capable of reproducing sounds up to 30 kHz cannot tell the difference between high resolution audio and the normal CD sampling rate of 44.1 kHz. Favoring evidence In research published in 2000 in the Journal of Neurophysiology, researchers described a series of objective and subjective experiments in which subjects were played music, sometimes containing high-frequency components (HFCs) above 25 kHz and sometimes not. The subjects could not consciously tell the difference, but when played music with the HFCs they showed differences measured in two ways: EEG monitoring of their brain activity showed statistically significant enhancement in alpha-wave activity The subjects preferred the music with the HFCs No effect was detected on listeners in the study when only the ultrasonic (frequencies higher than 24 kHz) portion of the test material was played for test subjects; the demonstrated effect was only present when comparing full-bandwidth to bandwidth-limited material. It is a common understanding in psychoacoustics that the ear cannot respond to sounds at such high frequency via an air-conduction pathway, so one question that this research raised was: does the hypersonic effect occur via the "ordinary" route of sound travelling through the air passage in the ear, or in some other way? A peer-reviewed study in 2006 seemed to confirm the second of these
https://en.wikipedia.org/wiki/Oleiharenicola
Oleiharenicola is a genus of bacteria from the family of Opitutaceae.
https://en.wikipedia.org/wiki/Pere%20Alberch
Pere Alberch Vie (2 November 1954, Badalona – 13 March 1998, Madrid) was a Spanish naturalist, biologist and embryologist. He was a professor at Harvard University from 1980 to 1989, and director of the Museo Nacional de Ciencias Naturales, Madrid. He studied in the United States, earning a bachelor's degree from the University of Kansas (1976) and a PhD from the University of California, Berkeley (1980). Biography In 1976 he graduated after studying Biology and Environmental Sciences at the University of Kansas. Four years later he obtained a PhD in Zoology at the University of California. Between 1980 and 1989 he worked as both a biology professor at Harvard University and as a curator of herpetology at the Museum of Comparative Zoology. He worked as an editor for magazines such as Trends in Ecology and Evolution (since 1993), Biodiversity Letters (since 1992), Journal of Theoretical Biology (since 1985) and Journal of Evolutionary Biology (1986-1991). In 1989 he returned to Spain as a research professor for the Spanish National Research Council and carried out an important role as director of the Museo Nacional de Ciencias Naturales. In 1998 the Cavanilles Institute of Biodiversity and Evolutional Biology, a new research center located in Valencia, was interested in including Alberch to its staff. He died on March 13, 1998, at the age of 43. Works
https://en.wikipedia.org/wiki/DockNET
DockNET is an intelligent control system for cargo handling developed by Portsystem in Sweden. The control system was created for being able to analyze, communicate, supervise, direct, record and to document everything that happens in and around a loading door hole in a modern cargo terminal. Cargo terminals often suffer damage to loading houses and vehicles incurred in connection with drive into and away from the loading house. This is usually not just mistake out of the driver, but lack of communication between driver and loading dock. DockNET solves this lack of communication and secures while running the work environment and cargo handling. The result is increased efficiency where technology makes everything measurable around the loading space. See also Logistics
https://en.wikipedia.org/wiki/Johnny%20Canuck
Johnny Canuck is a Canadian cartoon hero and superhero who was created as a political cartoon in 1869 and was later re-invented as a Second World War action hero in 1942. The Vancouver Canucks, a professional ice hockey team in the National Hockey League (NHL), currently use a hockey playing "Johnny Canuck" logo as one of their team logos. In addition, the Vancouver Canucks' American Hockey League affiliate, the Abbotsford Canucks, use it as their main logo. Political cartoon Johnny Canuck is a fictional lumberjack and a national personification of Canada. He first appeared in early political cartoons dating to 1869 where he was portrayed as a younger cousin of the United States' Uncle Sam and Britain's John Bull. Dressed as a habitant, farmer, logger, rancher or soldier, he was characterized as wholesome and simple-minded and was often depicted resisting the bullying of John Bull or Uncle Sam. He appeared regularly in editorial cartoons for 30 years before declining in usage in the early twentieth century. Comic book hero The character re-emerged during World War II in the February 1942 issue of Bell Features' Dime Comics #1. Cartoonist Leo Bachle created the character as a teenager, apparently on a challenge from a Bell executive. Initially, Johnny Canuck had no superpowers. Johnny Canuck's cartoon exploits helped Canada fight against Nazism. Like Captain America, he met Adolf Hitler and almost single-handedly ended the war. The use of such stock figures diminished in popularity after World War II, but in 1975, a new comic book character, Captain Canuck, emerged. Created by Richard Comely (who at the time was unaware of the earlier Johnny Canuck character), Captain Canuck was a costumed superhero rather than just a hero, and he wore red and white tights and bore a red maple leaf emblazoned on the forehead of his mask. In 1995, Canada Post issued a series of Canadian postage stamps celebrating Canada's comic-book superheroes. Johnny Canuck is depicted as he app
https://en.wikipedia.org/wiki/Meta-regression
Meta-regression is defined to be a meta-analysis that uses regression analysis to combine, compare, and synthesize research findings from multiple studies while adjusting for the effects of available covariates on a response variable. A meta-regression analysis aims to reconcile conflicting studies or corroborate consistent ones; a meta-regression analysis is therefore characterized by the collated studies and their corresponding data sets—whether the response variable is study-level (or equivalently aggregate) data or individual participant data (or individual patient data in medicine). A data set is aggregate when it consists of summary statistics such as the sample mean, effect size, or odds ratio. On the other hand, individual participant data are in a sense raw in that all observations are reported with no abridgment and therefore no information loss. Aggregate data are easily compiled through internet search engines and therefore not expensive. However, individual participant data are usually confidential and are only accessible within the group or organization that performed the studies. Although meta-analysis for observational data is also under extensive research, the literature still largely centers around combining randomized controlled trials (RCTs). In RCTs, a study typically includes a trial that consists of arms. An arm refers to a group of participants who received the same therapy, intervention, or treatment. A meta-analysis with some or all studies having more than two arms is called network meta-analysis, indirect meta-analysis, or a multiple treatment comparison. Despite also being an umbrella term, meta-analysis sometimes implies that all included studies have strictly two arms each—same two treatments in all trials—to distinguish itself from network meta-analysis. A meta-regression can be classified in the same way—meta-regression and network meta-regression—depending on the number of distinct treatments in the regression analysis. Meta-analy
https://en.wikipedia.org/wiki/Alinea%20%28restaurant%29
Alinea is a restaurant in Chicago, Illinois, United States. In 2010, Alinea was awarded three stars by the Michelin Guide. Since the closing on December 20, 2017 of Grace, Alinea remains the only Chicago restaurant with three Michelin stars. History The restaurant opened on May 4, 2005, and takes its name from the symbol alinea, which is featured as a logo. Co-owner Nick Kokonas wrote of the restaurant's name, Alinea literally means "off the line". The restaurant's symbol, more commonly known as the pilcrow, indicates the beginning of a new train of thought, or a new paragraph. There's a double meaning: on one hand, Alinea claims to represent a new train of thought about food, but as a restaurant, everything still has to come "off the line". In October 2008, chef and owner Grant Achatz and co-author Kokonas published Alinea, a hardcover coffee-table book featuring more than 100 of the restaurant's recipes. In January 2016, the Alinea Group, the owner of Alinea, bought Moto restaurant in Chicago. On January 1, 2016, Alinea closed temporarily for renovations. The restaurant planned to operate pop-up restaurants worldwide before reopening on May 20, 2016 after an extensive remodel and overhaul of the menu. In May 2016, Alinea and its chef and owner Grant Achatz were featured in the Netflix show Chef's Table. The innovative food of Alinea and the journey of Achatz to renowned chef were highlighted. In 2020, Alinea served diners from a rooftop when all indoor dining was closed in Illinois during the coronavirus pandemic. The menu included a canapé shaped like the SARS‑CoV‑2 virus. Alinea Group co-owner Nick Kokonas stated the appetizer was "meant to provoke discomfort, conversation, and awareness" but some diners described it as "tacky", "disrespectful", and "insensitive". Awards and honors Alinea received the AAA Five Diamond Award, the highest level of recognition given by the AAA, from 2007 to 2017. It ranked ninth on the S. Pellegrino World's 50 Bes
https://en.wikipedia.org/wiki/Dyne%3Abolic
dyne:bolic GNU/Linux is a Live CD/DVD distribution based on the Linux kernel. It is shaped by the needs of media activists, artists and creators to be a practical tool with a focus on multimedia production, that delivers a large assortment of applications. It allows manipulation and broadcast of both sound and video with tools to record, edit, encode, and stream. In addition to multimedia specific programs, dyne:bolic also provides word processors and common desktop computing tools. Termed "Rastasoft" by its author, it is based entirely on free software, and as such is recommended and endorsed by the GNU Project. dyne:bolic is created by volunteers and the author and maintainer Jaromil, who also included multimedia tools such as MusE, HasciiCam, and FreeJ in the distribution. Live CD/DVD dyne:bolic is intended to be used as Live CD/DVD. It does not require installation to a hard drive, and attempts to recognize most devices and peripherals (sound, video, TV, etc.) automatically. It is designed to work with older and slower computers, its kernel optimized for low latency and for performance, making the distribution suitable for audio and video production and turning PCs into full media stations. For that reason software included is sometimes not at the newest version available. Modules dyne:bolic can be extended by downloading extra modules such as development tools or common software like OpenOffice.org. These are SquashFS files placed in the directory of a dock (see below) or a burnt CD and are automatically integrated at boot. System requirements Basic system requirements for version 1.x and 2.x were relatively low. A PC with Pentium or AMD K5 (i586) class CPU and 64 MB of RAM and an IDE CD-ROM drive is sufficient. Some versions of dyne:bolic 1.x were ported by co-developer Smilzo to be used on the Xbox game console, multiple Xbox installations could be clustered. Console installation and clustering is currently not supported by version 2.x and up. Vers
https://en.wikipedia.org/wiki/Upstream%20activating%20sequence
An upstream activating sequence or upstream activation sequence (UAS) is a cis-acting regulatory sequence. It is distinct from the promoter and increases the expression of a neighbouring gene. Due to its essential role in activating transcription, the upstream activating sequence is often considered to be analogous to the function of the enhancer in multicellular eukaryotes. Upstream activation sequences are a crucial part of induction, enhancing the expression of the protein of interest through increased transcriptional activity. The upstream activation sequence is found adjacently upstream to a minimal promoter (TATA box) and serves as a binding site for transactivators. If the transcriptional transactivator does not bind to the UAS in the proper orientation then transcription cannot begin. To further understand the function of an upstream activation sequence, it is beneficial to see its role in the cascade of events that lead to transcription activation. The pathway begins when activators bind to their target at the UAS recruiting a mediator. A TATA-binding protein subunit of a transcription factor then binds to the TATA box, recruiting additional transcription factors. The mediator then recruits RNA polymerase II to the pre-initiation complex. Once initiated, RNA polymerase II is released from the complex and transcription begins. Examples GAL1-GAL10 intergenic region (UAS) The property of the GAL1-GAL10 to bind the GAL4 protein is utilised in the GAL4/UAS technique for controlled gene mis-expression in Drosophila. This is the most popular form of binary expression in Drosophila melanogaster, a system which has been adapted for many uses to make Drosophila melanogaster one of the most genetically tractable multicellular organisms. In this technique, four related binding sites between the GAL10 and GAL1 loci in Saccharomyces cerevisiae serve as an Upstream Activating Sequences (UAS) element through GAL4 binding. Several studies have been conducted with Saccharo
https://en.wikipedia.org/wiki/Experimenter%27s%20regress
In science, experimenter's regress refers to a loop of dependence between theory and evidence. In order to judge whether evidence is erroneous we must rely on theory-based expectations, and to judge the value of competing theories we rely on evidence. Cognitive bias affects experiments, and experiments determine which theory is valid. This issue is particularly important in new fields of science where there is no community consensus regarding the relative values of various competing theories, and where sources of experimental error are not well known. In a true scientific process, no consensus does exist and no consensus can exist as the process is conducted scientifically in the pursuit of knowledge. If any party involved in the process stands to personally lose or gain from the result, the process will be flawed and unscientific. In a true scientific process, a theory is formed after a scientist - amateur or professional - has observed a phenomenon and has asked "why?" as a result. The theory is the answer the scientist creates using logic and reason to explain the phenomenon. The scientist then focuses on how to conduct experiments to test the theory incrementally and the theory is either proven to be true or false through repeatable and legitimate experimentation. Legitimate scientific experiments conducted by the person who formulated the theory seek to prove the theory false rather than prove it true specifically to counter the effects of bias. If experimenter's regress acts a positive feedback system, it can be a source of pathological science. An experimenter's strong belief in a new theory produces confirmation bias, and any biased evidence they obtain then strengthens their belief in that particular theory. Neither individual researchers nor entire scientific communities are immune to this effect: see N-rays and polywater. Experimenter's regress is a typical relativistic phenomenon in the Empirical Programme of Relativism (EPOR). EPOR is very much co