source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/List%20of%20Croatian%20flags
For more information on main flags see article: Flag of Croatia This is a list of flags which have been, or are still today, used in Croatia or by Croatians and Croats. Modern Flag Standard Military Army Navy Air Force Coast Guard Police Security and Intelligence Agency Subnational flags Municipality flags Political flags Ethnic groups flags Historical flags Historical national flags Royal Standards Coronation Standards Historical city flags Historical regional flags Historical flags (medieval) Republic of Ragusa flags Other Flag proposals Croatian people in other countries Burgees of Croatia Notes
https://en.wikipedia.org/wiki/Page%20address%20register
A page address register (PAR) contains the physical addresses of pages currently held in the main memory of a computer system. PARs are used in order to avoid excessive use of an address table in some operating systems. A PAR may check a page's number against all entries in the PAR simultaneously, allowing it to retrieve the pages physical address quickly. A PAR is used by a single process and is only used for pages which are frequently referenced (though these pages may change as the process's behaviour changes in accordance with the principle of locality). An example computer which made use of PARs is the Atlas. See also Translation Lookaside Buffer (TLB) Virtual memory Computer memory
https://en.wikipedia.org/wiki/Donald%20I.%20Williamson
Donald Irving Williamson (8 January 1922, in Alnham, England – 29 January 2016, in Port Erin, Isle of Man) was a British planktologist and carcinologist. Education Williamson gained his first degree from the Newcastle division of Durham University in 1942. He earned his PhD from the same university in 1948, and a DSc from Newcastle University in 1972. He worked at the Port Erin Marine Laboratory of the University of Liverpool from 1948 to 1997, and published on Irish Sea plankton, crustacean behaviour and taxonomy, and crustacean larvae. Works He also published speculative works on hybridisation in evolution: Larvae and Evolution (1992, a book foreworded by Lynn Margulis and Alfred I. Tauber), The Origins of Larvae (2003, a revised and extended edition of Larvae and Evolution, not to be confounded with his 2007 article of same title published in the magazine American Scientist), and some articles on the same subject. In Larvae and Evolution Williamson developed a controversial hypothesis proposing the acquisition of larval stages in some marine organisms by hybridisation between two distant animal species (a speciation process referred to as hybridogenesis by Williamson). The fraction of the genome of one of the contributor species would be restricted to lead the developmental program of a newly acquired larva whereas the genome of the other contributor would drive the development of most of the adult anatomical structures. During the following years he would generalise his theory to other animal groups featuring a holometabolous development. According to Williamson, these successful hybridisations would most likely occur in organisms with external fertilisation or male gamete dispersal. He acknowledges in his work Larvae and Evolution to have borrowed the idea of hybridogenesis from the well-known process of interspecific hybridisation that take place in plants. Hybrid plants generated from phylogenetically distant species can often give rise to new species if
https://en.wikipedia.org/wiki/Detection%20of%20genetically%20modified%20organisms
The detection of genetically modified organisms in food or feed is possible by biochemical means. It can either be qualitative, showing which genetically modified organism (GMO) is present, or quantitative, measuring in which amount a certain GMO is present. Being able to detect a GMO is an important part of GMO labeling, as without detection methods the traceability of GMOs would rely solely on documentation. Polymerase chain reaction (PCR) The polymerase chain reaction (PCR) is a biochemistry and molecular biology technique for isolating and exponentially amplifying a fragment of DNA, via enzymatic replication, without using a living organism. It enables the detection of specific strands of DNA by making millions of copies of a target genetic sequence. The target sequence is essentially photocopied at an exponential rate, and simple visualisation techniques can make the millions of copies easy to see. The method works by pairing the targeted genetic sequence with custom designed complementary bits of DNA called primers. In the presence of the target sequence, the primers match with it and trigger a chain reaction. DNA replication enzymes use the primers as docking points and start doubling the target sequences. The process is repeated over and over again by sequential heating and cooling until doubling and redoubling has multiplied the target sequence several million-fold. The millions of identical fragments are then purified in a slab of gel, dyed, and can be seen with UV light. It is not prone to contamination. Irrespective of the variety of methods used for DNA analysis, only PCR in its different formats has been widely applied in GMO detection/analysis and generally accepted for regulatory compliance purposes. Detection methods based on DNA rely on the complementarity of two strands of DNA double helix that hybridize in a sequence-specific manner. The DNA of GMO consists of several elements that govern its functioning. The elements are promoter sequence, str
https://en.wikipedia.org/wiki/Transplastomic%20plant
A transplastomic plant is a genetically modified plant in which genes are inactivated, modified or new foreign genes are inserted into the DNA of plastids like the chloroplast instead of nuclear DNA. Currently, the majority of transplastomic plants are a result of chloroplast manipulation due to poor expression in other plastids. However, the technique has been successfully applied to the chromoplasts of tomatoes. Chloroplasts in plants are thought to have originated from an engulfing event of a photosynthetic bacteria (cyanobacterial ancestor) by a eukaryote. There are many advantages to chloroplast DNA manipulation because of its bacterial origin. For example, the ability to introduce multiple genes (operons) in a single step instead of many steps and the simultaneous expression of many genes with its bacterial gene expression system. Other advantages include the ability to obtain organic products like proteins at a high concentration and the fact that production of these products will not be affected by epigenetic regulation. The reason for product synthesis at high concentrations is because a single plant cell can potentially carry up to 100 chloroplasts. If all these plastids are transformed, all of them can express the introduced foreign genes. This is may be advantageous compared to transformation of the nucleus, because the nucleus typically contains only one or two copies of the gene. The advantages provided by chloroplast DNA manipulation has seen growing interest into this field of research and development, particularly in agricultural and pharmaceutical applications. However, there are some limitations in chloroplast DNA manipulation, such as the inability to manipulate cereal crop DNA material and poor expression of foreign DNA in non- green plastids as mentioned before. In addition, the lack of post- translational modification capability like glycosylation in plastids may make some human- related protein expression difficult. Nevertheless, much pro
https://en.wikipedia.org/wiki/Nancy%20Leveson
Nancy G. Leveson is an American specialist in system and software safety and a Professor of Aeronautics and Astronautics at MIT, United States. Leveson gained her degrees (in computer science, mathematics and management) from UCLA, including her PhD in 1980. Previously she worked at University of California, Irvine and the University of Washington as a faculty member. She has studied safety-critical systems such as the Traffic Collision Avoidance System (TCAS) for the avoidance of midair collisions between aircraft and problems with the Therac-25 radiation therapy machine. Leveson has been editor of the journal IEEE Transactions on Software Engineering. She has held memberships in the ACM, IEEE Computer Society, System Safety Society, and AIAA. Biography Leveson is Professor of Aeronautics and Astronautics and also Professor of Engineering Systems at MIT. Prof. Leveson conducts research on the topics of system safety, software safety, software and system engineering, and human-computer interaction. In 1999, she received the ACM Allen Newell Award for outstanding computer science research and in 1995 the AIAA Information Systems Award for "developing the field of software safety and for promoting responsible software and system engineering practices where life and property are at stake." She was elected a member of the National Academy of Engineering (NAE) in 2000 for contributions to software safety. She has published over 200 research papers and is author of two books, "Safeware: System Safety and Computers" published in 1995 by Addison-Wesley and "Engineering a Safer World" published in 2012 by MIT Press. She consults extensively in many industries on the ways to prevent accidents. In 2005, she received the ACM Sigsoft Outstanding Research Award. She developed the STPA (System Theoretic Process Analysis) and STAMP (System Theoretic Accident Model and Processes) methodologies for accident analysis. In 2020, she received the IEEE Medal for Environmental and
https://en.wikipedia.org/wiki/Alpha%20recursion%20theory
In recursion theory, α recursion theory is a generalisation of recursion theory to subsets of admissible ordinals . An admissible set is closed under functions, where denotes a rank of Godel's constructible hierarchy. is an admissible ordinal if is a model of Kripke–Platek set theory. In what follows is considered to be fixed. The objects of study in recursion are subsets of . These sets are said to have some properties: A set is said to be -recursively-enumerable if it is definable over , possibly with parameters from in the definition. A is -recursive if both A and (its relative complement in ) are -recursively-enumerable. It's of note that -recursive sets are members of by definition of . Members of are called -finite and play a similar role to the finite numbers in classical recursion theory. Members of are called -arithmetic. There are also some similar definitions for functions mapping to : A function mapping to is -recursively-enumerable, or -partial recursive, iff its graph is -definable on . A function mapping to is -recursive iff its graph is -definable on . Additionally, a function mapping to is -arithmetical iff there exists some such that the function's graph is -definable on . Additional connections between recursion theory and α recursion theory can be drawn, although explicit definitions may not have yet been written to formalize them: The functions -definable in play a role similar to those of the primitive recursive functions. We say R is a reduction procedure if it is recursively enumerable and every member of R is of the form where H, J, K are all α-finite. A is said to be α-recursive in B if there exist reduction procedures such that: If A is recursive in B this is written . By this definition A is recursive in (the empty set) if and only if A is recursive. However A being recursive in B is not equivalent to A being . We say A is regular if or in other words if every initial portion of A is α-finite.
https://en.wikipedia.org/wiki/Bioelectromagnetics%20%28journal%29
Bioelectromagnetics is a peer-reviewed scientific journal published by Wiley-Liss that specializes in articles about the biological effects from and applications of electromagnetic fields in biology and medicine. It is the official journal of the Bioelectromagnetics Society, the European Bioelectromagnetics Association, and the Society for Physical Regulation in Biology and Medicine. Abstracting and indexing The journal is abstracted and indexed in Medline, searchable via PubMed and indexed in Index medicus. According to the Journal Citation Reports, the journal has a 2020 impact factor of 2.010. See also Bioelectrochemistry (journal)
https://en.wikipedia.org/wiki/Tre%20recombinase
Tre recombinase is an experimental enzyme that in lab tests has removed DNA inserted by HIV from infected cells. Through selective mutation, Cre recombinase which recognizes loxP sites are modified to identify HIV long terminal repeats (loxLTR) instead. As a result, instead of performing Cre-Lox recombination, the new enzyme performs recombination at HIV provirus sites. The structure of Tre in complex with loxLTR has been resolved (), allowing for analyzing the roles of individual mutations.
https://en.wikipedia.org/wiki/Phreatic%20zone
The phreatic zone, saturated zone, or zone of saturation, is the part of an aquifer, below the water table, in which relatively all pores and fractures are saturated with water. Above the water table is the unsaturated or vadose zone. The phreatic zone size, color, and depth may fluctuate with changes of season, and during wet and dry periods.<ref ></ref ><ref ></ref > Depending on the characteristics of soil particles, their packing and porosity, the boundary of a saturated zone can be stable or instable, exhibiting fingering patterns known as Saffman–Taylor instability. Predicting the onset of stable vs. unstable drainage fronts is of some importance in modelling phreatic zone boundaries.<ref > Dynamics of Drainage and Viscous Fingering in Transport in Porous Media Note that zones "behind" the drainage front are areas on the 'dry' (low-viscosity) (typically above / beyond the 'wet' zone). </ref > See also Index: Aquifer articles
https://en.wikipedia.org/wiki/Junction%20temperature
Junction temperature, short for transistor junction temperature, is the highest operating temperature of the actual semiconductor in an electronic device. In operation, it is higher than case temperature and the temperature of the part's exterior. The difference is equal to the amount of heat transferred from the junction to case multiplied by the junction-to-case thermal resistance. Microscopic effects Various physical properties of semiconductor materials are temperature dependent. These include the diffusion rate of dopant elements, carrier mobilities and the thermal production of charge carriers. At the low end, sensor diode noise can be reduced by cryogenic cooling. On the high end, the resulting increase in local power dissipation can lead to thermal runaway that may cause transient or permanent device failure. Maximum junction temperature calculation Maximum junction temperature (sometimes abbreviated TJMax) is specified in a part's datasheet and is used when calculating the necessary case-to-ambient thermal resistance for a given power dissipation. This in turn is used to select an appropriate heat sink if applicable. Other cooling methods include thermoelectric cooling and coolants. In modern processors from manufacturer such as Intel, AMD, Qualcomm, the core temperature is measured by a network of sensors. Every time the temperature sensing network determines that a rise above the specified junction temperature (), is imminent, measures such as clock gating, clock stretching, clock speed reduction and others (commonly referred to as thermal throttling) are applied to prevent the temperature to raise further. If the applied mechanisms are not compensating enough for the processor to stay below the junction temperature, the device may shut down to prevent permanent damage. An estimation of the chip-junction temperature can be obtained from the following equation: where: = ambient temperature for the package [°C] = junction to ambient thermal resista
https://en.wikipedia.org/wiki/Ames%20process
The Ames process is a process by which pure uranium metal is obtained. It can be achieved by mixing any of the uranium halides (commonly uranium tetrafluoride) with magnesium metal powder or aluminium metal powder. History The Ames process was used on August 3, 1942, by a group of chemists led by Frank Spedding and Harley Wilhelm at the Ames Laboratory as part of the Manhattan Project. It is a type of thermite-based purification, which was patented in 1895 by German chemist Hans Goldschmidt. Development of the Ames process came at a time of increased research into mass uranium-metal production. The desire for increased production was motivated by a fear of Nazi Germany's developing nuclear weapons before the Allies. The process originally involved mixing powdered uranium tetrafluoride and powdered magnesium together. This mixture was placed inside an iron pipe that was welded shut on one side and capped shut on another side. This container, called a "bomb" by Spedding, was placed into a furnace. When heated to a temperature of , the contents of the container reacted violently, leaving a 35-gram ingot of pure uranium metal. The process was quickly scaled up; by October 1942 the "Ames Project" was producing metal at a rate of per week. The uranium tetrafluoride and magnesium were sealed in a refractory-lined reactor vessel, still referred to as a "bomb". The thermite reaction was initiated by furnace heating the assembly to ; the large difference in density between slag and metal allowed complete separation in the liquid state, yielding slag-free metal. By July 1943, the production rate exceeded of uranium metal per month. Approximately 1000 tons of uranium ingots were produced at Ames before the process was transferred to industry. The Ames project received the Army-Navy "E" Award for Excellence in Production on October 12, 1945, signifying 2.5 years of excellence in industrial production of metallic uranium as a vital war material. Iowa State University is un
https://en.wikipedia.org/wiki/Sociophysiology
Sociophysiology is the "interplay between society and physical functioning" (Freund 1988: 856) involving "collaboration of two neighboring sciences: physiology and sociology" (Mauss 1936: 373). In other words, sociophysiology is physiological sociology, a special science that studies the physiological side of human (and other animals') interrelations (Zeliony 1912: 405–406). Interdisciplinary field of research In addition to having been termed an "interdisciplinary area for research, an area which demonstrates the concomitant relationship between physiology and social behavior" (Di Mascio et al. 1955: 4), sociophysiology may also be described as "social ethology" and "social energetics" (Waxweiler 1906: 62). That is, the "physiology of reactive phenomena caused by the mutual excitations of individuals of the same species" (Waxweiler 1906: 62). The interdisciplinary nature of sociophysiology largely entails a "synthesis of psychophysiology and social interaction" (Adler 2002: 884) such that a "socio-psycho-biological study" (Mauss 1936: 386) of "biologico-sociological phenomena" (Mauss 1936: 385) may ensue. Such "socio-psycho-biological study" has uncovered a "sharing of physiology between people involved in a meaningful interaction" (Adler 2002: 884), as well as "mutually responsive physiologic engagement having normative function in maintaining social cohesion and well-being in higher social animals" (Adler 2002: 885). This "mutually responsive physiologic engagement" brings into play the "close links uniting social phenomena to the biological phenomena from which they immediately derive" (Solvay 1906: 26). Interpersonal physiology Furthermore, sociophysiology explores the "intimate relationship and mutual regulation between social and physiological systems that is especially vital in human groups" (Barchas 1986: 210). In other words, sociophysiology studies the "physio- and psycho-energetic phenomena at the basis of social groupings" (Solvay 1906: 25). Along th
https://en.wikipedia.org/wiki/List%20of%20M.A.S.K.%20toys%20%26%20characters
The following is a list of characters and vehicles from the M.A.S.K. media franchise, including the toyline and its television adaptation. The toyline lasted longer than the cartoon series. Toylines There were five different lines of toys released. Some packaging was altered for the European market to make the line seem less violent, such as revising box art so that vehicles' weapons were not shown firing, or, in several cases, changing vehicle names entirely. Additionally, Europe received 4 adventure packs that were not released in North America, as well as several extra action figure two-packs with redecoed figures. M.A.S.K. members and their vehicles M.A.S.K. (short for Mobile Armored Strike Kommand) is the titular protagonists who fight the forces of V.E.N.O.M. Among its known members are: V.E.N.O.M. members and their vehicles V.E.N.O.M. (short for Vicious Evil Network Of Mayhem) is a criminal organization against which M.A.S.K. fights. V.E.N.O.M.'s primary goal was obtaining money through either robbery, extortion, counterfeiting, and kidnapping, or attempting to steal historical artifacts. Among its known members are: Other toys Notes
https://en.wikipedia.org/wiki/List%20of%20optimization%20software
Given a transformation between input and output values, described by a mathematical function , optimization deals with generating and selecting the best solution from some set of available alternatives, by systematically choosing input values from within an allowed set, computing the output of the function and recording the best output values found during the process. Many real-world problems can be modeled in this way. For example, the inputs can be design parameters of a motor, the output can be the power consumption, or the inputs can be business choices and the output can be the obtained profit. An optimization problem, in this case a minimization problem, can be represented in the following way Given: a function f : A R from some set A to the real numbers Search for: an element x0 in A such that f(x0) ≤ f(x) for all x in A. In continuous optimization, A is some subset of the Euclidean space Rn, often specified by a set of constraints, equalities or inequalities that the members of A have to satisfy. In combinatorial optimization, A is some subset of a discrete space, like binary strings, permutations, or sets of integers. The use of optimization software requires that the function f is defined in a suitable programming language and connected at compile or run time to the optimization software. The optimization software will deliver input values in A, the software module realizing f will deliver the computed value f(x) and, in some cases, additional information about the function like derivatives. In this manner, a clear separation of concerns is obtained: different optimization software modules can be easily tested on the same function f, or a given optimization software can be used for different functions f. The following tables provide a list of notable optimization software organized according to license and business model type. Free and open-source software Applications {| class="wikitable" |- ! Name ! License ! Description |- | ADMB || BSD | nonl
https://en.wikipedia.org/wiki/Daniel%20Kane%20%28mathematician%29
Daniel Mertz Kane (born 1986) is an American mathematician. He is an associate professor with a joint position in the Mathematics Department and the Computer Science and Engineering Department at the University of California, San Diego. Early life and education Kane was born in Madison, Wisconsin, to Janet E. Mertz and Jonathan M. Kane, professors of oncology and of mathematics and computer science, respectively. He attended Wingra School, a small alternative K-8 school in Madison that focuses on self-guided education. By 3rd grade, he had mastered K through 9th-grade mathematics. Starting at age 13, he took honors math courses at the University of Wisconsin–Madison and did research under the mentorship of Ken Ono while dual enrolled at Madison West High School. He earned gold medals in the 2002 and 2003 International Mathematical Olympiads. Prior to his 17th birthday, he resolved an open conjecture proposed years earlier by Andrews and Lewis; for this research, he was named Fellow Laureate of the Davidson Institute for Talent Development. He graduated Phi Beta Kappa from the Massachusetts Institute of Technology in 2007 with two bachelor's degrees, one in mathematics with computer science and the other in physics. While at MIT, Kane was one of four people since 2003 (and one of eight in the history of the competition) to be named a four-time Putnam Fellow in the William Lowell Putnam Mathematical Competition. He also won the 2007 Morgan Prize and competed as part of the MIT team in the Mathematical Contest in Modeling four times, earning the highest score three times and winning the Ben Fusaro Award in 2004, INFORMS Award in 2006, and SIAM Award in 2007. He also won the Machtey Award as an undergraduate in 2005, with Tim Abbott and Paul Valiant, for the best student-authored paper at the Symposium on Foundations of Computer Science that year, on the complexity of two-player win-loss games. Kane received his doctorate in mathematics from Harvard University in 2
https://en.wikipedia.org/wiki/Chemical%20transport%20reaction
In chemistry, a chemical transport reaction describes a process for purification and crystallization of non-volatile solids. The process is also responsible for certain aspects of mineral growth from the effluent of volcanoes. The technique is distinct from chemical vapor deposition, which usually entails decomposition of molecular precursors and which gives conformal coatings. The technique, which was popularized by Harald Schäfer, entails the reversible conversion of nonvolatile elements and chemical compounds into volatile derivatives. The volatile derivative migrates throughout a sealed reactor, typically a sealed and evacuated glass tube heated in a tube furnace. Because the tube is under a temperature gradient, the volatile derivative reverts to the parent solid and the transport agent is released at the end opposite to which it originated (see next section). The transport agent is thus catalytic. The technique requires that the two ends of the tube (which contains the sample to be crystallized) be maintained at different temperatures. So-called two-zone tube furnaces are employed for this purpose. The method derives from the Van Arkel de Boer process which was used for the purification of titanium and vanadium and uses iodine as the transport agent. Cases of the exothermic and endothermic reactions of the transporting agent Transport reactions are classified according to the thermodynamics of the reaction between the solid and the transporting agent. When the reaction is exothermic, then the solid of interest is transported from the cooler end (which can be quite hot) of the reactor to a hot end, where the equilibrium constant is less favorable and the crystals grow. The reaction of molybdenum dioxide with the transporting agent iodine is an exothermic process, thus the MoO2 migrates from the cooler end (700 °C) to the hotter end (900 °C): MoO2 + I2 MoO2I2 ΔHrxn < 0 (exothermic) Using 10 milligrams of iodine for 4 grams of the solid, the proc
https://en.wikipedia.org/wiki/Advanced%20level%20mathematics
Advanced Level (A-Level) Mathematics is a qualification of further education taken in the United Kingdom (and occasionally other countries as well). In the UK, A-Level exams are traditionally taken by 17-18 year-olds after a two-year course at a sixth form or college. Advanced Level Further Mathematics is often taken by students who wish to study a mathematics-based degree at university, or related degree courses such as physics or computer science. Like other A-level subjects, mathematics has been assessed in a modular system since the introduction of Curriculum 2000, whereby each candidate must take six modules, with the best achieved score in each of these modules (after any retake) contributing to the final grade. Most students will complete three modules in one year, which will create an AS-level qualification in their own right and will complete the A-level course the following year—with three more modules. The system in which mathematics is assessed is changing for students starting courses in 2017 (as part of the A-level reforms first introduced in 2015), where the reformed specifications have reverted to a linear structure with exams taken only at the end of the course in a single sitting. In addition, while schools could choose freely between taking Statistics, Mechanics or Discrete Mathematics (also known as Decision Mathematics) modules with the ability to specialise in one branch of applied Mathematics in the older modular specification, in the new specifications, both Mechanics and Statistics were made compulsory, with Discrete Mathematics being made exclusive as an option to students pursuing a Further Mathematics course. The first assessment opportunity for the new specification is 2018 and 2019 for A-levels in Mathematics and Further Mathematics, respectively. 2000s specification Prior to the 2017 reform, the basic A-Level course consisted of six modules, four pure modules (C1, C2, C3, and C4) and two applied modules in Statistics, Mechanics
https://en.wikipedia.org/wiki/Doubly%20connected%20edge%20list
The doubly connected edge list (DCEL), also known as half-edge data structure, is a data structure to represent an embedding of a planar graph in the plane, and polytopes in 3D. This data structure provides efficient manipulation of the topological information associated with the objects in question (vertices, edges, faces). It is used in many algorithms of computational geometry to handle polygonal subdivisions of the plane, commonly called planar straight-line graphs (PSLG). For example, a Voronoi diagram is commonly represented by a DCEL inside a bounding box. This data structure was originally suggested by Muller and Preparata for representations of 3D convex polyhedra. Later, a somewhat different data structure was suggested, but the name "DCEL" was retained. For simplicity, only connected graphs are considered, however the DCEL structure may be extended to handle disconnected graphs as well by introducing dummy edges between disconnected components. Data structure DCEL is more than just a doubly linked list of edges. In the general case, a DCEL contains a record for each edge, vertex and face of the subdivision. Each record may contain additional information, for example, a face may contain the name of the area. Each edge usually bounds two faces and it is, therefore, convenient to regard each edge as two "half-edges" (which are represented by the two edges with opposite directions, between two vertices, in the picture on the right). Each half-edge is "associated" with a single face and thus has a pointer to that face. All half-edges associated with a face are clockwise or counter-clockwise. For example, in the picture on the right, all half-edges associated with the middle face (i.e. the "internal" half-edges) are counter-clockwise. A half-edge has a pointer to the next half-edge and previous half-edge of the same face. To reach the other face, we can go to the twin of the half-edge and then traverse the other face. Each half-edge also has a pointer t
https://en.wikipedia.org/wiki/Self-focusing
Self-focusing is a non-linear optical process induced by the change in refractive index of materials exposed to intense electromagnetic radiation. A medium whose refractive index increases with the electric field intensity acts as a focusing lens for an electromagnetic wave characterized by an initial transverse intensity gradient, as in a laser beam. The peak intensity of the self-focused region keeps increasing as the wave travels through the medium, until defocusing effects or medium damage interrupt this process. Self-focusing of light was discovered by Gurgen Askaryan. Self-focusing is often observed when radiation generated by femtosecond lasers propagates through many solids, liquids and gases. Depending on the type of material and on the intensity of the radiation, several mechanisms produce variations in the refractive index which result in self-focusing: the main cases are Kerr-induced self-focusing and plasma self-focusing. Kerr-induced self-focusing Kerr-induced self-focusing was first predicted in the 1960s and experimentally verified by studying the interaction of ruby lasers with glasses and liquids. Its origin lies in the optical Kerr effect, a non-linear process which arises in media exposed to intense electromagnetic radiation, and which produces a variation of the refractive index as described by the formula , where n0 and n2 are the linear and non-linear components of the refractive index, and I is the intensity of the radiation. Since n2 is positive in most materials, the refractive index becomes larger in the areas where the intensity is higher, usually at the centre of a beam, creating a focusing density profile which potentially leads to the collapse of a beam on itself. Self-focusing beams have been found to naturally evolve into a Townes profile regardless of their initial shape. Self-focusing occurs if the radiation power is greater than the critical power , where λ is the radiation wavelength in vacuum and α is a constant which
https://en.wikipedia.org/wiki/Antibiosis
Antibiosis is a biological interaction between two or more organisms that is detrimental to at least one of them; it can also be an antagonistic association between an organism and the metabolic substances produced by another. Examples of antibiosis include the relationship between antibiotics and bacteria or animals and disease-causing pathogens. The study of antibiosis and its role in antibiotics has led to the expansion of knowledge in the field of microbiology. Molecular processes such cell wall synthesis and recycling, for example, have become better understood through the study of how antibiotics affect beta-lactam development through the antibiosis relationship and interaction of the particular drugs with the bacteria subjected to the compound. Antibiosis is typically studied in host plant populations and extends to the insects which feed upon them. "Antibiosis resistance affects the biology of the insect so pest abundance and subsequent damage is reduced compared to that which would have occurred if the insect was on a susceptible crop variety. Antibiosis resistance often results in increased mortality or reduced longevity and reproduction of the insect." During a study of antibiosis, it was determine that the means to achieving effective antibiosis is remaining still. "When you give antibiotic-producing bacteria a structured medium, they affix to substrate, grow clonally, and produce a “no mans land,” absent competitors, where the antibiotics diffuse outward." Antibiosis is most effective when resources are neither plentiful nor sparse. Antibiosis should be considered as the median on the scale of resource, due to its ideal performance. See also Antibiotic Biological pest control Biotechnology Symbiosis
https://en.wikipedia.org/wiki/J.%20H.%20van%20Lint
Jacobus Hendricus ("Jack") van Lint (1 September 1932 – 28 September 2004) was a Dutch mathematician, professor at the Eindhoven University of Technology, of which he was rector magnificus from 1991 till 1996. He gained his Ph.D. from Utrecht University in 1957 under the supervision of Fred van der Blij. He was professor of mathematics at Eindhoven University of Technology from 1959 to 1997. He was appointed a full professor at Eindhoven University of Technology at the age of 26 years. His field of research was initially number theory, but he worked mainly in combinatorics and coding theory. Van Lint was honored with a great number of awards. He became a member of Royal Netherlands Academy of Arts and Sciences in 1972, received four honorary doctorates, was an honorary member of the Royal Netherlands Mathematics Society (Koninklijk Wiskundig Genootschap), and received a Knighthood. Books Coding Theory, 1971. Combinatorial Theory Seminar Eindhoven University of Technology, 1974 Introduction to Coding Theory, Springer, Graduate Texts in Mathematics, 1982, 3rd. edition 1999. with Peter Cameron: Designs, Graphs, Codes and their Links, London Mathematical Society Lecture Notes, Cambridge University Press, 1980. with Richard M. Wilson: A Course in Combinatorics, Cambridge University Press, 1992. with Gerard van der Geer: Introduction to Coding theory and Algebraic Geometry, Birkhäuser, 1988. See also Seidel adjacency matrix Seidel switching
https://en.wikipedia.org/wiki/Karl%20August%20Folkers
Karl August Folkers (September 1, 1906 – December 7, 1997) was an American biochemist who made major contributions to the isolation and identification of bioactive natural products. Career Folkers graduated from the College of Liberal Arts and Sciences at the University of Illinois in 1928. In 1986, the institution awarded him its Alumni Achievement Award. His career was mainly spent at Merck. He played a prominent role in the isolation of vitamin B12 in 1947, which is one of the most structural complex of the vitamins. As a Merck Pharmaceuticals research team, Folkers, Fern P. Rathe, and Edward Anthony Kaczka were the first to isolate the antibiotic cathomycin in 1955. His team also isolated the antibiotic cycloserine. In 1958 his Merck team determined the structure of coenzyme Q10. He later served as director of the Institute of Biomedical Research at the University of Texas at Austin, where he was also Ashbel Smith Professor of Chemistry. In recognition for his scientific contributions, he received the Perkin Medal in 1960, the William H. Nichols Medal in 1967, the Priestley Medal in 1986, and the National Medal of Science in 1990.
https://en.wikipedia.org/wiki/Bitcrusher
A Bitcrusher is an audio effect that produces distortion by reducing of the resolution or bandwidth of digital audio data. The resulting quantization noise may produce a "warmer" sound impression, or a harsh one, depending on the amount of reduction. Methods A typical bitcrusher uses two methods to reduce audio fidelity: sample rate reduction and resolution reduction. Sample rate reduction Digital audio is composed of a rapid series of numeric samples that encode the changing amplitude of an audio waveform. To accurately represent a wideband waveform of substantial duration, digital audio requires a large number of samples at a high sample rate. The higher the rate, the more accurate the waveform; a lower rate requires the source analog signal to be low-pass filtered to limit the maximum frequency component in the signal, or else high-frequency components of the signal will be aliased. Specifically, the frequency of sampling (a.k.a. the sample rate) must be at least twice the maximum frequency component in the signal; this maximum signal frequency of one-half the sampling frequency is called the Nyquist limit. Though it is a common misconception that the sample rate affects the "smoothness" of the digitally represented waveform, this is not true; sampling theory guarantees that up to the maximum signal frequency supported by the sample rate (i.e. the Nyquist limit), the digital (discrete) signal will exactly represent the analog (continuous-wave) source, except for the distortion of quantization noise resulting from the finite precision of the individual samples. The original signal can be exactly reconstructed simply bypassing the low-pass discrete signal through an ideal low-pass filter (with a perfect vertical cutoff profile). However, as an ideal filter is impossible to build, a real filter, with a gradual transition between the passband and the stopband, must be used, with the consequence that it is impossible to accurately record all frequencies right up t
https://en.wikipedia.org/wiki/Cannon%27s%20algorithm
In computer science, Cannon's algorithm is a distributed algorithm for matrix multiplication for two-dimensional meshes first described in 1969 by Lynn Elliot Cannon. It is especially suitable for computers laid out in an N × N mesh. While Cannon's algorithm works well in homogeneous 2D grids, extending it to heterogeneous 2D grids has been shown to be difficult. The main advantage of the algorithm is that its storage requirements remain constant and are independent of the number of processors. The Scalable Universal Matrix Multiplication Algorithm (SUMMA) is a more practical algorithm that requires less workspace and overcomes the need for a square 2D grid. It is used by the ScaLAPACK, PLAPACK, and Elemental libraries. Algorithm overview When multiplying two n×n matrices A and B, we need n×n processing nodes p arranged in a 2D grid. Initially pi,j is responsible for ai,j and bi,j. // PE(i , j) k := (i + j) mod N; a := a[i][k]; b := b[k][j]; c[i][j] := 0; for (l := 0; l < N; l++) { c[i][j] := c[i][j] + a * b; concurrently { send a to PE(i, (j + N − 1) mod N); send b to PE((i + N − 1) mod N, j); } with { receive a' from PE(i, (j + 1) mod N); receive b' from PE((i + 1) mod N, j ); } a := a'; b := b'; } We need to select k in every iteration for every Processor Element (PE) so that processors don't access the same data for computing . Therefore processors in the same row / column must begin summation with different indexes. If for example PE(0,0) calculates in the first step, PE(0,1) chooses first. The selection of k := (i + j) mod n for PE(i,j) satisfies this constraint for the first step. In the first step we distribute the input matrices between the processors based on the previous rule. In the next iterations we choose a new k' := (k + 1) mod n for every processor. This way every processor will continue accessing different values of the matrices. The ne
https://en.wikipedia.org/wiki/Common%20reference%20string%20model
In cryptography, the common reference string (CRS) model captures the assumption that a trusted setup in which all involved parties get access to the same string crs taken from some distribution D exists. Schemes proven secure in the CRS model are secure given that the setup was performed correctly. The common reference string model is a generalization of the common random string model, in which D is the uniform distribution of bit strings. As stated in, the CRS model is equivalent to the reference string model and the public parameters model. The CRS model has applications in the study of non-interactive zero-knowledge proofs and universal composability.
https://en.wikipedia.org/wiki/Small-bias%20sample%20space
In theoretical computer science, a small-bias sample space (also known as -biased sample space, -biased generator, or small-bias probability space) is a probability distribution that fools parity functions. In other words, no parity function can distinguish between a small-bias sample space and the uniform distribution with high probability, and hence, small-bias sample spaces naturally give rise to pseudorandom generators for parity functions. The main useful property of small-bias sample spaces is that they need far fewer truly random bits than the uniform distribution to fool parities. Efficient constructions of small-bias sample spaces have found many applications in computer science, some of which are derandomization, error-correcting codes, and probabilistically checkable proofs. The connection with error-correcting codes is in fact very strong since -biased sample spaces are equivalent to -balanced error-correcting codes. Definition Bias Let be a probability distribution over . The bias of with respect to a set of indices is defined as where the sum is taken over , the finite field with two elements. In other words, the sum equals if the number of ones in the sample at the positions defined by is even, and otherwise, the sum equals . For , the empty sum is defined to be zero, and hence . ϵ-biased sample space A probability distribution over is called an -biased sample space if holds for all non-empty subsets . ϵ-biased set An -biased sample space that is generated by picking a uniform element from a multiset is called -biased set. The size of an -biased set is the size of the multiset that generates the sample space. ϵ-biased generator An -biased generator is a function that maps strings of length to strings of length such that the multiset is an -biased set. The seed length of the generator is the number and is related to the size of the -biased set via the equation . Connection with epsilon-balanced error-correcting codes T
https://en.wikipedia.org/wiki/Discovery%20Investigations
The Discovery Investigations were a series of scientific cruises and shore-based investigations into the biology of whales in the Southern Ocean. They were funded by the British Colonial Office and organised by the Discovery Committee in London, which was formed in 1918. They were intended to provide the scientific background to stock management of the commercial Antarctic whale fishery. The work of the Investigations contributed hugely to our knowledge of the whales, the krill they fed on, and the oceanography of their habitat, while charting the local topography, including Atherton Peak. The investigations continued until 1951, with the final report being published in 1980. Specimens collected during the cruises are collectively known as the Discovery Collections. Laboratory Shore-based work on South Georgia took place in the marine laboratory, Discovery House, built in 1925 at King Edward Point and occupied until 1931. The scientists lived and worked in the building, travelling half a mile or so across King Edward Cove to the whaling station at Grytviken to work on whales as they were brought ashore by commercial whaling ships. Ships Vessels used were: RRS Discovery from 1924 to 1931 RRS William Scoresby from 1927 to 1945 or later RRS Discovery II from 1929 to 1951 Reports Results of the investigations were printed in the Discovery Reports. This was a series of many small reports, published in 38 volumes by the Cambridge University Press, and latterly the Institute of Oceanographic Sciences. Many were printed as individual reports rather than in large volumes. List of the Discovery Reports Books The Discovery Investigations are described in the following books, all of which were out of print in 2008:
https://en.wikipedia.org/wiki/Negative%20database
In database security, a negative database is a database that saves attributes that cannot be associated with a certain entry. A negative database is a kind of database that contains huge amount of data consisting of simulating data. When anyone tries to get access to such databases both the actual and the negative data sets will be retrieved even if they steal the entire database. For example, instead of storing just the personal details you store personal details that members don't have. Negative databases can avoid inappropriate queries and inferences. They also support allowable operations. Under this scenario, it is desirable that the database support only the allowable queries while protecting the privacy of individual records, say from inspection by an insider. Collection of negative data has been referred to as "negative sousveillance":
https://en.wikipedia.org/wiki/Carcinogenic%20bacteria
Cancer bacteria are bacteria infectious organisms that are known or suspected to cause cancer. While cancer-associated bacteria have long been considered to be opportunistic (i.e., infecting healthy tissues after cancer has already established itself), there is some evidence that bacteria may be directly carcinogenic. The strongest evidence to date involves the bacterium H. pylori and its role in gastric cancer. Oncoviruses are viral agents that are similarly suspected of causing cancer. Known to cause cancer Helicobacter pylori colonizes the human stomach and duodenum. In some cases it can cause stomach cancer and MALT lymphoma. Animal models have demonstrated Koch's third and fourth postulates for the role of Helicobacter pylori in the causation of stomach cancer. The mechanism by which H. pylori causes cancer may involve chronic inflammation, or the direct action of some of its virulence factors, for example, CagA has been implicated in carcinogenesis. Speculative links A number of bacteria have associations with cancer, although their possible role in carcinogenesis is unclear. Salmonella Typhi has been linked to gallbladder cancer but may also be useful in delivering chemotherapeutic agents for the treatment of melanoma, colon and bladder cancer. Bacteria found in the gut may be related to colon cancer but may be more complicated due to the role of chemoprotective probiotic cancers. Microorganisms and their metabolic byproducts, or impact of chronic inflammation, may also be linked to oral cancers. The relationship between cancer and bacteria may be complicated by different individuals reacting in different ways to different cancers. History In 1890, the Scottish pathologist William Russell reported circumstantial evidence for the bacterial cause of cancer. In 1926, Canadian physician Thomas Glover reported that he could consistently isolate a specific bacterium from the neoplastic tissues of animals and humans. One review summarized Glover's report a
https://en.wikipedia.org/wiki/Pancreatic%20ribonuclease%20family
Pancreatic ribonuclease family (, RNase, RNase I, RNase A, pancreatic RNase, ribonuclease I, endoribonuclease I, ribonucleic phosphatase, alkaline ribonuclease, ribonuclease, gene S glycoproteins, Ceratitis capitata alkaline ribonuclease, SLSG glycoproteins, gene S locus-specific glycoproteins, S-genotype-assocd. glycoproteins, ribonucleate 3'-pyrimidino-oligonucleotidohydrolase) is a superfamily of pyrimidine-specific endonucleases found in high quantity in the pancreas of certain mammals and of some reptiles. Specifically, the enzymes are involved in endonucleolytic cleavage of 3'-phosphomononucleotides and 3'-phosphooligonucleotides ending in C-P or U-P with 2',3'-cyclic phosphate intermediates. Ribonuclease can unwind the RNA helix by complexing with single-stranded RNA; the complex arises by an extended multi-site cation-anion interaction between lysine and arginine residues of the enzyme and phosphate groups of the nucleotides. Notable family members Bovine pancreatic ribonuclease is the best-studied member of the family and has served as a model system in work related to protein folding, disulfide bond formation, protein crystallography and spectroscopy, and protein dynamics. The human genome contains 8 genes that share the structure and function with bovine pancreatic ribonuclease, with 5 additional pseudo-genes. The structure and dynamics of these enzymes are related to their diverse biological functions. Other proteins belonging to the pancreatic ribonuclease superfamily include: bovine seminal vesicle and brain ribonucleases; kidney non-secretory ribonucleases; liver-type ribonucleases; angiogenin, which induces vascularisation of normal and malignant tissues; eosinophil cationic protein, a cytotoxin and helminthotoxin with ribonuclease activity; and frog liver ribonuclease and frog sialic acid-binding lectin. The sequence of pancreatic ribonucleases contains four conserved disulfide bonds and three amino acid residues involved in the catalytic activi
https://en.wikipedia.org/wiki/5%27-3%27%20exoribonuclease%202
5'-3' Exoribonuclease 2 (XRN2) also known as Dhm1-like protein is an exoribonuclease enzyme that in humans is encoded by the XRN2 gene. The human gene encoding XRN2 shares similarity with the mouse Dhm1 and the yeast's Dhp1 (Schizosaccharomyces pombe) or RAT1 (Saccharomyces) genes. The yeast gene is involved in homologous recombination and RNA metabolism, such as RNA synthesis and RNA trafficking and termination. Complementation studies show that Dhm1 has a similar function in mouse as Dhp1. Function Human XRN2 is involved in the torpedo model of transcription termination. The C. elegans homologue, XRN-2, is involved in the degradation of certain mature miRNAs and their dislodging from miRISC miRNAs. In yeast, the Rat1 protein has been shown to also be involved in the torpedo transcription termination model. When a polyadenylation site has been detected on the nascent RNA and cleaved by the RNA polymerase II, the Rtt103 factor recruits Rat1 and attaches it to free end. The exonuclease activity of Rat1 degrades the RNA strand and halts transcriptions upon catching up to the polymerase. See also Xrn1
https://en.wikipedia.org/wiki/Oligonucleotidase
Oligonucleotidase (, oligoribonuclease) is an exoribonuclease derived from Flammulina velutipes. This enzyme catalyses the following chemical reaction 3'-end directed exonucleolytic cleavage of viral RNA-DNA hybrid
https://en.wikipedia.org/wiki/The%20Planet%20Internet%20Services
The Planet was a privately held dedicated server company based in Texas. In May 2006, the company merged with Everyone's Internet, which used the EV1 Servers brand. In 2010, they merged with SoftLayer. All services provided by both companies were then operated under the SoftLayer name. History The Planet's support system was called Orbit and was located at orbit.theplanet.com. Prior to the merger, The Planet operated under several different brands. They included: "Server Matrix", which served the low-price end of the market; "Total Control", which featured servers with complete remote control, such as Dell DRAC and Remote Console capabilities; Orbit was the main way customers knew they were dealing with The Planet as a company rather than with one of the different brands they operated. The Planet then sold servers almost exclusively through their website rather than different brands which had been unified into their website. On November 10, 2010, GI Partners announced that the merger of The Planet and SoftLayer was effective. On November 16, 2010, The Planet was rebranded SoftLayer as part of the merger. On June 4, 2013, IBM announced its acquisition of SoftLayer. Everyone's Internet Everyone's Internet was originally a Houston, Texas-based internet service provider. It was formed on October 6, 1998 by Robert A. Marsh, Roy Marsh III, and Randy Williams. Its service was available nationwide. Since 2000, Everyone's Internet's focus shifted toward Web hosting through its EV1 Servers subsidiary. This company was a dedicated server hosting market. At its peak in 2006, EV1 Servers hosted over 30,000 servers. In May 2006, private equity firm GI Partners bought a controlling investment in Everyone's Internet. At the same time, Everyone's Internet announced that it was merging with The Planet, in which GI Partners had invested. The CEO of EV1 Servers was Doug Erwin, from GI Partners, after they gained control of EV1 Servers. In October 2006, Everyone's Internet
https://en.wikipedia.org/wiki/Nuclease%20S1
Nuclease S1 () is an endonuclease enzyme that splits single-stranded DNA (ssDNA) and RNA into oligo- or mononucleotides. This enzyme catalyses the following chemical reaction Endonucleolytic cleavage to 5'-phosphomononucleotide and 5'-phosphooligonucleotide end-products Although its primary substrate is single-stranded, it can also occasionally introduce single-stranded breaks in double-stranded DNA or RNA, or DNA-RNA hybrids. The enzyme hydrolyses single stranded region in duplex DNA such as loops or gaps. It also cleaves a strand opposite a nick on the complementary strand. It has no sequence specificity. Well-known versions include S1 found in Aspergillus oryzae (yellow koji mold) and Nuclease P1 found in Penicillium citrinum. Members of the S1/P1 family are found in both prokaryotes and eukaryotes and are thought to be associated in programmed cell death and also in tissue differentiation. Furthermore, they are secreted extracellular, that is, outside of the cell. Their function and distinguishing features mean they have potential in being exploited in the field of biotechnology. Nomenclature Alternative names include endonuclease S1 (Aspergillus), single-stranded-nucleate endonuclease, deoxyribonuclease S1, deoxyribonuclease S1, Aspergillus nuclease S1, Neurospora crassa single-strand specific endonuclease, S1 nuclease, single-strand endodeoxyribonuclease, single-stranded DNA specific endonuclease, single-strand-specific endodeoxyribonuclease, single strand-specific DNase and Aspergillus oryzae S1 nuclease. Structure Most nucleases with EC 3.1.30.1 activity are homologous to each other in a protein domain family called Nuclease S1/P1. Members of this family, including P1 and S1, are glycoproteins with very distinguishing features, they are: a requirement for three zinc ions cofactors, containing common active site motifs and requires an acidic pH for catalysis. contains three glycans bound to the amino acid asparagine via N-glycosylation two Disul
https://en.wikipedia.org/wiki/ASK%20Group
ASK Group, Inc., formerly ASK Computer Systems, Inc., was a producer of business and manufacturing software. It is best remembered for its Manman enterprise resource planning (ERP) software and for Sandra Kurtzig, the company's founder and one of the early female pioneers in the computer industry. At its peak, ASK had 91 offices in 15 countries before Computer Associates acquired the company in 1994. Beginning and growth (1972–1982) ASK was started in 1972 by Sandra Kurtzig in California. She left her job as a marketing specialist at General Electric and invested $2,000 of her savings to start the company in the apartment she shared with her HP salesman husband. At first, the firm built software for a variety of business applications. ASK was incorporated in 1974. In 1978, Kurtzig came up with ASK's most significant product, named Manman (originally "MaMa"), a contraction of manufacturing management. Manman was an ERP program that ran on Hewlett-Packard HP-3000 minicomputers. Manman helped manufacturing companies plan materials purchases, production schedules, and other administrative functions on a scale that was previously possible only on large, costly mainframe computers. Manman initially had a five-figure software price and was aimed at small and medium-sized manufacturers. Small companies desiring the least expensive implementation could use the software on a time-sharing contract. During the era when Manman was only running on HP-3000 systems, ASK would buy systems at a discount and resell them "with its programs for $125,000 to $300,000" as turnkey systems. Although ASK was initially named "standing for Arie and Sandra Kurtzig, although he is not an employee." Somewhat later, with her husband working for Hewlett Packard (HP); with the software being subsequently marketed both for HP's computers and those sold by Digital Equipment Corporation (DEC), Kurtzig said that "A" was for Associates. Manman was an enormous success and quickly came to dominate the
https://en.wikipedia.org/wiki/Anamorphosis%20%28biology%29
Anamorphosis or anamorphogenesis is the process of postembryonic development and moulting in Arthropoda that results in the addition of abdominal body segments, even after sexual maturity. Examples of this mode of development occur in proturans and millipedes. Protura hatch with only 8 abdominal segments and add the remaining 3 in subsequent moults. These new segments arise behind the last abdominal segment, but in front of the telson. In myriapods, euanamorphosis is when the addition of new segments continues during each moult, without there being a fixed number of segments for the adult, teloanamorphosis is when the moulting ceases once the adult has reached a fixed number of segments, and hemianamorphosis is when a fixed number of segments is reached, after which moulting continues with segments only growing in size, not number.
https://en.wikipedia.org/wiki/High-resolution%20computed%20tomography
High-resolution computed tomography (HRCT) is a type of computed tomography (CT) with specific techniques to enhance image resolution. It is used in the diagnosis of various health problems, though most commonly for lung disease, by assessing the lung parenchyma. On the other hand, HRCT of the temporal bone is used to diagnose various middle ear diseases such as otitis media, cholesteatoma, and evaluations after ear operations. Technique HRCT is performed using a conventional CT scanner. However, imaging parameters are chosen so as to maximize spatial resolution: a narrow slice width is used (usually 1–2 mm), a high spatial resolution image reconstruction algorithm is used, field of view is minimized, so as to minimize the size of each pixel, and other scan factors (e.g. focal spot) may be optimized for resolution at the expense of scan speed. Depending on the suspected diagnosis, the scan may be performed in both inspiration and expiration. In inspiration images are taken in the prone position. In expiratory HRCT the scan is taken in the supine position (face up). As HRCT's aim is to assess a generalized lung disease, the test is conventionally performed by taking thin sections which are 10–40 mm apart from each other. The result is a few images that should be representative of the lungs in general, but that cover only approximately one tenth of the lungs. Intravenous contrast agents are not used for HRCT as the lung inherently has very high contrast (soft tissue against air), and the technique itself is unsuitable for assessment of the soft tissues and blood vessels, which are the major targets of contrast agents. Impact of modern CT technology The technique of HRCT was developed with relatively slow CT scanners, which did not make use of multi-detector (MDCT) technology. The parameters of scan duration, z-axis resolution and coverage were interdependent. To cover the chest in a reasonable time period with a conventional chest CT scan required thick sectio
https://en.wikipedia.org/wiki/Fetal%20protein
Fetal proteins are high levels of proteins present during the fetal stage of development. Often related proteins assume similar roles after birth or in the embryo, in which case the fetal varieties are called fetal isoforms. Sometimes, the genes coding fetal isoforms occur adjacent to their adult homologues in the genome, and in those cases a locus control region often coordinates the transition from fetal to adult forms. In other cases fetal isoforms can be produced by alternate splicing using fetal exons to produce proteins that differ in only a portion of their amino acid sequence. In some situations the continuing expression of fetal forms can reveal the presence of a disease condition or serve as a treatment for diseases such as sickle cell anemia. Some well known examples include: Alpha-fetoprotein (AFP), the predominant serum protein of the fetus which gives way to albumin in the adult. AFP is categorized as an oncofetal protein because it is also found in tumors. Fetal hemoglobin, the fetal version of hemoglobin. Fetal Troponin T and Troponin I isoforms. Fetal Hemoglobin is a member of erythrocytes called F-cells. It is a tetramer protein with 2 alpha and 2 gamma subunits. This is different from adult hemoglobin because it has 2 alpha and 2 beta subunits.  Fetal hemoglobin is coded by a gene on chromosome 11. The gamma subunit on fetal hemoglobin contains a neutral and nonpolar amino acid at position 136, unlike the beta subunit of adult hemoglobin. The protein has a different structure than the adult protein because of this and helps in fetal development. Fetal hemoglobin has a main function to transfer oxygen from the pregnant person to the fetus during gestation. Fetal hemoglobin is vital in this system because it has a high affinity for oxygen. Fetal hemoglobin can be used to screen for pregnancy complications in the fetus and pregnant person. Fetal hemoglobin can also be used to treat sickle cell anemia. This hemoglobin is less likely to be a
https://en.wikipedia.org/wiki/Palpebral%20%28bone%29
The palpebral bone is a small dermal bone found in the region of the eye socket in a variety of animals, including crocodilians and ornithischian dinosaurs. It is also known as the adlacrimal or supraorbital, although the latter term may not be confused with the supraorbital in osteichthyan fishes. In ornithischians, the palpebral can form a prong that projects from the front upper corner of the orbit. It is large in heterodontosaurids, basal ornithopods such as Thescelosaurus (as Bugenasaura) and Dryosaurus, and basal ceratopsians such as Archaeoceratops; in these animals, the prong is elongate and would have stuck out and over the eye like a bony eyebrow. As paleoartist Gregory S. Paul has noted, elongate palpebrals would have given their owners fierce-looking "eagle eyes". In such cases, the expanded palpebral may have functioned to shade the eye.
https://en.wikipedia.org/wiki/Mentha%20%C3%97%20gracilis
Mentha × gracilis (syn. Mentha × gentilis L.; syn. Mentha cardiaca (S.F. Gray) Bak.) is a hybrid mint species within the genus Mentha, a sterile hybrid between Mentha arvensis (cornmint) and Mentha spicata (native spearmint). It is cultivated for its essential oil, used to flavour spearmint chewing gum. It is known by the common names of gingermint, redmint and Scotchmint in Europe, and as Scotch spearmint in North America. History Gingermint is a naturally occurring hybrid indigenous throughout the overlapping native regions of cornmint and spearmint in Europe and Asia. It was first introduced to North America by a gardener in Wisconsin in 1908; due to the Scottish origin of the variety and its similarity in flavour to spearmint, it is known there as Scotch spearmint. From Wisconsin it spread as a crop throughout the US midwest and later to the Pacific northwest states of Washington, Oregon, and Idaho, where the majority of global Scotch spearmint production is now concentrated. In 1990 it was brought from the Pacific Northwest to southern Alberta and Saskatchewan in Canada, which has become the second largest production region supplying about 25% of the North American market. Cultivation As a sterile hybrid gingermint does not set seed; it is instead propagated by plant cuttings from the runners of healthy mints. It is most commonly cultivated for steam distillation of its essential oil. Production is concentrated in North America north of the 41st parallel; below the 40th parallel north summer day lengths are insufficiently long to produce quality essential oil. In the Pacific northwest Scotch spearmint oil, along with native spearmint oil, is protected by a marketing board for farmers. In 2000, 89.4 tonnes of spearmint oil were produced in the US Midwest, 420.9 tonnes in the Pacific northwest, and 167.9 tonnes in Canada. An additional 10 tonnes were produced in India and limited quantities were produced in France and Argentina. Diseases Verticillium wilt is a
https://en.wikipedia.org/wiki/Pyrolysis%E2%80%93gas%20chromatography%E2%80%93mass%20spectrometry
Pyrolysis–gas chromatography–mass spectrometry is a method of chemical analysis in which the sample is heated to decomposition to produce smaller molecules that are separated by gas chromatography and detected using mass spectrometry. How it works Pyrolysis is the thermal decomposition of materials in an inert atmosphere or a vacuum. The sample is put into direct contact with a platinum wire, or placed in a quartz sample tube, and rapidly heated to 600–1000 °C. Depending on the application even higher temperatures are used. Three different heating techniques are used in actual pyrolyzers: Isothermal furnace, inductive heating (Curie Point filament), and resistive heating using platinum filaments. Large molecules cleave at their weakest bonds, producing smaller, more volatile fragments. These fragments can be separated by gas chromatography. Pyrolysis GC chromatograms are typically complex because a wide range of different decomposition products is formed. The data can either be used as fingerprint to prove material identity or the GC/MS data is used to identify individual fragments to obtain structural information. To increase the volatility of polar fragments, various methylating reagents can be added to a sample before pyrolysis. Besides the usage of dedicated pyrolyzers, pyrolysis GC of solid and liquid samples can be performed directly inside programmable temperature vaporizer (PTV) injectors that provide quick heating (up to 60 °C/s) and high maximum temperatures of 600-650 °C. This is sufficient for many pyrolysis applications. The main advantage is that no dedicated instrument has to be purchased and pyrolysis can be performed as part of routine GC analysis. In this case quartz GC inlet liners can be used. Quantitative data can be acquired, and good results of derivatization inside the PTV injector are published as well. Applications Pyrolysis gas chromatography is useful for the identification of involatile compounds. These materials include polymeric mat
https://en.wikipedia.org/wiki/Young%E2%80%93Simpson%20syndrome
Young–Simpson syndrome (YSS) is a rare congenital disorder with symptoms including hypothyroidism, heart defects, facial dysmorphism, cryptorchidism in males, hypotonia, intellectual disability, and postnatal growth retardation. Other symptoms include transient hypothyroidism, macular degeneration, and torticollis. The condition was discovered in 1987 and the name arose from the individuals who first reported the syndrome. An individual with YSS has been identified with having symptoms to a similar syndrome known as Ohdo Blepharophimosis syndrome, showing that it is quite difficult to diagnose the correct condition based on the symptoms present. Some doctors therefore consider these syndromes to be the same. Signs and symptoms Ohdo syndrome, SBBYSS variant, typically occurs early in life, with most diagnoses occurring at birth or during infancy. Patients with SBBYSS variant of Ohdo syndrome are characterized with hyperthyroidism, heart defects, facial dysmorphism, hypotonia, mental retardation, and postnatal growth retardation. Skeletal Underdeveloped patellae (kneecaps) are the most common skeletal symptom associated with the syndrome. Additionally, abnormally long thumbs and great toes as well as dental abnormalities are also common among patients. Joint stiffness involving the hips, knees, and ankles is common which impairs movement among patients. Polydactyly, camptodactyly, clinodactyly, brachydactyly, syndactyly, club feet, and abnormalities of the spine and/or ribs may affect patients with the syndrome. Facial Facial signs and symptoms are most distinguishable among patients of this syndrome. Patients with Ohdo syndrome, SBBYS syndrome are characterized to have non-expressive mask-life faces. Additionally, patients may have features such as broad nasal bridges, nose with rounded top, narrowing of the eye opening, and prominent cheeks. Patients with the syndrome may also have abnormalities of the lacrimal glands and may be born with an opening in the
https://en.wikipedia.org/wiki/Morita%20conjectures
The Morita conjectures in general topology are certain problems about normal spaces, now solved in the affirmative. The conjectures, formulated by Kiiti Morita in 1976, asked If is normal for every normal space Y, is X a discrete space? If is normal for every normal P-space Y, is X metrizable? If is normal for every normal countably paracompact space Y, is X metrizable and sigma-locally compact? The answers were believed to be affirmative. Here a normal P-space Y is characterised by the property that the product with every metrizable X is normal; thus the conjecture was that the converse holds. Keiko Chiba, Teodor C. Przymusiński, and Mary Ellen Rudin proved conjecture (1) and showed that conjectures (2) and (3) cannot be proven false under the standard ZFC axioms for mathematics (specifically, that the conjectures hold under the axiom of constructibility V=L). Fifteen years later, Zoltán Tibor Balogh succeeded in showing that conjectures (2) and (3) are true. Notes
https://en.wikipedia.org/wiki/OpenPlaG
openPlaG is a PHP based function graph plotter for use on websites. It was first released in April 2006. In June 2007 its source code was published under the GNU GPL license. PlaG is an abbreviation for Plot a Graph. The current version 3.2 of openPlaG allows the display of up to three function graphs, their derivative and their integral. It can compute several different functions, with a focus on a large function variety and on probability functions. Settings for a graph can be saved and loaded. A substitution for a user-defined formula can be used. It has an instruction page, which explains the use of the plotter and the function syntax. About 180 functions are predefined. These belong to the categories basic functions, trigonometric and hyperbolic functions, non-differentiable functions, probability functions, special functions, programmable functions, iterations and fractals, differential and integral equations. See also List of information graphics software
https://en.wikipedia.org/wiki/Gluten-related%20disorders
Gluten-related disorders is the term for the diseases triggered by gluten, including celiac disease (CD), non-celiac gluten sensitivity (NCGS), gluten ataxia, dermatitis herpetiformis (DH) and wheat allergy. The umbrella category has also been referred to as gluten intolerance, though a multi-disciplinary physician-led study, based in part on the 2011 International Celiac Disease Symposium, concluded that the use of this term should be avoided due to a lack of specificity. Gluten is a group of proteins, such as prolamins and glutelins, stored with starch in the endosperm of various cereal (grass) grains. , gluten-related disorders were increasing in frequency in different geographic areas. The increase might be explained by the popularity of the Western diet, the expanded reach of the Mediterranean diet (which also includes grains with gluten), the growing replacement of rice by wheat in many countries, the development in recent years of new types of wheat with a higher amount of cytotoxic gluten peptides, and the higher content of gluten in bread and bakery products, due to the reduction of dough fermentation time. However, a 2020 study by the Leibniz-Institute for Food Systems Biology casts doubt on the idea that modern wheat has higher gluten levels. From a seed bank, they grew and analyzed 60 wheat cultivars from between 1891 and 2010 and found no changes in albumin/globulin and gluten contents over time. "Overall, the harvest year had a more significant effect on protein composition than the cultivar. At the protein level, we found no evidence to support an increased immuno-stimulatory potential of modern winter wheat." Types The following classification of gluten-related disorders was announced in 2011 by a panel of experts in London, and published in February 2012: Autoimmune disorders: celiac disease, dermatitis herpetiformis, gluten ataxia Non-autoimmune, non-allergic: disorder with unknown cause, likely immune-modulated: non-celiac gluten sensitivit
https://en.wikipedia.org/wiki/Sandboxie
Sandboxie is an open-source OS-level virtualization solution for Microsoft Windows. It is a sandboxing solution that creates an isolated operating environment in which applications can run without permanently modifying the local system. This virtual environment allows for controlled testing of untrusted programs and web surfing. After various ownership transitions (Sophos acquired Invincea which acquired Sandboxie from the original author, Ronen Tzur), Sophos eventually dropped support and released the code as open-source. The day after the Sophos announcement, a third-party developer known as David Xanatos forked the open-source project and expanded it later with Sandboxie Plus. History Sandboxie was initially released in 2004 as a tool for sandboxing Internet Explorer. Over time, the program was expanded to support other web browsers and eventually, arbitrary apps. In December 2013, Invincea announced the acquisition of Sandboxie. The original developer Ronen Tzur further announced he would no longer be involved with the program. In February 2017, Sophos announced the acquisition of Invincea. Invincea posted an assurance in Sandboxie's website that for the time being Sandboxie's development and support would continue as normal. Version 4.02 introduced support for Windows 64-bit with the exception of Windows XP Professional x64 Edition, which was never supported. Windows XP SP3 and Windows Vista SP2 were supported up to version 5.22, after which their support was dropped. In September 2019, Sandboxie version 5.31.4 was released under a freeware license "with plans to transition it to an open source tool". The previous commercial license still applied to customers with active licenses until their license expired. Downtime In April 2019, the official site was shut down, preventing downloads, installations and purchases, which prompted the creation of a temporary forum in the company's own domain. In May 2019, the official site returned with the original forum
https://en.wikipedia.org/wiki/Gromov%27s%20inequality%20for%20complex%20projective%20space
In Riemannian geometry, Gromov's optimal stable 2-systolic inequality is the inequality , valid for an arbitrary Riemannian metric on the complex projective space, where the optimal bound is attained by the symmetric Fubini–Study metric, providing a natural geometrisation of quantum mechanics. Here is the stable 2-systole, which in this case can be defined as the infimum of the areas of rational 2-cycles representing the class of the complex projective line in 2-dimensional homology. The inequality first appeared in as Theorem 4.36. The proof of Gromov's inequality relies on the Wirtinger inequality for exterior 2-forms. Projective planes over division algebras In the special case n=2, Gromov's inequality becomes . This inequality can be thought of as an analog of Pu's inequality for the real projective plane . In both cases, the boundary case of equality is attained by the symmetric metric of the projective plane. Meanwhile, in the quaternionic case, the symmetric metric on is not its systolically optimal metric. In other words, the manifold admits Riemannian metrics with higher systolic ratio than for its symmetric metric . See also Loewner's torus inequality Pu's inequality Gromov's inequality (disambiguation) Gromov's systolic inequality for essential manifolds Systolic geometry
https://en.wikipedia.org/wiki/List%20of%20Six%20Sigma%20software%20packages
There are generally four classes of software used to support the Six Sigma process improvement protocol: Analysis tools, which are used to perform statistical or process analysis; Program management tools, used to manage and track a corporation's entire Six Sigma program; DMAIC and Lean online project collaboration tools for local and global teams; Data Collection tools that feed information directly into the analysis tools and significantly reduce the time spent gathering data. Analysis tools Notes
https://en.wikipedia.org/wiki/Junctional%20epithelium
The junctional epithelium (JE) is that epithelium which lies at, and in health also defines, the base of the gingival sulcus. The probing depth of the gingival sulcus is measured by a calibrated periodontal probe. In a healthy-case scenario, the probe is gently inserted, slides by the sulcular epithelium (SE), and is stopped by the epithelial attachment (EA). However, the probing depth of the gingival sulcus may be considerably different from the true histological gingival sulcus depth. Location The junctional epithelium, a nonkeratinized stratified squamous epithelium, lies immediately apical to the sulcular epithelium, which lines the gingival sulcus from the base to the free gingival margin, where it interfaces with the epithelium of the oral cavity. The gingival sulcus is bounded by the enamel of the crown of the tooth and the sulcular epithelium. Immediately apical to the base of the pocket, and coronal to the most coronal of the gingival fibers is the junctional epithelium. The JE attaches to the surface of the tooth by way of the EA with hemidesmosomes and is, on average, roughly 1 mm in width in the apico-coronal dimension, constituting about one half of the biologic width. The attachment of the JE to the tooth surface can occur on enamel, cementum, or dentin. The position of the EA on the tooth surface is initially on the cervical half of the anatomical crown when the tooth first becomes functional after tooth eruption. Origin Junctional epithelium is derived from the reduced enamel epithelium (REE) during tooth development. Before the eruption of the tooth and after enamel maturation, the ameloblasts secrete a basal lamina on the tooth surface that serves as a part of the primary EA. As the tooth actively erupts, the coronal part of the fused and surrounding epithelium peels back off the crown. The ameloblasts also develop hemidesmosomes for the primary EA and become firmly attached to the enamel surface. However, the cervical part of the fused tissue r
https://en.wikipedia.org/wiki/Tinfoil%20Hat%20Linux
Tinfoil Hat Linux (THL) was a compact security-focused Linux distribution designed for high security developed by The Shmoo Group. The first version (1.000) was released in February 2002. By 2013, it had become a low-priority project. Its image files and source are available in gzip format. THL can be used on modern PCs using an Intel 80386 or better, with at least 8 MB of RAM. The distribution fits on a single HD floppy disk. The small footprint provides additional benefits beyond making the system easy to understand and verify- the computer need not even have a hard drive, making it easier to "sanitize" the computer after use. The logo of Tinfoil Hat is Tux, the Linux mascot, wearing a tinfoil hat. The Shmoo Group website says "It started as a secure, single floppy, bootable Linux distribution for storing PGP keys and then encrypting, signing, and wiping files. At some point, it became an exercise in over-engineering." Security features Tinfoil Hat uses a number of measures to defeat hardware and software surveillance methods like keystroke logging, video camera, and TEMPEST: Encryption — GNU Privacy Guard (GPG) public key cryptography software is included in THL. Data retrieval — All temporary files are created on an encrypted RAMDisk that is destroyed on shutdown. Even the GPG key file information can be stored encrypted on the floppy. Keystroke monitoring — THL has GPG Grid, a wrapper for GPG that lets you use a video game-style character entry system instead of typing in your passphrase. Keystroke loggers get a set of grid points, instead of a passphrase. Power usage and other side-channel attacks — Under the Paranoid options, a copy of GPG runs in the background generating keys and encrypting random documents. This makes it harder to determine when real encryption is taking place. Even reading the screen over the user's shoulder is very hard when Tinfoil Hat is switched to paranoid mode, which sets the screen to a very low contrast. Applications THL can
https://en.wikipedia.org/wiki/BC%20Geographical%20Names
The BC Geographical Names (formerly BC Geographical Names Information System or BCGNIS) is a geographic name web service and database for the Canadian province of British Columbia run by the Base Mapping and Geomatic Services Branch of the Integrated Land Management Bureau. The database contains official names and spellings of towns, mountains, rivers, lakes, and other geographic places. The database often has other useful information, such as the history of geographic names, and their use in history. External links Geocodes Names of places in Canada Online databases Government databases in Canada
https://en.wikipedia.org/wiki/Decimal%20computer
Decimal computers are computers which can represent numbers and addresses in decimal as well as providing instructions to operate on those numbers and addresses directly in decimal, without conversion to a pure binary representation. Some also had a variable wordlength, which enabled operations on numbers with a large number of digits. Decimal computers were common from the early machines through the 1960s and into the 1970s. Using decimal directly saved the need to convert from decimal to binary for input and output and offered a significant speed improvement over binary machines that performed these conversions using subroutines. This allowed otherwise low-end machines to offer practical performance for roles like accounting and bookkeeping, and many low and mid-range systems of the era were decimal based. During the 1970s, microprocessors with decimal instructions became common in electronic calculators, cash registers and similar roles, especially in the 8-bit era. The rapid improvements in general performance of binary machines eroded the value of decimal operations. One of the last major new designs to support it was the Motorola 68000, which shipped in 1980. More recently, IBM added decimal support to their POWER6 designs to allow them to directly support programs written for 1960s platforms like the System/360. With that exception, most modern designs have little or no decimal support. Early computers Early computers that were exclusively decimal include the ENIAC, IBM NORC, IBM 650, IBM 1620, IBM 7070, UNIVAC Solid State 80. In these machines, the basic unit of data was the decimal digit, encoded in one of several schemes, including binary-coded decimal (BCD), bi-quinary and two-out-of-five code. Except for the IBM 1620 and 1710, these machines used word addressing. When non-numeric characters were used in these machines, they were encoded as two decimal digits. Other early computers were character oriented, providing instructions for performing arith
https://en.wikipedia.org/wiki/Agent%20detection
Agent detection is the inclination for animals, including humans, to presume the purposeful intervention of a sentient or intelligent agent in situations that may or may not involve one. Evolutionary origins It is believed that humans evolved agent detection as a survival strategy. In situations where one is unsure of the presence of an intelligent agent (such as an enemy or a predator), there is survival value in assuming its presence so that precautions can be taken. For example, if a human came across an indentation in the ground that might be a lion's footprint, it is advantageous to err on the side of caution and assume that the lion is present. Psychologists Kurt Gray and Daniel Wegner wrote: Role in religion Some scientists believe that the belief in acting gods is an evolutionary by-product of agent detection. A spandrel is a non-adaptive trait formed as a side effect of an adaptive trait. The psychological trait in question is "if you hear a twig snap in the forest, some sentient force is probably behind it". This trait helps to prevent the primate from being murdered or eaten as food. However this hypothetical trait could remain in modern humans: thus some evolutionary psychologists theorize that "even if the snapping was caused by the wind, modern humans are still inclined to attribute the sound to a sentient agent; they call this person a god". Gray and Wegner also said that agent detection is likely to be a "foundation for human belief in God" but "simple overattribution of agency cannot entirely account for the belief in God..." because the human ability to form a theory of mind and what they refer to as "existential theory of mind" are also required to "give us the basic cognitive capacity to conceive of God." According to Justin L. Barrett, having a scientific explanation for mental phenomena does not mean we should stop believing in them. "Suppose science produces a convincing account for why I think my wife loves me — should I then stop belie
https://en.wikipedia.org/wiki/Electrophoresis%20%28journal%29
Electrophoresis is a peer-reviewed scientific journal covering all aspects of electrophoresis, including new or improved analytical and preparative methods, development of theory, and innovative applications of electrophoretic methods in the study of proteins, nucleic acids, and other compounds. Abstracting and indexing The journal is abstracted and indexed in: According to the Journal Citation Reports, the journal has a 2020 impact factor of 3.535, ranking it 27th out of 87 journals in the category "Chemistry, Analytical" and 29th out of 78 in the category "Biochemical Research Methods".
https://en.wikipedia.org/wiki/List%20decoding
In coding theory, list decoding is an alternative to unique decoding of error-correcting codes for large error rates. The notion was proposed by Elias in the 1950s. The main idea behind list decoding is that the decoding algorithm instead of outputting a single possible message outputs a list of possibilities one of which is correct. This allows for handling a greater number of errors than that allowed by unique decoding. The unique decoding model in coding theory, which is constrained to output a single valid codeword from the received word could not tolerate a greater fraction of errors. This resulted in a gap between the error-correction performance for stochastic noise models (proposed by Shannon) and the adversarial noise model (considered by Richard Hamming). Since the mid 90s, significant algorithmic progress by the coding theory community has bridged this gap. Much of this progress is based on a relaxed error-correction model called list decoding, wherein the decoder outputs a list of codewords for worst-case pathological error patterns where the actual transmitted codeword is included in the output list. In case of typical error patterns though, the decoder outputs a unique single codeword, given a received word, which is almost always the case (However, this is not known to be true for all codes). The improvement here is significant in that the error-correction performance doubles. This is because now the decoder is not confined by the half-the-minimum distance barrier. This model is very appealing because having a list of codewords is certainly better than just giving up. The notion of list-decoding has many interesting applications in complexity theory. The way the channel noise is modeled plays a crucial role in that it governs the rate at which reliable communication is possible. There are two main schools of thought in modeling the channel behavior: Probabilistic noise model studied by Shannon in which the channel noise is modeled precisely in the
https://en.wikipedia.org/wiki/FYVE%2C%20RhoGEF%20and%20PH%20domain%20containing
FYVE, RhoGEF and PH domain containing (FGD) is a gene family consisting of: FGD1 FGD2 FGD3 FGD4 Type 1 is associated with Aarskog-Scott syndrome. See also Guanine nucleotide exchange factor
https://en.wikipedia.org/wiki/FGD1
FYVE, RhoGEF and PH domain-containing protein 1 (FGD1) also known as faciogenital dysplasia 1 protein (FGDY), zinc finger FYVE domain-containing protein 3 (ZFYVE3), or Rho/Rac guanine nucleotide exchange factor FGD1 (Rho/Rac GEF) is a protein that in humans is encoded by the FGD1 gene that lies on the X chromosome. Orthologs of the FGD1 gene are found in dog, cow, mouse, rat, and zebrafish, and also budding yeast and C. elegans. It is a member of the FYVE, RhoGEF and PH domain containing family. FGD1 is a guanine-nucleotide exchange factor (GEF) that can activate the Rho GTPase Cdc42. It localizes preferentially to the trans-Golgi network (TGN) of mammalian cells and regulates, for example, the secretory transport of bone-specific proteins from the Golgi complex. Thus Cdc42 and FGD1 regulate secretory membrane trafficking that occurs especially during bone growth and mineralization in humans. FGD1 promotes nucleotide exchange on the GTPase Cdc42, a key player in the establishment of cell polarity in all eukaryotic cells. The GEF activity of FGD1, which activates Cdc42, is harbored in its DH domain and causes the formation of filopodia, enabling the cells to migrate. FGD1 also activates the c-Jun N-terminal kinase (JNK) signaling cascade, important in cell differentiation and apoptosis. It also promotes the transition through G1 during the cell cycle and causes tumorgenic transformation of NIH/3T3 fibroblasts. The FGD1 gene is located on the short arm of the X-chromosome and is essential for normal mammalian embryonic development. Mice embryos that carried experimentally introduced mutations in the FGD1 gene had skeletal abnormalities affecting bone size, cartilage growth, vertebrae formation and distal extremities. These severe phenotypes are consistent with a lack of Cdc42 activity, as it controls membrane traffic as well as the organization of the actin cytoskeleton. Mutations in the FGD1 gene that cause the production of non-functional proteins are respon
https://en.wikipedia.org/wiki/Flory%20convention
In polymer science, the Flory convention is a convention for labelling rotational isomers of polymers. It is named after nobel prize-winning Paul Flory. The convention states that for a given bond, when the dihedral angle formed between the previous and subsequent bonds projected on the plane normal to the bond is 0 degrees, the state is labelled as "trans", and when the angle is 180 degrees, the angle is labelled as "cis".
https://en.wikipedia.org/wiki/Binary%20integer%20decimal
The IEEE 754-2008 standard includes decimal floating-point number formats in which the significand and the exponent (and the payloads of NaNs) can be encoded in two ways, referred to as binary encoding and decimal encoding. Both formats break a number down into a sign bit s, an exponent q (between qmin and qmax), and a p-digit significand c (between 0 and 10p−1). The value encoded is (−1)s×10q×c. In both formats the range of possible values is identical, but they differ in how the significand c is represented. In the decimal encoding, it is encoded as a series of p decimal digits (using the densely packed decimal (DPD) encoding). This makes conversion to decimal form efficient, but requires a specialized decimal ALU to process. In the binary integer decimal (BID) encoding, it is encoded as a binary number. Format Using the fact that 210 = 1024 is only slightly more than 103 = 1000, 3n-digit decimal numbers can be efficiently packed into 10n binary bits. However, the IEEE formats have significands of 3n+1 digits, which would generally require 10n+4 binary bits to represent. This would not be efficient, because only 10 of the 16 possible values of the additional 4 bits are needed. A more efficient encoding can be designed using the fact that the exponent range is of the form 3×2k, so the exponent never starts with 11. Using the Decimal32 encoding (with a significand of 3*2+1 decimal digits) as an example (e stands for exponent, m for mantissa, i.e. significand): If the significand starts with 0mmm, omitting the leading 0 bit lets the significand fit into 23 bits: s 00eeeeee (0)mmm mmmmmmmmmm mmmmmmmmmm s 01eeeeee (0)mmm mmmmmmmmmm mmmmmmmmmm s 10eeeeee (0)mmm mmmmmmmmmm mmmmmmmmmm If the significand starts with 100m, omitting the leading 100 bits lets the significand fit into 21 bits. The exponent is shifted over 2 bits, and a 11 bit pair shows that this form is being used: s 1100eeeeee (100)m mmmmmmmmmm mmmmmmmmmm s 1101eeeeee (100)m mmmmmm
https://en.wikipedia.org/wiki/Phosphatidylglycerol
Phosphatidylglycerol is a glycerophospholipid found in pulmonary surfactant and in the plasma membrane where it directly activates lipid-gated ion channels. The general structure of phosphatidylglycerol consists of a L-glycerol 3-phosphate backbone ester-bonded to either saturated or unsaturated fatty acids on carbons 1 and 2. The head group substituent glycerol is bonded through a phosphomonoester. It is the precursor of surfactant and its presence (>0.3) in the amniotic fluid of the newborn indicates fetal lung maturity. Approximately 98% of alveolar wall surface area is due to the presence of type I cells, with type II cells producing pulmonary surfactant covering around 2% of the alveolar walls. Once surfactant is secreted by the type II cells, it must be spread over the remaining type I cellular surface area. Phosphatidylglycerol is thought to be important in spreading of surfactant over the Type I cellular surface area. The major surfactant deficiency in premature infants relates to the lack of phosphatidylglycerol, even though it comprises less than 5% of pulmonary surfactant phospholipids. It is synthesized by head group exchange of a phosphatidylcholine enriched phospholipid using the enzyme phospholipase D. Biosynthesis Phosphatidic acid reacts with CTP, producing CDP-diacylglycerol, with loss of pyrophosphate. Glycerol-3-phosphate reacts with CDP-diacylglycerol to form phosphatidylglycerol phosphate, while CMP is released. The phosphate group is hydrolysed forming phosphatidylglycerol. Phosphatidylglycerol combines with CDP-DAG forming cardiolipin releasing CMP by the action of cardiolipin synthase. Two phosphatidylglycerols form cardiolipin, the constituent molecule of the mitochondrial inner membrane. In eukaryotic mitochondria phosphatidylglycerol is converted to cardiolipin by reacting with a molecule of cytidine diphosphate diglyceride in a reaction catalyzed by cardiolipin synthase (Hostetler KY, van den Bosch H, van Deenen LL. The mechanism o
https://en.wikipedia.org/wiki/Wheat%20berry
A wheat berry, or wheatberry, is a whole wheat kernel, composed of the bran, germ, and endosperm, without the husk. Botanically, it is a type of fruit called a caryopsis. Wheat berries have a tan to reddish-brown color and are available as either a . They are often added to salads or baked into bread to add a chewy texture. If wheat berries are milled, whole-wheat flour is produced. Wheat berries are the primary ingredient in an Eastern European Christmas porridge called kutya. In France, cooked durum wheat berries are commonly eaten as a side dish instead of rice or corn. This side dish is often called ebly, from the name of the first brand of prepared wheat berries. See also Cuccìa, a Sicilian wheat berry dish Bulgur, another whole wheat preparation Frumenty, a dish made with boiled wheat berries Graham flour Borș, a fermented drink made from sprouted grain
https://en.wikipedia.org/wiki/Transaction%20Processing%20over%20XML
Transaction Processing over XML (TPoX) is a computing benchmark for XML database systems. As a benchmark, TPoX is used for the performance testing of database management systems that are capable of storing, searching, modifying and retrieving XML data. The goal of TPoX is to allow database designers, developers and users to evaluate the performance of XML database features, such as the XML query languages XQuery and SQL/XML, XML storage, XML indexing, XML Schema support, XML updates, transaction processing and logging, and concurrency control. TPoX includes XML update tests based on the XQuery Update Facility. The TPoX benchmark exercises the processing of data-centric XML, in contrast to content- or document-centric XML. TPoX was originally developed and tested by IBM and Intel, but became an open source project on SourceForge in January 2007. TPoX 1.1 was released in June 2007. TPoX 2.0 was released in July 2009. The TPoX benchmark package contains the following: XML Schemas that define the XML data used in the benchmark. An XML data generation tool to generate an arbitrary number of XML documents with well-defined value distributions and referential integrity across documents. The XML data is generated conforming to industry schema such as FIXML to model real-world applications. Workloads which are executed on the generated data. A workload is a set of transactions. A transaction can be a query in XQuery or SQL/XML notation or an insert, update or delete operation. A Java application which acts as a workload driver. It is configurable and can spawn 1 to n parallel threads to simulate concurrent database users. Each user connects to the database and executes a random sequence of transactions defined in the workload. Parameter markers in the transactions are replaced by real values that are drawn from random value distributions. The workload driver collects and reports performance metrics, such as the transaction throughput as well as minimum, maximum and
https://en.wikipedia.org/wiki/ATP2A2
ATP2A2 also known as sarcoplasmic/endoplasmic reticulum calcium ATPase 2 (SERCA2) is an ATPase associated with Darier's disease and Acrokeratosis verruciformis. This gene encodes one of the SERCA Ca(2+)-ATPases, which are intracellular pumps located in the sarcoplasmic or endoplasmic reticula of muscle cells. This enzyme catalyzes the hydrolysis of ATP coupled with the translocation of calcium from the cytosol to the sarcoplasmic reticulum lumen, and is involved in calcium sequestration associated with muscular excitation and contraction. Alternative splicing results in multiple transcript variants encoding different isoforms.
https://en.wikipedia.org/wiki/Mobile%20identity%20management
Mobile identity is a development of online authentication and digital signatures, where the SIM card of one’s mobile phone works as an identity tool. Mobile identity enables legally binding authentication and transaction signing for online banking, payment confirmation, corporate services, and consuming online content. The user's certificates are maintained on the telecom operator's SIM card and in order to use them, the user has to enter a personal, secret PIN code. When using mobile identity, no separate card reader is needed, as the phone itself already performs both functions. In contrast to other approaches, the mobile phone in conjunction with a mobile signature-enabled SIM card aims to offer the same security and ease of use as for example smart cards in existing digital identity management systems. Smart card-based digital identities can only be used in conjunction with a card reader and a PC. In addition, distributing and managing the cards can be logistically difficult, exacerbated by the lack of interoperability between services relying on such a digital identity. There are a number of private company stakeholders that have an inherent interest in setting up a mobile signature service infrastructure to offer mobile identity services. These stakeholders are mobile network operators and, to a certain extent, financial institutions or service providers with an existing large customer base, that could leverage the use of mobile signatures across several applications. By country Finland The Finnish government has supervised the deployment of a common derivative of the ETSI-based mobile signature service standard, thus allowing the Finnish mobile operators to offer mobile signature services. The Finnish government certificate authority (CA) also issues the certificates that link the digital keys on the SIM card to the person’s real world identity. Islamic Republic of Iran Through national mobile register program Iranian customs administration and ministr
https://en.wikipedia.org/wiki/AT91CAP
AT91CAP (AT91CAP Customizable Atmel Microcontrollers) is a family of Atmel microcontrollers based on the 32-bit RISC microprocessors from ARM. They include a block of metal-programmable logic gates (MP Block) that can be personalized by the application developer. The MP Block can contain one or more additional processor cores, additional peripherals or interfaces, or application-specific logic such as a GPS correlator. CAP products feature embedded SRAM and ROM memories and an external bus for additional memories including flash memory, together with a number of peripherals and standard communications and networking interfaces. This qualifies them as system-on-a-chip devices. External interfaces include USB, CAN, Ethernet, SPI, USART and ADC. A DMA controller provides direct communication channels between external interfaces and memories, increasing data throughput with minimal processor intervention. Peripherals include counter/timers, power-on reset generators, voltage regulators and advanced interrupt controller. This enhances the real time performance of the processor. A power management controller keeps power consumption to a minimum by powering down unused peripherals and interfaces, and enabling the processor to be put in standby mode. The AT91CAP comes in both ARM7 and ARM9 versions. The CAP design flow emphasizes parallel hardware/software development. An FPGA-based emulation board enables the hardware and software of the application under development to be thoroughly tested at close to full operational speed in order to validate the functionality of the device.
https://en.wikipedia.org/wiki/Mobile%20signature%20roaming
In mobile telecommunications technology, the concept of mobile signature roaming means an access point (AP) should be able to get a mobile signature from any end-user, even if the AP and the end-user have not contracted a commercial relationship with the same MSSP. Otherwise, an AP would have to build commercial terms with as many MSSPs as possible, and this might be a cost burden. This means that a mobile signature transaction issued by an application provider should be able to reach the appropriate MSSP, and this should be transparent for the AP. Mobile signature roaming itself requires commercial agreements between the entities that facilitate it. In this respect, we assume that various entities (including MSSPs) will join in order to define common commercial terms and rules corresponding to a mobile signature roaming Service. This is the concept of a mobile signature roaming service. Entities involved Acquiring Entity (AE): an entity performing this role is one of the entry points of the mesh, and handles commercial agreements with APs. The entry point in the mesh may be for instance a MSSP, or an aggregator of Application Providers in the context of a particular communities of interests (e.g. payment associations, banks, MNOs etc.). That's the reason why we define this more abstract role. An Acquiring Entity implements the Web Service Interface specified in TS 102 204 [8]; Home MSSP (HMSSP): this is the MSSP that is able to deal with the current end-user and the current transaction; Routing Entity (RE): any entity that facilitates the communication between the AE and the home MSSP; Attribute Provider: this role is described by Liberty Alliance [3]. One or several mesh members may undertake this role and store relevant attributes in order to facilitate the discovery of the Home MSSP by other Mesh members; Identity Issuer: an entity that is able to make a link between a Mobile Signature and an end user's identity. Within a PKI system, this is typically the ce
https://en.wikipedia.org/wiki/Dynamic%20stochastic%20general%20equilibrium
Dynamic stochastic general equilibrium modeling (abbreviated as DSGE, or DGE, or sometimes SDGE) is a macroeconomic method which is often employed by monetary and fiscal authorities for policy analysis, explaining historical time-series data, as well as future forecasting purposes. DSGE econometric modelling applies general equilibrium theory and microeconomic principles in a tractable manner to postulate economic phenomena, such as economic growth and business cycles, as well as policy effects and market shocks. Terminology As a practical matter, people often use the term "DSGE models" to refer to a particular class of econometric, quantitative models of business cycles or economic growth called real business cycle (RBC) models. Considered to be classically quantitative, DSGE models were initially proposed by Kydland & Prescott, and Long & Plosser; whereby Charles Plosser described RBC models as a precursor for DSGE modeling. As mentioned in the Introduction, DSGE models constitute the predominant framework of macroeconomic analysis through their coherent combination of micro-foundations and optimising economic behaviour of rational agents. DSGE models are multi-faceted which allow for a more comprehensive analysis of macro effects, and their defining characteristics indicative through their name, are as follows: Dynamic: The effect of current choices on future uncertainty makes the models dynamic and assigns a certain relevance to the expectations of agents in forming macroeconomic outcomes. Stochastic: The models take into consideration the transmission of random shocks into the economy and the subsequent economic fluctuations. General: referring to the entire economy as a whole (within the model) in that price levels and output levels are determined jointly. As opposed to a Partial equilibrium where price-levels are taken as given and only output-levels are determined within the model economy. Equilibrium: Subscribing to the Walrasian, General Competitive
https://en.wikipedia.org/wiki/Electronic%20differential
In automotive engineering the electronic differential is a form of differential, which provides the required torque for each driving wheel and allows different wheel speeds. It is used in place of the mechanical differential in multi-drive systems. When cornering, the inner and outer wheels rotate at different speeds, because the inner wheels describe a smaller turning radius. The electronic differential uses the steering wheel command signal and the motor speed signals to control the power to each wheel so that all wheels are supplied with the torque they need. Functional description The classical automobile drivetrain is composed by a single Internal combustion engine providing torque to one or more driving wheels. The most common solution is to use a mechanical device to distribute torque to the wheels. This mechanical differential allows different wheel speeds when cornering. With the emergence of electric vehicles new drive train configurations are possible. Multi-drive systems become easy to implement due to the large power density of electric motors. These systems, usually with one motor per driving wheel, need an additional top level controller which performs the same task as a mechanical differential. The ED scheme has several advantages over a mechanical differential: simplicity - it avoids additional mechanical parts such as a gearbox or clutch; independent torque for each wheel allows additional capabilities (e.g., traction control, stability control); reconfigurable - it is reprogrammable in order to include new features or tuned according to the driver’s preferences; allows distributed regenerative braking; the torque is not limited by the wheel with least traction, as it is with a mechanical differential. faster response times; accurate knowledge of traction torque per wheel. Applications Several applications of this technology have proven successful and have increased vehicle performance. The application range is wide and includes the h
https://en.wikipedia.org/wiki/Cepaea
Cepaea is a genus of large air-breathing land snails, terrestrial pulmonate gastropod molluscs in the family Helicidae. The shells are often brightly colored and patterned with brown stripes. The two species in this genus, C. nemoralis and C. hortensis, are widespread and common in Western and Central Europe and have been introduced to North America. Both have been influential model species for ongoing studies of genetics and natural selection. Like many Helicidae, these snails use love darts during mating. Species For a long time, four species were classified in the genus Cepaea. However, molecular phylogenetic studies suggested that two of them should be placed in the genera Macularia and Caucasotachea, which are not immediate relatives either of Cepaea or each other: Cepaea hortensis (O. F. Müller, 1774) – white-lipped snail or garden banded snail Cepaea nemoralis (Linnaeus, 1758) – brown-lipped snail or grove snail Cepaea sylvatica (Draparnaud, 1801), now Macularia sylvatica Cepaea vindobonensis (Férussac, 1821), now Caucasotachea vindobonensis Interspecific relations The range of C. hortensis extends further north than that of C. nemoralis in Scotland and Scandinavia and it is the only one of the two species in Iceland. Likewise in the Swiss Alps C. hortensis is found as high as 2050 m, but C. nemoralis only up to 1600 m. Conversely, the southern edge of the range lies further north in C. hortensis; unlike C. nemoralis it does not occur in Italy, and in Spain it has a more restricted distribution (in the north-east corner). Where the ranges overlap C. hortensis prefers cooler sites with longer and damper vegetation. But the two species often co-occur at a site, in which situation the densities of both affect each other's growth, fecundity and mortality. However, they differ somewhat in their behaviour: C. hortensis is more active at lower temperatures, aestivates higher on the vegetation and is more diurnal, although this appears to be independent of wh
https://en.wikipedia.org/wiki/Megan%20and%20Morag
Megan and Morag, two domestic sheep, were the first mammals to have been successfully cloned from differentiated cells. They are not to be confused with Dolly the sheep which was the first animal to be successfully cloned from an adult somatic cell or Polly the sheep which was the first cloned and transgenic animal. Megan and Morag, like Dolly and Polly, were cloned at the Roslin Institute in Edinburgh, Scotland in 1995. Background The team at the Roslin Institute were seeking a way to modify the genetic constitution of sheep and cattle more effectively than the hit and miss method that was the only method and had sort of aids available at the time – microinjection. In microinjection, DNA is injected into the pronuclei of fertilized oocytes. However, only a small proportion of the animals will integrate the injected DNA into their genome and in the rare cases that they do integrate this new genetic information, the pattern of expression of the injected piece of DNA's gene, due to the random integration, is very variable. The team choose to combine two approaches – microinjection and embryonic stem cells. In order to achieve this they decided to try to transfer the nucleus from one cell to another and stimulate this new cell to grow and become an animal, a process known as nuclear transfer. The team at the Roslin Institute tried to make immortalized and undifferentiated embryonic stem cell lines in sheep, but failed. As a result, they decided to work with cultured blastocyst cells. The nuclear material of these blastocyst cells would be transferred into an unfertilized sheep egg cell, an oocyte where the nucleus had been removed. To optimize the chances of successful nuclear transfer, they put the cultured cells into a state of quiescence, which was a similar state to that of the unfertilized egg cell. Nuclear transfer was done, using electrical stimuli both to fuse the cultured cell with the enucleated egg and to kick start embryonic development. From 244 nucle
https://en.wikipedia.org/wiki/Nanmu
Nanmu () is a precious wood that is unique to China and South Asia, and was historically used for boat building, architectural woodworking, furniture and sculptural carving in China. The Ming Dynasty-era writings indicate this wood as superior durable softwood. A recent excavation of a tomb in Lija village in Jing'an County, Jiangxi Province, found 47 coffins made of nanmu wood that are reported to be about 2500 years old, dating back to the Eastern Zhou Dynasty period and belonging to the Dongyi State of Xu. The trees that produce nanmu wood are evergreens that have long, straight trunks which grows to 35 meters in height and one meter in diameter. More than 30 varieties exist, which are found south of the Yangtze River. There are also nanmu trees on Hainan Island and in Vietnam. Yangmu nanmu is found in Sichuan. Zinan nanmu is found in southeast and south central China. Zinanmu nanmu is found in Jiangsu, Anhui, Zhenjiang. Zhennan nanmu is found in Guizhou and Sichuan. The finest of all the nanmu woods is Hongmao Nanmu (Phoebe hungmaoensis) from Hainan Island. Nanmu wood comes from several species of tree, including: Litsea cubeba Machilus nanmu Phoebe hungmaoensis Phoebe zhennan It is trees of the Phoebe genus, however, that produce the highest grades of nanmu wood. Nanmu is a knotty wood that frequently shows a wavy or quilted grain figure. It does not react to humidity and temperature much in the way of expansion or contraction and makes superior furniture which tends not to get loose or crack because of changes in climate. Nanmu woods that are lighter in color and have loose grain are considered inferior. Nanmu was used in architectural woodworking and boatbuilding due to its resistance to decay. The wood dries with little splitting or warping. After drying the wood is of medium-density and does not change shape. Nanmu can be sanded to a mirror finish. The highest grade of nanmu wood has a bright golden color, a pleasant fragrance, and exhibits impress
https://en.wikipedia.org/wiki/Layered%20double%20hydroxides
Layered double hydroxides (LDH) are a class of ionic solids characterized by a layered structure with the generic layer sequence [AcB Z AcB]n, where c represents layers of metal cations, A and B are layers of hydroxide () anions, and Z are layers of other anions and neutral molecules (such as water). Lateral offsets between the layers may result in longer repeating periods. The intercalated anions (Z) are weakly bound, often exchangeable; their intercalation properties have scientific interest and industrial applications. LDHs occur in nature as minerals, as byproducts of metabolism of certain bacteria, and also unintentionally in man-made contexts, such as the products of corrosion of metal objects. Structure and formulas LDHs can be seen as derived from hydroxides of divalent cations (d) with the brucite (Mg(OH)2) layer structure [AdB AdB]n, by cation (c) replacement (Mg2+ → Al3+), or by cation oxidation (Fe2+ → Fe3+ in the case of green rust, Fe(OH)2), in the metallic divalent (d) cation layers, so as to give them an excess positive electric charge; and intercalation of extra anion layers (Z) between the hydroxide layers (A,B) to neutralize that charge, resulting in the structure [AcB Z AcB]n. LDHs can be formed with a wide variety of anions in the intercalated layers (Z), such as Cl−, Br−, NO, CO, SO and SeO. This structure is unusual in solid state chemistry, since many materials with similar structure (such as montmorillonite and other clay minerals) have negatively charged main metal layers (c) and positive ions in the intercalated layers (Z). In the most studied class of LDHs, the positive layer (c) consists of divalent and trivalent cations, and can be represented by the formula [()2]x+ [(Xn−)x/n · y]x-, where Xn− is the intercalating anion (or anions). Most commonly, = Ca2+, Mg2+, Mn2+, Fe2+, Co2+, Ni2+, Cu2+ or Zn2+, and is another trivalent cation, possibly of the same element. Fixed-composition phases have been shown to exist over the rang
https://en.wikipedia.org/wiki/Bombieri%20norm
In mathematics, the Bombieri norm, named after Enrico Bombieri, is a norm on homogeneous polynomials with coefficient in or (there is also a version for non homogeneous univariate polynomials). This norm has many remarkable properties, the most important being listed in this article. Bombieri scalar product for homogeneous polynomials To start with the geometry, the Bombieri scalar product for homogeneous polynomials with N variables can be defined as follows using multi-index notation: by definition different monomials are orthogonal, so that if while by definition In the above definition and in the rest of this article the following notation applies: if write and and Bombieri inequality The fundamental property of this norm is the Bombieri inequality: let be two homogeneous polynomials respectively of degree and with variables, then, the following inequality holds: Here the Bombieri inequality is the left hand side of the above statement, while the right side means that the Bombieri norm is an algebra norm. Giving the left hand side is meaningless without that constraint, because in this case, we can achieve the same result with any norm by multiplying the norm by a well chosen factor. This multiplicative inequality implies that the product of two polynomials is bounded from below by a quantity that depends on the multiplicand polynomials. Thus, this product can not be arbitrarily small. This multiplicative inequality is useful in metric algebraic geometry and number theory. Invariance by isometry Another important property is that the Bombieri norm is invariant by composition with an isometry: let be two homogeneous polynomials of degree with variables and let be an isometry of (or ). Then we have . When this implies . This result follows from a nice integral formulation of the scalar product: where is the unit sphere of with its canonical measure . Other inequalities Let be a homogeneous polynomial of degree with variab
https://en.wikipedia.org/wiki/Progressive%20Graphics%20File
PGF (Progressive Graphics File) is a wavelet-based bitmapped image format that employs lossless and lossy data compression. PGF was created to improve upon and replace the JPEG format. It was developed at the same time as JPEG 2000 but with a focus on speed over compression ratio. PGF can operate at higher compression ratios without taking more encoding/decoding time and without generating the characteristic "blocky and blurry" artifacts of the original DCT-based JPEG standard. It also allows more sophisticated progressive downloads. Color models PGF supports a wide variety of color models: Grayscale with 1, 8, 16, or 31 bits per pixel Indexed color with palette size of 256 RGB color image with 12, 16 (red: 5 bits, green: 6 bits, blue: 5 bits), 24, or 48 bits per pixel ARGB color image with 32 bits per pixel L*a*b color image with 24 or 48 bits per pixel CMYK color image with 32 or 64 bits per pixel Technical discussion PGF claims to achieve an improved compression quality over JPEG adding or improving features such as scalability. Its compression performance is similar to the original JPEG standard. Very low and very high compression rates (including lossless compression) are also supported in PGF. The ability of the design to handle a very large range of effective bit rates is one of the strengths of PGF. For example, to reduce the number of bits for a picture below a certain amount, the advisable thing to do with the first JPEG standard is to reduce the resolution of the input image before encoding it — something that is ordinarily not necessary for that purpose when using PGF because of its wavelet scalability properties. The PGF process chain contains the following four steps: Color space transform (in case of color images) Discrete Wavelet Transform Quantization (in case of lossy data compression) Hierarchical bit-plane run-length encoding Color components transformation Initially, images have to be transformed from the RGB color space to ano
https://en.wikipedia.org/wiki/Shapley%E2%80%93Shubik%20power%20index
The Shapley–Shubik power index was formulated by Lloyd Shapley and Martin Shubik in 1954 to measure the powers of players in a voting game. The index often reveals surprising power distribution that is not obvious on the surface. The constituents of a voting system, such as legislative bodies, executives, shareholders, individual legislators, and so forth, can be viewed as players in an n-player game. Players with the same preferences form coalitions. Any coalition that has enough votes to pass a bill or elect a candidate is called winning, and the others are called losing. Based on Shapley value, Shapley and Shubik concluded that the power of a coalition was not simply proportional to its size. The power of a coalition (or a player) is measured by the fraction of the possible voting sequences in which that coalition casts the deciding vote, that is, the vote that first guarantees passage or failure. The power index is normalized between 0 and 1. A power of 0 means that a coalition has no effect at all on the outcome of the game; and a power of 1 means a coalition determines the outcome by its vote. Also the sum of the powers of all the players is always equal to 1. There are some algorithms for calculating the power index, e.g., dynamic programming techniques, enumeration methods and Monte Carlo methods. Since Shapley and Shubik have published their paper, several axiomatic approaches have been used to mathematically study the Shapley–Shubik power index, with the anonymity axiom, the null player axiom, the efficiency axiom and the transfer axiom being the most widely used. However, these have been criticised, especially the transfer axiom, which has led to other axioms being proposed as a replacement. Examples Suppose decisions are made by majority rule in a body consisting of A, B, C, D, who have 3, 2, 1 and 1 votes, respectively. The majority vote threshold is 4. There are 4! = 24 possible orders for these members to vote: For each voting sequence the piv
https://en.wikipedia.org/wiki/Tkemali
Tkemali (Georgian: ტყემალი) is a Georgian sauce primarily made of cherry plum, sometimes alucha or other varieties of plum. Both red and green varieties of plum are used. The flavor of the sauce varies, but generally tends to be pungently tart. To lower the tartness level, occasionally sweeter types of plums are added during preparation. Traditionally, besides plum the following ingredients are used: garlic, pennyroyal, cumin, coriander, dill, chili pepper and salt. Tkemali is used for fried or grilled meat, poultry and potato dishes, and has a place in Georgian cuisine similar to the one ketchup has in the United States. It can be made at home, but is also mass-produced by several Georgian and Russian companies. See also List of plum dishes List of dips List of sauces
https://en.wikipedia.org/wiki/Institute%20of%20Mathematics
This is a list of Institutes of Mathematics or Mathematical Institutes. Americas American Institute of Mathematics Clay Mathematics Institute, Cambridge, Massachusetts Centre de Recherches Mathématiques, at the Université de Montréal Center for Mathematical Modeling, at the University of Chile Centro de Investigación en Matemáticas, Guanajuato, Guanajuato in Mexico Courant Institute of Mathematical Sciences, at New York University Fields Institute, at the University of Toronto Institute for Advanced Study, in Princeton, New Jersey Institute for Mathematics and its Applications, at the University of Minnesota Institute for Pure and Applied Mathematics, at the University of California, Los Angeles Instituto Nacional de Matemática Pura e Aplicada, Rio de Janeiro, Brazil Mathematical Sciences Research Institute, at the University of California, Berkeley PPGMAp, at the Universidade Federal do Rio Grande do Sul in Brazil Europe Brunel Institute of Computational Mathematics, in Uxbridge, UK Central Economic Mathematical Institute, at the Russian Academy of Sciences Centre de Recerca Matemàtica, at the Autonomous University of Barcelona Centrum Wiskunde & Informatica, at Science Park, Amsterdam CoMPLEX, at University College London Fachinformationszentrum Karlsruhe, Germany Hamilton Mathematics Institute, at Trinity College, Dublin, Ireland Hausdorff Center for Mathematics, Bonn, Germany Institut de Mathématiques de Toulouse, France Institute for Experimental Mathematics, at the University of Duisburg-Essen in Germany Institute of Mathematics (National Academy of Sciences of Belarus) Institute of Mathematics of the National Academy of Sciences of Ukraine Institute of Mathematics and its Applications, a UK society The Institute of Mathematics and Computer Science, University of Latvia Institute of Mathematics and Informatics (Bulgarian Academy of Sciences) Institute of Mathematics of National Academy of Sciences of Armenia Institute of Mathema
https://en.wikipedia.org/wiki/Internet%20of%20things
The Internet of things (IoT) describes devices with sensors, processing ability, software and other technologies that connect and exchange data with other devices and systems over the Internet or other communications networks. The Internet of things encompasses electronics, communication and computer science engineering. Internet of things has been considered a misnomer because devices do not need to be connected to the public internet, they only need to be connected to a network, and be individually addressable. The field has evolved due to the convergence of multiple technologies, including ubiquitous computing, commodity sensors, and increasingly powerful embedded systems, as well as machine learning. Older fields of embedded systems, wireless sensor networks, control systems, automation (including home and building automation), independently and collectively enable the Internet of things. In the consumer market, IoT technology is most synonymous with "smart home" products, including devices and appliances (lighting fixtures, thermostats, home security systems, cameras, and other home appliances) that support one or more common ecosystems, and can be controlled via devices associated with that ecosystem, such as smartphones and smart speakers. IoT is also used in healthcare systems. There are a number of concerns about the risks in the growth of IoT technologies and products, especially in the areas of privacy and security, and consequently there have been industry and government moves to address these concerns, including the development of international and local standards, guidelines, and regulatory frameworks. History The main concept of a network of smart devices was discussed as early as 1982, with a modified Coca-Cola vending machine at Carnegie Mellon University becoming the first ARPANET-connected appliance, able to report its inventory and whether newly loaded drinks were cold or not. Mark Weiser's 1991 paper on ubiquitous computing, "The Computer of
https://en.wikipedia.org/wiki/SAT%20Subject%20Test%20in%20Mathematics%20Level%202
In the U.S., the SAT Subject Test in Mathematics Level 2 (formerly known as Math II or Math IIChest, the "C" representing chest) was a one-hour multiple choice test. The questions covered a broad range of topics. Approximately 10-14% of questions focused on numbers and operations, 48-52% focused on algebra and functions, 28-32% focused on geometry (coordinate, three-dimensional, and trigonometric geometry were covered; plane geometry was not directly tested), and 8-12% focused on data analysis, statistics and probability. Compared to Mathematics 1, Mathematics 2 was more advanced. Whereas the Mathematics 1 test covered Algebra II and basic trigonometry, a pre-calculus class was good preparation for Mathematics 2. On January 19, 2021, the College Board discontinued all SAT Subject tests, including the SAT Subject Test in Mathematics Level 2. This was effective immediately in the United States, and the tests were to be phased out by the following summer for international students. This was done as a response to changes in college admissions due to the impact of the COVID-19 pandemic on education. Format The test had 50 multiple choice questions that were to be answered in one hour. All questions had five answer choices. Students received 1 point for every correct answer, lost ¼ of a point for each incorrect answer, and received 0 points for questions left blank. Calculator use The College Board stated that a calculator "may be useful or necessary" for about 55-60% of the questions on the test. The College Board also encouraged the use of a graphing calculator over a scientific calculator, saying that the test was "developed with the expectation that most students are using graphing calculators." For the Mathematics Level Two test, students were not permitted to use calculators that have a QWERTY format keyboard, require an electrical outlet, make noise, use paper tape, have non-traditional methods of input (such as a stylus), or are part of a communication devic
https://en.wikipedia.org/wiki/Umbrella%20sampling
Umbrella sampling is a technique in computational physics and chemistry, used to improve sampling of a system (or different systems) where ergodicity is hindered by the form of the system's energy landscape. It was first suggested by Torrie and Valleau in 1977. It is a particular physical application of the more general importance sampling in statistics. Systems in which an energy barrier separates two regions of configuration space may suffer from poor sampling. In Metropolis Monte Carlo runs, the low probability of overcoming the potential barrier can leave inaccessible configurations poorly sampled—or even entirely unsampled—by the simulation. An easily visualised example occurs with a solid at its melting point: considering the state of the system with an order parameter Q, both liquid (low Q) and solid (high Q) phases are low in energy, but are separated by a free energy barrier at intermediate values of Q. This prevents the simulation from adequately sampling both phases. Umbrella sampling is a means of "bridging the gap" in this situation. The standard Boltzmann weighting for Monte Carlo sampling is replaced by a potential chosen to cancel the influence of the energy barrier present. The Markov chain generated has a distribution given by: with U the potential energy, w(rN) a function chosen to promote configurations that would otherwise be inaccessible to a Boltzmann-weighted Monte Carlo run. In the example above, w may be chosen such that w = w(Q), taking high values at intermediate Q and low values at low/high Q, facilitating barrier crossing. Values for a thermodynamic property A deduced from a sampling run performed in this manner can be transformed into canonical-ensemble values by applying the formula: with the subscript indicating values from the umbrella-sampled simulation. The effect of introducing the weighting function w(rN) is equivalent to adding a biasing potential V(rN) to the potential energy of the system. If the biasing potential is
https://en.wikipedia.org/wiki/Digital%20signal%20controller
A digital signal controller (DSC) is a hybrid of microcontrollers and digital signal processors (DSPs). Like microcontrollers, DSCs have fast interrupt responses, offer control-oriented peripherals like PWMs and watchdog timers, and are usually programmed using the C programming language, although they can be programmed using the device's native assembly language. On the DSP side, they incorporate features found on most DSPs such as single-cycle multiply–accumulate (MAC) units, barrel shifters, and large accumulators. Not all vendors have adopted the term DSC. The term was first introduced by Microchip Technology in 2002 with the launch of their 6000 series DSCs and subsequently adopted by most, but not all DSC vendors. For example, Infineon and Renesas refer to their DSCs as microcontrollers. DSCs are used in a wide range of applications, but the majority go into motor control, power conversion, and sensor processing applications. Currently, DSCs are being marketed as green technologies for their potential to reduce power consumption in electric motors and power supplies. In order of market share, the top three DSC vendors are Texas Instruments, Freescale, and Microchip Technology, according to market research firm Forward Concepts (2007). These three companies dominate the DSC market, with other vendors such as Infineon and Renesas taking a smaller slice of the pie. DSC chips NOTE: Data is from 2012 (Microchip and TI) and table currently only includes offering from the top 3 DSC vendors. DSC software DSCs, like microcontrollers and DSPs, require software support. There are a growing number of software packages that offer the features required by both DSP applications and microcontroller applications. With a broader set of requirements, software solutions are more rare. They require: development tools, DSP libraries, optimization for DSP processing, fast interrupt handling, multi-threading, and a tiny footprint.
https://en.wikipedia.org/wiki/Osteoimmunology
Osteoimmunology (όστέον, osteon from Greek, "bone"; from Latin, "immunity"; and λόγος, logos, from Greek "study") is a field that emerged about 40 years ago that studies the interface between the skeletal system and the immune system, comprising the "osteo-immune system". Osteoimmunology also studies the shared components and mechanisms between the two systems in vertebrates, including ligands, receptors, signaling molecules and transcription factors. Over the past decade, osteoimmunology has been investigated clinically for the treatment of bone metastases, rheumatoid arthritis (RA), osteoporosis, osteopetrosis, and periodontitis. Studies in osteoimmunology reveal relationships between molecular communication among blood cells and structural pathologies in the body. System similarities The RANKL-RANK-OPG axis (OPG stands for osteoprotegerin) is an example of an important signaling system functioning both in bone and immune cell communication. RANKL is expressed on osteoblasts and activated T cells, whereas RANK is expressed on osteoclasts, and dendritic cells (DCs), both of which can be derived from myeloid progenitor cells. Surface RANKL on osteoblasts as well as secreted RANKL provide necessary signals for osteoclast precursors to differentiate into osteoclasts. RANKL expression on activated T cells leads to DC activation through binding to RANK expressed on DCs. OPG, produced by DCs, is a soluble decoy receptor for RANKL that competitively inhibits RANKL binding to RANK. Crosstalk The bone marrow cavity is important for the proper development of the immune system, and houses important stem cells for maintenance of the immune system. Within this space, as well as outside of it, cytokines produced by immune cells also have important effects on regulating bone homeostasis. Some important cytokines that are produced by the immune system, including RANKL, M-CSF, TNFa, ILs, and IFNs, affect the differentiation and activity of osteoclasts and bone resorption. Such
https://en.wikipedia.org/wiki/Methoxy%20arachidonyl%20fluorophosphonate
Methoxy arachidonyl fluorophosphonate, commonly referred as MAFP, is an irreversible active site-directed enzyme inhibitor that inhibits nearly all serine hydrolases and serine proteases. It inhibits phospholipase A2 and fatty acid amide hydrolase with special potency, displaying IC50 values in the low-nanomolar range. In addition, it binds to the CB1 receptor in rat brain membrane preparations (IC50 = 20 nM), but does not appear to agonize or antagonize the receptor, though some related derivatives do show cannabinoid-like properties. See also DIFP – diisopropyl fluorophosphate, a related inhibitor IDFP – isopropyl dodecylfluorophosphonate, another related inhibitor with selectivity for FAAH and MAGL Activity-based probes
https://en.wikipedia.org/wiki/Memory-level%20parallelism
In computer architecture, memory-level parallelism (MLP) is the ability to have pending multiple memory operations, in particular cache misses or translation lookaside buffer (TLB) misses, at the same time. In a single processor, MLP may be considered a form of instruction-level parallelism (ILP). However, ILP is often conflated with superscalar, the ability to execute more than one instruction at the same time, e.g. a processor such as the Intel Pentium Pro is five-way superscalar, with the ability to start executing five different microinstructions in a given cycle, but it can handle four different cache misses for up to 20 different load microinstructions at any time. It is possible to have a machine that is not superscalar but which nevertheless has high MLP. Arguably a machine that has no ILP, which is not superscalar, which executes one instruction at a time in a non-pipelined manner, but which performs hardware prefetching (not software instruction-level prefetching) exhibits MLP (due to multiple prefetches outstanding) but not ILP. This is because there are multiple memory operations outstanding, but not instructions. Instructions are often conflated with operations. Furthermore, multiprocessor and multithreaded computer systems may be said to exhibit MLP and ILP due to parallelism—but not intra-thread, single process, ILP and MLP. Often, however, we restrict the terms MLP and ILP to refer to extracting such parallelism from what appears to be non-parallel single threaded code. See also Memory disambiguation Memory dependence prediction Hardware scout Runahead
https://en.wikipedia.org/wiki/Doxastic%20logic
Doxastic logic is a type of logic concerned with reasoning about beliefs. The term derives from the Ancient Greek (doxa, "opinion, belief"), from which the English term doxa ("popular opinion or belief") is also borrowed. Typically, a doxastic logic uses the notation to mean "It is believed that is the case", and the set denotes a set of beliefs. In doxastic logic, belief is treated as a modal operator. There is complete parallelism between a person who believes propositions and a formal system that derives propositions. Using doxastic logic, one can express the epistemic counterpart of Gödel's incompleteness theorem of metalogic, as well as Löb's theorem, and other metalogical results in terms of belief. Types of reasoners To demonstrate the properties of sets of beliefs, Raymond Smullyan defines the following types of reasoners: Accurate reasoner: An accurate reasoner never believes any false proposition. (modal axiom T) Inaccurate reasoner: An inaccurate reasoner believes at least one false proposition. Consistent reasoner: A consistent reasoner never simultaneously believes a proposition and its negation. (modal axiom D) Normal reasoner: A normal reasoner is one who, while believing also believes they believe p (modal axiom 4). A variation on this would be someone who, while not believing also believes they don't believe p (modal axiom 5). Peculiar reasoner: A peculiar reasoner believes proposition p while also believing they do not believe Although a peculiar reasoner may seem like a strange psychological phenomenon (see Moore's paradox), a peculiar reasoner is necessarily inaccurate but not necessarily inconsistent. Regular reasoner: A regular reasoner is one who, while believing , also believes . Reflexive reasoner: A reflexive reasoner is one for whom every proposition has some proposition such that the reasoner believes . If a reflexive reasoner of type 4 [see below] believes , they will believe p. This is a parallelism
https://en.wikipedia.org/wiki/Shorthand%20for%20orchestra%20instrumentation
The shorthand for the instrumentation of a symphony orchestra (and other similar ensembles) is used to outline which and how many instruments, especially wind instruments, are called for in a given piece of music. The shorthand is ordered in the same fashion as the parts of the individual instruments in the score (when read from top to bottom). General approach The orchestra is divided into four groups and specified as follows: Woodwind instruments: flutes, oboes, clarinets, saxophones (if one or more are needed), bassoons Brass instruments: horns, trumpets, trombones, tubas Percussion: timpani, snare drum, bass drum, chimes, etc. Keyboard instruments: celesta, organ, piano String instruments: harp, violins, violas, cellos, basses, frequently abbreviated to 'str', 'strs' or similar. If any soloists or a choir are called for, their parts are usually printed between the percussion/keyboards and the strings in the score. The basic order of the instruments, as seen above, is common to all of the shorthand systems. However, there is no standardized version of this shorthand; different publishers and librarians use different systems, especially for doubling/alternate/additional instruments. David Daniels, in earlier versions of his influential work that collects in print a catalog of the instrumentations of some 4,000+ pieces, made use of a shorthand for doubling/alternate/additional instruments which was less clear, but in the newer online version Daniels' approach has been refined to something more explicit, akin to the Chester Novello and Boosey & Hawkes notations below. Examples for different notations (the instrumentation of John Adams' Harmonielehre is used here as an example): Examples An example of another approach, particularly useful where there may be extensive versatility required from doubling players, is given here for The Phantom of the Opera for a 45-part orchestra, taken from the Chester/Novello Hire Library: WW1(fl,pic).WW2(fl).WW3(cl).WW4
https://en.wikipedia.org/wiki/Weaver%20syndrome
Weaver syndrome is a rare autosomal dominant genetic disorder associated with rapid growth beginning in the prenatal period and continuing through the toddler and youth years. It is characterized by advanced osseous maturation and distinctive craniofacial, skeletal and neurological abnormalities. It is similar to Sotos syndrome and is classified as an overgrowth syndrome. Its genetic cause was identified in 2011 as mutations in the EZH2 gene. Forty-eight cases had been documented and confirmed , and its prevalence is estimated to be similar to that of Sotos syndrome, around 1 in 15,000. It was first described by American physician David Weaver in 1974. Signs and symptoms Children with Weaver syndrome tend to look similar and have distinctive physical and craniofacial characteristics, which may include several, but not all, of the following features: Macrocephaly Large bifrontal diameter Flattened occiput Long philtrum Retrognathia Round face in infancy Prominent chin crease Large ears Strabismus Hypertelorism Epicanthal folds Downslanting palpebral fissures Other features may include loose skin, thin deep-set nails, thin hair, short ribs, limited elbow and knee extension, camptodactyly, and a coarse, low-pitched voice. Delayed development of motor skills such as sitting, standing, and walking are commonly exhibited in early childhood. Patients with Weaver syndrome typically have mild intellectual disability with poor coordination and balance. They also have some neurological abnormalities such as speech delay, epilepsy, intellectual disability, hypotonia or hypertonia, and behavioral problems. Cause The cause for Weaver syndrome was identified in 2011 as autosomal dominant mutations in the EZH2 gene on chromosome 7q36. EZH2 (Enhancer of Zeste, Drosophila, homolog 2) is the second histone methyltransferase associated with human overgrowth. It encodes the catalytic component of the PRC2 protein complex (Polycomb Repressive Complex 2), which regulat
https://en.wikipedia.org/wiki/Deutschsprachige%20Mykologische%20Gesellschaft
Deutschsprachige Mykologische Gesellschaft (DMykG) e.V. (German-Speaking Mycological Society) has been acknowledged as a non-profit organisation. The society was founded in 1961 as a platform for all scientists of the German-speaking area who are interested in mycology either from a medical or veterinary standpoint, i.e. medical mycology or veterinary mycology. To promote science and research is a prime concern. The society is based in the city of Essen. The society currently has about 500 members and organises yearly meetings. These meetings are held for several days each year and are dubbed Myk. Working parties for “clinical mycology” as well as “mycological laboratory diagnostics” make major contributions to the work of the society. The scientific organ of the society publishes the internationally renowned journal Mycoses. Moreover, the society publishes a scientific magazine dubbed Mykologie Forum which is distributed 4 times a year in a circulation of about 5.000 issues to members as well as further interested groups of physicians. Mycological quality management forms a major part of the work of the society. In this context there is a focus on the preparation and updating of guidelines. Currently, the society provides under the canopy of Arbeitsgemeinschaft der Medizinisch-Wissenschaftlichen Fachgesellschaften (AWMF) 6 guidelines (for the electronic version see www.awmf-leitlinien.de), namely „tinea of glabrous skin“, „onychomycosis“, „vulvovaginal candidosis“, „cutaneous candidosis“, “oral candidosis”, and “tinea capitis”. Just recently, English versions of the German language guidelines have started to be published in mycoses. On international grounds Deutschsprachige Mykologische Gesellschaft cooperates with the International Society for Human and Animal Mycology (ISHAM) as well as the European Confederation of Medical Mycology (ECMM). Promotion of the career of younger mycologists is a major concern. In this context a prize for the promotion of mycol
https://en.wikipedia.org/wiki/Gromov%27s%20systolic%20inequality%20for%20essential%20manifolds
In the mathematical field of Riemannian geometry, M. Gromov's systolic inequality bounds the length of the shortest non-contractible loop on a Riemannian manifold in terms of the volume of the manifold. Gromov's systolic inequality was proved in 1983; it can be viewed as a generalisation, albeit non-optimal, of Loewner's torus inequality and Pu's inequality for the real projective plane. Technically, let M be an essential Riemannian manifold of dimension n; denote by sysπ1(M) the homotopy 1-systole of M, that is, the least length of a non-contractible loop on M. Then Gromov's inequality takes the form where Cn is a universal constant only depending on the dimension of M. Essential manifolds A closed manifold is called essential if its fundamental class defines a nonzero element in the homology of its fundamental group, or more precisely in the homology of the corresponding Eilenberg–MacLane space. Here the fundamental class is taken in homology with integer coefficients if the manifold is orientable, and in coefficients modulo 2, otherwise. Examples of essential manifolds include aspherical manifolds, real projective spaces, and lens spaces. Proofs of Gromov's inequality Gromov's original 1983 proof is about 35 pages long. It relies on a number of techniques and inequalities of global Riemannian geometry. The starting point of the proof is the imbedding of X into the Banach space of Borel functions on X, equipped with the sup norm. The imbedding is defined by mapping a point p of X, to the real function on X given by the distance from the point p. The proof utilizes the coarea inequality, the isoperimetric inequality, the cone inequality, and the deformation theorem of Herbert Federer. Filling invariants and recent work One of the key ideas of the proof is the introduction of filling invariants, namely the filling radius and the filling volume of X. Namely, Gromov proved a sharp inequality relating the systole and the filling radius, valid for all ess
https://en.wikipedia.org/wiki/Moshe%20Vardi
Moshe Ya'akov Vardi () is an Israeli mathematician and computer scientist. He is the Karen Ostrum George Distinguished Service Professor in Computational Engineering at Rice University, United States. and a faculty advisor for the Ken Kennedy Institute. His interests focus on applications of logic to computer science, including database theory, finite model theory, knowledge of multi-agent systems, computer-aided verification and reasoning, and teaching logic across the curriculum. He is an expert in model checking, constraint satisfaction and database theory, common knowledge (logic), and theoretical computer science. Vardi has authored or co-authored over 600 technical papers as well as editing several collections. He has authored the books Reasoning About Knowledge with Ronald Fagin, Joseph Halpern, and Yoram Moses, and Finite Model Theory and Its Applications with Erich Grädel, Phokion G. Kolaitis, Leonid Libkin, Maarten Marx, Joel Spencer, Yde Venema, and Scott Weinstein. He is senior editor of Communications of the ACM, after serving as its editor-in-chief for a decade. Education Vardi was an undergraduate student at Bar-Ilan University and received his Master of Science degree from the Weizmann Institute of Science. His PhD was supervised by Catriel Beeri and awarded by the Hebrew University of Jerusalem in 1981. Career and research Vardi's research interests are in logic and computation. He served as chair of the computer science department at Rice University from January 1994 until June 2002. Prior to joining Rice in 1993, he worked at IBM Research and was also a postdoctoral researcher at Stanford University. Vardi serves as an editor of several international journals and was formerly a director of the International Federation of Computational Logic Ltd. He has also co-chaired the Association for Computing Machinery (ACM) task force on job migration. Awards and honors Vardi is the recipient of three IBM Outstanding Innovation Awards, a co-winner of
https://en.wikipedia.org/wiki/Ambra%20Computer%20Corporation
Ambra Computer Corporation is a defunct wholly owned subsidiary of IBM. Created by Dr Richard Greame Ambra, it introduced a line of personal computers targeted at the home user, sold mainly through mail-order, first in Europe (1992), then in the USA (1993). Ambra had a volume production run of just a year or so; the line was discontinued in 1994 in favor of the IBM Aptiva (except for Canada, where it was not discontinued until 1996). Models 386 486 Achiever 2000 Achiever 3000 Achiever 4000 Achiever 5000 Achiever 7000 Achiever 9000 Achiever D Achiever DP Achiever S Achiever T Achiever Anthem Achiever Hurdla/Sprinta Notebook Ispirati (Canada) Positioning Ambra PCs were generally positioned at the low-end of the market, and made use of their ties with IBM in marketing materials in order to make the machines appear better quality than the host of clones, since 'real' IBM PCs were known to be expensive. In reality the machines were fairly low specification, having shadow-mask screens, minimal onboard peripherals, and using low-end processors with the minimum memory and hard disk size at each price. Television advertising for the brand in the UK used the slogan: "Take your mind for a run." Aesthetics The machines were coloured off-white, which was unusual at the time, since most machines were beige. Generally the cases were compact and offered little room for expansion. One notable aspect was the original Ambra mouse, which differed from almost all other designs in the position of its buttons. Conventional mice have the buttons on top: the user clicks by pressing down. The Ambra mouse had the buttons on the front, either side of the cable: the user clicked by pulling their finger backward, in a manner similar to squeezing a trigger. Criticisms led to Ambra changing to a more conventional design: one UK magazine review described the mouse as "looking like a torture device." Timeline See also Acquisition of the IBM PC business by Lenovo
https://en.wikipedia.org/wiki/RCA%20Spectra%2070
The RCA Spectra 70 was a line of electronic data processing (EDP) equipment manufactured by the Radio Corporation of America’s computer division beginning in April 1965. The Spectra 70 line included several CPU models, various configurations of core memory, mass-storage devices, terminal equipment, and a variety of specialized interface equipment. The system architecture and instruction-set were largely compatible with the non-privileged instruction-set of the IBM System/360, including use of the EBCDIC character set. While this degree of compatibility made some interchange of programs and data possible, differences in the operating system software precluded transparent movement of programs between the two systems. Competition in the mainframe market was fierce, and in 1971 the company sold the computer division and the Spectra 70 line to Sperry Rand, taking a huge write down in the process. System overview Five models of the Spectra 70 CPU were announced around 1965, ranging from a small system (70/15) to the large scale (70/55). Some of the main features were: The systems were upward-compatible, allowing programs written for a smaller model to run on any larger machine in the series. Larger machines in the series were also faster, with memory access times ranging from two microseconds in the 70/15 to 0.84 microseconds in the 70/55. Memory capacities ranged from a minimum of 4,096 bytes (4 KB) in the 70/15 to a maximum of 524,288 bytes (512 KB) in the 70/55. All used the Extended Binary Coded Decimal Interchange Code (EBCDIC) of eight bits plus parity for internal data representation. The use of a standard electrical interface allowed the same peripherals to be used with any CPU model in the series. Simultaneous input and output was accomplished by the use of intelligent communication channels. Like the IBM 360, two types of channel were available (on all but the 70/15): selector channels which could address up to 256 devices (one at a time), a
https://en.wikipedia.org/wiki/Organellar%20DNA
Organellar DNA (oDNA) is DNA contained in organelles (such as mitochondria and chloroplasts), outside the nucleus of eukaryotic cells. Mitochondria contain mitochondrial DNA Plastids (e.g., chloroplasts) contain plastid DNA Inheritance of organelle DNA The traits encoded by this type of DNA, in animals, generally pass from mother to offspring rather than from the father in a process called cytoplasmic inheritance. This is due to the ovum provided from the mother being larger than the male sperm cell, and therefore has more organelles, where the organellar DNA is found. Although maternal inheritance is most common, there are also paternal and biparental patterns of inheritance that take place. The latter two patterns of inheritance are found most often in plants. Recombination of organelle DNA is very limited, meaning that any traits that are encoded by the oDNA are likely to remain the same as they are passed from generation to generation. Structure Unlike nuclear DNA, which is present as linear molecules inside the chromosomes, the entire genomes of chloroplasts and mitochondria are present on a single molecule of double-stranded circular DNA molecule; this is very similar structure to a bacterial chromosome. Although the functionality and genetic structure vary significantly between different organelles and their host species, genetic characteristic patterns allow the differentiation between nucleolar and organellar DNA. A recently published machine-learning approach using only the genome sequences and multiple genome annotation tools can classify them. See also Nuclear DNA Non-Mendelian Inheritance
https://en.wikipedia.org/wiki/Recurrence%20period%20density%20entropy
Recurrence period density entropy (RPDE) is a method, in the fields of dynamical systems, stochastic processes, and time series analysis, for determining the periodicity, or repetitiveness of a signal. Overview Recurrence period density entropy is useful for characterising the extent to which a time series repeats the same sequence, and is therefore similar to linear autocorrelation and time delayed mutual information, except that it measures repetitiveness in the phase space of the system, and is thus a more reliable measure based upon the dynamics of the underlying system that generated the signal. It has the advantage that it does not require the assumptions of linearity, Gaussianity or dynamical determinism. It has been successfully used to detect abnormalities in biomedical contexts such as speech signal. The RPDE value is a scalar in the range zero to one. For purely periodic signals, , whereas for purely i.i.d., uniform white noise, . Method description The RPDE method first requires the embedding of a time series in phase space, which, according to stochastic extensions to Taken's embedding theorems, can be carried out by forming time-delayed vectors: for each value xn in the time series, where M is the embedding dimension, and τ is the embedding delay. These parameters are obtained by systematic search for the optimal set (due to lack of practical embedding parameter techniques for stochastic systems) (Stark et al. 2003). Next, around each point in the phase space, an -neighbourhood (an m-dimensional ball with this radius) is formed, and every time the time series returns to this ball, after having left it, the time difference T between successive returns is recorded in a histogram. This histogram is normalised to sum to unity, to form an estimate of the recurrence period density function P(T). The normalised entropy of this density: is the RPDE value, where is the largest recurrence value (typically on the order of 1000 samples). Note that RPDE i
https://en.wikipedia.org/wiki/Microarray%20databases
A microarray database is a repository containing microarray gene expression data. The key uses of a microarray database are to store the measurement data, manage a searchable index, and make the data available to other applications for analysis and interpretation (either directly, or via user downloads). Microarray databases can fall into two distinct classes: A peer reviewed, public repository that adheres to academic or industry standards and is designed to be used by many analysis applications and groups. A good example of this is the Gene Expression Omnibus (GEO) from NCBI or ArrayExpress from EBI. A specialized repository associated primarily with the brand of a particular entity (lab, company, university, consortium, group), an application suite, a topic, or an analysis method, whether it is commercial, non-profit, or academic. These databases might have one or more of the following characteristics: A subscription or license may be needed to gain full access, The content may come primarily from a specific group (e.g. SMD, or UPSC-BASE), the Immunological Genome Project There may be constraints on who can use the data or for what purpose data can be used, Special permission may be required to submit new data, or there may be no obvious process at all, Only certain applications may be equipped to use the data, often also associated with the same entity (for example, caArray at NCI is specialized for the caBIG), Further processing or reformatting of the data may be required for standard applications or analysis, They claim to address the 'urgent need' to have a standard, centralized repository for microarray data. (See YMD, last updated in 2003, for example), There is a claim to an incremental improvement over one of the public repositories, A meta-analysis application, which incorporates studies from one or more public databases (e.g. Gemma primarily uses GEO studies; NextBio uses various sources) Some of the most known public, curated microarray
https://en.wikipedia.org/wiki/Pharmaceutical%20microbiology
Pharmaceutical microbiology is an applied branch of microbiology. It involves the study of microorganisms associated with the manufacture of pharmaceuticals e.g. minimizing the number of microorganisms in a process environment, excluding microorganisms and microbial byproducts like exotoxin and endotoxin from water and other starting materials, and ensuring the finished pharmaceutical product is sterile. Other aspects of pharmaceutical microbiology include the research and development of anti-infective agents, the use of microorganisms to detect mutagenic and carcinogenic activity in prospective drugs, and the use of microorganisms in the manufacture of pharmaceutical products like insulin and human growth hormone. Drug safety Drug safety is a major focus of pharmaceutical microbiology. Pathogenic bacteria, fungi (yeasts and moulds) and toxins produced by microorganisms are all possible contaminants of medicines- although stringent, regulated processes are in place to ensure the risk is minimal. Antimicrobial activity and disinfection Another major focus of pharmaceutical microbiology is to determine how a product will react in cases of contamination. For example: You have a bottle of cough medicine. Imagine you take the lid off, pour yourself a dose and forget to replace the lid. You come back to take your next dose and discover that you will indeed left the lid off for a few hours. What happens if a microorganism "fell in" whilst the lid was off? There are tests that look at that. The product is "challenged" with a known amount of specific microorganisms, such as E. coli and C. albicans and the anti-microbial activity monitored Pharmaceutical microbiology is additionally involved with the validation of disinfectants, either according to U.S. AOAC or European CEN standards, to evaluate the efficacy of disinfectants in suspension, on surfaces, and through field trials. Field trials help to establish the frequency of the application of detergents and disi
https://en.wikipedia.org/wiki/Compucolor
Compucolor is a series of color microcomputers introduced by Compucolor Corporation of Norcross, Georgia. It was the first color home computer system with built-in color graphics and floppy-based data storage. It used the Intel 8080 CPU. The first model was an upgrade kit for the company's color computer terminal, turning the Intecolor 8001 into the Compucolor 8001 by adding more RAM and a number of optional storage systems. Released in 1976, the 8001 was soon replaced by the Compucolor II in 1977, although shipments did not start until the next year. The Compucolor II was smaller, less expensive, and used the newly introduced 5.25-inch floppy disks instead of the former 8-inch models. Compucolor opened its first retail computer store in Norcross, Georgia USA in 1979, aptly named the "Compucolor Computer Store." The store had limited success in the six months of operation, and the store concept was abandoned. By 1983, Compucolor was out of business. Compucolor, and its forerunner, Intecolor, produced three computer designs (Intecolor 8001, Compucolor 8001 and Compucolor II) over the life of the parent company, Intelligent Systems Corporation. ISC formed in 1973 to produce color terminals. Intecolor 8001 Intelligent Systems Corporation's first product was the Intecolor 8001, an intelligent terminal based on the Intel 8080. Released some time in early 1976, it consisted of a $1,395 kit based around a 19-inch RCA delta-gun CRT and came with 4 kB of random-access memory (RAM). The monitor's three separate electron guns produced a bright and colorful picture, but had the disadvantage of requiring constant adjustment to keep the guns properly aligned. It offered a graphics display with 192 x 160 resolution and 80 x 48 character text display (in single row height) or 80 x 24 character in (double height mode), in 8 primary RGB colors (see Intecolor/Compucolor 8001 character set). Connectivity was limited to a RS232 port. Compucolor 8001 In December 1976, the newly
https://en.wikipedia.org/wiki/Lockstep%20protocol
The lockstep protocol is a partial solution to the look-ahead cheating problem in peer-to-peer architecture multiplayer games, in which a cheating client delays their own actions to await the messages of other players. A client can do so by acting as if they're suffering from high latency; the outgoing packet is forged by attaching a time stamp that is prior to the actual moment the packet is sent. To avoid this method of cheating, the lockstep protocol requires each player to first announce a "commitment" (e.g. hash value of the action); this commitment is a representation of an action that: Cannot be used to infer the action; and Easily compares whether an action corresponds with a commitment. Once all players have received the commitments, they reveal their actions, which are compared with the corresponding commitments to ensure that the commitment is indeed the sent action. Drawbacks As all players must wait for all commitments to arrive before sending their actions, the game progresses as slowly as the player with the highest latency. Although this may not be noticeable in a turn-based game, real-time online games, such as first person shooters, require much faster reactions. This can be acquired by placing a limit on the time in which a player can announce their action. If no action is sent within this period, other players do not announce their actions to that player and ignore any action that arrives too late. Asynchronous lockstep protocol To overcome the obvious drawback of the simple lockstep protocol, an asynchronous variant of the protocol exists wherein players advance in time free of any negotiations with other players until interaction between players exists, known as a "lockstep mode." This mode may be defined by a certain area around a player, such as a sphere, in which the game world may be affected by the player. Such an interaction can only occur when, for example, the areas of influence surrounding two players intersect. External link
https://en.wikipedia.org/wiki/Philosophia%20Botanica
Philosophia Botanica ("Botanical Philosophy", ed. 1, Stockholm & Amsterdam, 1751.) was published by the Swedish naturalist and physician Carl Linnaeus (1707–1778) who greatly influenced the development of botanical taxonomy and systematics in the 18th and 19th centuries. It is "the first textbook of descriptive systematic botany and botanical Latin". It also contains Linnaeus's first published description of his binomial nomenclature. Philosophia Botanica represents a maturing of Linnaeus's thinking on botany and its theoretical foundations, being an elaboration of ideas first published in his Fundamenta Botanica (1736) and Critica Botanica (1737), and set out in a similar way as a series of stark and uncompromising principles (aphorismen). The book also establishes a basic botanical terminology. The following principle §79 demonstrates the style of presentation and Linnaeus's method of introducing his ideas. A detailed analysis of the work is given in Frans Stafleu's Linnaeus and the Linnaeans, pp. 25–78. Binomial nomenclature To understand the objectives of the Philosophia Botanica it is first necessary to appreciate the state of botanical nomenclature at the time of Linnaeus. In accordance with the provisions of the present-day International Code of Nomenclature for algae, fungi and plants the starting point for the scientific names of plants effectively dates back to the list of species enumerated in Linnaeus's Species Plantarum, ed. 1, published 1 May 1753. The Species Plantarum was, for European scientists, a comprehensive global Flora for its day. Linnaeus had learned plant names as short descriptive phrases (polynomials) known as nomina specifica. Each time a new species was described the diagnostic phrase-names had to be adjusted, and lists of names, especially those including synonyms (alternative names for the same plant) became extremely unwieldy. Linnaeus's solution was to associate with the generic name an additional single word, what he termed