source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Graph%20neural%20network
A graph neural network (GNN) is a class of artificial neural networks for processing data that can be represented as graphs. In the more general subject of "geometric deep learning", certain existing neural network architectures can be interpreted as GNNs operating on suitably defined graphs. A convolutional neural network layer, in the context of computer vision, can be seen as a GNN applied to graphs whose nodes are pixels and only adjacent pixels are connected by edges in the graph. A transformer layer, in natural language processing, can be seen as a GNN applied to complete graphs whose nodes are words or tokens in a passage of natural language text. The key design element of GNNs is the use of pairwise message passing, such that graph nodes iteratively update their representations by exchanging information with their neighbors. Since their inception, several different GNN architectures have been proposed, which implement different flavors of message passing, started by recursive or convolutional constructive approaches. , whether it is possible to define GNN architectures "going beyond" message passing, or if every GNN can be built on message passing over suitably defined graphs, is an open research question. Relevant application domains for GNNs include Natural Language Processing, social networks, citation networks, molecular biology, chemistry, physics and NP-hard combinatorial optimization problems. Several open source libraries implementing graph neural networks are available, such as PyTorch Geometric (PyTorch), TensorFlow GNN (TensorFlow), jraph (Google JAX), and GraphNeuralNetworks.jl (Julia, Flux). Architecture The architecture of a generic GNN implements the following fundamental layers: Permutation equivariant: a permutation equivariant layer maps a representation of a graph into an updated representation of the same graph. In the literature, permutation equivariant layers are implemented via pairwise message passing between graph nodes. Int
https://en.wikipedia.org/wiki/Q-Meixner%20polynomials
In mathematics, the q-Meixner polynomials are a family of basic hypergeometric orthogonal polynomials in the basic Askey scheme. give a detailed list of their properties. Definition The polynomials are given in terms of basic hypergeometric functions by
https://en.wikipedia.org/wiki/MycoKeys
MycoKeys is a peer-reviewed open access scientific journal covering mycology. It was established in 2011 by Pensoft Publishers. The editor-in-chief is H. Thorsten Lumbsch. Abstracting and indexing The journal is abstracted and indexed in: Science Citation Index Expanded. Current Contents/Agriculture, Biology & Environmental Sciences. BIOSIS Previews. According to the Journal Citation Reports, the journal has a 2022 impact factor of 3.3.
https://en.wikipedia.org/wiki/X%20Font%20Server
The X font server (xfs) provides a standard mechanism for an X server to communicate with a font renderer, frequently one running on a remote machine. It usually runs on TCP port 7100. Current status The use of server-side fonts is currently considered deprecated in favour of client-side fonts. Such fonts are rendered by the client, not by the server, with the support of the Xft2 or Cairo libraries and the XRender extension. For the few cases in which server-side fonts are still needed, the new servers have their own integrated font renderer, so that no external one is needed. Server-side fonts can now be configured in the X server configuration files. For example, will set the server-side fonts for Xorg. No specification on client-side fonts is given in the core protocol. Future As of October 2006, the manpage for xfs on Debian states that: FUTURE DIRECTIONS Significant further development of xfs is unlikely. One of the original motivations behind xfs was the single-threaded nature of the X server — a user’s X session could seem to "freeze up" while the X server took a moment to rasterize a font. This problem with the X server (which remains single-threaded in all popular implementations to this day) has been mitigated on two fronts: machines have gotten much faster, and client-side font rendering (particularly via the Xft library) has become the norm in contemporary software. Deployment issues So the choice between local filesystem font access and xfs-based font access is purely a local deployment choice. It does not make much sense in a single computer scenario. See also X Window System core protocol X logical font description
https://en.wikipedia.org/wiki/FANCB
Fanconi anemia group B protein is a protein that in humans is encoded by the FANCB gene. Function The Fanconi anemia complementation group (FANC) currently includes FANCA, FANCB, FANCC, FANCD1 (also called BRCA2), FANCD2, FANCE, FANCF, FANCG, and FANCL. Fanconi anemia is a genetically heterogeneous recessive disorder characterized by cytogenetic instability, hypersensitivity to DNA crosslinking agents, increased chromosomal breakage, and defective DNA repair. The members of the Fanconi anemia complementation group do not share sequence similarity; they are related by their assembly into a common nuclear protein complex. This gene encodes the protein for complementation group B. Alternative splicing results in two transcript variants encoding the same protein. Gene FANCB is the only gene known to cause X-linked Fanconi Anemia. In female carriers of FANCB mutations (one wild-type FANCB allele and one mutant FANCB allele) there is strong selection through X-inactivation for expression of only the wild-type allele. In contrast, males have only one FANCB allele. Only male patients with Fanconi anemia have ever been linked to FANCB mutations, and they make up about 4% of cases. Mutation in the FANCB are highly associated with the development of the VACTERL-H constilation of birth defects. In a cohort study of 19 children with FANCB variants, those with deletion of FANCB gene or truncation of FANCB protein demonstrate earlier-than-average onset of bone marrow failure and more severe congenital abnormalities compared with a large series of Fanconi Anemia individuals in published reports. This reflects the indispensable role of FANCB gene in cells. For FANCB missense variants, more variable severity is associated with the extent of residual activity. Protein The FANCB gene product is FANCB protein. FANCB is a component of a "core complex" of nine Fanconi Anemia proteins: FANCA, FANCB, FANCC, FANCE, FANCF, FANCG, FANCL, FAAP100 and FAAP20. The core complex localises
https://en.wikipedia.org/wiki/Infinite%20Energy%20%28magazine%29
Infinite Energy is a bi-monthly magazine published in New Hampshire that details theories and experiments concerning alternative energy, new science and new physics. The magazine was founded by the late Eugene Mallove, and is owned by the non-profit New Energy Foundation. It was established in 1994 as Cold Fusion magazine and changed its name in March 1995. Topics of interest include "new hydrogen physics," also called cold fusion; vacuum energy, or zero point energy; and so-called "environmental energy" which they define as the attempt to violate the Second Law of Thermodynamics, for example with a perpetual motion machine. This is done in pursuit of the founder's commitment to "unearthing new sources of energy and new paradigms in science." The magazine has also published articles and book reviews that are critical of the Big Bang theory that describes the origin of the universe. The magazine has a print run of 3,000, and is available on U.S. newsstands. The issues range in size from 48 to 100 pages.
https://en.wikipedia.org/wiki/Assistive%20Technology%20Industry%20Association
The Assistive Technology Industry Association (ATIA) is a not-for-profit membership organization of manufacturers, sellers and providers of technology-based assistive devices and/or services, for people with disabilities. ATIA represents the interests of its members to business, government, education, and the many agencies that serve people with disabilities. One goal of the ATIA is to "speak with the common voice" for Its mission is to serve as the collective voice of the Assistive Technology (AT) industry so that the best products and services are delivered to people with disabilities. Founded in 1998, ATIA is governed by a 10-member Board of Directors. ATIA has over 120 individual and corporate members. Its offices are located in Chicago, Illinois. Since 1999, ATIA has held annual conferences that provide forums for education and communication to professional practitioners serving those with disabilities. The annual conferences are also open to people with disabilities, their caregivers, their family members, and members of the general public. In 2009, ATIA began providing educational online webinars to serve these groups. ATIA has also held conferences in collaboration with the Job Accommodation Network and United States Business Leadership Network (USBLN). The published result of one such conference is Roadmaps for Enhancing Employment of Persons with Disabilities through Accessible Technology. Publications ATIA produces the Assistive Technology Outcomes and Benefits Journal (ATOB) in a collaborative scholarly partnership with the Special Education Assistive Technology Center (SEAT Center) at Illinois State University. ATOB publishes articles related to the outcomes and benefits of assistive technology for persons with disabilities across the lifespan. ATOB is a peer-reviewed annual publication, first published in 2004. Industry standard setting ATIA is involved in industry standard setting through its Accessibility Interoperability Alliance (AIA) divisio
https://en.wikipedia.org/wiki/Veiling%20glare
Veiling glare is an imperfection of performance in optical instruments (such as cameras and telescopes) arising from incoming light that strays from the normal image-forming paths, and reaches the focal plane. The effect superimposes a form of noise onto the normal image sensed by the detector (film, digital sensor, or eye viewing through an eyepiece), resulting in a final image degraded by loss of contrast and reduced definition. Scenes In scenes where a bright object is next to a faint object, veiling glare from the bright object may hide the faint object from view, even though the instrument is otherwise capable of spatially resolving the two. Veiling glare is a limiting factor in high-dynamic-range imaging. Glare in optical instruments differs from glare in vision, even though they both follow the same physical principles, because the phenomenon arises from mechanical versus physiological features. Factors and design techniques Light strays or scatters in lenses due to many potential factors in design and operation. These factors include dirt, film, or scratches on lens surfaces; reflections from lens surfaces or their mounts; and the slightly imperfect transparency (or reflection) of real glass (or mirrors). Typical optical engineering design techniques to minimize stray light include: black coatings on internal surfaces, knife edges on mounts, antireflection lens coatings, internal baffles and stops, and tube extensions which block sources outside the field of view. Veiling glare is a performance factor tested by UL standard 2802, Testing and Certification for the Performance of Video Cameras.
https://en.wikipedia.org/wiki/Vespertilionid%20gammaherpesvirus%201
Vespertilionid gammaherpesvirus 1 (VeGHV-1) is a species of virus in the genus Percavirus, subfamily Gammaherpesvirinae, family Herpesviridae, and order Herpesvirales.
https://en.wikipedia.org/wiki/List%20of%20U.S.%20state%20horses
Twelve U.S. states have designated a horse breed as the official "state horse", two have a horse breed as their "state animal", one has an official "state pony", and one has a "honorary state equine". The first state horse was designated in Vermont in 1961. The most recent state horse designations occurred in 2023 when Virginia designated the Chincoteague Pony as its state pony and in 2022 when Oklahoma declared the American Quarter Horse as its state horse. There have been proposals to designate a state horse in Oregon as well as in Arizona (where an ongoing campaign sought to designate the Colonial Spanish Horse as the state horse prior to the state centennial in 2012), but neither proposal is yet successful. In one state, North Dakota, the state horse is officially designated the "honorary state equine." Two additional states have not designated a specific state horse, but have designed a horse or horse breed as its official state animals: the horse in New Jersey and the Morgan horse breed in Vermont. Some breeds, such as the American Quarter Horse in Texas and the Morgan horse in Vermont and Massachusetts, were named as the state horse because of the close connection between the history of the breed and the state. Others, including the Tennessee Walking Horse and the Missouri Fox Trotter, include the state in the official breed name. School children have lobbied for the cause of some state horses, such as the Colonial Spanish Horse being named the state horse of North Carolina due to the presence of the Spanish-descended Banker horses in the Outer Banks, while others have been brought to official status through the lobbying efforts of their breed registries. Official state horses are one of many state symbols officially designated by states. Each state has its own flag and state seal, and many states also designate other symbols, including animals, plants, and foods. Such items usually are designated because of their ties to the culture or history of that part
https://en.wikipedia.org/wiki/Sharktopus
Sharktopus is a 2010 SyFy original horror/science fiction film produced by Roger Corman, directed by Declan O'Brien, and starring Eric Roberts. It is the first film in the Sharktopus franchise. Plot Geneticist Nathan Sands and his daughter Nicole are hired by the U.S. Navy to create a new weapon; they create an intelligent shark that has the tentacles of an octopus, dubbed S-11, controlling the creature using electromagnetic pulses with a device attached to its head. During one of the test missions, S-11 discards the device before traveling to Mexican waters to find food. Sands and his daughter are then assigned to catch S-11 and travel down to Mexico, where they meet up with fishermen Andy Flynn and Santos, who work for Sands to help capture S-11. Andy, Nicole and Santos track S-11 on Andy's boat as Sands and his men follow in a yacht behind them. Several sightings occur as S-11 kills tourists and locals in the area. Fisherman Pez sends a photo of the creature to a news station, and news reporter Stacy Everheart and her cameraman Bones arrive to find the creature, enlisting Pez's help in the process. Stacy researches Sands and deduces that S-11 is a biological experiment. She, Bones and Pez record it killing a few people on a beach before heading into the ocean on Pez's boat to capture more evidence. Andy and a group of his diver friends go into a cave to find S-11, although the creature attacks them, killing everybody except Andy. He, Nicole and Santos then encounter Stacy, Bones and Pez, although Pez is killed by S-11 before it runs off. Andy, Nicole and Santos pursue it, resulting in Santos being killed. An enraged Andy radios Sands and tells him he's going to kill S-11 despite his orders. Andy and Nicole track S-11 to the mainland, where it kills a few people before Sands and his men arrive and hold Andy hostage. Nicole reprimands her father for wishing to further the experiments, although S-11 arrives and kills Sands's men before Sands himself is killed savi
https://en.wikipedia.org/wiki/Effective%20descriptive%20set%20theory
Effective descriptive set theory is the branch of descriptive set theory dealing with sets of reals having lightface definitions; that is, definitions that do not require an arbitrary real parameter (Moschovakis 1980). Thus effective descriptive set theory combines descriptive set theory with recursion theory. Constructions Effective Polish space An effective Polish space is a complete separable metric space that has a computable presentation. Such spaces are studied in both effective descriptive set theory and in constructive analysis. In particular, standard examples of Polish spaces such as the real line, the Cantor set and the Baire space are all effective Polish spaces. Arithmetical hierarchy The arithmetical hierarchy, arithmetic hierarchy or Kleene–Mostowski hierarchy classifies certain sets based on the complexity of formulas that define them. Any set that receives a classification is called "arithmetical". More formally, the arithmetical hierarchy assigns classifications to the formulas in the language of first-order arithmetic. The classifications are denoted and for natural numbers n (including 0). The Greek letters here are lightface symbols, which indicates that the formulas do not contain set parameters. If a formula is logically equivalent to a formula with only bounded quantifiers then is assigned the classifications and . The classifications and are defined inductively for every natural number n using the following rules: If is logically equivalent to a formula of the form , where is , then is assigned the classification . If is logically equivalent to a formula of the form , where is , then is assigned the classification .
https://en.wikipedia.org/wiki/Little%27s%20law
In mathematical queueing theory, Little's law (also result, theorem, lemma, or formula) is a theorem by John Little which states that the long-term average number L of customers in a stationary system is equal to the long-term average effective arrival rate λ multiplied by the average time W that a customer spends in the system. Expressed algebraically the law is The relationship is not influenced by the arrival process distribution, the service distribution, the service order, or practically anything else. In most queuing systems, service time is the bottleneck that creates the queue. The result applies to any system, and particularly, it applies to systems within systems. For example in a bank branch, the customer line might be one subsystem, and each of the tellers another subsystem, and Little's result could be applied to each one, as well as the whole thing. The only requirements are that the system be stable and non-preemptive; this rules out transition states such as initial startup or shutdown. In some cases it is possible not only to mathematically relate the average number in the system to the average wait but even to relate the entire probability distribution (and moments) of the number in the system to the wait. History In a 1954 paper Little's law was assumed true and used without proof. The form L = λW was first published by Philip M. Morse where he challenged readers to find a situation where the relationship did not hold. Little published in 1961 his proof of the law, showing that no such situation existed. Little's proof was followed by a simpler version by Jewell and another by Eilon. Shaler Stidham published a different and more intuitive proof in 1972. Examples Finding response time Imagine an application that had no easy way to measure response time. If the mean number in the system and the throughput are known, the average response time can be found using Little’s Law: mean response time = mean number in system / mean throughput
https://en.wikipedia.org/wiki/Euler%27s%20theorem%20in%20geometry
In geometry, Euler's theorem states that the distance d between the circumcenter and incenter of a triangle is given by or equivalently where and denote the circumradius and inradius respectively (the radii of the circumscribed circle and inscribed circle respectively). The theorem is named for Leonhard Euler, who published it in 1765. However, the same result was published earlier by William Chapple in 1746. From the theorem follows the Euler inequality: which holds with equality only in the equilateral case. Stronger version of the inequality A stronger version is where , , and are the side lengths of the triangle. Euler's theorem for the escribed circle If and denote respectively the radius of the escribed circle opposite to the vertex and the distance between its center and the center of the circumscribed circle, then . Euler's inequality in absolute geometry Euler's inequality, in the form stating that, for all triangles inscribed in a given circle, the maximum of the radius of the inscribed circle is reached for the equilateral triangle and only for it, is valid in absolute geometry. See also Fuss' theorem for the relation among the same three variables in bicentric quadrilaterals Poncelet's closure theorem, showing that there is an infinity of triangles with the same two circles (and therefore the same R, r, and d) List of triangle inequalities
https://en.wikipedia.org/wiki/Sazonov%27s%20theorem
In mathematics, Sazonov's theorem, named after Vyacheslav Vasilievich Sazonov (), is a theorem in functional analysis. It states that a bounded linear operator between two Hilbert spaces is γ-radonifying if it is a Hilbert–Schmidt operator. The result is also important in the study of stochastic processes and the Malliavin calculus, since results concerning probability measures on infinite-dimensional spaces are of central importance in these fields. Sazonov's theorem also has a converse: if the map is not Hilbert–Schmidt, then it is not γ-radonifying. Statement of the theorem Let G and H be two Hilbert spaces and let T : G → H be a bounded operator from G to H. Recall that T is said to be γ-radonifying if the push forward of the canonical Gaussian cylinder set measure on G is a bona fide measure on H. Recall also that T is said to be a Hilbert–Schmidt operator if there is an orthonormal basis } of G such that Then Sazonov's theorem is that T is γ-radonifying if it is a Hilbert–Schmidt operator. The proof uses Prokhorov's theorem. Remarks The canonical Gaussian cylinder set measure on an infinite-dimensional Hilbert space can never be a bona fide measure; equivalently, the identity function on such a space cannot be γ-radonifying. See also
https://en.wikipedia.org/wiki/Land%20cover
Land cover is the physical material at the surface of Earth. Land covers include grass, asphalt, trees, bare ground, water, etc. Earth cover is the expression used by ecologist Frederick Edward Clements that has its closest modern equivalent being vegetation. The expression continues to be used by the United States Bureau of Land Management. There are two primary methods for capturing information on land cover: field survey, and analysis of remotely sensed imagery. Land change models can be built from these types of data to assess changes in land cover over time. One of the major land cover issues (as with all natural resource inventories) is that every survey defines similarly named categories in different ways. For instance, there are many definitions of "forest"—sometimes within the same organisation—that may or may not incorporate a number of different forest features (e.g., stand height, canopy cover, strip width, inclusion of grasses, and rates of growth for timber production). Areas without trees may be classified as forest cover "if the intention is to re-plant" (UK and Ireland), while areas with many trees may not be labelled as forest "if the trees are not growing fast enough" (Norway and Finland). Distinction from "land use" "Land cover" is distinct from "land use", despite the two terms often being used interchangeably. Land use is a description of how people utilize the land and of socio-economic activity. Urban and agricultural land uses are two of the most commonly known land use classes. At any one point or place, there may be multiple and alternate land uses, the specification of which may have a political dimension. The origins of the "land cover/land use" couplet and the implications of their confusion are discussed in Fisher et al. (2005). Types Following table is Land Cover statistics by Food and Agriculture Organization (FAO) with 14 classes. Mapping Land cover change detection using remote sensing and geospatial data provides baselin
https://en.wikipedia.org/wiki/Variance%20reduction
In mathematics, more specifically in the theory of Monte Carlo methods, variance reduction is a procedure used to increase the precision of the estimates obtained for a given simulation or computational effort. Every output random variable from the simulation is associated with a variance which limits the precision of the simulation results. In order to make a simulation statistically efficient, i.e., to obtain a greater precision and smaller confidence intervals for the output random variable of interest, variance reduction techniques can be used. The main variance reduction methods are common random numbers antithetic variates control variates importance sampling stratified sampling moment matching conditional Monte Carlo and quasi random variables (in Quasi-Monte Carlo method) For simulation with black-box models subset simulation and line sampling can also be used. Under these headings are a variety of specialized techniques; for example, particle transport simulations make extensive use of "weight windows" and "splitting/Russian roulette" techniques, which are a form of importance sampling. Crude Monte Carlo simulation Suppose one wants to compute with the random variable defined on the probability space . Monte Carlo does this by sampling i.i.d. copies of and then to estimate via the sample-mean estimator Under further mild conditions such as , a central limit theorem will apply such that for large , the distribution of converges to a normal distribution with mean and standard error . Because the standard deviation only converges towards at the rate , implying one needs to increase the number of simulations () by a factor of to halve the standard deviation of , variance reduction methods are often useful for obtaining more precise estimates for without needing very large numbers of simulations. Common Random Numbers (CRN) The common random numbers variance reduction technique is a popular and useful variance reduction technique which
https://en.wikipedia.org/wiki/International%20Journal%20of%20Food%20Sciences%20and%20Nutrition
The International Journal of Food Sciences and Nutrition is a peer-reviewed scientific journal that covers food science and nutrition. It is published by Taylor & Francis. the editor-in-chief is Daniele Del Rio (University of Parma). Abstracting and indexing The journal is abstracted and indexed in BIOSIS Previews, Chemical Abstracts, Current Contents/Agriculture, Biology & Environmental Sciences, EMBASE/Excerpta Medica, Food Science & Technology Abstracts, Index Medicus/MEDLINE/PubMed, PASCAL, Scopus, and Science Citation Index Expanded. According to the Journal Citation Reports, the journal has a 2019 impact factor of 3.483.
https://en.wikipedia.org/wiki/Caspio
Caspio is an American software company headquartered in Sunnyvale, California, with offices in Ukraine, Poland and the Philippines. Caspio was founded by Frank Zamani in 2000. The company focuses on database-centric web applications. History Caspio was founded by Frank Zamani in 2000. The company initially focused on simplifying custom cloud applications and reducing development time and cost as compared to traditional software development. Caspio released the first version of its platform, Caspio Bridge, in 2001. In 2014, Caspio released a HIPAA-Compliant Edition of its low-code application development platform. Caspio also released an EU General Data Protection Regulation (GDPR) Compliance Edition of its low-code application development platform in 2016. Caspio's second European Software Development Center opened in Kraków, Poland in 2017. Caspio also opened data centers in Montreal, Canada and India in 2020.
https://en.wikipedia.org/wiki/Doppler%20cooling
Doppler cooling is a mechanism that can be used to trap and slow the motion of atoms to cool a substance. The term is sometimes used synonymously with laser cooling, though laser cooling includes other techniques. History Doppler cooling was simultaneously proposed by two groups in 1975, the first being David J. Wineland and Hans Georg Dehmelt and the second being Theodor W. Hänsch and Arthur Leonard Schawlow. It was first demonstrated by Wineland, Drullinger, and Walls in 1978 and shortly afterwards by Neuhauser, Hohenstatt, Toschek and Dehmelt. One conceptually simple form of Doppler cooling is referred to as optical molasses, since the dissipative optical force resembles the viscous drag on a body moving through molasses. Steven Chu, Claude Cohen-Tannoudji and William D. Phillips were awarded the 1997 Nobel Prize in Physics for their work in laser cooling and atom trapping. Brief explanation Doppler cooling involves light with frequency tuned slightly below an electronic transition in an atom. Because the light is detuned to the "red" (i.e. at lower frequency) of the transition, the atoms will absorb more photons if they move towards the light source, due to the Doppler effect. Consider the simplest case of 1D motion on the x axis. Let the photon be traveling in the +x direction and the atom in the −x direction. In each absorption event, the atom loses a momentum equal to the momentum of the photon. The atom, which is now in the excited state, emits a photon spontaneously but randomly along +x or −x. Momentum is returned to the atom. If the photon was emitted along +x then there is no net change; however, if the photon was emitted along −x, then the atom is moving more slowly in either −x or +x. The net result of the absorption and emission process is a reduced speed of the atom, on the condition that its initial speed is larger than the recoil velocity from scattering a single photon. If the absorption and emission are repeated many times, the mean veloc
https://en.wikipedia.org/wiki/Septin
Septins are a group of GTP-binding proteins expressed in all eukaryotic cells except plants. Different septins form protein complexes with each other. These complexes can further assemble into filaments, rings and gauzes. Assembled as such, septins function in cells by localizing other proteins, either by providing a scaffold to which proteins can attach, or by forming a barrier preventing the diffusion of molecules from one compartment of the cell to another, or in the cell cortex as a barrier to the diffusion of membrane-bound proteins. Septins have been implicated in the localization of cellular processes at the site of cell division, and at the cell membrane at sites where specialized structures like cilia or flagella are attached to the cell body. In yeast cells, they compartmentalize parts of the cell and build scaffolding to provide structural support during cell division at the septum, from which they derive their name. Research in human cells suggests that septins build cages around pathogenic bacteria, that immobilize and prevent them from invading other cells. As filament forming proteins, septins can be considered part of the cytoskeleton. Apart from forming non-polar filaments, septins associate with cell membranes, the cell cortex, actin filaments and microtubules. Structure Septins are P-Loop-NTPase proteins that range in weight from 30-65 kDa. Septins are highly conserved between different eukaryotic species. They are composed of a variable-length proline rich N-terminus with a basic phosphoinositide binding motif important for membrane association, a GTP-binding domain, a highly conserved Septin Unique Element domain, and a C-terminal extension including a coiled coil domain of varying length. Septins interact either via their respective GTP-binding domains, or via both their N- and C-termini. Different organisms express a different number of septins, and from those symmetric oligomers are formed. For example, in yeast the octameric complex fo
https://en.wikipedia.org/wiki/Tuxedo%20Computers
The Tuxedo Computers GmbH (proper spelling: TUXEDO Computers) is a computer manufacturer based in Augsburg, Germany. The company specializes in desktop computers and notebooks with pre-installed Linux operating system. The devices are manufactured in Leipzig, Germany. Tuxedo Computers equips its devices with Tuxedo OS, its own Linux distribution based on Ubuntu, or installs a selection of distributions as well as Microsoft Windows as an operating system in parallel with the Linux system or in a virtual machine. History Tuxedo Computers was founded in February 1st, 2004 by current managing director Herbert Feiler in Bayreuth. In 2013, the company moved to Königsbrunn. In 2019 followed another move to the current headquarters in Augsburg. The name derives from the Linux mascot Tux, whose feathers resemble a tuxedo. The company emerged from an online store that specialized in the distribution of promotional items related to Linux and open-source software and software boxes with Linux distributions. Due to better Linux compatibility, TUXEDO Computers originally only carried desktop computers, as notebooks often required special adaptations. In the meantime, notebooks and small form factor desktop computers complement the range. The names of the devices borrow from stars and planets, space travel, and science and technology. Tuxedo OS With Tuxedo OS (proper spelling: TUXEDO OS), Tuxedo Computers develops its own Linux distribution based on Ubuntu. The first version was released on September 29, 2022. Compared to Ubuntu, Tuxedo OS comes without the package management Snap initiated by Canonical and adds the latest Linux kernel and the latest version of KDE Plasma. In addition, Tuxedo OS uses its own software repositories operated by hosting providers located in Germany and refrains from phoning home. Tuxedo OS can be freely downloaded from the project page in the form of an ISO disk image file. Complementing Tuxedo OS, the company is working on tools to control ha
https://en.wikipedia.org/wiki/Subharmonic%20function
In mathematics, subharmonic and superharmonic functions are important classes of functions used extensively in partial differential equations, complex analysis and potential theory. Intuitively, subharmonic functions are related to convex functions of one variable as follows. If the graph of a convex function and a line intersect at two points, then the graph of the convex function is below the line between those points. In the same way, if the values of a subharmonic function are no larger than the values of a harmonic function on the boundary of a ball, then the values of the subharmonic function are no larger than the values of the harmonic function also inside the ball. Superharmonic functions can be defined by the same description, only replacing "no larger" with "no smaller". Alternatively, a superharmonic function is just the negative of a subharmonic function, and for this reason any property of subharmonic functions can be easily transferred to superharmonic functions. Formal definition Formally, the definition can be stated as follows. Let be a subset of the Euclidean space and let be an upper semi-continuous function. Then, is called subharmonic if for any closed ball of center and radius contained in and every real-valued continuous function on that is harmonic in and satisfies for all on the boundary of , we have for all Note that by the above, the function which is identically −∞ is subharmonic, but some authors exclude this function by definition. A function is called superharmonic if is subharmonic. Properties A function is harmonic if and only if it is both subharmonic and superharmonic. If is C2 (twice continuously differentiable) on an open set in , then is subharmonic if and only if one has on , where is the Laplacian. The maximum of a subharmonic function cannot be achieved in the interior of its domain unless the function is constant, which is called the maximum principle. However, the minimum of a subharmonic f
https://en.wikipedia.org/wiki/0music
0music is the second album produced with Melomics technology. While the first one (Iamus' album) is a compilation of contemporary pieces fully composed by Iamus, 0music compiles pieces of popular genres, composed and interpreted without any human intervention by Melomics109, a computer cluster hosted at the University of Malaga. The pieces in this album, and all the production of Melomics109, is distributed under CC0 licensing, and it is available in audible and editable (MIDI) formats. The album was launched during a one-day symposium held in Malaga on July 21, 2014. Track listing See also 1 the Road External links Melomics Playlist 0music in YouTube
https://en.wikipedia.org/wiki/Century%20egg
Century eggs (), also known under a wide variety of names (see infobox), are a Chinese egg-based culinary dish made by preserving duck, chicken, or quail eggs in a mixture of clay, ash, salt, quicklime, and rice hulls for several weeks to several months, depending on the processing method. Through the process, the yolk becomes a dark green to grey color, with a creamy consistency and strong flavor due to the hydrogen sulfide and ammonia present, while the white becomes a dark brown, translucent jelly with a salty flavor. The transforming agent in the century egg is an alkaline salt, which gradually raises the pH of the egg to around 9–12, during the curing process. This chemical process breaks down some of the complex, flavorless proteins and fats, which produces a variety of smaller flavorsome compounds. Some eggs have patterns near the surface of the egg white which are likened to pine branches. These patterned eggs are regarded as having better quality than the normal century eggs and are called Songhua eggs (Chinese: ), variously translated as pine flower eggs or pine-patterned eggs. History The method for creating century eggs likely came about through the need to preserve eggs in times of plenty by coating them in alkaline clay, which is similar to methods of egg preservation in some Western cultures. The clay hardens around the egg and results in the curing and creation of century eggs instead of spoiled eggs. The century egg has at least four centuries of history behind its production. Its discovery, though not verifiable, was said to have occurred around 600 years ago in Hunan during the Ming Dynasty, when a homeowner discovered duck eggs in a shallow pool of slaked lime that was used for mortar during construction of his home two months before. Upon tasting the eggs, he set out to produce more – this time with the addition of salt to improve their flavor – resulting in the present recipe of the century egg. An alternate story involves a young duck farm
https://en.wikipedia.org/wiki/Cusp%20form
In number theory, a branch of mathematics, a cusp form is a particular kind of modular form with a zero constant coefficient in the Fourier series expansion. Introduction A cusp form is distinguished in the case of modular forms for the modular group by the vanishing of the constant coefficient a0 in the Fourier series expansion (see q-expansion) This Fourier expansion exists as a consequence of the presence in the modular group's action on the upper half-plane via the transformation For other groups, there may be some translation through several units, in which case the Fourier expansion is in terms of a different parameter. In all cases, though, the limit as q → 0 is the limit in the upper half-plane as the imaginary part of z → ∞. Taking the quotient by the modular group, this limit corresponds to a cusp of a modular curve (in the sense of a point added for compactification). So, the definition amounts to saying that a cusp form is a modular form that vanishes at a cusp. In the case of other groups, there may be several cusps, and the definition becomes a modular form vanishing at all cusps. This may involve several expansions. Dimension The dimensions of spaces of cusp forms are, in principle, computable via the Riemann–Roch theorem. For example, the Ramanujan tau function τ(n) arises as the sequence of Fourier coefficients of the cusp form of weight 12 for the modular group, with a1 = 1. The space of such forms has dimension 1, which means this definition is possible; and that accounts for the action of Hecke operators on the space being by scalar multiplication (Mordell's proof of Ramanujan's identities). Explicitly it is the modular discriminant which represents (up to a normalizing constant) the discriminant of the cubic on the right side of the Weierstrass equation of an elliptic curve; and the 24-th power of the Dedekind eta function. The Fourier coefficients here are written and called 'Ramanujan's tau function', with the normalization τ(1) = 1.
https://en.wikipedia.org/wiki/92nd%20meridian%20west
The meridian 92° west of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, North America, the Gulf of Mexico, Central America, the Pacific Ocean, the Southern Ocean, and Antarctica to the South Pole. The 92nd meridian west forms a great circle with the 88th meridian east. From Pole to Pole Starting at the North Pole and heading south to the South Pole, the 92nd meridian west passes through: {| class="wikitable plainrowheaders" ! scope="col" width="120" | Co-ordinates ! scope="col" | Country, territory or sea ! scope="col" | Notes |-valign="top" | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Arctic Ocean | style="background:#b0e0e6;" | Passing just west of Ellesmere Island, Nunavut, (at ) Passing just west of Krueger Island, Nunavut, (at ) |- | ! scope="row" | | Nunavut — Fjeldholmen Island |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Nansen Sound | style="background:#b0e0e6;" | |- | ! scope="row" | | Nunavut — Axel Heiberg Island |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Norwegian Bay | style="background:#b0e0e6;" | |- | ! scope="row" | | Nunavut — Devon Island |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Parry Channel | style="background:#b0e0e6;" | Barrow Strait |- | ! scope="row" | | Nunavut — Somerset Island |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Prince Regent Inlet | style="background:#b0e0e6;" | |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Gulf of Boothia | style="background:#b0e0e6;" | |- | ! scope="row" | | Nunavut — Boothia Peninsula (mainland) |- | style="background:#b0e0e6;" | ! scope="row" style="background:#b0e0e6;" | Gulf of Boothia | style="background:#b0e0e6;" | Lord Mayor Bay |- | ! scope="row" | | Nunavut — mainland |- | style="background:#b0e0e6;" | ! scope="row" style="backgrou
https://en.wikipedia.org/wiki/Abstract%20family%20of%20languages
In computer science, in particular in the field of formal language theory, an abstract family of languages is an abstract mathematical notion generalizing characteristics common to the regular languages, the context-free languages and the recursively enumerable languages, and other families of formal languages studied in the scientific literature. Formal definitions A formal language is a set for which there exists a finite set of abstract symbols such that , where * is the Kleene star operation. A family of languages is an ordered pair , where is an infinite set of symbols; is a set of formal languages; For each in there exists a finite subset such that ; and for some in . A trio is a family of languages closed under homomorphisms that do not introduce the empty word, inverse homomorphisms, and intersections with a regular language. A full trio, also called a cone, is a trio closed under arbitrary homomorphism. A (full) semi-AFL is a (full) trio closed under union. A (full) AFL is a (full) semi-AFL closed under concatenation and the Kleene plus. Some families of languages The following are some simple results from the study of abstract families of languages. Within the Chomsky hierarchy, the regular languages, the context-free languages, and the recursively enumerable languages are all full AFLs. However, the context sensitive languages and the recursive languages are AFLs, but not full AFLs because they are not closed under arbitrary homomorphisms. The family of regular languages are contained within any cone (full trio). Other categories of abstract families are identifiable by closure under other operations such as shuffle, reversal, or substitution. Origins Seymour Ginsburg of the University of Southern California and Sheila Greibach of Harvard University presented the first AFL theory paper at the IEEE Eighth Annual Symposium on Switching and Automata Theory in 1967. Notes
https://en.wikipedia.org/wiki/The%20Genetic%20Basis%20of%20Evolutionary%20Change
The Genetic Basis of Evolutionary Change is a book by Richard Lewontin about evolutionary genetics. Originally published by Columbia University Press in 1974, the book originated in a series of lectures, known as the "Jesup lectures", that Lewontin gave at Columbia University in 1969. In a blurb promoting the book, Columbia University Press claimed that it "will surely become one of the landmarks in twentieth-century science", for which they were criticized by some of the book's reviewers. James F. Crow, for example, argued that the Columbia employees who chose to describe the book in this way "should have their wrists slapped", adding, "...this is not the Origin of Species. It is just a thoughtful, readable, and very stimulating book." Reviews In his review of The Genetic Basis of Evolutionary Change, Donald J. Nash described it as "...a fine addition to the series of volumes on evolutionary biology published by Columbia University Press." Joseph Felsenstein also reviewed the book favorably, describing it as "...the book we always knew Dick Lewontin could write" and "...a brilliant comprehensive introductory review of the controversy over the evolutionary significance of protein polymorphisms." In his review of the book, James F. Crow described it as "a fine book", praising Lewontin for his "gift for seeing problems clearly, for marshalling the relevant evidence, and for presenting all this in an interesting way." Crow concluded that "[i]n the areas in which there has been the greatest controversy, Lewontin has presented the scientific issues fairly and objectively." However, Crow also criticized Lewontin for inserting "irrelevant social and political statements" into several parts of the book. In his review of the book for Science, Bryan Clarke described it as "...a remarkable book", adding that "[i]t will, no doubt, be necessary fare for future generations of undergraduates, and it will certainly benefit their intellectual nutrition. Influence The Genetic Basi
https://en.wikipedia.org/wiki/Respiratory%20inductance%20plethysmography
Respiratory inductance plethysmography (RIP) is a method of evaluating pulmonary ventilation by measuring the movement of the chest and abdominal wall. Accurate measurement of pulmonary ventilation or breathing often requires the use of devices such as masks or mouthpieces coupled to the airway opening. These devices are often both encumbering and invasive, and thus ill suited for continuous or ambulatory measurements. As an alternative RIP devices that sense respiratory excursions at the body surface can be used to measure pulmonary ventilation. According to a paper by Konno and Mead "the chest can be looked upon as a system of two compartments with only one degree of freedom each". Therefore, any volume change of the abdomen must be equal and opposite to that of the rib cage. The paper suggests that the volume change is close to being linearly related to changes in antero-posterior (front to back of body) diameter. When a known air volume is inhaled and measured with a spirometer, a volume-motion relationship can be established as the sum of the abdominal and rib cage displacements. Therefore, according to this theory, only changes in the antero-posterior diameter of the abdomen and the rib cage are needed to estimate changes in lung volume. Several sensor methodologies based on this theory have been developed. RIP is the most frequently used, established and accurate plethysmography method to estimate lung volume from respiratory movements . RIP has been used in many clinical and academic research studies in a variety of domains including polysomnographic (sleep), psychophysiology, psychiatric research, anxiety and stress research, anesthesia, cardiology and pulmonary research (asthma, COPD, dyspnea). Technology A respiratory inductance plethysmograph consists of two sinusoid wire coils insulated and placed within two 2.5 cm (about 1 inch) wide, lightweight elastic and adhesive bands. The transducer bands are placed around the rib cage under the armpits an
https://en.wikipedia.org/wiki/Microwave%20Imaging%20Radiometer%20with%20Aperture%20Synthesis
Microwave Imaging Radiometer with Aperture Synthesis (MIRAS) is the major instrument on the Soil Moisture and Ocean Salinity satellite (SMOS). MIRAS employs a planar antenna composed of a central body (the so-called hub) and three telescoping, deployable arms, in total 69 receivers on the Unit. Each receiver is composed of one Lightweight Cost-Effective Front-end (LICEF) module, which detects radiation in the microwave L-band, both in horizontal and vertical polarizations. The aperture on the LICEF detectors, planar in arrangement on MIRAS, point directly toward the Earth's surface as the satellite orbits. The arrangement and orientation of MIRAS makes the instrument a 2-D interferometric radiometer that generates brightness temperature images, from which both geophysical variables are computed. The salinity measurement requires demanding performance of the instrument in terms of calibration and stability. The MIRAS instrument's prime contractor was EADS CASA Espacio, manufacturing the payload of SMOS under ESA's contract. LICEF The LICEF detector is composed of a round patch antenna element, with 2 pairs of probes for orthogonal linear polarisations, feeding two receiver channels in a compact lightweight package behind the antenna. It picks up thermal radiation emitted by the Earth near 1.4 GHz in the microwave L-band, amplifies it 100 dB, and digitises it with 1-bit quantisation.
https://en.wikipedia.org/wiki/Yuri%20Prokhorov
Yuri Vasilyevich Prokhorov (; 15 December 1929 – 16 July 2013) was a Russian mathematician, active in the field of probability theory. He was a PhD student of Andrey Kolmogorov at the Moscow State University, where he obtained his PhD in 1956. Prokhorov became a corresponding member of the Russian Academy of Sciences in 1966, a full member in 1972. He was a vice-president of the IMU. He received Lenin Prize in 1970, Order of the Red Banner of Labour in 1975 and 1979. He was also an editor of the Great Soviet Encyclopedia. See also Lévy–Prokhorov metric Prokhorov's theorem
https://en.wikipedia.org/wiki/SMPTE%20424M
SMPTE 424M is a standard published by SMPTE which expands upon SMPTE 259M, SMPTE 344M, and SMPTE 292M allowing for bit-rates of 2.970 Gbit/s and 2.970/1.001 Gbit/s over a single-link coaxial cable. These bit-rates are sufficient for 1080p video at 50 or 60 frames per second. The initial 424M standard was published in 2006, with a revision published in 2012 (SMPTE ST 424:2012). This standard is part of a family of standards that define a serial digital interface (SDI); it is commonly known as 3G-SDI. Formats Within this standard, there are three formats: Level A format is the direct mapping of uncompressed 1080p (up to 60 fps) video into a serial digital interface at the nominal 3 Gbit/s. That is, one video signal, one video stream, in one cable. Level B-DL format is the mapping of dual-link HD-SDI/SMPTE 372M (i.e.: 1080p up to 60 fps) in a single serial digital interface at the nominal 3 Gbit/s. That is, one video signal, two streams, in one cable. Level B-DS format is the dual-stream carriage of two independent HD-SDI/SMPTE 292M signals (720p up to 60 fps or 1080i/1080p up to 30 fps) in a single serial digital interface at the nominal 3 Gbit/s. That is, two video signals, two video streams, in one cable.
https://en.wikipedia.org/wiki/Rhizosphere
The rhizosphere is the narrow region of soil or substrate that is directly influenced by root secretions and associated soil microorganisms known as the root microbiome. Soil pores in the rhizosphere can contain many bacteria and other microorganisms that feed on sloughed-off plant cells, termed rhizodeposition, and the proteins and sugars released by roots, termed root exudates. This symbiosis leads to more complex interactions, influencing plant growth and competition for resources. Much of the nutrient cycling and disease suppression by antibiotics required by plants, occurs immediately adjacent to roots due to root exudates and metabolic products of symbiotic and pathogenic communities of microorganisms. The rhizosphere also provides space to produce allelochemicals to control neighbours and relatives. The rhizoplane refers to the root surface including its associated soil particles which closely interact with each other. The plant-soil feedback loop and other physical factors occurring at the plant-root soil interface are important selective pressures in communities and growth in the rhizosphere and rhizoplane. Background The term "rhizosphere" was used first in 1904 by the German plant physiologist Lorenz Hiltner to describe how plant roots interface with surrounding soil. The prefix rhiza- comes from the Greek, and means "root". Hiltner postulated the rhizosphere was a region surrounding the plant roots, and populated with microorganisms under some degree of control by chemicals released from the plant roots. Chemical interactions Chemical availability Plant roots may exude 20-40% of the sugars and organic acids - photosynthetically fixed carbon. Plant root exudates, such as organic acids, change the chemical structure and the biological communities of the rhizosphere in comparison with the bulk soil or parent soil. Concentrations of organic acids and saccharides affect the ability of the biological communities to shuttle phosphorus, nitrogen, potassium
https://en.wikipedia.org/wiki/Features%20new%20to%20Windows%208
The transition from Windows 7 to Windows 8 introduced a number of new features across various aspects of the operating system. These include a greater focus on optimizing the operating system for touchscreen-based devices (such as tablets) and cloud computing. Development platform Language and standards support Windows 8 introduces the new Windows Runtime (WinRT) platform, which can be used to create a new type of application officially known as Windows Store apps and commonly called Metro-style apps. Such apps run within a secure sandbox and share data with other apps through common APIs. WinRT, being a COM-based API, allows for the use of various programming languages to code apps, including C++, C++/CX, C#, Visual Basic .NET, or HTML5 and JavaScript. Metro-style apps are packaged and distributed via APPX, a new file format for package management. Unlike desktop applications, Metro-style apps can be sideloaded, subject to licensing conditions. Windows 8.1 Update allows for sideloading apps on all Windows 8.1 Pro devices joined to an Active Directory domain. In Windows 8 up to two apps may snap to the side of a widescreen display to allow multi-tasking, forming a sidebar that separates the apps. In Windows 8.1, apps can continually be resized to the desired width. Snapped apps may occupy half of the screen. Large screens allow up to four apps to be snapped. Upon launching an app, Windows allows the user to pick which snapped view the app should open into. The term "Metro-style apps" referred to "Metro", a design language prominently used by Windows 8 and other recent Microsoft products. Reports surfaced that Microsoft employees were told to stop using the term due to potential trademark issues with an unspecified partner. A Microsoft spokesperson however, denied these reports and stated that "Metro-style" was merely a codename for the new application platform. Windows 8 introduces APIs to support near field communication (NFC) on Windows 8 devices, allowing
https://en.wikipedia.org/wiki/Septum%20verum
Septum Verum (true septum) is a region in the lower medial part of the telencephalon that separates the two cerebral hemispheres. The human septum consists of two parts: the septum pellucidum (translucent septum), a thin membrane consisting of white matter and glial cells that separate the lateral ventricles, and the lower, precommisural septum verum, which consists of nuclei and grey matter.  The term is sometimes used synonymously with Area Septalis, to refer to the precommisural part of the lower base of the telencephalon. The Septum verum contains the septal nuclei, which are usually considered part of the limbic system. Syntopy Laterally, the septum verum reaches the lower part of the lateral ventricles, with the septal nuclei forming a bulge into the medial side of the ventricles. Dorsally can be found the septum pellucidum, a thin membrane of glial cells and fibres that separate the ventricles, and anteriorly is the lamina terminalis.  It continues caudally as the pre-optic area and hypothalamus.  The subfornical organ (SFO) can also be found in this area, between the ventral side of the fornix and the interventricular foramina. Function The septum is considered a part of the limbic system, mediating the connection between the cortex and subcortical limbic nuclei.  The septum projects fibres to the hypothalamus, hippocampus, amygdala, reticular formation and olfactory cortical areas, suggesting a role in limbic regulation. While the exact function remains controversial, the septum is considered a pleasure zone in animals, studies have shown that stimulation of the septal area can bring feelings of satisfaction to euphoria and damage can cause hyperactivity and fury. Evolutionary Significance With the exception of the nucleus septalis triangularis, all septal nuclei appear to have progressed in size during the evolution of higher primates. See also Septum Pellucidum Septal Nuclei This article uses anatomical terminology; for an overview, see Anatomica
https://en.wikipedia.org/wiki/Maximum%20common%20induced%20subgraph
In graph theory and theoretical computer science, a maximum common induced subgraph of two graphs G and H is a graph that is an induced subgraph of both G and H, and that has as many vertices as possible. Finding this graph is NP-hard. In the associated decision problem, the input is two graphs G and H and a number k. The problem is to decide whether G and H have a common induced subgraph with at least k vertices. This problem is NP-complete. It is a generalization of the induced subgraph isomorphism problem, which arises when k equals the number of vertices in the smaller of G and H, so that this entire graph must appear as an induced subgraph of the other graph. Based on hardness of approximation results for the maximum independent set problem, the maximum common induced subgraph problem is also hard to approximate. This implies that, unless P = NP, there is no approximation algorithm that, in polynomial time on -vertex graphs, always finds a solution within a factor of of optimal, for any . One possible solution for this problem is to build a modular product graph of G and H. In this graph, the largest clique corresponds to a maximum common induced subgraph of G and H. Therefore, algorithms for finding maximum cliques can be used to find the maximum common induced subgraph. Moreover, a modified maximum-clique algorithm can be used to find a maximum common connected subgraph. The McSplit algorithm (along with its McSplit↓ variant) is a forward checking algorithm that does not use the clique encoding, but uses a compact data structure to keep track of the vertices in graph H to which each vertex in graph G may be mapped. Both versions of the McSplit algorithm outperform the clique encoding for many graph classes.A more efficient implementation of McSplit is McSplitDAL+PR, which combines a Reinforcement Learning agent with some heuristic scores computed with the PageRank algorithm. Maximum common induced subgraph algorithms have a long tradition in cheminforma
https://en.wikipedia.org/wiki/Social%20cognitive%20optimization
Social cognitive optimization (SCO) is a population-based metaheuristic optimization algorithm which was developed in 2002. This algorithm is based on the social cognitive theory, and the key point of the ergodicity is the process of individual learning of a set of agents with their own memory and their social learning with the knowledge points in the social sharing library. It has been used for solving continuous optimization, integer programming, and combinatorial optimization problems. It has been incorporated into the NLPSolver extension of Calc in Apache OpenOffice. Algorithm Let be a global optimization problem, where is a state in the problem space . In SCO, each state is called a knowledge point, and the function is the goodness function. In SCO, there are a population of cognitive agents solving in parallel, with a social sharing library. Each agent holds a private memory containing one knowledge point, and the social sharing library contains a set of knowledge points. The algorithm runs in T iterative learning cycles. By running as a Markov chain process, the system behavior in the tth cycle only depends on the system status in the (t − 1)th cycle. The process flow is in follows: [1. Initialization]:Initialize the private knowledge point in the memory of each agent , and all knowledge points in the social sharing library , normally at random in the problem space . [2. Learning cycle]: At each cycle : [2.1. Observational learning] For each agent : [2.1.1. Model selection]:Find a high-quality model point in , normally realized using tournament selection, which returns the best knowledge point from randomly selected points. [2.1.2. Quality Evaluation]:Compare the private knowledge point and the model point ,and return the one with higher quality as the base point ,and another as the reference point 。 [2.1.3. Learning]:Combine and to generate a new knowledge point . Normally should be around ,and the distance with is related to the
https://en.wikipedia.org/wiki/Missing%20heritability%20problem
The missing heritability problem is the fact that single genetic variations cannot account for much of the heritability of diseases, behaviors, and other phenotypes. This is a problem that has significant implications for medicine, since a person's susceptibility to disease may depend more on the combined effect of all the genes in the background than on the disease genes in the foreground, or the role of genes may have been severely overestimated. Discovery The missing heritability problem was named as such in 2008 (after the "missing baryon problem" in physics). The Human Genome Project led to optimistic forecasts that the large genetic contributions to many traits and diseases (which were identified by quantitative genetics and behavioral genetics in particular) would soon be mapped and pinned down to specific genes and their genetic variants by methods such as candidate-gene studies which used small samples with limited genetic sequencing to focus on specific genes believed to be involved, examining single-nucleotide polymorphisms (SNPs). While many hits were found, they often failed to replicate in other studies. The exponential fall in genome sequencing costs led to the use of genome-wide association studies (GWASes) which could simultaneously examine all candidate-genes in larger samples than the original finding, where the candidate-gene hits were found to almost always be false positives and only 2-6% replicate; in the specific case of intelligence candidate-gene hits, only 1 candidate-gene hit replicated, the top 25 schizophrenia candidate-genes were no more associated with schizophrenia than chance, and of 15 neuroimaging hits, none did. The editorial board of Behavior Genetics noted, in setting more stringent requirements for candidate-gene publications, that "the literature on candidate gene associations is full of reports that have not stood up to rigorous replication...it now seems likely that many of the published findings of the last decade are
https://en.wikipedia.org/wiki/Hyperthermia
Hyperthermia, also known simply as overheating, is a condition in which an individual's body temperature is elevated beyond normal due to failed thermoregulation. The person's body produces or absorbs more heat than it dissipates. When extreme temperature elevation occurs, it becomes a medical emergency requiring immediate treatment to prevent disability or death. Almost half a million deaths are recorded every year from hyperthermia. The most common causes include heat stroke and adverse reactions to drugs. Heat stroke is an acute temperature elevation caused by exposure to excessive heat, or combination of heat and humidity, that overwhelms the heat-regulating mechanisms of the body. The latter is a relatively rare side effect of many drugs, particularly those that affect the central nervous system. Malignant hyperthermia is a rare complication of some types of general anesthesia. Hyperthermia can also be caused by a traumatic brain injury. Hyperthermia differs from fever in that the body's temperature set point remains unchanged. The opposite is hypothermia, which occurs when the temperature drops below that required to maintain normal metabolism. The term is from Greek ὑπέρ, hyper, meaning "above", and θέρμος, thermos, meaning "heat". Classification In humans, hyperthermia is defined as a temperature greater than , depending on the reference used, that occurs without a change in the body's temperature set point. The normal human body temperature can be as high as in the late afternoon. Hyperthermia requires an elevation from the temperature that would otherwise be expected. Such elevations range from mild to extreme; body temperatures above can be life-threatening. Signs and symptoms An early stage of hyperthermia can be "heat exhaustion" (or "heat prostration" or "heat stress"), whose symptoms can include heavy sweating, rapid breathing and a fast, weak pulse. If the condition progresses to heat stroke, then hot, dry skin is typical as blood vessels
https://en.wikipedia.org/wiki/Magnetic%20anisotropy
In condensed matter physics, magnetic anisotropy describes how an object's magnetic properties can be different depending on direction. In the simplest case, there is no preferential direction for an object's magnetic moment. It will respond to an applied magnetic field in the same way, regardless of which direction the field is applied. This is known as magnetic isotropy. In contrast, magnetically anisotropic materials will be easier or harder to magnetize depending on which way the object is rotated. For most magnetically anisotropic materials, there are two easiest directions to magnetize the material, which are a 180° rotation apart. The line parallel to these directions is called the easy axis. In other words, the easy axis is an energetically favorable direction of spontaneous magnetization. Because the two opposite directions along an easy axis are usually equivalently easy to magnetize along, the actual direction of magnetization can just as easily settle into either direction, which is an example of spontaneous symmetry breaking. Magnetic anisotropy is a prerequisite for hysteresis in ferromagnets: without it, a ferromagnet is superparamagnetic. Sources The observed magnetic anisotropy in an object can happen for several different reasons. Rather than having a single cause, the overall magnetic anisotropy of a given object is often explained by a combination of these different factors: Magnetocrystalline anisotropy The atomic structure of a crystal introduces preferential directions for the magnetization. Shape anisotropy When a particle is not perfectly spherical, the demagnetizing field will not be equal for all directions, creating one or more easy axes. Magnetoelastic anisotropy Tension may alter magnetic behaviour, leading to magnetic anisotropy. Exchange anisotropy Occurs when antiferromagnetic and ferromagnetic materials interact. At the molecular level The magnetic anisotropy of a benzene ring (A), alkene (B), carbonyl (C), alkyne (D), and
https://en.wikipedia.org/wiki/Algal%20bloom
An algal bloom or algae bloom is a rapid increase or accumulation in the population of algae in freshwater or marine water systems. It is often recognized by the discoloration in the water from the algae's pigments. The term algae encompasses many types of aquatic photosynthetic organisms, both macroscopic multicellular organisms like seaweed and microscopic unicellular organisms like cyanobacteria.  Algal bloom commonly refers to the rapid growth of microscopic unicellular algae, not macroscopic algae. An example of a macroscopic algal bloom is a kelp forest. Algal blooms are the result of a nutrient, like nitrogen or phosphorus from various sources (for example fertilizer runoff or other forms of nutrient pollution), entering the aquatic system and causing excessive growth of algae. An algal bloom affects the whole ecosystem. Consequences range from the benign feeding of higher trophic levels to more harmful effects like blocking sunlight from reaching other organisms, causing a depletion of oxygen levels in the water, and, depending on the organism, secreting toxins into the water. Blooms that can injure animals or the ecology, especially those blooms where toxins are secreted by the algae, are usually called "harmful algal blooms" (HAB), and can lead to fish die-offs, cities cutting off water to residents, or states having to close fisheries. The process of the oversupply of nutrients leading to algae growth and oxygen depletion is called eutrophication. Algal and bacterial blooms have persistently contributed to mass extinctions driven by global warming in the geologic past, such as during the end-Permian extinction driven by Siberian Traps volcanism and the biotic recovery following the mass extinction. Bloom characterization The term algal bloom is defined inconsistently depending on the scientific field and can range from a "minibloom" of harmless algae to a large, harmful bloom event. Since algae is a broad term including organisms of widely varying siz
https://en.wikipedia.org/wiki/E.%20Jacquelin%20Dietz
E. Jacquelin Dietz (1951-2020) was an American statistician, interested in nonparametric and multivariate statistics and in statistics education. She was a professor at North Carolina State University until 2004, when she moved to Meredith College. At Meredith, she was head of the mathematics and computer science department for five years, from approximately 2007 to 2012, and taught statistics for 10 years. Dietz was the founding editor-in-chief of Journal of Statistics Education. Education and career Dietz graduated from Oberlin College in 1973, majoring in mathematics and psychobiology, a subject she added to her mathematics courses in order to make her studies less theoretical and more relevant. She entered graduate study at the University of Connecticut in biobehavioral science, but after taking a required statistics course switched to that subject, and completed a master's degree and a Ph.D. in 1975 and 1978 respectively. Her dissertation, supervised by Timothy John Killeen, was Bivariate Nonparametric Tests for the One-Sample Location Problem. Contributions to statistics education Dietz's first scholarly publication in statistics education was in 1989. She founded the Journal of Statistics Education in 1992, and shepherded it into becoming an official publication of the American Statistical Association beginning in 1999; she remained as its editor until 2000. Recognition Dietz was elected as a Fellow of the American Statistical Association in 1996. She was also a winner of the Founder's Award of the American Statistical Association.
https://en.wikipedia.org/wiki/Wireworld
Wireworld, alternatively WireWorld, is a cellular automaton first proposed by Brian Silverman in 1987, as part of his program Phantom Fish Tank. It subsequently became more widely known as a result of an article in the "Computer Recreations" column of Scientific American. Wireworld is particularly suited to simulating transistors, and is Turing-complete. Rules A Wireworld cell can be in one of four different states, usually numbered 0–3 in software, modeled by colors in the examples here: empty (black), electron head (blue), electron tail (red), conductor (yellow). As in all cellular automata, time proceeds in discrete steps called generations (sometimes "gens" or "ticks"). Cells behave as follows: empty → empty, electron head → electron tail, electron tail → conductor, conductor → electron head if exactly one or two of the neighbouring cells are electron heads, otherwise remains conductor. Wireworld uses what is called the Moore neighborhood, which means that in the rules above, neighbouring means one cell away (range value of one) in any direction, both orthogonal and diagonal. These simple rules can be used to construct logic gates (see below). Applications Entities built within Wireworld universes include Langton's Ant (allowing any Langton's Ant pattern to be built within Wireworld) and the Wireworld computer, a Turing-complete computer implemented as a cellular automaton. See also von Neumann's cellular automaton
https://en.wikipedia.org/wiki/Protocol%20composition%20logic
Protocol Composition Logic is a formal method that is used for proving security properties of protocols that use symmetric key and Public key cryptography. PCL is designed around a process calculi with actions for possible protocol steps like generating some random number, perform encryption and decryption, send and receive messages and digital signature verification actions. Some problems with the logic have been found implying that some currently claimed proofs cannot be proven within the logic.
https://en.wikipedia.org/wiki/Porphyromonas%20gingivalis
Porphyromonas gingivalis belongs to the phylum Bacteroidota and is a nonmotile, Gram-negative, rod-shaped, anaerobic, pathogenic bacterium. It forms black colonies on blood agar. It is found in the oral cavity, where it is implicated in periodontal disease, as well as in the upper gastrointestinal tract, the respiratory tract and the colon. It has been isolated from women with bacterial vaginosis. Collagen degradation observed in chronic periodontal disease results in part from the collagenase enzymes of this species. It has been shown in an in vitro study that P. gingivalis can invade human gingival fibroblasts and can survive in the presence of antibiotics. P. gingivalis invades gingival epithelial cells in high numbers, in which case both bacteria and epithelial cells survive for extended periods of time. High levels of specific antibodies can be detected in patients harboring P. gingivalis. P. gingivalis infection has been linked to Alzheimer's disease and rheumatoid arthritis. It contains the enzyme peptidyl-arginine deiminase, which is involved in citrullination. Patients with rheumatoid arthritis have increased incidence of periodontal disease; antibodies against the bacterium are significantly more common in these patients. P. gingivalis is divided into K-serotypes based upon capsular antigenicity of the various types. These serotypes have been the drivers of observations regarding bacterial cell to cell interactions to the associated serotype-dependent immune response and risk with pancreatic cancer. Genome The genome of P. gingivalis was described in 2003 revealing 1,990 open reading frames (i.e. protein-coding sequences), encoded by 2,343,479 bp, with an average G+C content of 48.3%. An estimated 463 genes are essential. Virulence factors Gingipain Arg-gingipain (Rgp) and lys-gingipain (Kgp) are endopeptidase enzymes secreted by P. gingivalis. These gingipains serve many functions for the organism, contributing to its survival and virulence. Arg
https://en.wikipedia.org/wiki/Amazon%20Reef
The Amazon Reef, or Amazonian Reef, is an extensive coral and sponge reef system, located in the Atlantic Ocean off the coast of French Guiana and northern Brazil. It is one of the largest known reef systems in the world, with scientists estimating its length at over , and its area as over . Publication of its discovery was released in April 2016, following an oceanographic study of the region in 2012. Evidence of a large structure near the delta of the Amazon River dated from as early as the 1950s. History In the 1970s, the biologist Rodrigo Moura completed a study on fishing on the continental shelf and wanted to expand his research by locating the reefs where he caught the fish. When Moura located the fish he caught around the Amazon Reef and in the mouth of the Amazon River, he saw this as an indication that there must be biodiversity underneath, as the fish was indicated to be a coral reef fish. A few decades later a group of students from the University of Georgia noted that Moura's article did not contain GPS coordinates and used Moura's sound waves and sea floor samples to locate the reef. Once they believed they had located the reef they dredged the bottom to confirm that this was its location. The process of finding the reef took about three years before an official announcement was made about its discovery. The Amazon River is home to about 20 percent of the world's fresh water supply, placing the Amazon Reef at the mouth of the largest river in the world, where every day one fifth of the world's water flows into the ocean from the Amazon River. Because of this, the Amazon Reef is less biologically diverse compared to other reefs of its kind. Geography and ecology The reef system has been identified as a coral and sponge reef. Scientists estimated the reef's size to be in area, and over in length, making it one of the largest reef systems in the world, comparable to the size of the island of Cyprus. Another estimate also puts the general ecoregio
https://en.wikipedia.org/wiki/Fiber-reinforced%20cementitious%20matrix
A fiber-reinforced cementitious matrix (FRCM) is a reinforcement system composed by fibers (such as steel, aramid, basalt, carbon, polyparaphenylenebenzobisoxazole, and glass) embedded in an inorganic-based matrix, usually made by cement or lime mortar. In international literature, FRCMs are also called textile-reinforced concrete (TRC), textile reinforced mortars (TRM), fabric-reinforced mortar (FRM), or inorganic matrix-grid composites (IMG). Starting from the second decade of the 21st century they are used for the structural rehabilitation of existing buildings, in particular made by masonry (existing and historical) or by reinforced concrete, to increase their load-bearing capacity under both vertical and horizontal loads (including seismic ones). History FRCM efficacy stands in the association of more materials together to give better mechanical properties to the structural systems. An historical example that shares some features with FRCM is the association of sun-dried clay and straw for the production of bricks in Mesopotamia, or the Roman cocciopesto. The first FRP composite materials appeared in the 1940s in aeronautical engineering. FRCM composite materials, on the other hand, have seen their first applications in the early years of the 21th century. Indeed, in the second decade of the same century, FRCMs have joined the now classic FRPs in terms of importance for structural rehabilitation. This is due to the fact that the inorganic matrix has shown numerous advantages, compared with the organic counterpart (FRP), including a better response when applied to fragile substrates such as masonry and reinforced concrete, thanks to the greater compatibility of the mortar layer when applied on such substrates. Properties FRCM composites constitute systems or kits according to the definition set out in point 2 of the art. 2 of EU Regulation 305/2011. They are composed by two fundamental components: an inorganic matrix and a reinforcement. Sometimes, to impr
https://en.wikipedia.org/wiki/OpenNMS
OpenNMS is a free and open-source enterprise grade network monitoring and network management platform. It is developed and supported by a community of users and developers and by the OpenNMS Group, offering commercial services, training and support. The goal is for OpenNMS to be a truly distributed, scalable management application platform for all aspects of the FCAPS network management model while remaining 100% free and open source. Currently the focus is on Fault and Performance Management. All code associated with the project is available under the Affero General Public License. The OpenNMS Project is maintained by The Order of the Green Polo. History The OpenNMS Project was started in July, 1999 by Steve Giles, Brian Weaver and Luke Rindfuss and their company PlatformWorks. It was registered as project 4141 on SourceForge in March 2000. On September 28, 2000, PlatformWorks was acquired by Atipa, a Kansas City-based competitor to VA Linux Systems. In July 2001, Atipa changed its name to Oculan. In September 2002, Oculan decided to stop supporting the OpenNMS project. Tarus Balog, then an Oculan employee, left the company to continue to focus on the project. In September 2004, The OpenNMS Group was started by Balog, Matt Brozowski and David Hustace to provide a commercial services and support business around the project. Shortly after that, The Order of the Green Polo (OGP) was founded to manage the OpenNMS Project itself. While many members of the OGP are also employees of The OpenNMS Group, it remains a separate organization. Platform support and requirements OpenNMS is written in Java, and thus can run on any platform with support for a Java SDK version 11 or higher. Precompiled binaries are available for most Linux distributions. In addition to Java, it requires the PostgreSQL database, although work is being done to make the application database independent by leveraging the Hibernate project. Features OpenNMS describes itself as a "network m
https://en.wikipedia.org/wiki/PIPES
PIPES is the common name for piperazine-N,N-bis(2-ethanesulfonic acid), and is a frequently used buffering agent in biochemistry. It is an ethanesulfonic acid buffer developed by Good et al. in the 1960s. Applications PIPES has two pKa values. One pKa (6.76 at 25 °C) is near the physiological pH which makes it useful in cell culture work. Its effective buffering range is 6.1-7.5 at 25 °C. The second pKa value is at 2.67 with a buffer range of from 1.5-3.5. PIPES has been documented minimizing lipid loss when buffering glutaraldehyde histology in plant and animal tissues. Fungal zoospore fixation for fluorescence microscopy and electron microscopy were optimized with a combination of glutaraldehyde and formaldehyde in PIPES buffer. It has a negligible capacity to bind divalent ions. See also MOPS HEPES MES Tris Common buffer compounds used in biology Good's buffers
https://en.wikipedia.org/wiki/Memory%20architecture
Memory architecture describes the methods used to implement electronic computer data storage in a manner that is a combination of the fastest, most reliable, most durable, and least expensive way to store and retrieve information. Depending on the specific application, a compromise of one of these requirements may be necessary in order to improve another requirement. Memory architecture also explains how binary digits are converted into electric signals and then stored in the memory cells. And also the structure of a memory cell. For example, dynamic memory is commonly used for primary data storage due to its fast access speed. However dynamic memory must be repeatedly refreshed with a surge of current dozens of time per second, or the stored data will decay and be lost. Flash memory allows for long-term storage over a period of years, but it is much slower than dynamic memory, and the static memory storage cells wear out with frequent use. Similarly, the data bus is often designed to suit specific needs such as serial or parallel data access, and the memory may be designed to provide for parity error detection or even error correction. The earliest memory architectures are the Harvard architecture, which has two physically separate memories and data paths for program and data, and the Princeton architecture which uses a single memory and data path for both program and data storage. Most general purpose computers use a hybrid split-cache modified Harvard architecture that appears to an application program to have a pure Princeton architecture machine with gigabytes of virtual memory, but internally (for speed) it operates with an instruction cache physically separate from a data cache, more like the Harvard model. DSP systems usually have a specialized, high bandwidth memory subsystem; with no support for memory protection or virtual memory management. Many digital signal processors have 3 physically separate memories and datapaths -- program storage, coefficie
https://en.wikipedia.org/wiki/Genetic%20hitchhiking
Genetic hitchhiking, also called genetic draft or the hitchhiking effect, is when an allele changes frequency not because it itself is under natural selection, but because it is near another gene that is undergoing a selective sweep and that is on the same DNA chain. When one gene goes through a selective sweep, any other nearby polymorphisms that are in linkage disequilibrium will tend to change their allele frequencies too. Selective sweeps happen when newly appeared (and hence still rare) mutations are advantageous and increase in frequency. Neutral or even slightly deleterious alleles that happen to be close by on the chromosome 'hitchhike' along with the sweep. In contrast, effects on a neutral locus due to linkage disequilibrium with newly appeared deleterious mutations are called background selection. Both genetic hitchhiking and background selection are stochastic (random) evolutionary forces, like genetic drift. History The term hitchhiking was coined in 1974 by Maynard Smith and John Haigh. Subsequently the phenomenon was studied by John H. Gillespie and others. Outcomes Hitchhiking occurs when a polymorphism is in linkage disequilibrium with a second locus that is undergoing a selective sweep. The allele that is linked to the adaptation will increase in frequency, in some cases until it becomes fixed in the population. The other allele, which is linked to the non-advantageous version, will decrease in frequency, in some cases until extinction. Overall, hitchhiking reduces the amount of genetic variation. A hitchhiker mutation (or passenger mutation in cancer biology) may itself be neutral, advantageous, or deleterious. Recombination can interrupt the process of genetic hitchhiking, ending it before the hitchhiking neutral or deleterious allele becomes fixed or goes extinct. The closer a hitchhiking polymorphism is to the gene under selection, the less opportunity there is for recombination to occur. This leads to a reduction in genetic variation near
https://en.wikipedia.org/wiki/Proofs%20of%20trigonometric%20identities
There are several equivalent ways for defining trigonometric functions, and the proof of the trigonometric identities between them depend on the chosen definition. The oldest and somehow the most elementary definition is based on the geometry of right triangles. The proofs given in this article use this definition, and thus apply to non-negative angles not greater than a right angle. For greater and negative angles, see Trigonometric functions. Other definitions, and therefore other proofs are based on the Taylor series of sine and cosine, or on the differential equation to which they are solutions. Elementary trigonometric identities Definitions The six trigonometric functions are defined for every real number, except, for some of them, for angles that differ from 0 by a multiple of the right angle (90°). Referring to the diagram at the right, the six trigonometric functions of θ are, for angles smaller than the right angle: Ratio identities In the case of angles smaller than a right angle, the following identities are direct consequences of above definitions through the division identity They remain valid for angles greater than 90° and for negative angles. Or Complementary angle identities Two angles whose sum is π/2 radians (90 degrees) are complementary. In the diagram, the angles at vertices A and B are complementary, so we can exchange a and b, and change θ to π/2 − θ, obtaining: Pythagorean identities Identity 1: The following two results follow from this and the ratio identities. To obtain the first, divide both sides of by ; for the second, divide by . Similarly Identity 2: The following accounts for all three reciprocal functions. Proof 2: Refer to the triangle diagram above. Note that by Pythagorean theorem. Substituting with appropriate functions - Rearranging gives: Angle sum identities Sine Draw a horizontal line (the x-axis); mark an origin O. Draw a line from O at an angle above the horizontal line and a second line at an
https://en.wikipedia.org/wiki/De%20Bruijn%20sequence
In combinatorial mathematics, a de Bruijn sequence of order n on a size-k alphabet A is a cyclic sequence in which every possible length-n string on A occurs exactly once as a substring (i.e., as a contiguous subsequence). Such a sequence is denoted by and has length , which is also the number of distinct strings of length n on A. Each of these distinct strings, when taken as a substring of , must start at a different position, because substrings starting at the same position are not distinct. Therefore, must have at least symbols. And since has exactly symbols, de Bruijn sequences are optimally short with respect to the property of containing every string of length n at least once. The number of distinct de Bruijn sequences is The sequences are named after the Dutch mathematician Nicolaas Govert de Bruijn, who wrote about them in 1946. As he later wrote, the existence of de Bruijn sequences for each order together with the above properties were first proved, for the case of alphabets with two elements, by . The generalization to larger alphabets is due to . Automata for recognizing these sequences are denoted as de Bruijn automata. In most applications, A = {0,1}. History The earliest known example of a de Bruijn sequence comes from Sanskrit prosody where, since the work of Pingala, each possible three-syllable pattern of long and short syllables is given a name, such as 'y' for short–long–long and 'm' for long–long–long. To remember these names, the mnemonic yamātārājabhānasalagām is used, in which each three-syllable pattern occurs starting at its name: 'yamātā' has a short–long–long pattern, 'mātārā' has a long–long–long pattern, and so on, until 'salagām' which has a short–short–long pattern. This mnemonic, equivalent to a de Bruijn sequence on binary 3-tuples, is of unknown antiquity, but is at least as old as Charles Philip Brown's 1869 book on Sanskrit prosody that mentions it and considers it "an ancient line, written by Pāṇini". In 1894, A. de
https://en.wikipedia.org/wiki/BRENDA%20tissue%20ontology
The BRENDA tissue ontology (BTO) represents a comprehensive structured encyclopedia. It provides terms, classifications, and definitions of tissues, organs, anatomical structures, plant parts, cell cultures, cell types, and cell lines of organisms from all taxonomic groups (animals, plants, fungi, protozoon) as enzyme sources. The information is connected to the functional data in the BRENDA ("BRaunschweig ENzyme DAtabase“) enzyme information system. BTO is one of the first tissue-specific ontologies in life sciences, not restricted to a specific organism or a specific organism group providing a user-friendly access to the wide range of tissue and cell-type information. Databases, such as Ontology Lookup Service or ses, such as MIRIAM Registry or of the EBI-EMBL, the TissueDistributionDB, including the Tissue Synonym Library of the German Cancer Research Center (DKFZ) in Heidelberg or the Bioportal platform of the National Center for Biomedical Ontology in Stanford, USA rely on BTO and implement the encyclopedia as an essential repository of information into their respective platform. BTO enables users from medical research and pharmaceutical sciences to search for the occurrence and histological detection of disease-related enzymes in tissues, which play an important role in diagnosis, therapies, and drug development. In biochemistry and biotechnology the organism-specific tissue terms linked to enzyme functional data are an important resource for the understanding of the metabolism and regulation in life sciences. Ontologies represent classification systems that provide controlled and structured vocabularies. They are important tools to illustrate and to link evolutionary correlations. Development of BTO started in 2003, aimed to connect the biochemical and molecular biological enzyme data of BRENDA with a hierarchical and standardized collection of tissue-specific terms. The functional enzyme data and information in BRENDA have been manually annotated and str
https://en.wikipedia.org/wiki/Ultrahyperbolic%20equation
In the mathematical field of differential equations, the ultrahyperbolic equation is a partial differential equation (PDE) for an unknown scalar function of variables of the form More generally, if is any quadratic form in variables with signature , then any PDE whose principal part is is said to be ultrahyperbolic. Any such equation can be put in the form above by means of a change of variables. The ultrahyperbolic equation has been studied from a number of viewpoints. On the one hand, it resembles the classical wave equation. This has led to a number of developments concerning its characteristics, one of which is due to Fritz John: the John equation. In 2008, Walter Craig and Steven Weinstein proved that under a nonlocal constraint, the initial value problem is well-posed for initial data given on a codimension-one hypersurface. And later, in 2022, a research team at the University of Michigan extended the conditions for solving ultrahyperbolic wave equations to complex-time (kime), demonstrated space-kime dynamics, and showed data science applications using tensor-based linear modeling of functional magnetic resonance imaging data. The equation has also been studied from the point of view of symmetric spaces, and elliptic differential operators. In particular, the ultrahyperbolic equation satisfies an analog of the mean value theorem for harmonic functions. Notes
https://en.wikipedia.org/wiki/Invariant%20mass
The invariant mass, rest mass, intrinsic mass, proper mass, or in the case of bound systems simply mass, is the portion of the total mass of an object or system of objects that is independent of the overall motion of the system. More precisely, it is a characteristic of the system's total energy and momentum that is the same in all frames of reference related by Lorentz transformations. If a center-of-momentum frame exists for the system, then the invariant mass of a system is equal to its total mass in that "rest frame". In other reference frames, where the system's momentum is nonzero, the total mass (a.k.a. relativistic mass) of the system is greater than the invariant mass, but the invariant mass remains unchanged. Because of mass–energy equivalence, the rest energy of the system is simply the invariant mass times the speed of light squared. Similarly, the total energy of the system is its total (relativistic) mass times the speed of light squared. Systems whose four-momentum is a null vector (for example, a single photon or many photons moving in exactly the same direction) have zero invariant mass and are referred to as massless. A physical object or particle moving faster than the speed of light would have space-like four-momenta (such as the hypothesized tachyon), and these do not appear to exist. Any time-like four-momentum possesses a reference frame where the momentum (3-dimensional) is zero, which is a center of momentum frame. In this case, invariant mass is positive and is referred to as the rest mass. If objects within a system are in relative motion, then the invariant mass of the whole system will differ from the sum of the objects' rest masses. This is also equal to the total energy of the system divided by c2. See mass–energy equivalence for a discussion of definitions of mass. Since the mass of systems must be measured with a weight or mass scale in a center of momentum frame in which the entire system has zero momentum, such a scale always me
https://en.wikipedia.org/wiki/Bacterial%20anaerobic%20corrosion
Bacterial anaerobic corrosion is the bacterially-induced oxidation of metals. Corrosion of metals typically alters the metal to a form that is more stable. Thus, bacterial anaerobic corrosion typically occurs in conditions favorable to the corrosion of the underlying substrate. In humid, anoxic conditions the corrosion of metals occurs as a result of a redox reaction. This redox reaction generates molecular hydrogen from local hydrogen ions. Conversely, anaerobic corrosion occurs spontaneously. Anaerobic corrosion primarily occurs on metallic substrates but may also occur on concrete. Details Bacterial anaerobic corrosion typically impacts metallic substrates but may also occur in concrete. Corrosion of concrete mediums leads to considerable losses in industrial settings. When considering the corrosion of concrete there is significant documentation of structural degradation in concrete wastewater infrastructure where wastewater is collected or treated. Similarly, biofilms are important for bacterial anaerobic corrosion of metals in wastewater pipes. For bacterial anaerobic corrosion there is general corrosion of substrates as well as another form of corrosion known as pitting. In both general or pitting corrosion, the breakdown process occurs in aqueous conditions. Bacteria tend to form biofilms as their primary means of corroding metals, with different bacteria dominating across different settings. In municipal wastewater, Desulfovibrio desulfuricans is the main contributor to corrosion. Chemistry A base metal, such as iron (Fe) goes into aqueous solution as positively charged cation, Fe2+. As the metal is oxidized under anaerobic conditions by the protons of water, H+ ions are reduced to form molecular H2. This can be written in the following ways under acidic and neutral conditions respectively: Fe + 2 H+  →  Fe2+ + H2 Fe + 2 H2O  →  Fe(OH)2 + H2 Usually, a thin film of molecular hydrogen forms on the metal. Sulfate-reducing bacteria oxidize the molecular
https://en.wikipedia.org/wiki/Gadget%20%28computer%20science%29
In computational complexity theory, a gadget is a subunit of a problem instance that simulates the behavior of one of the fundamental units of a different computational problem. Gadgets are typically used to construct reductions from one computational problem to another, as part of proofs of NP-completeness or other types of computational hardness. The component design technique is a method for constructing reductions by using gadgets. traces the use of gadgets to a 1954 paper in graph theory by W. T. Tutte, in which Tutte provided gadgets for reducing the problem of finding a subgraph with given degree constraints to a perfect matching problem. However, the "gadget" terminology has a later origin, and does not appear in Tutte's paper. Example Many NP-completeness proofs are based on many-one reductions from 3-satisfiability, the problem of finding a satisfying assignment to a Boolean formula that is a conjunction (Boolean and) of clauses, each clause being the disjunction (Boolean or) of three terms, and each term being a Boolean variable or its negation. A reduction from this problem to a hard problem on undirected graphs, such as the Hamiltonian cycle problem or graph coloring, would typically be based on gadgets in the form of subgraphs that simulate the behavior of the variables and clauses of a given 3-satisfiability instance. These gadgets would then be glued together to form a single graph, a hard instance for the graph problem in consideration. For instance, the problem of testing 3-colorability of graphs may be proven NP-complete by a reduction from 3-satisfiability of this type. The reduction uses two special graph vertices, labeled as "Ground" and "False", that are not part of any gadget. As shown in the figure, the gadget for a variable x consists of two vertices connected in a triangle with the ground vertex; one of the two vertices of the gadget is labeled with x and the other is labeled with the negation of x. The gadget for a clause consists o
https://en.wikipedia.org/wiki/FRACTRAN
FRACTRAN is a Turing-complete esoteric programming language invented by the mathematician John Conway. A FRACTRAN program is an ordered list of positive fractions together with an initial positive integer input n. The program is run by updating the integer n as follows: for the first fraction f in the list for which nf is an integer, replace n by nf repeat this rule until no fraction in the list produces an integer when multiplied by n, then halt. gives the following FRACTRAN program, called PRIMEGAME, which finds successive prime numbers: Starting with n=2, this FRACTRAN program generates the following sequence of integers: 2, 15, 825, 725, 1925, 2275, 425, 390, 330, 290, 770, ... After 2, this sequence contains the following powers of 2: The exponent part of these powers of two are primes, 2, 3, 5, etc. Understanding a FRACTRAN program A FRACTRAN program can be seen as a type of register machine where the registers are stored in prime exponents in the argument n. Using Gödel numbering, a positive integer n can encode an arbitrary number of arbitrarily large positive integer variables. The value of each variable is encoded as the exponent of a prime number in the prime factorization of the integer. For example, the integer represents a register state in which one variable (which we will call v2) holds the value 2 and two other variables (v3 and v5) hold the value 1. All other variables hold the value 0. A FRACTRAN program is an ordered list of positive fractions. Each fraction represents an instruction that tests one or more variables, represented by the prime factors of its denominator. For example: tests v2 and v5. If and , then it subtracts 2 from v2 and 1 from v5 and adds 1 to v3 and 1 to v7. For example: Since the FRACTRAN program is just a list of fractions, these test-decrement-increment instructions are the only allowed instructions in the FRACTRAN language. In addition the following restrictions apply: Each time an instruction is execut
https://en.wikipedia.org/wiki/List%20of%20web%20analytics%20software
This is a list of web analytics software used to collect and display data about visiting website users. Self-hosted software Free / Open source (FLOSS) This is a comparison table of web analytics software released under a free software license. Proprietary This is a comparison table of web analytics proprietary software. Hosted / Software as a service This is a comparison table of hosted web analytics software as a service.
https://en.wikipedia.org/wiki/Synovial%20sarcoma%2C%20X%20breakpoint
Synovial sarcoma, X breakpoint (SSX) refers to a group of genes rearranged in synovial sarcoma. They include: SSX1 SSX2 and SSX2B SSX3 SSX4 and SSX4B SSX5 SSX6 SSX7 SSX8 SSX9 SSX10 The group also has several associated pseudogenes, and the interacting protein SSX2IP. The translocation t(X;18) creates a fusion of the SYT gene(at 18q11) with either SSX1 or SSX2 (both at Xp11). Neither SYT, nor the SSX proteins contain DNA-binding domains. Instead, they appear to be transcriptional regulators whose actions are mediated primarily through protein–protein interactions, with BRM in the case of SYT, and with Polycomb group repressors in the case of SSX.
https://en.wikipedia.org/wiki/Flag%20%28geometry%29
In (polyhedral) geometry, a flag is a sequence of faces of a polytope, each contained in the next, with exactly one face from each dimension. More formally, a flag of an -polytope is a set such that and there is precisely one in for each , Since, however, the minimal face and the maximal face must be in every flag, they are often omitted from the list of faces, as a shorthand. These latter two are called improper faces. For example, a flag of a polyhedron comprises one vertex, one edge incident to that vertex, and one polygonal face incident to both, plus the two improper faces. A polytope may be regarded as regular if, and only if, its symmetry group is transitive on its flags. This definition excludes chiral polytopes. Incidence geometry In the more abstract setting of incidence geometry, which is a set having a symmetric and reflexive relation called incidence defined on its elements, a flag is a set of elements that are mutually incident. This level of abstraction generalizes both the polyhedral concept given above as well as the related flag concept from linear algebra. A flag is maximal if it is not contained in a larger flag. An incidence geometry (Ω, ) has rank if Ω can be partitioned into sets Ω1, Ω2, ..., Ω, such that each maximal flag of the geometry intersects each of these sets in exactly one element. In this case, the elements of set Ω are called elements of type . Consequently, in a geometry of rank , each maximal flag has exactly elements. An incidence geometry of rank 2 is commonly called an incidence structure with elements of type 1 called points and elements of type 2 called blocks (or lines in some situations). More formally, An incidence structure is a triple D = (V, B, ) where V and B are any two disjoint sets and is a binary relation between V and B, that is, ⊆ V × B. The elements of V will be called points, those of B blocks and those of flags. Notes
https://en.wikipedia.org/wiki/Stiction
Stiction (a portmanteau of the words static and friction) is the force that needs to be overcome to enable relative motion of stationary objects in contact. Any solid objects pressing against each other (but not sliding) will require some threshold of force parallel to the surface of contact in order to overcome static adhesion. Stiction is a threshold, not a continuous force. However, stiction might also be an illusion made by the rotation of kinetic friction. In situations where two surfaces with areas below the micrometer scale come into close proximity (as in an accelerometer), they may adhere together. At this scale, electrostatic and/or Van der Waals and hydrogen bonding forces become significant. The phenomenon of two such surfaces being adhered together in this manner is also called stiction. Stiction may be related to hydrogen bonding or residual contamination. Automobiles Stiction is also the same threshold at which a rolling object would begin to slide over a surface rather than rolling at the expected rate (and in the case of a wheel, in the expected direction). In this case, it's called "rolling friction" or μr. This is why driver training courses teach that, if a car begins to slide sideways, the driver should avoid braking and instead try to steer in the same direction as the slide. This gives the wheels a chance to regain static contact by rolling, which gives the driver some control again. Similarly, when trying to accelerate rapidly (particularly from a standing start) an overenthusiastic driver may "squeal" the driving wheels, but this impressive display of noise and smoke is less effective than maintaining static contact with the road. Many stunt-driving techniques (such as drifting) are done by deliberately breaking and/or regaining this rolling friction. A car on a slippery surface can slide a long way with little control over orientation if the driver "locks" the wheels in stationary positions by pressing hard on the brakes. Anti-lock br
https://en.wikipedia.org/wiki/Pinhole%20camera%20model
The pinhole camera model describes the mathematical relationship between the coordinates of a point in three-dimensional space and its projection onto the image plane of an ideal pinhole camera, where the camera aperture is described as a point and no lenses are used to focus light. The model does not include, for example, geometric distortions or blurring of unfocused objects caused by lenses and finite sized apertures. It also does not take into account that most practical cameras have only discrete image coordinates. This means that the pinhole camera model can only be used as a first order approximation of the mapping from a 3D scene to a 2D image. Its validity depends on the quality of the camera and, in general, decreases from the center of the image to the edges as lens distortion effects increase. Some of the effects that the pinhole camera model does not take into account can be compensated, for example by applying suitable coordinate transformations on the image coordinates; other effects are sufficiently small to be neglected if a high quality camera is used. This means that the pinhole camera model often can be used as a reasonable description of how a camera depicts a 3D scene, for example in computer vision and computer graphics. Geometry The geometry related to the mapping of a pinhole camera is illustrated in the figure. The figure contains the following basic objects: A 3D orthogonal coordinate system with its origin at O. This is also where the camera aperture is located. The three axes of the coordinate system are referred to as X1, X2, X3. Axis X3 is pointing in the viewing direction of the camera and is referred to as the optical axis, principal axis, or principal ray. The plane which is spanned by axes X1 and X2 is the front side of the camera, or principal plane. An image plane, where the 3D world is projected through the aperture of the camera. The image plane is parallel to axes X1 and X2 and is located at distance from the
https://en.wikipedia.org/wiki/Suid%20gammaherpesvirus%203
Suid gammaherpesvirus 3 (SuHV-3) is a species of virus in the genus Macavirus, subfamily Gammaherpesvirinae, family Herpesviridae, and order Herpesvirales.
https://en.wikipedia.org/wiki/World%20Toilet%20Day
World Toilet Day (WTD) is an official United Nations international observance day on 19 November to inspire action to tackle the global sanitation crisis. Worldwide, 4.2 billion people live without "safely managed sanitation" and around 673 million people practice open defecation. Sustainable Development Goal 6 aims to "Ensure availability and sustainable management of water and sanitation for all". In particular, target 6.2 is to "End open defecation and provide access to sanitation and hygiene". When the Sustainable Development Goals Report 2020 was published, United Nations Secretary-General António Guterres said, "Today, Sustainable Development Goal 6 is badly off track" and it "is hindering progress on the 2030 Agenda, the realization of human rights and the achievement of peace and security around the world". World Toilet Day exists to inform, engage and inspire people to take action toward achieving this goal. The UN General Assembly declared World Toilet Day an official UN day in 2013, after Singapore had tabled the resolution (its first resolution before the UN's General Assembly of 193 member states). Prior to that, World Toilet Day had been established unofficially by the World Toilet Organization (a Singapore-based NGO) in 2001. UN-Water is the official convener of World Toilet Day. UN-Water maintains the official World Toilet Day website and chooses a special theme for each year. In 2020 the theme was "Sustainable sanitation and climate change". In 2019 the theme was 'Leaving no one behind', which is the central theme of the Sustainable Development Goals. Themes in previous years include nature-based solutions, wastewater, toilets and jobs, and toilets and nutrition. World Toilet Day is marked by communications campaigns and other activities. Events are planned by UN entities, international organizations, local civil society organizations and volunteers to raise awareness and inspire action. Toilets are important because access to a safe functioning
https://en.wikipedia.org/wiki/Ectomesenchyme
Ectomesenchyme has properties similar to mesenchyme. The origin of the ectomesenchyme is disputed. It is either like the mesenchyme, arising from mesodermic cells, or conversely arising from neural crest cells. The neural crest is a critical group of cells that form in the cranial region during early vertebrate development. Ectomesenchyme plays a critical role in the formation of the hard and soft tissues of the head and neck such as bones, muscles, teeth and, notably, the pharyngeal arches.
https://en.wikipedia.org/wiki/J-structure
In mathematics, a J-structure is an algebraic structure over a field related to a Jordan algebra. The concept was introduced by to develop a theory of Jordan algebras using linear algebraic groups and axioms taking the Jordan inversion as basic operation and Hua's identity as a basic relation. There is a classification of simple structures deriving from the classification of semisimple algebraic groups. Over fields of characteristic not equal to 2, the theory of J-structures is essentially the same as that of Jordan algebras. Definition Let V be a finite-dimensional vector space over a field K and j a rational map from V to itself, expressible in the form n/N with n a polynomial map from V to itself and N a polynomial in K[V]. Let H be the subset of GL(V) × GL(V) containing the pairs (g,h) such that g∘j = j∘h: it is a closed subgroup of the product and the projection onto the first factor, the set of g which occur, is the structure group of j, denoted G'''(j). A J-structure is a triple (V,j,e) where V is a vector space over K, j is a birational map from V to itself and e is a non-zero element of V satisfying the following conditions. j is a homogeneous birational involution of degree −1 j is regular at e and j(e) = e if j is regular at x, e + x and e + j(x) then the orbit G e of e under the structure group G = G(j) is a Zariski open subset of V. The norm associated to a J-structure (V,j,e) is the numerator N of j, normalised so that N(e) = 1. The degree of the J-structure is the degree of N as a homogeneous polynomial map. The quadratic map of the structure is a map P from V to End(V) defined in terms of the differential dj at an invertible x. We put The quadratic map turns out to be a quadratic polynomial map on V. The subgroup of the structure group G generated by the invertible quadratic maps is the inner structure group of the J-structure. It is a closed connected normal subgroup. J-structures from quadratic forms Let K have characteristic not
https://en.wikipedia.org/wiki/Code%20sanitizer
A code sanitizer is a programming tool that detects bugs in the form of undefined or suspicious behavior by a compiler inserting instrumentation code at runtime. The class of tools was first introduced by Google's AddressSanitizer (or ASan) of 2012, which uses directly mapped shadow memory to detect memory corruption such as buffer overflows or accesses to a dangling pointer (use-after-free). AddressSanitizer Google's ASan, introduced in 2012, uses a shadow memory scheme to detect memory bugs. It is available in: Clang (starting from version 3.1) GCC (starting from version 4.8) Xcode (starting from version 7.0) MSVC (widely available starting from version 16.9). On average, the instrumentation increases processing time by about 73% and memory usage by 240%. There is a hardware-accelerated ASan called HWAsan available for AArch64 and (in a limited fashion) x86_64. AddressSanitizer does not detect any uninitialized memory reads (but this is detected by MemorySanitizer), and only detects some use-after-return bugs. It is also not capable of detecting all arbitrary memory corruption bugs, nor all arbitrary write bugs due to integer underflow/overflows (when the integer with undefined behavior is used to calculate memory address offsets). Adjacent buffers in structs and classes are not protected from overflow, in part to prevent breaking backwards compatibility. KernelAddressSanitizer The KernelAddressSanitizer (KASan) detects dynamic memory errors in the Linux kernel. Kernel instrumentation requires a special feature in the compiler supplying the -fsanitize=kernel-address command line option, since kernels do not use the same address space as normal programs. Other sanitizers Google also produced LeakSanitizer (LSan, memory leaks), ThreadSanitizer (TSan, data races and deadlocks), MemorySanitizer (MSan, uninitialized memory), and UndefinedBehaviorSanitizer (UBSan, undefined behaviors, with fine-grained control). These tools are generally available in Clang/LL
https://en.wikipedia.org/wiki/GAI%20%28Arabidopsis%20thaliana%20gene%29
GAI or Gibberellic-Acid Insensitive is a gene in Arabidopsis thaliana which is involved in regulation of plant growth. GAI represses the pathway of gibberellin-sensitive plant growth. It does this by way of its conserved DELLA motif.
https://en.wikipedia.org/wiki/LAN%20eXtensions%20for%20Instrumentation
LAN eXtensions for Instrumentation (LXI) is a standard developed by the LXI Consortium, a consortium that maintains the LXI specification and promotes the LXI Standard. The LXI standard defines the communication protocols for instrumentation and data acquisition systems using Ethernet. Ethernet is a ubiquitous communication standard providing a versatile interface, the LXI standard describes how to use the Ethernet standards for test and measurement applications in a way that promotes simple interoperability between instruments. The LXI Consortium ensures LXI compliant instrumentation developed by various vendors works together with no communication or setup issues. The LXI Consortium ensures that the LXI standard complements other test and measurement control systems, such as GPIB and PXI systems. Overview Proposed in 2005 by Keysight(formerly called Agilent Technologies) and VTI Instruments (formerly called VXI Technology and now part of Ametek), the LXI standard adapts the Ethernet and World Wide Web standards and applies them to test and measurement applications. The standard defines how existing standards should be used in instrumentation applications to provide a consistent feel and ensure compatibility between vendors equipment. The LXI standard does not define a mechanical format, allowing LXI solutions to take any physical form deemed suitable for products in their intended market. LXI products can be modular, rack mounted, bench mounted or take any other physical form. LXI supports synthetic instruments and peer-to-peer networking, providing a number of unique capabilities to the test engineer. LXI products may have no front panel or display, or they may include embedded keyboards and displays. Connections to the DUT are permitted to be on the front or the rear to suit market demand, most devices provide front panel connectivity to allow Ethernet and power connections to be provided to the rear panel. Use of Ethernet allows the simple construction of
https://en.wikipedia.org/wiki/Zuiyo-maru%20carcass
The was a corpse, caught by the Japanese fishing trawler off the coast of New Zealand in 1977. The carcass's peculiar appearance led to speculation that it might be the remains of a sea serpent or prehistoric plesiosaur. Although several scientists insisted it was "not a fish, whale, or any other mammal", analysis of amino acids in the corpse's muscle tissue later indicated it was most likely the carcass of a basking shark. Decomposing basking shark carcasses lose most of the lower head area and the dorsal and caudal fins first, making them resemble a plesiosaur. Discovery On April 25, 1977, the Japanese trawler Zuiyō Maru, sailing east of Christchurch, New Zealand, caught a strange, unknown creature in the trawl. The crew was convinced it was an unidentified animal, but despite the potential biological significance of the curious discovery, the captain, Akira Tanaka, decided to dump the carcass into the ocean again so not to risk spoiling the caught fish. However, before that, some photos and sketches were taken of the creature, nicknamed "Nessie" by the crew, measurements were taken and some samples of skeleton, skin and fins were collected for further analysis by experts in Japan. The discovery resulted in immense commotion and a "plesiosaur-craze" in Japan, and the shipping company ordered all its boats to try to relocate the dumped corpse, but with no apparent success. Description The foul-smelling, decomposing corpse reportedly weighed 1,800 kg and was about 10 m long. According to the crew, the creature had a 1.5-m-long neck, four large, reddish fins, and a tail about 2.0 m long. It seemed to lack a dorsal fin on inspection, but one was visible from photographs. No internal organs remained as the chest cavity and gut had opened up from decay, but flesh and fat were somewhat intact. Proposed explanations Plesiosaur Professor Tokio Shikama from Yokohama National University was convinced the remains were of a supposedly extinct plesiosaur. Dr. Fujiro
https://en.wikipedia.org/wiki/%C3%89mile%20Cotton
Émile Clément Cotton (5 February 1872 – 14 March 1950) was a professor of mathematics at the University of Grenoble. His PhD thesis studied differential geometry in three dimensions, with the introduction of the Cotton tensor. He held the professorship from 1904 until his 1942 retirement. He was the brother of Aimé Cotton.
https://en.wikipedia.org/wiki/Skin%20test
A skin test is a medical test in which a substance is injected into the skin. Examples Casoni test Corneometry Dick test Fernandez reaction Frei test Hair perforation test Kveim test Leishmanin skin test Lepromin Patch test Schick test Skin allergy test Sweat diagnostics Sweat test Tine test Transepidermal water loss Trichoscopy
https://en.wikipedia.org/wiki/United%20States%20Board%20on%20Geographic%20Names
The United States Board on Geographic Names (BGN) is a federal body operating under the United States Secretary of the Interior. The purpose of the board is to establish and maintain uniform usage of geographic names throughout the federal government of the United States. History On January 8, 1890, Thomas Corwin Mendenhall, superintendent of the US Coast and Geodetic Survey Office, wrote to 10 noted geographers "to suggest the organization of a Board made up of representatives from the different Government services interested, to which may be referred any disputed question of geographical orthography." President Benjamin Harrison signed executive order 28 on September 4, 1890, establishing the Board on Geographical Names. "To this Board shall be referred all unsettled questions concerning geographic names. The decisions of the Board are to be accepted [by federal departments] as the standard authority for such matters." The board was given authority to resolve all unsettled questions concerning geographic names. Decisions of the board were accepted as binding by all departments and agencies of the federal government. The board has since undergone several name changes. In 1934, it was transferred to the Department of the Interior. The Advisory Committee on Antarctic Names was established in 1943 as the Special Committee on Antarctic Names (SCAN). In 1963, the Advisory Committee on Undersea Features was started for standardization of names of undersea features. Its present form derives from a 1947 law, Public Law 80-242. Operation The 1969 BGN publication Decisions on Geographic Names in the United States stated the agency's chief purpose as: The board has developed principles, policies, and procedures governing the use of domestic and foreign geographic names, including underseas. The BGN also deals with names of geographical features in Antarctica via its Advisory Committee on Antarctic Names. The Geographic Names Information System, developed by the BGN in
https://en.wikipedia.org/wiki/Sex%20chromosome
A sex chromosome (also referred to as an allosome, heterotypical chromosome, gonosome, heterochromosome, or idiochromosome) is a chromosome that differs from an ordinary autosome in form, size, and behavior. The human sex chromosomes, a typical pair of mammal allosomes, carry the genes that determine the sex of an individual created in sexual reproduction. Autosomes differ from allosomes because autosomes appear in pairs whose members have the same form but differ from other pairs in a diploid cell, whereas members of an allosome pair may differ from one another and thereby determine sex. Nettie Stevens and Edmund Beecher Wilson both independently discovered sex chromosomes in 1905. However, Stevens is credited for discovering them earlier than Wilson. Differentiation In humans, each cell nucleus contains 23 pairs of chromosomes, a total of 46 chromosomes. The first 22 pairs are called autosomes. Autosomes are homologous chromosomes i.e. chromosomes which contain the same genes (regions of DNA) in the same order along their chromosomal arms. The 23rd pair of chromosomes are called allosomes. These consist of two X chromosomes in most females, and an X chromosome and a Y chromosome in most males. Females therefore have 23 homologous chromosome pairs, while males have 22. The X and Y chromosomes have small regions of homology called pseudoautosomal regions. An X chromosome is always present as the 23rd chromosome in the ovum, while either an X or Y chromosome may be present in an individual sperm. Early in female embryonic development, in cells other than egg cells, one of the X chromosomes is randomly and permanently partially deactivated: In some cells, the X chromosome inherited from the mother deactivates; in other cells, it is the X chromosome inherited from the father. This ensures that both sexes always have exactly one functional copy of an X chromosome in each body cell. The deactivated X chromosome is silenced by repressive heterochromatin that compacts
https://en.wikipedia.org/wiki/Lycoperdon%20molle
Lycoperdon molle, commonly known as the smooth puffball or the soft puffball, is a type of puffball mushroom in the genus Lycoperdon. It was first described scientifically in 1799 by Dutch mycologist Christiaan Hendrik Persoon. The puffball is edible when the internal flesh is still white.
https://en.wikipedia.org/wiki/International%20Physics%20Olympiad
The International Physics Olympiad (IPhO) is an annual physics competition for high school students. It is one of the International Science Olympiads. The first IPhO was held in Warsaw, Poland in 1967. Each national delegation is made up of at most five student competitors plus two leaders, selected on a national level. Observers may also accompany a national team. The students compete as individuals, and must sit for intensive theoretical and laboratory examinations. For their efforts the students can be awarded gold, silver, or bronze medals or an honourable mention. The theoretical examination lasts 5 hours and consists of three questions. Usually these questions involve more than one part. The practical examination may consist of one laboratory examination of five hours, or two, which together take up the full five hours. History The idea of creating the International Physics Olympiad was conceived in Eastern Bloc countries, inspired by the 1959 established International Mathematical Olympiad. Poland seemed to offer the best conditions at the time, and so the first IPhO was held in Warsaw in 1967, organized by Czesław Ścisłowski. Some months prior to the competition, all Central European countries were invited, and the five countries Bulgaria, Czechoslovakia, Hungary, Poland and Romania participated. Each country sent a delegation of three students and one supervisor. Already in this first edition, the competition consisted of two exams, one theoretical and one experimental, and the students went on excursions while their exams were marked. The second IPhO was held in Hungary, with the additional participation of the German Democratic Republic, the Soviet Union and Yugoslavia. Subsequent editions were carried out in the following years in Czechoslovakia, the Soviet Union, Bulgaria and Romania. At that sixth IPhO in 1972, France joined the competition as the first Western country and Cuba as the first non-European country. With growing size and organiza
https://en.wikipedia.org/wiki/Mouse%20brain
The mouse brain refers to the brain of Mus musculus. Various brain atlases exist. For reasons of reproducibility, genetically characterized, stable strains like C57BL/6 were chosen to produce high-resolution images and databases. Well known online resources include: Allen Brain Atlas Mouse Brain Library High resolution mouse brain atlas BrainMaps High-Resolution Brain Maps and Brain Atlases of Mus musculus Despite superficial differences, especially in size and weight, the mouse brain and its function can serve as a powerful animal model for study of human brain diseases or mental disorders (see e.g. Reeler, Chakragati mouse). This is because the genes responsible for building and operating both mouse and human brain are 90% identical. Transgenic mouse lines also allow neuroscientists to specifically target the labeling of certain cell types to probe the neural basis of fundamental processes. Anatomy The cerebral cortex of a mouse has around 8–14 million neurons while in those humans there are more than 10–15 billion. The olfactory bulb volume takes about 2% of the mouse brain by volume in contrast to about 0.01% of the human brain. Development Mouse brain development timeline See also List of animals by number of neurons
https://en.wikipedia.org/wiki/List%20of%20plant%20hybrids
This is a list of plant hybrids created intentionally or by chance and exploited commercially in agriculture or horticulture. The hybridization event mechanism is documented where known, along with the authorities who described it. Hybrids
https://en.wikipedia.org/wiki/Semi-continuity
In mathematical analysis, semicontinuity (or semi-continuity) is a property of extended real-valued functions that is weaker than continuity. An extended real-valued function is upper (respectively, lower) semicontinuous at a point if, roughly speaking, the function values for arguments near are not much higher (respectively, lower) than A function is continuous if and only if it is both upper and lower semicontinuous. If we take a continuous function and increase its value at a certain point to for some , then the result is upper semicontinuous; if we decrease its value to then the result is lower semicontinuous. The notion of upper and lower semicontinuous function was first introduced and studied by René Baire in his thesis in 1899. Definitions Assume throughout that is a topological space and is a function with values in the extended real numbers . Upper semicontinuity A function is called upper semicontinuous at a point if for every real there exists a neighborhood of such that for all . Equivalently, is upper semicontinuous at if and only if where lim sup is the limit superior of the function at the point . A function is called upper semicontinuous if it satisfies any of the following equivalent conditions: (1) The function is upper semicontinuous at every point of its domain. (2) All sets with are open in , where . (3) All superlevel sets with are closed in . (4) The hypograph is closed in . (5) The function is continuous when the codomain is given the left order topology. This is just a restatement of condition (2) since the left order topology is generated by all the intervals . Lower semicontinuity A function is called lower semicontinuous at a point if for every real there exists a neighborhood of such that for all . Equivalently, is lower semicontinuous at if and only if where is the limit inferior of the function at point . A function is called lower semicontinuous if it satisfies any of the follo
https://en.wikipedia.org/wiki/Proteins%20%28journal%29
Proteins: Structure, Function, and Bioinformatics is a monthly peer-reviewed scientific journal published by John Wiley & Sons, which was established in 1986 by Cyrus Levinthal. The journal covers research on all aspects protein biochemistry, including computation, function, structure, design, and genetics. The editor-in-chief is Nikolay Dokholyan (Penn State College of Medicine). Publishing formats are original research reports, short communications, prediction reports, invited reviews, and topic proposals. In addition, Proteins includes a section entitled "Section Notes", describing novel protein structures. Abstracting and indexing Proteins is abstracted and indexed in: According to the Journal Citation Reports, the journal has a 2020 impact factor of 3.756.
https://en.wikipedia.org/wiki/Dynamic-maturational%20model%20of%20attachment%20and%20adaptation
The dynamic-maturational model of attachment and adaptation (DMM) is a biopsychosocial model describing the effect attachment relationships can have on human development and functioning. It is especially focused on the effects of relationships between children and parents and between reproductive couples. It developed initially from attachment theory as developed by John Bowlby and Mary Ainsworth, and incorporated many other theories into a comprehensive model of adaptation to life's many dangers. The DMM was initially created by developmental psychologist Patricia McKinsey Crittenden and her colleagues including David DiLalla, Angelika Claussen, Andrea Landini, Steve Farnfield, and Susan Spieker. A main tenet of the DMM is that exposure to danger drives neural development and adaptation to promote survival. Danger includes relationship danger. In DMM-attachment theory, when a person needs protection or comfort from danger from a person with whom they have a protective relationship, the nature of the relationship generates relation-specific self-protective strategies. These are patterns of behavior which include the underlying neural processing. The DMM protective strategies describe aspects of the parent–child relationship, romantic relationships, and to a degree, relationships between patients/clients and long-term helping professionals. History Out of the development of attachment theory, British psychiatrist John Bowlby coalesced a coherent theory and is generally credited with creating the foundation for modern attachment theory. Mary Ainsworth, an American-Canadian psychologist, started working with Bowlby in 1950. Ainsworth completed her doctoral thesis in 1940 under William Blatz, who had developed security theory, a precursor to attachment theory. Blatz believed the core nature of the relationship between a (to use his colloquial terms) mother and child involved the development of a trusted and secure relationship to function as a safe base for a child's
https://en.wikipedia.org/wiki/Test%20compression
Test compression is a technique used to reduce the time and cost of testing integrated circuits. The first ICs were tested with test vectors created by hand. It proved very difficult to get good coverage of potential faults, so Design for testability (DFT) based on scan and automatic test pattern generation (ATPG) were developed to explicitly test each gate and path in a design. These techniques were very successful at creating high-quality vectors for manufacturing test, with excellent test coverage. However, as chips got bigger and more complex the ratio of logic to be tested per pin increased dramatically, and the volume of scan test data started causing a significant increase in test time, and required tester memory. This raised the cost of testing. Test compression was developed to help address this problem. When an ATPG tool generates a test for a fault, or a set of faults, only a small percentage of scan cells need to take specific values. The rest of the scan chain is don't care, and are usually filled with random values. Loading and unloading these vectors is not a very efficient use of tester time. Test compression takes advantage of the small number of significant values to reduce test data and test time. In general, the idea is to modify the design to increase the number of internal scan chains, each of shorter length. These chains are then driven by an on-chip decompressor, usually designed to allow continuous flow decompression where the internal scan chains are loaded as the data is delivered to the decompressor. Many different decompression methods can be used. One common choice is a linear finite state machine, where the compressed stimuli are computed by solving linear equations corresponding to internal scan cells with specified positions in partially specified test patterns. Experimental results show that for industrial circuits with test vectors and responses with very low fill rates, ranging from 3% to 0.2%, the test compression
https://en.wikipedia.org/wiki/Lisa%20Sauermann
Lisa Sauermann (born 25 September 1992) is a mathematician from Germany known for her performance in the International Mathematical Olympiad, where in 2011 she had the single highest (and perfect) score. She won four gold medals (2008–2011) and one silver medal (2007) at the olympiad, representing Germany. Sauermann attended Martin-Andersen-Nexö-Gymnasium Dresden when she was in 12th grade. She won the Franz Ludwig Gehe Prize in 2011 and the gold medal in the age group III, the 11th–12th grade competition. As a result, she won a trip to the Royal Academy of Sciences in Stockholm. To achieve this, she presented a new mathematical theorem with a proof in a work entitled "Forests with Hypergraphs". In 2011 she began studying mathematics at the University of Bonn. In 2014, she completed her bachelor thesis on algebraic geometry under Michael Rapoport. She became a graduate student studying with Jacob Fox at Stanford University where she obtained her PhD in 2019, receiving two prizes for her dissertation titled "Modern Methods in Extermal Combinatorics". Currently she works as assistant professor at MIT where she lists her research interests as "extremal and probabilistic combinatorics". In 2022, she was awarded a Sloan fellowship. Her sister, Anne, two years her junior, was a successful participant in math and science Olympiads at the national level. Selected publications
https://en.wikipedia.org/wiki/KUKA%20Robot%20Language
The KUKA Robot Language, also known as KRL, is a proprietary programming language similar to Pascal and used to control KUKA robots. Features Any KRL code consists of two different files with the same name: a permanent data file, with the extension .dat, and a movement command file, with the extension .src. KRL has four basic data types: User can also create custom data types using enumeration. Enumeration and basic data types can be used to create arrays and structures. Motion commands support several types of structures as data formats: FRAME {X 10, Y 0, Z 500, A 0, B 0, C 0} POS {X 10, Y 0, Z 500, A 0, B 0, C 0, S 6, T 21} E3POS {X 10, Y 0, Z 500, A 0, B 0, C 0, S 6, T 21, E1 0, E2 0, E3 0} E6POS {X 10, Y 0, Z 500, A 0, B 0, C 0, S 6, T 21, E1 0, E2 0, E3 0, E4 0, E5 0, E6 0} AXIS {A1 0, A2 -90, A3 90, A4 0, A5 0, A6 0} etc. Robot joints are A1-A6. External axis joints are E1-E6. Frame value is sufficient to specify TCP location and orientation. But to also determine unique robot arm pose, additional info is required - S and T or Status and Turn. They are collection of flags stored as integer. See also RAPID
https://en.wikipedia.org/wiki/Poole%E2%80%93Frenkel%20effect
In solid-state physics, the Poole–Frenkel effect (also known as Frenkel-Poole emission) is a model describing the mechanism of trap-assisted electron transport in an electrical insulator. It is named after Yakov Frenkel, who published on it in 1938, extending the theory previously developed by H. H. Poole. Electrons can move slowly through an insulator by the following process. The electrons are generally trapped in localized states (loosely speaking, they are "stuck" to a single atom, and not free to move around the crystal). Occasionally, random thermal fluctuations will give an electron enough energy to leave its localized state, and move to the conduction band. Once there, the electron can move through the crystal, for a brief amount of time, before relaxing into another localized state (in other words, "sticking" to a different atom). The Poole–Frenkel effect describes how, in a large electric field, the electron doesn't need as much thermal energy to be promoted into the conduction band (because part of this energy comes from being pulled by the electric field), so it does not need as large a thermal fluctuation and will be able to move more frequently. On theoretical grounds, the Poole–Frenkel effect is comparable to the Schottky effect, which is the lowering of the metal-insulator energy barrier due to the electrostatic interaction with the electric field at a metal-insulator interface. However, the conductivity arising from the Poole–Frenkel effect is detected in presence of bulk-limited conduction (when the limiting conduction process occurs in the bulk of a material), while the Schottky current is observed when the conductivity is contact-limited (when the limiting conduction mechanism occurs at the metal-insulator interface). Poole-Frenkel equation The electrical conductivity of dielectrics and semiconductors in presence of high electric fields (more than for insulators and up to for semiconductors) increases approximately as described by the Pool
https://en.wikipedia.org/wiki/List%20of%20U.S.%20state%20insects
State insects are designated by 48 individual states of the fifty United States. Some states have more than one designated insect, or have multiple categories (e.g., state insect and state butterfly, etc.). Iowa and Michigan are the two states without a designated state insect. More than half of the insects chosen are not native to North America, because of the inclusion of three European species (European honey bee, European mantis, and 7-spotted ladybird), each having been chosen by multiple states. Table See also Lists of United States state insignia
https://en.wikipedia.org/wiki/Quasisymmetric%20function
In algebra and in particular in algebraic combinatorics, a quasisymmetric function is any element in the ring of quasisymmetric functions which is in turn a subring of the formal power series ring with a countable number of variables. This ring generalizes the ring of symmetric functions. This ring can be realized as a specific limit of the rings of quasisymmetric polynomials in n variables, as n goes to infinity. This ring serves as universal structure in which relations between quasisymmetric polynomials can be expressed in a way independent of the number n of variables (but its elements are neither polynomials nor functions). Definitions The ring of quasisymmetric functions, denoted QSym, can be defined over any commutative ring R such as the integers. Quasisymmetric functions are power series of bounded degree in variables with coefficients in R, which are shift invariant in the sense that the coefficient of the monomial is equal to the coefficient of the monomial for any strictly increasing sequence of positive integers indexing the variables and any positive integer sequence of exponents. Much of the study of quasisymmetric functions is based on that of symmetric functions. A quasisymmetric function in finitely many variables is a quasisymmetric polynomial. Both symmetric and quasisymmetric polynomials may be characterized in terms of actions of the symmetric group on a polynomial ring in variables . One such action of permutes variables, changing a polynomial by iteratively swapping pairs of variables having consecutive indices. Those polynomials unchanged by all such swaps form the subring of symmetric polynomials. A second action of conditionally permutes variables, changing a polynomial by swapping pairs of variables except in monomials containing both variables. Those polynomials unchanged by all such conditional swaps form the subring of quasisymmetric polynomials. One quasisymmetric function in four variables is the polynomi
https://en.wikipedia.org/wiki/Nevada%20statistical%20areas
The U.S. currently has 11 statistical areas that have been delineated by the Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated two combined statistical areas, three metropolitan statistical areas, and six micropolitan statistical areas in Nevada. Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 11 United States statistical areas, 16 counties, and 1 independent city of the State of Nevada with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to the 2020 US Census. The core based statistical area (CBSA) as designated by the OMB.
https://en.wikipedia.org/wiki/Per%20Enflo
Per H. Enflo (; born 20 May 1944) is a Swedish mathematician working primarily in functional analysis, a field in which he solved problems that had been considered fundamental. Three of these problems had been open for more than forty years: The basis problem and the approximation problem and later the invariant subspace problem for Banach spaces. In solving these problems, Enflo developed new techniques which were then used by other researchers in functional analysis and operator theory for years. Some of Enflo's research has been important also in other mathematical fields, such as number theory, and in computer science, especially computer algebra and approximation algorithms. Enflo works at Kent State University, where he holds the title of University Professor. Enflo has earlier held positions at the Miller Institute for Basic Research in Science at the University of California, Berkeley, Stanford University, École Polytechnique, (Paris) and The Royal Institute of Technology, Stockholm. Enflo is also a concert pianist. Enflo's contributions to functional analysis and operator theory In mathematics, Functional analysis is concerned with the study of vector spaces and operators acting upon them. It has its historical roots in the study of functional spaces, in particular transformations of functions, such as the Fourier transform, as well as in the study of differential and integral equations. In functional analysis, an important class of vector spaces consists of the complete normed vector spaces over the real or complex numbers, which are called Banach spaces. An important example of a Banach space is a Hilbert space, where the norm arises from an inner product. Hilbert spaces are of fundamental importance in many areas, including the mathematical formulation of quantum mechanics, stochastic processes, and time-series analysis. Besides studying spaces of functions, functional analysis also studies the continuous linear operators on spaces of functions.
https://en.wikipedia.org/wiki/Online%20Armor%20Personal%20Firewall
Online Armor Personal Firewall was a firewall originally developed by Australian company ((Tall Emu)), until the program was sold to Emsi Software GmbH (now Emsisoft). The program provides protection on a Microsoft Windows operating system from both inbound and outbound attacks. There are three editions of this product: Online Armor Free is freeware but is licensed for personal use only and has a limited featureset. Online Armor Premium is a more comprehensive commercial firewall that includes anti-phishing and anti-spam capabilities. Overview In an independent proactive security challenge test performed by matousec.com, Online Armor Premium received a score of 99%, surpassing more well-known firewalls, such as ZoneAlarm and Kaspersky Internet Security. A well known vulnerability profiling site and company, Secunia, had not found any vulnerabilities as of March, 2008 in the software, though Matousec reported a weakness 25 March 2008 that has been repaired in the latest version. Online Armor has gained both negative and positive feedback with some users reporting serious compatibility problems with certain programs, such as F-Secure and Ad Muncher, as noted in the product website, but also receiving praise for its free online support and for its swift response to problems. End of Support On 31 March 2015 Emsisoft announced that they had discontinued selling new licenses for Online Armor, that it would only be possible to activate new license keys until the end of May 2015, and that support for Online Armor would officially end after 31 March 2016. See also Internet Security Comparison of antivirus software Comparison of firewalls
https://en.wikipedia.org/wiki/Collabora%20Online
Collabora Online is an open source online office suite built on LibreOffice Technology, enabling web-based collaborative real-time editing of word processing documents, spreadsheets, presentations, and vector graphics. Optional apps are available for desktops, laptops, tablets, smartphones and Chromebooks. Collabora Online is developed by Collabora Productivity, a division of Collabora, who are a commercial partner with LibreOffice's parent organisation The Document Foundation (TDF). The TDF states that a majority of the LibreOffice software development is done by its commercial partners, Collabora, Red Hat, CIB, and Allotropia. Features Collabora Online can be accessed from modern web browsers without plug-ins or add-ons, documents, spreadsheets, presentations and vector graphics can be edited collaboratively. Collaborative functions include comments which other users can respond to, document version history which enables the comparison of documents and restoring, etc. Collaborative functions may also include integrated video calls or chat whilst collaboratively editing documents, features like these are possible with integrations with enterprise cloud solutions such as Nextcloud, ownCloud, Seafile, EGroupware and others. Collabora Online can be integrated with any application. Device support Client apps are not required to access Collabora Online which only needs a web browser; However, optional apps are available for most devices that run the following operating systems: Android, ChromeOS, iOS, iPadOS, Windows, macOS and Linux. These optional apps share the same LibreOffice Technology core software with Collabora Online, this results with document fidelity between them. Software coding development therefore normally effects the source code of Collabora Online all of the apps simultaneously. The apps work offline without the need for a connection to a local server or the cloud, support for integrations with cloud storage services is still possible. The mobi
https://en.wikipedia.org/wiki/M.U.L.E.
M.U.L.E. is a 1983 multiplayer video game written for the Atari 8-bit family of home computers by Ozark Softscape. Designer Danielle Bunten Berry (credited as Dan Bunten) took advantage of the four joystick ports of the Atari 400 and 800 to allow four-player simultaneous play. M.U.L.E. was one of the first five games published in 1983 by new company Electronic Arts, alongside Axis Assassin, Archon: The Light and the Dark, Worms?, and Hard Hat Mack. Primarily a turn-based strategy game, it incorporates real-time elements where players compete directly as well as aspects that simulate economics. The game was ported to the Commodore 64, Nintendo Entertainment System, and IBM PC (as a self-booting disk). Japanese versions also exist for the PC-88, Sharp X1, and MSX2 computers. Like the subsequent models of the Atari 8-bit family, none of these systems allow four players with separate joysticks. The Commodore 64 version lets four players share joysticks, with two players using the keyboard during action portions. Gameplay Set on the fictional planet Irata (Atari backwards), the game is an exercise in supply and demand economics involving competition among four players, with computer opponents automatically filling in for any missing players. Players choose the race of their colonist, which has advantages and disadvantages that can be paired to their respective strategies. To win, players not only compete against each other to amass the largest amount of wealth, but must also cooperate for the survival of the colony. Central to the game is the acquisition and use of Multiple Use Labor Elements, or M.U.L.E.s, to develop and harvest resources from the player's real estate. Depending on how it is outfitted, a M.U.L.E. can be configured to harvest Energy, Food, Smithore (from which M.U.L.E.s are constructed), and Crystite (a valuable mineral available only at the "Tournament" level). Players must balance supply and demand of these elements, buying what they need and se
https://en.wikipedia.org/wiki/Almost%20all
In mathematics, the term "almost all" means "all but a negligible quantity". More precisely, if is a set, "almost all elements of " means "all elements of but those in a negligible subset of ". The meaning of "negligible" depends on the mathematical context; for instance, it can mean finite, countable, or null. In contrast, "almost no" means "a negligible quantity"; that is, "almost no elements of " means "a negligible quantity of elements of ". Meanings in different areas of mathematics Prevalent meaning Throughout mathematics, "almost all" is sometimes used to mean "all (elements of an infinite set) except for finitely many". This use occurs in philosophy as well. Similarly, "almost all" can mean "all (elements of an uncountable set) except for countably many". Examples: Almost all positive integers are greater than 1012. Almost all prime numbers are odd (2 is the only exception). Almost all polyhedra are irregular (as there are only nine exceptions: the five platonic solids and the four Kepler–Poinsot polyhedra). If P is a nonzero polynomial, then P(x) ≠ 0 for almost all x (if not all x). Meaning in measure theory When speaking about the reals, sometimes "almost all" can mean "all reals except for a null set". Similarly, if S is some set of reals, "almost all numbers in S" can mean "all numbers in S except for those in a null set". The real line can be thought of as a one-dimensional Euclidean space. In the more general case of an n-dimensional space (where n is a positive integer), these definitions can be generalised to "all points except for those in a null set" or "all points in S except for those in a null set" (this time, S is a set of points in the space). Even more generally, "almost all" is sometimes used in the sense of "almost everywhere" in measure theory, or in the closely related sense of "almost surely" in probability theory. Examples: In a measure space, such as the real line, countable sets are null. The set of rational numbers is
https://en.wikipedia.org/wiki/Tensor%20%28intrinsic%20definition%29
In mathematics, the modern component-free approach to the theory of a tensor views a tensor as an abstract object, expressing some definite type of multilinear concept. Their properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra. In differential geometry, an intrinsic geometric statement may be described by a tensor field on a manifold, and then doesn't need to make reference to coordinates at all. The same is true in general relativity, of tensor fields describing a physical property. The component-free approach is also used extensively in abstract algebra and homological algebra, where tensors arise naturally. Note: This article assumes an understanding of the tensor product of vector spaces without chosen bases. An overview of the subject can be found in the main tensor article. Definition via tensor products of vector spaces Given a finite set of vector spaces over a common field F, one may form their tensor product , an element of which is termed a tensor. A tensor on the vector space V is then defined to be an element of (i.e., a vector in) a vector space of the form: where V∗ is the dual space of V. If there are m copies of V and n copies of V∗ in our product, the tensor is said to be of and contravariant of order m and covariant of order n and of total order . The tensors of order zero are just the scalars (elements of the field F), those of contravariant order 1 are the vectors in V, and those of covariant order 1 are the one-forms in V∗ (for this reason, the elements of the last two spaces are often called the contravariant and covariant vectors). The space of all tensors of type is denoted Example 1. The space of type tensors, is isomorphic in a natural way to the space of linear transformations from V to V. Example 2. A bilinear form on a real vector space V, corresponds in a natural way to a type tensor
https://en.wikipedia.org/wiki/Altyn%20Dala%20Conservation%20Initiative
Altyn Dala Conservation Initiative (ADCI) is the Government of Kazakhstan-led initiative to support the conservation of steppe and semi-desert ecosystems of Kazakhstan. Background ADCI is jointly initiated by Association for the Conservation of the Biodiversity of Kazakhstan (ACBK), the Committee of Forestry and Wildlife of the Ministry of Agriculture (Kazakhstan), Frankfurt Zoological Society, Fauna and Flora International and the Royal Society for the Protection of Birds. The program includes the whole area of around 50 to 60 million hectares, which corresponds to the distribution range of the Betpak-Dala Saiga antelope community in Central Kazakhstan. The effort also hopes to reintroduce the Turkmenian kulan and Przewalski's horse, although the primary focus has been on the conservation of native saiga populations. Conservation of Kulans Currently, a program under ADCI is preparing a tiny population of endangered kulans for release into the wild. In 2017, a first batch of nine animals was released into an acclimatisation cage on the outskirts of the protected area of Altyn Dala. The creatures were carried 1200 kilometres by helicopter from Altyn-Emel National Park in the country's southeast.
https://en.wikipedia.org/wiki/Landen%27s%20transformation
Landen's transformation is a mapping of the parameters of an elliptic integral, useful for the efficient numerical evaluation of elliptic functions. It was originally due to John Landen and independently rediscovered by Carl Friedrich Gauss. Statement The incomplete elliptic integral of the first kind is where is the modular angle. Landen's transformation states that if , , , are such that and , then Landen's transformation can similarly be expressed in terms of the elliptic modulus and its complement . Complete elliptic integral In Gauss's formulation, the value of the integral is unchanged if and are replaced by their arithmetic and geometric means respectively, that is Therefore, From Landen's transformation we conclude and . Proof The transformation may be effected by integration by substitution. It is convenient to first cast the integral in an algebraic form by a substitution of , giving A further substitution of gives the desired result This latter step is facilitated by writing the radical as and the infinitesimal as so that the factor of is recognized and cancelled between the two factors. Arithmetic-geometric mean and Legendre's first integral If the transformation is iterated a number of times, then the parameters and converge very rapidly to a common value, even if they are initially of different orders of magnitude. The limiting value is called the arithmetic-geometric mean of and , . In the limit, the integrand becomes a constant, so that integration is trivial The integral may also be recognized as a multiple of Legendre's complete elliptic integral of the first kind. Putting Hence, for any , the arithmetic-geometric mean and the complete elliptic integral of the first kind are related by By performing an inverse transformation (reverse arithmetic-geometric mean iteration), that is the relationship may be written as which may be solved for the AGM of a pair of arbitrary arguments;
https://en.wikipedia.org/wiki/Perfboard
Perfboard is a material for prototyping electronic circuits. It is a thin, rigid sheet with holes pre-drilled at standard intervals across a grid, usually a square grid of spacing. These holes are ringed by round or square copper pads, though bare boards are also available. Inexpensive perfboard may have pads on only one side of the board, while better quality perfboard can have pads on both sides (plate-through holes). Since each pad is electrically isolated, the builder makes all connections with either wire wrap or miniature point to point wiring techniques. Discrete components are soldered to the prototype board such as resistors, capacitors, and integrated circuits. The substrate is typically made of paper laminated with phenolic resin (such as FR-2) or a fiberglass-reinforced epoxy laminate (FR-4). Connections The grid system accommodates integrated circuits in DIP packages and many other types of through-hole components. Perfboard is not designed for prototyping surface mount devices. Before building a circuit on perfboard, the locations of the components and connections are typically planned in detail on paper or with software tools. Small scale prototypes, however, are often built ad hoc, using an oversized perfboard. Software for PCB layout can often be used to generate perfboard layouts as well. In this case, the designer positions the components so all leads fall on intersections of a grid. When routing the connections more than 2 copper layers can be used, as multiple overlaps are not a problem for insulated wires. Once the layout is finalized, the components are soldered in their designated locations, paying attention to orientation of polarized parts such as electrolytic capacitors, diodes, and integrated circuits. Next, electrical connections are made as called for in the layout. One school of thought advocates making as many connections as possible without adding extra wire. This is done by bending the existing leads on resistors, capaci