source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Agent-based%20model
An agent-based model (ABM) is a computational model for simulating the actions and interactions of autonomous agents (both individual or collective entities such as organizations or groups) in order to understand the behavior of a system and what governs its outcomes. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to understand the stochasticity of these models. Particularly within ecology, ABMs are also called individual-based models (IBMs). A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used in many scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents obeying simple rules, typically in natural systems, rather than in designing agents or solving specific practical or engineering problems. Agent-based models are a kind of microscale model that simulate the simultaneous operations and interactions of multiple agents in an attempt to re-create and predict the appearance of complex phenomena. The process is one of emergence, which some express as "the whole is greater than the sum of its parts". In other words, higher-level system properties emerge from the interactions of lower-level subsystems. Or, macro-scale state changes emerge from micro-scale agent behaviors. Or, simple behaviors (meaning rules followed by agents) generate complex behaviors (meaning state changes at the whole system level). Individual agents are typically characterized as boundedly rational, presumed to be acting in what they perceive as their own interests, such as reproduction, economic benefit, or social status, using heuristics or simple decision-making rules. ABM agents
https://en.wikipedia.org/wiki/B%2A
In computer science, B* (pronounced "B star") is a best-first graph search algorithm that finds the least-cost path from a given initial node to any goal node (out of one or more possible goals). First published by Hans Berliner in 1979, it is related to the A* search algorithm. Summary The algorithm stores intervals for nodes of the tree as opposed to single point-valued estimates. Then, leaf nodes of the tree can be searched until one of the top level nodes has an interval which is clearly "best." Details Interval evaluations rather than estimates Leaf nodes of a B*-tree are given evaluations that are intervals rather than single numbers. The interval is supposed to contain the true value of that node. If all intervals attached to leaf nodes satisfy this property, then B* will identify an optimal path to the goal state. Backup process To back up the intervals within the tree, a parent's upper bound is set to the maximum of the upper bounds of the children. A parent's lower bound is set to the maximum of the lower bound of the children. Note that different children might supply these bounds. Termination of search B* systematically expands nodes in order to create "separation," which occurs when the lower bound of a direct child of the root is at least as large as the upper bound of any other direct child of the root. A tree that creates separation at the root contains a proof that the best child is at least as good as any other child. In practice, complex searches might not terminate within practical resource limits. So the algorithm is normally augmented with artificial termination criteria such as time or memory limits. When an artificial limit is hit, then you must make a heuristic judgment about which move to select. Normally, the tree would supply you with extensive evidence, like the intervals of root nodes. Expansion B* is a best-first process, which means that it is very efficient to traverse the tree, repeatedly descending to find a leaf to expand.
https://en.wikipedia.org/wiki/Mill%20%28heraldry%29
Mills are sometimes used as a charge in heraldry, usually as a sign for agricultural or industrial endeavours. Examples See also Millrind Millwheel Heraldic charges Grinding mills
https://en.wikipedia.org/wiki/Ralph%20Lent%20Jeffery
Ralph Lent Jeffery (3 October 1889 Overton, Yarmouth County, Nova Scotia, Canada – 1975 Wolfville, Nova Scotia) was a Canadian mathematician working on analysis. He taught at several institutions including Acadia University, the University of Saskatchewan and Queen's University. Jeffery Hall at Queen's was named for him. In 1937 he was elected a Fellow of the Royal Society of Canada. In 1925 Jeffery exhibited a bounded function of two real variables, continuous in each, yet fails to have an integral. In 1951 Jeffery published Theory of Functions of a Real Variable which was noted for its coverage of integration theory. Selected papers 1925: "Definite integrals containing a parameter", Annals of Mathematics 26(3): 173 to 80 1926: "Functions of two variables for which the double integral does not exist", American Mathematical Monthly 33(3): 142,3 1931: "The uniform approximation of a sequence of integrals", American Journal of Mathematics 53(1): 61 to 71 1933: "Sets of k-extent in n-dimensional space", Transactions of the American Mathematical Society 35(3): 629 to 47
https://en.wikipedia.org/wiki/Tester-driven%20development
</noinclude> In software engineering, tester-driven development, or bug-driven development, is an anti-pattern where the requirements are determined by bug reports or test results rather than, for example, the value or cost of a feature. The concept is generally invoked facetiously, and comes with the implication that high volumes of computer code are written with little regard for unit testing by the programmers. The term itself is a tongue-in-cheek reference to test-driven development, a widely used methodology in agile software practices. In test-driven development tests are used to drive the implementation towards fulfilling the requirements. Tester-driven development instead shortcuts the process by removing the determination of requirements and letting the testers (or the QA team) drive what they think the software should be through the testing (or QA) process. Projects that are developed using this anti-pattern often suffer from being extremely late. Another common problem is poor code quality. Common causes for projects ending up being run this way are often: The testing phase started too early; Incomplete requirements; Inexperienced testers; Inexperienced developers; Poor project management. Things get worse when the testers realize that they don't know what the requirements are and therefore don't know how to test any particular code changes. The onus then falls on the developers of individual changes to write their own test cases and they are happy to do so because their own tests normally pass and their performance measurements improve. Project leaders are also delighted by the rapid reduction in the number of open change requests. See also Extreme programming Extreme programming practices Feature creep Requirements management Software prototyping – creating prototypes of software applications to get feedback from users early in a project
https://en.wikipedia.org/wiki/Motion%20Picture%20Association%20film%20rating%20system
The Motion Picture Association film rating system is used in the United States and its territories to rate a motion picture's suitability for certain audiences based on its content. The system and the ratings applied to individual motion pictures are the responsibility of the Motion Picture Association (MPA), previously known as the Motion Picture Association of America (MPAA) from 1945 to 2019. The MPA rating system is a voluntary scheme that is not enforced by law; films can be exhibited without a rating, although most theaters refuse to exhibit non-rated or NC-17 rated films. Non-members of the MPA may also submit films for rating. Other media, such as television programs, music and video games, are rated by other entities such as the TV Parental Guidelines, the RIAA and the ESRB, respectively. Introduced in 1968, following the Hays Code of the classical Hollywood cinema era, the MPA rating system is one of various motion picture rating systems that are used to help parents decide what films are appropriate for their children. It is administered by the Classification & Ratings Administration (CARA), an independent division of the MPA. Ratings MPA film ratings The MPA film ratings are as follows: In 2013, the MPA ratings were visually redesigned, with the rating displayed on a left panel and the name of the rating shown above it. A larger panel on the right provides a more detailed description of the film's content and an explanation of the rating level is placed on a horizontal bar at the bottom of the rating. Content descriptors Film ratings often have accompanying brief descriptions of the specifics behind the film's content and why it received a certain rating. They are displayed in trailers, posters, and on the backside of home video releases. Film rating content descriptors are exclusively used for films rated from PG to NC-17; they are not used for G-rated films because the content in them is suitable for all audiences even if containing mild objecti
https://en.wikipedia.org/wiki/Windows%2010X
Windows 10X is a cancelled edition of Windows 10, a major release of the Microsoft Windows series of operating systems. Announced by Microsoft on October 2, 2019, it was initially developed as an operating system to support dual-screen devices, such as the unreleased Surface Neo. 10X was expected to be released in 2020, but Microsoft later announced that the project had been cancelled in May 2021. However, some features and design changes from 10X were integrated into the newer Windows 11. While the operating system was originally designed for dual-screen devices, Windows 10X shifted its target to single-screen devices in 2020 due to increasing demand for traditional computers from the COVID-19 pandemic. Features New and changed Windows 10X introduced major changes to the Windows shell, abolishing legacy components in favor of new user experiences and enhanced security, as well as some notable design changes, which were integrated into Windows 11: The taskbar is now centered. It has 3 different sizes; Small, intended for mouse-controlled desktop computers, and medium and large are intended for touch computers The taskbar is automatically hidden, and can be clicked/tapped to be shown. New start menu: Microsoft redesigned the start menu with a focus on productivity, with the search box now at the top instead of in the taskbar like in other editions of Windows 10, as well as a section of pinned apps which is the successor to the Live Tiles from other Windows 10 editions and 8. The Action Center has been renamed “Quick Settings” and given a redesign. Network/Internet controls, volume controls and power options have been moved to Quick Settings. There also exists an area to check notifications and control music playing from a specific app. Window borders have been rounded. The Out-of-box setup has been updated to better fit the new user interface of 10X, with a more modern design, as well as Cortana no longer being an integrated feature. The default UI now use
https://en.wikipedia.org/wiki/Cytochemistry
Cytochemistry is the branch of cell biology dealing with the detection of cell constituents by means of biochemical analysis and visualization techniques. This is the study of the localization of cellular components through the use of staining methods. The term is also used to describe a process of identification of the biochemical content of cells. Cytochemistry is a science of localizing chemical components of cells and cell organelles on thin histological sections by using several techniques like enzyme localization, micro-incineration, micro-spectrophotometry, radioautography, cryo-electron microscopy, X-ray microanalysis by energy-dispersive X-ray spectroscopy, immunohistochemistry and cytochemistry, etc. Freeze Fracture Enzyme Cytochemistry Freeze fracture enzyme cytochemistry was initially mentioned in the study of Pinto de silva in 1987. It is a technique that allows the introduction of cytochemistry into a freeze fracture cell membrane. immunocytochemistry is used in this technique to label and visualize the cell membrane's molecules. This technique could be useful in analyzing the ultrastructure of cell membranes. The combination of immunocytochemistry and freeze fracture enzyme technique, research can identify and have a better understanding of the structure and distribution of a cell membrane. Origin Jean Brachet's research in Brussel demonstrated the localization and relative abundance between RNA and DNA in the cells of both animals and plants opened up the door into the research of cytochemistry. The work by Moller and Holter in 1976 about endocytosis which discussed the relationship between a cell's structure and function had established the needs of cytochemical research. Aims Cytochemical research aims to study individual cells that may contain several cell types within a tissue. It takes a nondestructive approach to study the localization of the cell. By remaining the cell components intact, researcher are able to study the intact cell activ
https://en.wikipedia.org/wiki/Pasch%27s%20axiom
In geometry, Pasch's axiom is a statement in plane geometry, used implicitly by Euclid, which cannot be derived from the postulates as Euclid gave them. Its essential role was discovered by Moritz Pasch in 1882. Statement The axiom states that, The fact that segments AC and BC are not both intersected by the line is proved in Supplement I,1, which was written by P. Bernays. A more modern version of this axiom is as follows: (In case the third side is parallel to our line, we count an "intersection at infinity" as external.) A more informal version of the axiom is often seen: History Pasch published this axiom in 1882, and showed that Euclid's axioms were incomplete. The axiom was part of Pasch's approach to introducing the concept of order into plane geometry. Equivalences In other treatments of elementary geometry, using different sets of axioms, Pasch's axiom can be proved as a theorem; it is a consequence of the plane separation axiom when that is taken as one of the axioms. Hilbert uses Pasch's axiom in his axiomatic treatment of Euclidean geometry. Given the remaining axioms in Hilbert's system, it can be shown that Pasch's axiom is logically equivalent to the plane separation axiom. Hilbert's use of Pasch's axiom David Hilbert uses Pasch's axiom in his book Foundations of Geometry which provides an axiomatic basis for Euclidean geometry. Depending upon the edition, it is numbered either II.4 or II.5. His statement is given above. In Hilbert's treatment, this axiom appears in the section concerning axioms of order and is referred to as a plane axiom of order. Since he does not phrase the axiom in terms of the sides of a triangle (considered as lines rather than line segments) there is no need to talk about internal and external intersections of the line with the sides of the triangle ABC. Caveats Pasch's axiom is distinct from Pasch's theorem which is a statement about the order of four points on a line. However, in literature there are many in
https://en.wikipedia.org/wiki/The%20Biggest%20Little%20Railway%20in%20the%20World
The Biggest Little Railway in the World (BLR) was a temporary 71 mile (114 km) 1.25 inches (32 mm) O-gauge model railway from Fort William to the City of Inverness, the two largest settlements in the Scottish Highlands. It has been described as a crackpot project to run a model train the length of the Great Glen Way by an army of madcap enthusiasts, geeks, and engineers in the best spirit of eccentric Britishness. Project The project was headed by Dick Strawbridge, MBE. It was backed by a television production with the same name as the railway. The production team and security staff were also needed to assist with the project. Project management The project took months of planning. It was described as an operation of fiendish complexity. Calls were made for the 56 volunteers determined to be needed for the project. There were planning meetings at the start of each day. Some disagreements occurred but were overcome by a spirit of gusto and camaraderie. Team Engineers Claire Barratt and Hadrian Spooner who had worked on engineering projects such as Scrapheap Challenge and Salvage Squad also acted as part of the credited professional team. A team of 56 volunteers constructed and operated the line with help from local volunteers. Community involvement The local community also assisted the enterprise at various points including the Inverness and District Model Railway Club provisioning a model station and castle for the train's arrival. Related projects The Invergarry and Fort Augustus Railway, opened by the Victorians in 1903 and closed in 1946, had connected to the main line at Spean Bridge. It was speculated the ultimate aim was the same as the BLR's, to reach Inverness though the attempt was abandoned. In 2009 James May attempted to beat the longest OO Gauge record. Route The route began at Corpach Double Lock near Fort William and tracked the Great Glen Way past Fort Augustus to Inverness terminating at Inverness Castle. The track and infrastruct
https://en.wikipedia.org/wiki/NUTS%20statistical%20regions%20of%20Slovenia
In the NUTS (Nomenclature of Territorial Units for Statistics) codes of Slovenia (SI), the three levels are: NUTS codes SI0 Slovenia SI03 Eastern Slovenia (Vzhodna Slovenija) SI031 Mura Statistical Region (pomurska statistična regija) SI032 Drava Statistical Region (podravska statistična regija) SI033 Carinthia Statistical Region (koroška statistična regija) SI034 Savinja Statistical Region (savinjska statistična regija) SI035 Central Sava Statistical Region (zasavska statistična regija) SI036 Lower Sava Statistical Region (spodnjeposavska statistična regija) SI037 Southeast Slovenia Statistical Region (jugovzhodna Slovenija) SI038 Littoral–Inner Carniola Statistical Region (primorsko-notranjska statistična regija) SI04 Western Slovenia (Zahodna Slovenija) SI041 Central Slovenia Statistical Region (osrednjeslovenska statistična regija) SI042 Upper Carniola Statistical Region (gorenjska statistična regija) SI043 Gorizia Statistical Region (goriška statistična regija) SI044 Coastal–Karst Statistical Region (obalno-kraška statistična regija) In the 2003 version, the codes were as follows: SI0 Slovenia SI00 Slovenia SI001 Mura Statistical Region (pomurska statistična regija) SI002 Drava Statistical Region (podravska statistična regija) SI003 Carinthia Statistical Region (koroška statistična regija) SI004 Savinja Statistical Region (savinjska statistična regija) SI005 Central Sava Statistical Region (zasavska statistična regija) SI006 Lower Sava Statistical Region (spodnjeposavska statistična regija) SI009 Upper Carniola Statistical Region (gorenjska statistična regija) SI00A Inner Carniola–Karst Statistical Region (notranjsko-kraška statistična regija) SI00B Gorizia Statistical Region (goriška statistična regija) SI00C Coastal–Karst Statistical Region (obalno-kraška statistična regija) SI00D Southeast Slovenia Statistical Region (jugovzhodna Slovenija) SI00E Central Slovenia Statistical Region (osrednjeslovenska statistična regija) Local administrative units Below t
https://en.wikipedia.org/wiki/Time%20of%20occurrence
In forensic investigation, the time of occurrence of an event (such as time of death, time of incident) is one of the most important things to determine accurately as soon as possible. Sometimes this can only be estimated. Some indicators that investigators use are rigor mortis, livor mortis, algor mortis, clouding of the corneas, state of decomposition, presence/absence of purged fluids and level of tissue desiccation. Pathologists can assume a time of death via analysing necrophagous diptera. The odour from decaying flesh attracts different species as the stages of decomposition progress.
https://en.wikipedia.org/wiki/Active%20rectification
Active rectification, or synchronous rectification, is a technique for improving the efficiency of rectification by replacing diodes with actively controlled switches, usually power MOSFETs or power bipolar junction transistors (BJT). Whereas normal semiconductor diodes have a roughly fixed voltage drop of around 0.5-1 volts, active rectifiers behave as resistances, and can have arbitrarily low voltage drop. Historically, vibrator driven switches or motor-driven commutators have also been used for mechanical rectifiers and synchronous rectification. Active rectification has many applications. It is frequently used for arrays of photovoltaic panels to avoid reverse current flow that can cause overheating with partial shading while giving minimum power loss. It is also used in switched-mode power supplies (SMPS). Motivation The constant voltage drop of a standard p-n junction diode is typically between 0.7 V and 1.7 V, causing significant power loss in the diode. Electric power depends on current and voltage: the power loss rises proportional to both current and voltage. In low voltage converters (around 10 volts and less), the voltage drop of a diode (typically around 0.7 to 1 volt for a silicon diode at its rated current) has an adverse effect on efficiency. One classic solution replaces standard silicon diodes with Schottky diodes, which exhibit very low voltage drops (as low as 0.3 volts). However, even Schottky rectifiers can be significantly more lossy than the synchronous type, notably at high currents and low voltages. When addressing very low-voltage converters, such as a buck converter power supply for a computer CPU (with a voltage output around 1 volt, and many amperes of output current), Schottky rectification does not provide adequate efficiency. In such applications, active rectification becomes necessary. Description Replacing a diode with an actively controlled switching element such as a MOSFET is the heart of active rectification. MOS
https://en.wikipedia.org/wiki/Aponeurosis
An aponeurosis (; plural: aponeuroses) is a flattened tendon by which muscle attaches to bone or fascia. Aponeuroses exhibit an ordered arrangement of collagen fibres, thus attaining high tensile strength in a particular direction while being vulnerable to tensional or shear forces in other directions. They have a shiny, whitish-silvery color, are histologically similar to tendons, and are very sparingly supplied with blood vessels and nerves. When dissected, aponeuroses are papery and peel off by sections. The primary regions with thick aponeuroses are in the ventral abdominal region, the dorsal lumbar region, the ventriculus in birds, and the palmar (palms) and plantar (soles) regions. Anatomy Anterior abdominal aponeuroses The anterior abdominal aponeuroses are located just superficial to the rectus abdominis muscle. It has for its borders the external oblique, pectoralis muscles, and the latissimus dorsi. Posterior lumbar aponeuroses The posterior lumbar aponeuroses are situated just on top of the epaxial muscles of the thorax, which are multifidus spinae and sacrospinalis. Palmar and plantar aponeuroses and extensor hood The palmar aponeuroses occur on the palms of the hands. The extensor hoods are aponeuroses at the back of the fingers. The plantar aponeuroses occur on the plantar aspect of the foot. They extend from the calcaneal tuberosity then diverge to connect to the bones, ligaments and the dermis of the skin around the distal part of the metatarsal bones. Anterior and posterior intercostal membranes The anterior and posterior intercostal membranes are aponeuroses located between the ribs and are continuations of the external and internal intercostal muscles, respectively. Scalp aponeuroses The epicranial aponeurosis, or galea aponeurotica, is a tough layer of dense fibrous tissue which runs from the frontalis muscle anteriorly to the occipitalis posteriorly. Pennate muscles and aponeuroses Pennate muscles, in which the muscle fibers are oriented
https://en.wikipedia.org/wiki/NOAAS%20Reuben%20Lasker
NOAAS Reuben Lasker is a National Oceanic and Atmospheric Administration (NOAA) fishery research vessel. The ship's namesake, Reuben Lasker, was a fisheries biologist who served with the Southwest Fisheries Center, National Marine Fisheries Service, and taught at the Scripps Institution of Oceanography. Construction and commissioning Reuben Laskers construction was funded through the American Recovery and Reinvestment Act. Marinette Marine Corporation was awarded a $73.6 million contract to build her in April 2010. Reuben Lasker was laid down at the Marinette Marine Corporation shipyard in Marinette, Wisconsin, on 21 June 2011 and by February 2012 was 60% complete. Four months later, on 16 June 2012, Pamela A. Lasker, Reuben Lasker's daughter, christened the ship and Reuben Lasker was side-launched into the Menominee River. Marinette Marine delivered the ship to NOAA at Norfolk, Virginia, on 8 November 2013. After a 20-day, 5,000-nautical mile (9,260-km) voyage from Norfolk via the Panama Canal, Reuben Lasker arrived at San Diego, California, her home port, on 29 March 2014. NOAA officially commissioned her on 2 May 2014 during a ceremony at the Navy Pier in downtown San Diego, California. Characteristics and capabilities Capable of conducting multidisciplinary oceanographic operations in support of biological, chemical, and physical process studies, Reuben Lasker was commissioned as the fifth of a class of five of the most advanced fisheries research vessels in the world, with a unique capability to conduct both fishing and oceanographic research. She is a stern trawler with fishing capabilities similar to those of commercial fishing vessels. She is rigged for longlining and trap fishing and can conduct trawling operations to depths of . Her most advanced feature is the incorporation of United States Navy-type acoustic quieting technology to enable NOAA scientists to monitor fish populations without the ships noise altering the behavior of the fish, includin
https://en.wikipedia.org/wiki/Omega%20constant
The omega constant is a mathematical constant defined as the unique real number that satisfies the equation It is the value of , where is Lambert's function. The name is derived from the alternate name for Lambert's function, the omega function. The numerical value of is given by . . Properties Fixed point representation The defining identity can be expressed, for example, as or as well as Computation One can calculate iteratively, by starting with an initial guess , and considering the sequence This sequence will converge to as approaches infinity. This is because is an attractive fixed point of the function . It is much more efficient to use the iteration because the function in addition to having the same fixed point, also has a derivative that vanishes there. This guarantees quadratic convergence; that is, the number of correct digits is roughly doubled with each iteration. Using Halley's method, can be approximated with cubic convergence (the number of correct digits is roughly tripled with each iteration): (see also ). Integral representations An identity due to Victor Adamchik is given by the relationship Other relations due to Mező and Kalugin-Jeffrey-Corless are: The latter two identities can be extended to other values of the function (see also ). Transcendence The constant is transcendental. This can be seen as a direct consequence of the Lindemann–Weierstrass theorem. For a contradiction, suppose that is algebraic. By the theorem, is transcendental, but , which is a contradiction. Therefore, it must be transcendental.
https://en.wikipedia.org/wiki/Primary%20key
In the relational model of databases, a primary key is a specific choice of a minimal set of attributes (columns) that uniquely specify a tuple (row) in a relation (table). Informally, a primary key is "which attributes identify a record," and in simple cases constitute a single attribute: a unique ID. More formally, a primary key is a choice of candidate key (a minimal superkey); any other candidate key is an alternate key. A primary key may consist of real-world observables, in which case it is called a natural key, while an attribute created to function as a key and not used for identification outside the database is called a surrogate key. For example, for a database of people (of a given nationality), time and location of birth could be a natural key. National identification number is another example of an attribute that may be used as a natural key. Design In relational database terms, a primary key does not differ in form or function from a key that isn't primary. In practice, various motivations may determine the choice of any one key as primary over another. The designation of a primary key may indicate the "preferred" identifier for data in the table, or that the primary key is to be used for foreign key references from other tables or it may indicate some other technical rather than semantic feature of the table. Some languages and software have special syntax features that can be used to identify a primary key as such (e.g. the PRIMARY KEY constraint in SQL). The relational model, as expressed through relational calculus and relational algebra, does not distinguish between primary keys and other kinds of keys. Primary keys were added to the SQL standard mainly as a convenience to the application programmer. Primary keys can be an integer that is incremented, a universally unique identifier (UUID) or can be generated using Hi/Lo algorithm. Defining primary keys in SQL Primary keys are defined in the ISO SQL Standard, through the PRIMARY KEY constrain
https://en.wikipedia.org/wiki/Frank%E2%80%93Read%20source
In materials science, a Frank–Read source is a mechanism explaining the generation of multiple dislocations in specific well-spaced slip planes in crystals when they are deformed. When a crystal is deformed, in order for slip to occur, dislocations must be generated in the material. This implies that, during deformation, dislocations must be primarily generated in these planes. Cold working of metal increases the number of dislocations by the Frank–Read mechanism. Higher dislocation density increases yield strength and causes work hardening of metals. The mechanism of dislocation generation was proposed by and named after British physicist Charles Frank and Thornton Read. History Charles Frank detailed the history of the discovery from his perspective in Proceedings of the Royal Society in 1980. In 1950 Charles Frank, who was then a research fellow in the physics department at the University of Bristol, visited the United States to participate in a conference on crystal plasticity in Pittsburgh. Frank arrived in the United States well in advance of the conference to spend time at a naval laboratory and to give a lecture at Cornell University. When, during his travels in Pennsylvania, Frank visited Pittsburgh, he received a letter from fellow scientist Jock Eshelby suggesting that he read a recent paper by Gunther Leibfried. Frank was supposed to board a train to Cornell to give his lecture at Cornell, but before departing for Cornell he went to the library at Carnegie Institute of Technology to obtain a copy of the paper. The library did not yet have the journal with Leibfried's paper, but the staff at the library believed that the journal could be in the recently arrived package from Germany. Frank decided to wait for the library to open the package, which did indeed contain the journal. Upon reading the paper he took a train to Cornell, where he was told to pass the time until 5:00, as the faculty was in meeting. Frank decided to take a walk between 3:00 a
https://en.wikipedia.org/wiki/Foveated%20rendering
Foveated rendering is a rendering technique which uses an eye tracker integrated with a virtual reality headset to reduce the rendering workload by greatly reducing the image quality in the peripheral vision (outside of the zone gazed by the fovea). A less sophisticated variant called fixed foveated rendering doesn't utilise eye tracking and instead assumes a fixed focal point. History Research into foveated rendering dates back at least to 1991. At Tech Crunch Disrupt SF 2014, Fove unveiled a headset featuring foveated rendering. This was followed by a successful kickstarter in May 2015. At CES 2016, SensoMotoric Instruments (SMI) demoed a new 250 Hz eye tracking system and a working foveated rendering solution. It resulted from a partnership with camera sensor manufacturer Omnivision who provided the camera hardware for the new system. In July 2016, Nvidia demonstrated during SIGGRAPH a new method of foveated rendering claimed to be invisible to users. In February 2017, Qualcomm announced their Snapdragon 835 Virtual Reality Development Kit (VRDK) which includes foveated rendering support called Adreno Foveation. During CES 2019 on January 7 HTC announced an upcoming virtual reality headset called Vive Pro Eye featuring eye-tracking and support for foveated rendering. In December 2019, Facebook's Oculus Quest SDK gave developers access to dynamic fixed foveated rendering, allowing the variation in level of detail to be changed on the fly via an API. On January 4, 2022, Sony announced that their follow-up to PlayStation VR will include eye tracking and foveated rendering. On June 5, 2023, Apple announced that the Apple Vision Pro extended reality headset includes dynamic foveated rendering. Use According to chief scientist Michael Abrash at Oculus, utilising foveated rendering in conjunction with sparse rendering and deep learning image reconstruction has the potential to require an order of magnitude fewer pixels to be rendered in comparison to a ful
https://en.wikipedia.org/wiki/Perfect%20Dark%20%28P2P%29
is a peer-to-peer file-sharing (P2P) application from Japan designed for use with Microsoft Windows. It was launched in 2006. Its author is known by the pseudonym . Perfect Dark was developed with the intention for it to be the successor to both Winny and Share software. While Japan's Association for Copyright of Computer Software reported that in January 2014, the number of nodes connected on Perfect Dark () was less than on Share (), but more than on Winny (), Netagent in 2018 reported Winny being the largest with 50 000 nodes followed by Perfect Dark with 30 000 nodes followed by Share with 10 000. Netagent asserts that the number of nodes on Perfect Dark have fallen since 2015 while the numbers of Winny hold steady. Netagent reports that users of Perfect Dark are most likely to share books/manga. As of version 1.02 (2008), code-named "Stand Alone Complex", there is support for the program to run in English, an option that can be selected when the program is installed. Overview Perfect Dark is still being actively developed. The author does not ask that the program's users at this point become dedicated "users" of the software. Instead, the author asks them to participate in the test phase. Through this test phase, the author hopes for bug reports and discussion that will help shape Perfect Dark into a better program. DKT+DHT+DU The author implements an architecture called DKT+DHT+DU in the design of the network. These three parts compose the entire network. "DKT" stands for Distributed Keyword Table. "DHT" for Distributed Hash Table. "DU" for distributed Unity. "DKT" is mainly for providing effective file searching while "DHT" and "DU" is used for effective file sharing and enhancing anonymity. Network bandwidth requirement Perfect Dark has higher bandwidth and hard drive space requirements than its predecessors Winny and Share. The minimum upload speed is 100 kbit/s. Perfect Dark requires more network bandwidth and hard disk space than Winny or
https://en.wikipedia.org/wiki/Antitropical%20distribution
Antitropical (alternatives include biantitropical or amphitropical) distribution is a type of disjunct distribution where a species or clade exists at comparable latitudes across the equator but not in the tropics. For example, a species may be found north of the Tropic of Cancer and south of the Tropic of Capricorn, but not in between. With increasing time since dispersal, the disjunct populations may be the same variety, species, or clade. How the life forms distribute themselves to the opposite hemisphere when they can't normally survive in the middle depends on the species; plants may have their seed spread through wind, animal, or other methods and then germinate upon reaching the appropriate climate, while sea life may be able to travel through the tropical regions in a larval state or by going through deep ocean currents with much colder temperatures than on the surface. For the American amphitropical distribution, dispersal has been generally agreed to be more likely than vicariance from a previous distribution including the tropics in North and South America. Known cases Plants Phacelia crenulata – scorpionweed Bowlesia incana – American Bowlesia Osmorhiza berteroi and Osmorhiza depauperata – sweet cecily species. Ruppia megacarpa Solenogyne For a list of American amphitropically distributed plants (237 vascular plants), see the tables in the open access paper Simpson et al. 2017 or their working group on figshare Animals Scylla serrata – mud crab Freshwater crayfish Ground beetle genus Bembidion Bryophytes and lichens Tetraplodon fuegianus - dung moss See also Rapoport's rule
https://en.wikipedia.org/wiki/Dihydrogen%20cation
The dihydrogen cation or hydrogen molecular ion is a cation (positive ion) with formula . It consists of two hydrogen nuclei (protons) sharing a single electron. It is the simplest molecular ion. The ion can be formed from the ionization of a neutral hydrogen molecule () by electron impact. It is commonly formed in molecular clouds in space by the action of cosmic rays. The dihydrogen cation is of great historical, theoretical, and experimental interest. Historically it is of interest because, having only one electron, the equations of quantum mechanics that describe its structure can be solved approximately in a relatively straightforward way, as long as the motion of the nuclei and relativistic and quantum electrodynamic effects are neglected. The first such solution was derived by Ø. Burrau in 1927, just one year after the wave theory of quantum mechanics was published. The theoretical interest arises because an accurate mathematical solution, taking into account the motion of all constituents, is feasible. The accuracy has steadily improved over more than half a century, eventually resulting in a theoretical framework allowing ultra-high-accuracy predictions for the energies of the rotational and vibrational levels in the electronic ground state, which are mostly metastable. In parallel, the experimental approach to the study of the cation has undergone a fundamental evolution with respect to earlier experimental techniques used in the 1960s and 1980s. Employing the most advanced techniques, the rotational and vibrational transitions can be investigated in extremely fine detail. The transition frequencies can be measured and the results can be compared with the theoretical predictions. This makes the dihydrogen cations another family of bound systems relevant for the determination of fundamental constants of atomic and nuclear physics, after the hydrogen atom family and the helium atom family. Physical properties Bonding in can be described as a covalent
https://en.wikipedia.org/wiki/Vehicle%20routing%20problem
The vehicle routing problem (VRP) is a combinatorial optimization and integer programming problem which asks "What is the optimal set of routes for a fleet of vehicles to traverse in order to deliver to a given set of customers?" It generalises the travelling salesman problem (TSP). It first appeared in a paper by George Dantzig and John Ramser in 1959, in which the first algorithmic approach was written and was applied to petrol deliveries. Often, the context is that of delivering goods located at a central depot to customers who have placed orders for such goods. The objective of the VRP is to minimize the total route cost. In 1964, Clarke and Wright improved on Dantzig and Ramser's approach using an effective greedy algorithm called the savings algorithm. Determining the optimal solution to VRP is NP-hard, so the size of problems that can be optimally solved using mathematical programming or combinatorial optimization may be limited. Therefore, commercial solvers tend to use heuristics due to the size and frequency of real world VRPs they need to solve. VRP has many direct applications in industry. Vendors of VRP routing tools often claim that they can offer cost savings of 5%–30%. Setting up the problem The VRP concerns the service of a delivery company. How things are delivered from one or more depots which has a given set of home vehicles and operated by a set of drivers who can move on a given road network to a set of customers. It asks for a determination of a set of routes, S, (one route for each vehicle that must start and finish at its own depot) such that all customers' requirements and operational constraints are satisfied and the global transportation cost is minimized. This cost may be monetary, distance or otherwise. The road network can be described using a graph where the arcs are roads and vertices are junctions between them. The arcs may be directed or undirected due to the possible presence of one way streets or different costs in each dire
https://en.wikipedia.org/wiki/Psoralen
Psoralen (also called psoralene) is the parent compound in a family of naturally occurring organic compounds known as the linear furanocoumarins. It is structurally related to coumarin by the addition of a fused furan ring, and may be considered as a derivative of umbelliferone. Psoralen occurs naturally in the seeds of Psoralea corylifolia, as well as in the common fig, celery, parsley, West Indian satinwood, and in all citrus fruits. It is widely used in PUVA (psoralen + UVA) treatment for psoriasis, eczema, vitiligo, and cutaneous T-cell lymphoma; these applications are typically through the use of medications such as Methoxsalen. Many furanocoumarins are extremely toxic to fish, and some are deposited in streams in Indonesia to catch fish. Uses Psoralen is a mutagen, and is used for this purpose in molecular biology research. Psoralen intercalates into DNA and on exposure to ultraviolet (UVA) radiation can form monoadducts and covalent interstrand cross-links (ICL) with thymines, preferentially at 5'-TpA sites in the genome, inducing apoptosis. Psoralen plus UVA (PUVA) therapy can be used to treat hyperproliferative skin disorders like psoriasis and certain kinds of skin cancer. Unfortunately, PUVA treatment itself leads to a higher risk of skin cancer. An important use of psoralen is in PUVA treatment for skin problems such as psoriasis and, to a lesser extent, eczema and vitiligo. This takes advantage of the high UV absorbance of psoralen. The psoralen is applied first to sensitise the skin, then UVA light is applied to clean up the skin problem. Psoralens are also used in photopheresis, where they are mixed with the extracted leukocytes before UV radiation is applied. Despite the photocarcinogenic properties of psoralen, it was used as a tanning activator in sunscreens until 1996. Psoralens are used in tanning accelerators, because psoralen increases the skin's sensitivity to light. Some patients have had severe skin loss after sunbathing with psoralen-
https://en.wikipedia.org/wiki/Gas%20slug
A gas slug is a conglomerate of high pressure gas bubbles that forms within certain volcanoes, the agitation of which is a key driving factor in Strombolian eruptions. They start out as small bubbles of gas inside of volcanic magma.These accumulate into one large bubble, which starts to rise through the lava plume. Volcanic eruptions consist of mostly water vapor gases, with sulfur dioxide and carbon dioxide playing a huge part in gas release as well. Once the accumulated slug reaches the top of the column and comes in contact with air, it bursts with a loud pop because of the lower air pressure, throwing magma into the air in the typical lava volcanic arc of a Strombolian eruption. This type of eruption is episodic, non-damaging to its source vent, and one of the slowest forms of activity, with the ability to sustain itself for thousands of years. Although the effect of gas slugs in lava is well understood, how they form is not well understood, but recent research suggests that they can form as deep as under the surface.
https://en.wikipedia.org/wiki/Ad%20hoc%20On-Demand%20Distance%20Vector%20Routing
Ad hoc On-Demand Distance Vector (AODV) Routing is a routing protocol for mobile ad hoc networks (MANETs) and other wireless ad hoc networks. It was jointly developed by Charles Perkins (Sun Microsystems) and Elizabeth Royer (now Elizabeth Belding) (University of California, Santa Barbara) and was first published in the ACM 2nd IEEE Workshop on Mobile Computing Systems and Applications in February 1999. AODV is the routing protocol used in Zigbee – a low power, low data rate wireless ad hoc network. There are various implementations of AODV such as MAD-HOC, Kernel-AODV, AODV-UU, AODV-UCSB and AODV-UIUC. The original publication of AODV won the SIGMOBILE Test of Time Award in 2018. According to Google Scholar, this publication reached 30,000 citations at the end of 2022. AODV was published in the Internet Engineering Task Force (IETF) as Experimental RFC 3561 in 2003. See also Wireless ad hoc networks Backpressure routing Mesh networking Wireless mesh network § Routing protocols List of ad hoc routing protocols
https://en.wikipedia.org/wiki/Extended%20static%20checking
Extended static checking (ESC) is a collective name in computer science for a range of techniques for statically checking the correctness of various program constraints. ESC can be thought of as an extended form of type checking. As with type checking, ESC is performed automatically at compile time (i.e. without human intervention). This distinguishes it from more general approaches to the formal verification of software, which typically rely on human-generated proofs. Furthermore, it promotes practicality over soundness, in that it aims to dramatically reduce the number of false positives (overestimated errors that are not real errors, that is, ESC over strictness) at the cost of introducing some false negatives (real ESC underestimation error, but that need no programmer's attention, or are not targeted by ESC). ESC can identify a range of errors that are currently outside the scope of a type checker, including division by zero, array out of bounds, integer overflow and null dereferences. The techniques used in extended static checking come from various fields of computer science, including static program analysis, symbolic simulation, model checking, abstract interpretation, SAT solving and automated theorem proving and type checking. Extended static checking is generally performed only at an intraprocedural, rather than interprocedural, level in order to scale to large programs. Furthermore, extended static checking aims to report errors by exploiting user-supplied specifications, in the form of pre- and post-conditions, loop invariants and class invariants. Extended static checkers typically operate by propagating strongest postconditions (respectively weakest preconditions) intraprocedurally through a method starting from the precondition (respectively postcondition). At each point during this process an intermediate condition is generated that captures what is known at that program point. This is combined with the necessary conditions of the program stateme
https://en.wikipedia.org/wiki/Rubroboletus%20pulcherrimus
Rubroboletus pulcherrimus—known as Boletus pulcherrimus until 2015—is a species of mushroom in the family Boletaceae. It is a large bolete from Western North America with distinguishing features that include a netted surface on the stem, a red to brown cap and stem color, and red pores that stain blue upon injury. Until 2005 this was the only bolete that has been implicated in the death of someone consuming it; a couple developed gastrointestinal symptoms in 1994 after eating this fungus with the husband succumbing. Autopsy revealed infarction of the midgut. Taxonomy American mycologists Harry D. Thiers and Roy E. Halling were aware of confusion on the west coast of North America over red-pored boletes; two species were traditionally recognised—Boletus satanas and Boletus eastwoodiae. However, they strongly suspected the type specimen of the latter species was in fact the former. In reviewing material they published a new name for the taxon, which Thiers had written about in local guidebooks as B. eastwoodiae, as they felt that name to be invalid. Hence in 1976 they formally described Boletus pulcherrimus, from the Latin pulcherrimus, meaning "very pretty". It was transferred to the genus Rubroboletus in 2015 along with several other allied reddish colored, blue-staining bolete species such as B. eastwoodiae and B. satanas. Description Colored various shades of olive- to reddish-brown, the cap may sometimes reach in diameter and is convex in shape before flattening at maturity. The cap surface may be smooth or velvety when young, but may be scaled in older specimens; the margin of the cap is curved inwards in young specimens but rolls out and flattens as it matures. The cap may reach a thickness of when mature. The adnate (attached squarely to the stem) poroid surface is bright red to dark red or red-brown and bruise dark blue or black; there are 2 to 3 pores per mm in young specimens, and in maturity they expand to about 1 or 2 per mm. In cross section, the tu
https://en.wikipedia.org/wiki/Alphonse%20Antonio%20de%20Sarasa
Alphonse Antonio de Sarasa was a Jesuit mathematician who contributed to the understanding of logarithms, particularly as areas under a hyperbola. Alphonse de Sarasa was born in 1618, in Nieuwpoort in Flanders. In 1632 he was admitted as a novice in Ghent. It was there that he worked alongside Gregoire de Saint-Vincent whose ideas he developed, exploited, and promulgated. According to Sommervogel, Alphonse de Sarasa also held academic positions in Antwerp and Brussels. In 1649 Alphonse de Sarasa published Solutio problematis a R.P. Marino Mersenne Minimo propositi. This book was in response to Marin Mersenne's pamphlet "Reflexiones Physico-mathematicae" which reviewed Saint-Vincent's Opus Geometricum and posed this challenge: Given three arbitrary magnitudes, rational or irrational, and given the logarithms of the two, to find the logarithm of the third geometrically. R.P. Burn explains that the term logarithm was used differently in the seventeenth century. Logarithms were any arithmetic progression which corresponded to a geometric progression. Burn says, in reviewing de Sarasa's popularization of de Saint-Vincent, and concurring with Moritz Cantor, that "the relationship between logarithms and the hyperbola was found by Saint-Vincent in all but name". Burn quotes de Sarasa on this point: "…the foundation of the teaching embracing logarithms are contained" in Saint-Vincent's Opus Geometricum, part 4 of Book 6, de Hyperbola. Alphonse Antonio de Sarasa died in Brussels in 1667. Works See also List of Roman Catholic scientist-clerics
https://en.wikipedia.org/wiki/Brand%20page
A brand page (also known as a page or fan page), in online social networking parlance, is a profile on a social networking website which is considered distinct from an actual user profile in that it is created and managed by at least one other registered user as a representation of a non-personal online identity. This feature is most used to represent the brands of organizations associated with, properties owned by, or general interests favored by a user of the hosting network. While also being potentially manageable by more than one registered user, pages are distinguished from groups in that pages are usually designed for the managers to direct messages and posts to subscribing users (akin to a newsletter or blog) and promote a brand, while groups are usually and historically formed for discussion purposes. History Prior to 2007, only a few websites made use of non-personal profile pages. Last.fm, established in 2002, used its music recommendation service to automatically generate "artist pages" which serve as portals for biographies, events and artist-related playlists. This approach, however, is not explicitly controlled by artists or music groups because of the automatic nature of artist pages; pages, for example, could be created from erroneous misspellings and miscredits of works which are accepted as-is by the Audioscrobbler recommendation service used by Last.fm. Furthermore, Last.fm has never advertised itself as a social networking service, despite accruing myriad social features since 2002. The most high-profile usage of this model is Facebook's Pages (formerly known as "Fan Page" until 2010) feature, launched in 2007; one could "be a fan of" a page until April 2010, when the parlance was replaced with "Like". Foursquare, a location-oriented social networking site, launched its "Brands" feature allowing for the creation of specialized brand pages in January 2010 (with Intel being the first user), but they did not become "self-serve" (controllable by i
https://en.wikipedia.org/wiki/Graphical%20path%20method
The Graphical Path Method (GPM) is a mathematically based algorithm used in project management for planning, scheduling and resource control. GPM represents logical relationships of dated objects – such as activities, milestones, and benchmarks – in a time-scaled network diagram. History The Graphical Path Method (GPM) (formerly known as the ‘Graphical Planning Method’) was originally developed from 2004 to 2006 by Dr. Gui Ponce de Leon, current Chairman of the Project Management Institute College of Scheduling (PMICOS), and was first presented at the PMICOS 2008 annual conference. It was created as an alternative to the Critical Path Method (CPM) and was designed as a graphical tool to encourage an engaging, planning-centric experience for project stakeholders. Technique To create a GPM schedule, users draw and place objects – such as activities, milestones, and benchmarks – on a time-scaled canvas. Objects are linked together to establish logical, precedence relationships. These relationships are governed by the Logic Diagramming Method (LDM), a blend of the Arrow Diagramming Method (ADM) and the Precedence Diagramming Method (PDM). In total, LDM permits 12 relationship types to account for all possible dependencies between objects. The resulting web of logically related, dated objects and their relationships forms a network diagram. Object relationships form the backbone of a GPM network. They are used to calculate a number of object attributes, including link gap and object buffer, drift, and float. As objects and their relationships are added to or modified in the schedule, GPM continuously re-calculates and updates gap for all links and float for all dated objects. Link gaps are calculated from the dates of two related activities and floats are algorithmically calculated from gaps. Differences between GPM and CPM The Critical Path Method (CPM) is the traditional mathematical algorithm used for schedule logic computation. GPM utilizes a different algorith
https://en.wikipedia.org/wiki/RIVA%20TNT
The RIVA TNT, codenamed NV4, is a 2D, video, and 3D graphics accelerator chip for PCs that was developed by Nvidia. It was released in March 1998 and cemented Nvidia's reputation as a worthy rival within the developing consumer 3D graphics adapter industry. The first RIVA TNT based card was released on June 15, 1998, by STB Systems: Velocity 4400. RIVA is an acronym for Real-time Interactive Video and Animation accelerator. The "TNT" suffix refers to the chip's ability to work on two texels at once (TwiN Texel). Overview The TNT was designed as a follow-up to the RIVA 128 and a response to 3Dfx's introduction of the Voodoo2. It added a second pixel pipeline, practically doubling rendering speed, and used considerably faster memory. Unlike the Voodoo2 (but like the slower Matrox G200) it also added support for a 32-bit (truecolor) pixel format, 24-bit Z-buffer in 3D mode, an 8-bit stencil buffer and support for 1024×1024 pixel textures. Improved mipmapping and texture filtering techniques, including newly added support for trilinear filtering, dramatically improved quality compared to the TNT's predecessor. TNT also added support for up to 16 MiB of SDR SDRAM. Like RIVA 128, RIVA TNT is a single chip solution. The TNT shipped later than originally planned, ran quite hot, and was clocked lower than Nvidia had planned at 90 MHz instead of 110 MHz. Originally planned specifications should have placed the card ahead of Voodoo2 in theoretical performance for Direct3D applications, but at 90 MHz it did not quite match the Voodoo2. At the time, most games supported 3dfx's proprietary Glide API which gave the Voodoo2 a large advantage in speed and image quality, and some games only used the Glide API for 3D acceleration, leaving TNT users no better off than people who didn't have a 3D accelerator. Even in "OpenGL only" comparisons such as the case in Quake 2, the Voodoo2 had the upper hand as a custom "MiniGL" driver was made specifically for 3dfx cards to run the game (a
https://en.wikipedia.org/wiki/Cophonicity
Given two interacting atoms A and B, the cophonicity () of the A-B atomic pair is a measure of the overlap of the A and B contributions to a specific range of vibrational frequencies. In the field of condensed matter physics, cophonicity is a metric aimed at the parametrization of the dynamical interactions in terms of the atomic types forming the A-B pair. In connection with other electronic and structural descriptors, such as the covalency metric or the distortion mode analysis from group theory, the A-B pair cophonicity is a guide to properly select either A or B atomic species to tune specific vibrational frequencies of a given system. The cophonicity metric has been originally designed for the study of the atomic motions in transition metal dichalcogenides, but its formulation is general and can be applied to any kind of system, irrespective of the chemical composition and stoichiometry. Mathematical formulation Considering the phonon density of states (pDOS) in the first Brillouin zone, the center mass of the atom-projected pDOS of an atom A is defined as where is the contribution of the atom A to and ; the total density of states of the solid is , defined as and obtained by summing over all atoms X of the unit cell. The integration interval is chosen in such a way that it encompasses all the phonon states relevant for the specific study. The integral at the denominator of the definition of is the contribution of the atom A to the states in the frequency range ; we call such quantity the phonicity of the A atom in that specific frequency range. The phonicity of an atom then represents the amount of phonon states that such atom contributes to form; in this respect, it can be regarded as the phonon counterpart of the atomic valence, that is the number of electrons with which an atom participates to form the electronic states of the system, counted as the integral of the atom-projected electronic density of states. Consider a generic A-B atomic pair.
https://en.wikipedia.org/wiki/Mycobacterium%20virus%20Packman
Mycobacterium virus Packman is a bacteriophage known to infect bacterial species of the genus Mycobacterium. It is named after the famed arcade game character Pac-Man, from the game of the same name.
https://en.wikipedia.org/wiki/Morchella%20laurentiana
Morchella laurentiana is a species of fungus in the family Morchellaceae described as new to science in 2016. It is known only from the Saint Lawrence River basin in the Canadian province of Newfoundland and Labrador. It is in the Morchella elata clade.
https://en.wikipedia.org/wiki/Bacterial%20therapy
Bacterial therapy is the therapeutic use of bacteria to treat diseases. Bacterial therapeutics are living medicines, and may be wild type bacteria (often in the form of probiotics) or bacteria that have been genetically engineered to possess therapeutic properties that is injected into a patient. Other examples of living medicines include cellular therapeutics (including immunotherapeutics), activators of anti-tumor immunity, or synergizing with existing tools and approaches. and phage therapeutics, or as delivery vehicles for treatment, diagnosis, or imaging, complementing or synergizing with existing tools and approaches. Development Development of bacterial therapeutics is an extremely active research area in the fields of synthetic biology and microbiology. Currently, there is a large focus on: 1) identifying bacteria that naturally produce therapeutic effects (for example, probiotic bacteria), and 2) genetically programming bacteria to produce therapeutic effects. Design Several aspects require consideration during the design of an engineered bacterial therapeutic. The selection of a chassis organism can be guided by the desired site of activity and pharmacokinetic properties of the chassis, as well as manufacturing feasibility. The design of genetic circuits may also be influenced by the circuit's effectors, pragmatic concerns regarding inducer compounds, and the genetic stability of regulatory circuits. Critically, the design of an engineered bacterial drug may also be constrained by considerations for the needs of patients. Optimal strain design often requires a balance between strain suitability for function in the target microenvironment and concerns for feasibility of manufacturing and clinical development. The development workflow should incorporate technologies for optimizing strain potency, as well as predictive in vitro and in vivo assays, as well quantitative pharmacology models, to maximize translational potential for patient populations. A
https://en.wikipedia.org/wiki/Dimitris%20Koukoulopoulos
Dimitris Koukoulopoulos (born 1984) is a Greek mathematician working in analytic number theory. He is a professor at the University of Montreal. In 2019, in joint work with James Maynard, he proved the Duffin-Schaeffer conjecture. He was an invited speaker at the 2022 International Congress of Mathematicians. Publications
https://en.wikipedia.org/wiki/Myogenin
Myogenin, is a transcriptional activator encoded by the MYOG gene. Myogenin is a muscle-specific basic-helix-loop-helix (bHLH) transcription factor involved in the coordination of skeletal muscle development or myogenesis and repair. Myogenin is a member of the MyoD family of transcription factors, which also includes MyoD, Myf5, and MRF4. In mice, myogenin is essential for the development of functional skeletal muscle. Myogenin is required for the proper differentiation of most myogenic precursor cells during the process of myogenesis. When the DNA coding for myogenin was knocked out of the mouse genome, severe skeletal muscle defects were observed. Mice lacking both copies of myogenin (homozygous-null) suffer from perinatal lethality due to the lack of mature secondary skeletal muscle fibers throughout the body. In cell culture, myogenin can induce myogenesis in a variety of non-muscle cell types. Interactions Myogenin has been shown to interact with: MDFI, POLR2C, Serum response factor Sp1 transcription factor, and TCF3.
https://en.wikipedia.org/wiki/Korean%20wind%20chime
The Korean Wind Chime () is a variety of bell traditionally hung from the exterior corners of Korean Buddhist temples, and functioning as a wind chime. The bell's clapper is often in the shape of a fish, an auspicious sign in Buddhism. An elaborate gilt bronze style of Korean wind chime and dragon's head finial became a type of object in later Silla / early Goryeo art. Hung from the eaves, and rung by the wind, it is a form of awakening practitioners of Buddhism to the external world. See also Wooden fish
https://en.wikipedia.org/wiki/Worldpay%20Group
Worldpay Group plc (formerly RBS WorldPay) was a payment processing company. It was formerly listed on the London Stock Exchange until 16 January 2018 when it was acquired by Vantiv. The combined company then took the name Worldpay, Inc.. Worldpay, Inc. was acquired by FIS in July 2019 for $43 billion. History Early history WorldPay started as an online multi-currency payment system in 1997. The founder Nick Ogden partnered with National Westminster Bank to provide the financial systems and Andrew Birch of Symbiant to provide the end user payment gateway. When Royal Bank of Scotland took over National Westminster Bank, Worldpay was wholly acquired and merged with an electronic payment system called Streamline which was first released by Centre-file ltd, a wholly owned subsidiary of National Westminster Bank, in 1989. Management History RBS Ownership In 1995 the Streamline system was reabsorbed into the bank when the trading name and payroll service of Centre-file ltd were sold to Ceridian. NatWest was acquired in 2002 by Royal Bank of Scotland Group (RBS) which renamed the business RBS WorldPay and appointed Ron Kalifa as CEO. RBS expanded the business significantly by acquiring and merging a number of payment solutions companies from different countries. Over the next five years it was combined with seven leading retail payment solutions brands: Streamline, Streamline International, PaymentTrust, Netherlands based Bibit, RiskGuardian and US-based Lynk. Divestment from RBS As a condition in the European Commission's clearance in December 2009 of state aid to RBS, Worldpay was to be sold as part of a plan to divest selected businesses from the group. On 6 August 2010, Advent International and Bain Capital agreed to acquire Worldpay for £2.025bn, including a £200m contingent consideration, and appointed Ron Kalifa, who has previously headed up the global transaction services division with RBS, as CEO. The RBS Group retained a 20% stake in the newly indepen
https://en.wikipedia.org/wiki/Phytoecdysteroid
Phytoecdysteroids are plant-derived ecdysteroids. Phytoecdysteroids are a class of chemicals that plants synthesize for defense against phytophagous (plant eating) insects. These compounds are mimics of hormones used by arthropods in the molting process known as ecdysis. When insects eat the plants with these chemicals they may prematurely molt, lose weight, or suffer other metabolic damage and die. Chemically, phytoecdysteroids are classed as triterpenoids, the group of compounds that includes triterpene saponins, phytosterols, and phytoecdysteroids. Plants, but not animals, synthesize phytoecdysteroids from mevalonic acid in the mevalonate pathway of the plant cell using acetyl-CoA as a precursor. Over 250 ecdysteroid analogs have been identified so far in plants, and it has been theorized that there are over 1,000 possible structures which might occur in nature. Many more plants have the ability to "turn on" the production of phytoecdysteroids when under stress, animal attack or other conditions. The term phytoecdysteroid can also apply to ecdysteroids found in fungi, even though fungi are not plants. Some plants or fungi that produce phytoecdysteroids include Achyranthes bidentata, Tinospora cordifolia, Pfaffia paniculata, Leuzea carthamoides, Rhaponticum uniflorum, Serratula coronata, Cordyceps, and Asparagus. See also Plant defense against herbivory
https://en.wikipedia.org/wiki/Primordial%20sandwich
The concept of the primordial sandwich was proposed by the chemist Günter Wächtershäuser to describe the possible origins of the first cell membranes, and, therefore, the first cell. According to the two main models of abiogenesis, RNA world and iron-sulfur world, prebiotic processes existed before the development of the cell membrane. The difficulty with this idea, however, is that it is almost impossible to create a complex molecule such as RNA (or even its molecular precursor, pre-RNA) directly from simple organic molecules dissolved in a global ocean (Joyce, 1991), because without some mechanism to concentrate these organic molecules, they would be too dilute to generate the necessary chemical reactions to transform them from simple organic molecules into genuine prebiotic molecules. To address this problem, Wächtershäuser proposed that concentration might occur by concentration upon ("adsorption to") the surfaces of minerals. With the accumulation of enough amphipathic molecules (such as phospholipids), a bilayer will self-organize, and any molecules caught inside will become the contents of a liposome, and would be concentrated enough to allow chemical reactions to transform organic molecules into prebiotic molecules. Although developed for his own iron-sulfur world model, the idea of the primordial sandwich has also been adopted by some adherents of the RNA world model. See also Primordial sea Primordial soup Notes External links Minerals and the Origin of Life Astrobiology, Volume 2, Number 4, "The First Cell Membranes" Origin of life
https://en.wikipedia.org/wiki/Disc%20permeameter
The disc permeameter is a field instrument used for measuring water infiltration in the soil, which is characterized by in situ saturated and unsaturated soil hydraulic properties. It is mainly used to provide estimates of the hydraulic conductivity of the soil near saturation. History Conventional techniques for measuring in-situ infiltration include the use of a single or double ring infiltrometer. Single and double ring infiltrometer only measures flow under ponded (saturated) conditions, and when used in soil with distinct macropores, preferential flow will dominate the flow. (See: Poiseuille's law) This does not reflect infiltration under rainfall or sprinkler irrigation. Therefore, many authors attempted to create a negative potential (tension) on the water flow. This is to exclude macropores in the flow process, hence only measuring the soil matrix flow. Willard Gardner and Walter Gardner developed a negative head permeameter as early as 1939. Dixon (1975) developed a closed-top ring infiltrometer to quantify macropores. Water is applied to a closed-top system, which permits the imposition of negative head or pressure on the ponded water surface. Negative tension can be considered as simulating a positive soil air pressure, created by a negative air pressure above ponded surface water. A simplification was made by Topp and Zebchuk (1985). The limitation of this device is the infiltration has to be started by ponding the closed-top infiltrometer (applying a positive head), then adjusted to a negative pressure. Little research effort was continued in this area, instead attention has been given mainly to the sorptivity apparatus of Dirksen (1975) which used a ceramic plate as a base. Based on this design, Brent Clothier and Ian White (1981) developed the sorptivity tube which can provide a constant negative potential (tension) on the soil surface. However, the sorptivity tube had many shortcomings, hence modifications to the design led to the development of th
https://en.wikipedia.org/wiki/Fixation%20%28population%20genetics%29
In population genetics, fixation is the change in a gene pool from a situation where there exists at least two variants of a particular gene (allele) in a given population to a situation where only one of the alleles remains. That is, the allele becomes fixed. In the absence of mutation or heterozygote advantage, any allele must eventually be lost completely from the population or fixed (permanently established at 100% frequency in the population). Whether a gene will ultimately be lost or fixed is dependent on selection coefficients and chance fluctuations in allelic proportions. Fixation can refer to a gene in general or particular nucleotide position in the DNA chain (locus). In the process of substitution, a previously non-existent allele arises by mutation and undergoes fixation by spreading through the population by random genetic drift or positive selection. Once the frequency of the allele is at 100%, i.e. being the only gene variant present in any member, it is said to be "fixed" in the population. Similarly, genetic differences between taxa are said to have been fixed in each species. History The earliest mention of gene fixation in published works was found in Motoo Kimura's 1962 paper "On Probability of Fixation of Mutant Genes in a Population". In the paper, Kimura uses mathematical techniques to determine the probability of fixation of mutant genes in a population. He showed that the probability of fixation depends on the initial frequency of the allele and the mean and variance of the gene frequency change per generation. Probability Under conditions of genetic drift alone, every finite set of genes or alleles has a "coalescent point" at which all descendants converge to a single ancestor (i.e. they 'coalesce'). This fact can be used to derive the rate of gene fixation of a neutral allele (that is, one not under any form of selection) for a population of varying size (provided that it is finite and nonzero). Because the effect of natural selecti
https://en.wikipedia.org/wiki/Differentiable%20stack
A differentiable stack is the analogue in differential geometry of an algebraic stack in algebraic geometry. It can be described either as a stack over differentiable manifolds which admits an atlas, or as a Lie groupoid up to Morita equivalence. Differentiable stacks are particularly useful to handle spaces with singularities (i.e. orbifolds, leaf spaces, quotients), which appear naturally in differential geometry but are not differentiable manifolds. For instance, differentiable stacks have applications in foliation theory, Poisson geometry and twisted K-theory. Definition Definition 1 (via groupoid fibrations) Recall that a category fibred in groupoids (also called a groupoid fibration) consists of a category together with a functor to the category of differentiable manifolds such that is a fibred category, i.e. for any object of and any arrow of there is an arrow lying over ; for every commutative triangle in and every arrows over and over , there exists a unique arrow over making the triangle commute. These properties ensure that, for every object in , one can define its fibre, denoted by or , as the subcategory of made up by all objects of lying over and all morphisms of lying over . By construction, is a groupoid, thus explaining the name. A stack is a groupoid fibration satisfied further glueing properties, expressed in terms of descent. Any manifold defines its slice category , whose objects are pairs of a manifold and a smooth map ; then is a groupoid fibration which is actually also a stack. A morphism of groupoid fibrations is called a representable submersion if for every manifold and any morphism , the fibred product is representable, i.e. it is isomorphic to (for some manifold ) as groupoid fibrations; the induce smooth map is a submersion. A differentiable stack is a stack together with a special kind of representable submersion (every submersion described above is asked to be surjective), for some mani
https://en.wikipedia.org/wiki/Spigot%20algorithm
A spigot algorithm is an algorithm for computing the value of a transcendental number (such as or e) that generates the digits of the number sequentially from left to right providing increasing precision as the algorithm proceeds. Spigot algorithms also aim to minimize the amount of intermediate storage required. The name comes from the sense of the word "spigot" for a tap or valve controlling the flow of a liquid. Spigot algorithms can be contrasted with algorithms that store and process complete numbers to produce successively more accurate approximations to the desired transcendental. Interest in spigot algorithms was spurred in the early days of computational mathematics by extreme constraints on memory, and such an algorithm for calculating the digits of e appeared in a paper by Sale in 1968. In 1970, Abdali presented a more general algorithm to compute the sums of series in which the ratios of successive terms can be expressed as quotients of integer functions of term positions. This algorithm is applicable to many familiar series for trigonometric functions, logarithms, and transcendental numbers because these series satisfy the above condition. The name "spigot algorithm" seems to have been coined by Stanley Rabinowitz and Stan Wagon, whose algorithm for calculating the digits of is sometimes referred to as "the spigot algorithm for ". The spigot algorithm of Rabinowitz and Wagon is bounded, in the sense that the number of terms of the infinite series that will be processed must be specified in advance. The term "streaming algorithm" indicates an approach without this restriction. This allows the calculation to run indefinitely varying the amount of intermediate storage as the calculation progresses. A variant of the spigot approach uses an algorithm which can be used to compute a single arbitrary digit of the transcendental without computing the preceding digits: an example is the Bailey–Borwein–Plouffe formula, a digit extraction algorithm for whi
https://en.wikipedia.org/wiki/Neuropeptidergic
Neuropeptidergic means "related to neuropeptides". A neuropeptidergic agent (or drug) is a chemical which functions to directly modulate the neuropeptide systems in the body or brain. An example is opioidergics. See also Adenosinergic Cannabinoidergic Cholinergic GABAergic Glutamatergic Glycinergic Histaminergic Monoaminergic Opioidergic
https://en.wikipedia.org/wiki/Anticaking%20agent
An anticaking agent is an additive placed in powdered or granulated materials, such as table salt or confectioneries, to prevent the formation of lumps (caking) and for easing packaging, transport, flowability, and consumption. Caking mechanisms depend on the nature of the material. Crystalline solids often cake by formation of liquid bridge and subsequent fusion of microcrystals. Amorphous materials can cake by glass transitions and changes in viscosity. Polymorphic phase transitions can also induce caking. Some anticaking agents function by absorbing excess moisture or by coating particles and making them water-repellent. Calcium silicate (CaSiO3), a commonly used anti-caking agent, added to e.g. table salt, absorbs both water and oil. Anticaking agents are also used in non-food items such as road salt, fertilisers, cosmetics, and detergents. Some studies suggest that anticaking agents may have a negative effect on the nutritional content of food; one such study indicated that most anti-caking agents result in the additional degradation of vitamin C added to food. Examples An anticaking agent in salt is denoted in the ingredients, for example, as "anti-caking agent (554)", which is sodium aluminosilicate. This product is present in many commercial table salts as well as dried milk, egg mixes, sugar products, flours and spices. In Europe, sodium ferrocyanide (535) and potassium ferrocyanide (536) are more common anticaking agents in table salt. "Natural" anticaking agents used in more expensive table salt include calcium carbonate and magnesium carbonate. Diatomaceous earth, mostly consisting of silicon dioxide (SiO2), may also be used as an anticaking agent in animal foods, typically mixed at 2% rate of a product dry weight. List of anticaking agents The most widely used anticaking agents include the stearates of calcium and magnesium, silica and various silicates, talc, as well as flour and starch. Ferrocyanides are used for table salt. The following a
https://en.wikipedia.org/wiki/History%20of%20combinatorics
The mathematical field of combinatorics was studied to varying degrees in numerous ancient societies. Its study in Europe dates to the work of Leonardo Fibonacci in the 13th century AD, which introduced Arabian and Indian ideas to the continent. It has continued to be studied in the modern era. Earliest records The earliest recorded use of combinatorial techniques comes from problem 79 of the Rhind papyrus, which dates to the 16th century BCE. The problem concerns a certain geometric series, and has similarities to Fibonacci's problem of counting the number of compositions of 1s and 2s that sum to a given total. In Greece, Plutarch wrote that Xenocrates of Chalcedon (396–314 BC) discovered the number of different syllables possible in the Greek language. This would have been the first attempt on record to solve a difficult problem in permutations and combinations. The claim, however, is implausible: this is one of the few mentions of combinatorics in Greece, and the number they found, 1.002 × 10 12, seems too round to be more than a guess. Later, an argument between Chrysippus (3rd century BCE) and Hipparchus (2nd century BCE) of a rather delicate enumerative problem, which was later shown to be related to Schröder–Hipparchus numbers, is mentioned. There is also evidence that in the Ostomachion, Archimedes (3rd century BCE) considered the configurations of a tiling puzzle, while some combinatorial interests may have been present in lost works of Apollonius. In India, the Bhagavati Sutra had the first mention of a combinatorics problem; the problem asked how many possible combinations of tastes were possible from selecting tastes in ones, twos, threes, etc. from a selection of six different tastes (sweet, pungent, astringent, sour, salt, and bitter). The Bhagavati is also the first text to mention the choose function. In the second century BC, Pingala included an enumeration problem in the Chanda Sutra (also Chandahsutra) which asked how many ways a six-sylla
https://en.wikipedia.org/wiki/Edinburgh%20Compatible%20Context%20Editor
ECCE (the Edinburgh Compatible Context Editor) is a text editor for computing systems and operating environments that support a command line interface. It is an original command set which is logical and regular. It was written in the 1960s by Hamish Dewar, an experienced Compiler writer and used this skill to design a command-set which could be easily parsed and coded to allow complex commands to be built up. A technique similar to threaded code in the Forth environment. The current ECCE release is licensed under the BSD License, recoded into C and released by Graham Toal in 2007. History Hamish Dewar in the early 1960s recognised a need for a more powerful text editor. At the time editing files was laborious as editors could only load into memory one code line at a time and insert, delete or replace only the whole line. Because of memory limitations (a large computer might have between 8k and 32k or memory) few editors could execute repeated commands or support macros for text processing. H Dewar used his talent as a compiler author to create ECCE as a much more capable command set but retain a small footprint. From the start ECCE would endeavour to buffer as much of the file as memory allowed while earlier editors could only buffer one line of the file at a time. ECCE became the default text editor for computers at the University of Edinburgh and remained almost unchanged for a period of almost 25 years. The editors survival is attributed to the fact that thousands of undergraduates and postgraduates would have used the tool in their higher education and wherever in the world they settled the benefits of ECCE were promoted and local implementations created from Hamish Dewar's source code. ECCE became one of the most popular and well respected text editors of the 1970s. ECCE was originally written in Imp, a language created at the University of Edinburgh, the second implementation was coded in PDP-8 assembler and was ported to numerous other platforms. Source
https://en.wikipedia.org/wiki/Tensor%20density
In differential geometry, a tensor density or relative tensor is a generalization of the tensor field concept. A tensor density transforms as a tensor field when passing from one coordinate system to another (see tensor field), except that it is additionally multiplied or weighted by a power W of the Jacobian determinant of the coordinate transition function or its absolute value. A tensor density with a single index is called a vector density. A distinction is made among (authentic) tensor densities, pseudotensor densities, even tensor densities and odd tensor densities. Sometimes tensor densities with a negative weight W are called tensor capacity. A tensor density can also be regarded as a section of the tensor product of a tensor bundle with a density bundle. Motivation In physics and related fields, it is often useful to work with the components of an algebraic object rather than the object itself. An example would be decomposing a vector into a sum of basis vectors weighted by some coefficients such as where is a vector in 3-dimensional Euclidean space, are the usual standard basis vectors in Euclidean space. This is usually necessary for computational purposes, and can often be insightful when algebraic objects represent complex abstractions but their components have concrete interpretations. However, with this identification, one has to be careful to track changes of the underlying basis in which the quantity is expanded; it may in the course of a computation become expedient to change the basis while the vector remains fixed in physical space. More generally, if an algebraic object represents a geometric object, but is expressed in terms of a particular basis, then it is necessary to, when the basis is changed, also change the representation. Physicists will often call this representation of a geometric object a tensor if it transforms under a sequence of linear maps given a linear change of basis (although confusingly others call the underlyi
https://en.wikipedia.org/wiki/Moens%E2%80%93Korteweg%20equation
In biomechanics, the Moens–Korteweg equation models the relationship between wave speed or pulse wave velocity (PWV) and the incremental elastic modulus of the arterial wall or its distensibility. The equation was derived independently by Adriaan Isebree Moens and Diederik Korteweg. It is derived from Newton's second law of motion, using some simplifying assumptions, and reads: The Moens–Korteweg equation states that PWV is proportional to the square root of the incremental elastic modulus, (Einc), of the vessel wall given constant ratio of wall thickness, h, to vessel radius, r, and blood density, ρ, assuming that the artery wall is isotropic and experiences isovolumetric change with pulse pressure.
https://en.wikipedia.org/wiki/Quantum%20dot
Quantum dots (QDs), also called semiconductor nanocrystals, are semiconductor particles a few nanometres in size, having optical and electronic properties that differ from those of larger particles as a result of quantum mechanical effects. They are a central topic in nanotechnology and materials science. When the quantum dots are illuminated by UV light, an electron in the quantum dot can be excited to a state of higher energy. In the case of a semiconducting quantum dot, this process corresponds to the transition of an electron from the valence band to the conductance band. The excited electron can drop back into the valence band releasing its energy as light. This light emission (photoluminescence) is illustrated in the figure on the right. The color of that light depends on the energy difference between the conductance band and the valence band, or the transition between discrete energy states when the band structure is no longer well-defined in QDs. Nanoscale semiconductor materials tightly confine either electrons or electron holes. The confinement is similar to a three-dimensional particle in a box model. The quantum dot absorption and emission features correspond to transitions between discrete quantum mechanically allowed energy levels in the box that are reminiscent of atomic spectra. For these reasons, quantum dots are sometimes referred to as artificial atoms, emphasizing their bound and discrete electronic states, like naturally occurring atoms or molecules. It was shown that the electronic wave functions in quantum dots resemble the ones in real atoms. By coupling two or more such quantum dots, an artificial molecule can be made, exhibiting hybridization even at room temperature. Precise assembly of quantum dots can form superlattices that act as artificial solid-state materials that exhibit unique optical and electronic properties. Quantum dots have properties intermediate between bulk semiconductors and discrete atoms or molecules. Their optoelectr
https://en.wikipedia.org/wiki/B1%20domain
In molecular biology, the protein domain b1 refers to the domain b1 of Protein L. L is a bacterial protein with immunoglobulin (Ig) light chain-binding properties. It contains a number of homologous b1 repeats towards the N terminus. These repeats have been found to be responsible for the interaction of protein L with Ig light chains. N-terminus domain contains five homologous B1 repeats of 72-76 amino acids each.
https://en.wikipedia.org/wiki/Raman%20microscope
The Raman microscope is a laser-based microscopic device used to perform Raman spectroscopy. The term MOLE (molecular optics laser examiner) is used to refer to the Raman-based microprobe. The technique used is named after C. V. Raman, who discovered the scattering properties in liquids. Configuration The Raman microscope begins with a standard optical microscope, and adds an excitation laser, laser rejection filters, a spectrometer or monochromator, and an optical sensitive detector such as a charge-coupled device (CCD), or photomultiplier tube, (PMT). Traditionally Raman microscopy was used to measure the Raman spectrum of a point on a sample, more recently the technique has been extended to implement Raman spectroscopy for direct chemical imaging over the whole field of view on a 3D sample. Imaging modes In direct imaging, the whole field of view is examined for scattering over a small range of wavenumbers (Raman shifts). For instance, a wavenumber characteristic for cholesterol could be used to record the distribution of cholesterol within a cell culture. The other approach is hyperspectral imaging or chemical imaging, in which thousands of Raman spectra are acquired from all over the field of view. The data can then be used to generate images showing the location and amount of different components. Taking the cell culture example, a hyperspectral image could show the distribution of cholesterol, as well as proteins, nucleic acids, and fatty acids. Sophisticated signal- and image-processing techniques can be used to ignore the presence of water, culture media, buffers, and other interference. Resolution Raman microscopy, and in particular confocal microscopy, can reach down to sub-micrometer lateral spatial resolution. Because a Raman microscope is a diffraction-limited system, its spatial resolution depends on the wavelength of light and the numerical aperture of the focusing element. In confocal Raman microscopy, the diameter of the confocal aperture is an
https://en.wikipedia.org/wiki/Axiom%20of%20empty%20set
In axiomatic set theory, the axiom of empty set is a statement that asserts the existence of a set with no elements. It is an axiom of Kripke–Platek set theory and the variant of general set theory that Burgess (2005) calls "ST," and a demonstrable truth in Zermelo set theory and Zermelo–Fraenkel set theory, with or without the axiom of choice. Formal statement In the formal language of the Zermelo–Fraenkel axioms, the axiom reads: or in words: There is a set such that no element is a member of it. Interpretation We can use the axiom of extensionality to show that there is only one empty set. Since it is unique we can name it. It is called the empty set (denoted by { } or ∅). The axiom, stated in natural language, is in essence: An empty set exists. This formula is a theorem and considered true in every version of set theory. The only controversy is over how it should be justified: by making it an axiom; by deriving it from a set-existence axiom (or logic) and the axiom of separation; by deriving it from the axiom of infinity; or some other method. In some formulations of ZF, the axiom of empty set is actually repeated in the axiom of infinity. However, there are other formulations of that axiom that do not presuppose the existence of an empty set. The ZF axioms can also be written using a constant symbol representing the empty set; then the axiom of infinity uses this symbol without requiring it to be empty, while the axiom of empty set is needed to state that it is in fact empty. Furthermore, one sometimes considers set theories in which there are no infinite sets, and then the axiom of empty set may still be required. However, any axiom of set theory or logic that implies the existence of any set will imply the existence of the empty set, if one has the axiom schema of separation. This is true, since the empty set is a subset of any set consisting of those elements that satisfy a contradictory formula. In many formulations of first-order predicate logic,
https://en.wikipedia.org/wiki/Video%20game%20localization
Video game localization (American English), or video game localisation (British English; see spelling differences), is the process of preparing a video game for a market outside of where it was originally published. The game's name, art assets, packaging, manuals, and cultural and legal differences are typically altered. Before localization, producers consider economic factors such as potential foreign profit. Most official localizations are done by the game's developers or a third-party translation company. Nevertheless, fan localizations are also popular. Localization is largely inconsistent between platforms, engines and companies due to its recency. Localizers intend to create an experience like the original game, with discretion to the localization audience. Localizations are considered to have failed if it is confusing or difficult to understand; this may break the player's immersion. History Since the beginning of video game history, video games have been localized. One of the first widely popular video games, Pac-Man was localized from Japanese. The original transliteration of the Japanese title would be "Puck-Man," but the decision was made to change the name when the game was imported to the United States out of fear that the word 'Puck' would be vandalized into an obscenity. In addition, the names of the ghosts were originally based on colors - roughly translating to "Reddie," "Pinky," "Bluey," and "Slowly." Rather than translate these names exactly, they were renamed to Blinky, Pinky, Inky, and Clyde. This choice maintained the odd-man-out style of the original names without adhering to their exact meaning. This is an early example of a change in cultural context. Early localization had one main concern. Due to the small memory size of the NES and SNES cartridges many translated text strings were too long. Ted Woolsey, translator of Final Fantasy VI, recounts having to continually cut down the English text due to limited capacity. Early video gam
https://en.wikipedia.org/wiki/Department%20of%20Chemical%20Engineering%20and%20Biotechnology%2C%20University%20of%20Cambridge
The Department of Chemical Engineering and Biotechnology (CEB) is one of the teaching and research departments at the University of Cambridge. The department trains undergraduate students and conducts original research at the interfaces between engineering, chemistry, biology and physics. It conducts research in collaboration with industrial partners. Its research programmes encompass sustainable reaction engineering, chemical product and process design, healthcare, measurement, and materials science. It conducts biotechnology research with chemical engineering at the science-engineering interface. Notable staff notable staff include Sabine Bahn John Bridgwater Silvana Cardoso Howard Chase Lynn Gladden Elizabeth A. H. Hall Ross D. King Róisín Owens Heads of department These include heads of the former Department of Chemical Engineering and Institute of Biotechnology which merged to form the current department. Lisa Hall John Dennis Lynn Gladden John Davidson Peter Danckwerts Alumni Former staff include: Francis Thomas Bacon History In 1945, the university received an endowment from Shell for a chemical engineering department and chair. The first Shell Professor was Terence Fox, appointed in 1946. The undergraduate Tripos course began in 1948. Peter Danckwerts was head of department from 1959 to 1975 and then John Davidson became Shell Professor and Head of department in 1975. He held the post until 1993 when he retired. in 2008, the Department of Chemical Engineering merged with the Institute of Biotechnology to become the Department of Chemical Engineering and Biotechnology. Until 2017, the department's main centre of activity was the Shell building on Pembroke Street on the New Museums Site, to the south of Cambridge city centre. In 2017, the department moved over to a new building on Philippa Fawcett Drive on the West Cambridge site. The building was officially opened by the chancellor of the university, David Sainsbury, Baron Sainsbury
https://en.wikipedia.org/wiki/Cannabis%20in%20South%20Africa
Cannabis in South Africa has been decriminalized for personal adult consumption in private by the Constitutional Court of South Africa. However, laws still prohibit its use outside of one's private dwelling as well as the buying and selling of cannabis. Regulations prohibiting the purchase of cannabis-containing products remain in effect, raising questions about the enforceability of the ruling. Prior to the lifting of the prohibition of cannabis in 2018, advocates pressured the government to amend laws restricting cannabis that were first established in 1922 to allow exceptions for medical use, religious practices, and other purposes. The Afrikaans term commonly used to refer to cannabis is dagga (), derived from the Khoikhoi word , which was adopted by early European colonial settlers in the Dutch Cape Colony. Cannabis is believed to have been introduced to Africa by early Arab or Indian traders. It was already in widespread use among South Africa's indigenous Khoisan and Bantu peoples before European settlement in the Cape in 1652. Additionally, it was traditionally utilized by Basotho communities to facilitate childbirth. According to author Hazel Crampton, historical Afrikaner recipes for teas and foods incorporated the use of cannabis. The plant's use was primarily associated with traditional African populations and individuals of lower economic status. Long-term research conducted by the Medical Research Council (MRC) indicates that the number of cannabis users in South Africa was 2.2 million in 2004, which increased to 3.2 million by 2008. In 2003, Interpol ranked South Africa as the world's fourth-largest cannabis producer, and the Institute for Security Studies reported that a significant portion of cannabis seized in the United Kingdom, and a third seized globally, originated from South Africa. History The first written record of the plant in South Africa is by Jan van Riebeeck, who ordered officers of the Voorman to purchase "daccha" in Natal for tra
https://en.wikipedia.org/wiki/European%20Middleware%20Initiative
The European Middleware Initiative (EMI) is a computer software platform for high performance distributed computing. It is developed and distributed directly by the EMI project. It is the base for other grid middleware distributions used by scientific research communities and distributed computing infrastructures all over the world especially in Europe, South America and Asia. EMI supports broad scientific experiments and initiatives, such as the Worldwide LHC Computing Grid (for the Large Hadron Collider). The EMI middleware is a cooperation among three general purpose grid platforms, the Advanced Resource Connector, gLite and UNICORE and the dCache storage software. Purpose The purpose of the EMI distribution is to consolidate, harmonize and support the original software platforms, evolve and extend them. Redundant or duplicate services resulting from the merging are deprecated, in favour of new services added to satisfy user requirements or specific consolidation needs, standardizing and developing common interfaces. These include the adoption of a common structure for accounting, resource information exchange or authentication and authorization. Input for the development activities is taken from users, infrastructures projects, standardization initiatives or changing technological innovations. The software products will be adapted as necessary to comply with standard open-source guidelines to facilitate the integration in mainstream operating system distributions. Collaborations A cooperation with FutureGrid, a US distributed testbed for Clouds, Grids and high-performance computing, was announced in December 2011. In January 2012, the EMI project formalized a partnership with the iMarine project to create an open data e-infrastructure for fisheries management and marine conservation. Users By 2008 the EMI software distribution provided most of the middleware components which support the execution and completion of the millions of computational jobs handled
https://en.wikipedia.org/wiki/Ram%20Prakash%20Bambah
Ram Prakash Bambah (born 17 September 1925) is an Indian mathematician working in number theory and discrete geometry. Education and career Bambah earned a bachelor's degree from Government College University, Lahore, and a master's degree from the University of the Punjab, Lahore. He then went to England for his doctoral studies, earning his Ph.D. in 1950 from St John's College, Cambridge under the supervision of Louis J. Mordell. Returning to India, he became a reader at Panjab University, Chandigarh, in 1952, and was promoted to professor there in 1957. Maintaining his position at Panjab University, he also held a position as professor at Ohio State University in the US from 1964 to 1969. He retired from Panjab University in 1993. Bambah was president of the Indian Mathematical Society in 1969, and vice chancellor of Panjab University from 1985 to 1991. Awards and honours He was elected to the Indian National Science Academy in 1955. In 1979 he was awarded the Srinivasa Ramanujan Medal, and in 1974 was elected to the Indian Academy of Sciences. In 1988 he received the Aryabhata Medal of the Indian National Science Academy and the Padma Bhushan award.
https://en.wikipedia.org/wiki/Pointed%20set
In mathematics, a pointed set (also based set or rooted set) is an ordered pair where is a set and is an element of called the base point, also spelled basepoint. Maps between pointed sets and —called based maps, pointed maps, or point-preserving maps—are functions from to that map one basepoint to another, i.e. maps such that . Based maps are usually denoted . Pointed sets are very simple algebraic structures. In the sense of universal algebra, a pointed set is a set together with a single nullary operation which picks out the basepoint. Pointed maps are the homomorphisms of these algebraic structures. The class of all pointed sets together with the class of all based maps forms a category. Every pointed set can be converted to an ordinary set by forgetting the basepoint (the forgetful functor is faithful), but the reverse is not true. In particular, the empty set cannot be pointed, because it has no element that can be chosen as the basepoint. Categorical properties The category of pointed sets and based maps is equivalent to the category of sets and partial functions. The base point serves as a "default value" for those arguments for which the partial function is not defined. One textbook notes that "This formal completion of sets and partial maps by adding 'improper', 'infinite' elements was reinvented many times, in particular, in topology (one-point compactification) and in theoretical computer science." This category is also isomorphic to the coslice category (), where is (a functor that selects) a singleton set, and (the identity functor of) the category of sets. This coincides with the algebraic characterization, since the unique map extends the commutative triangles defining arrows of the coslice category to form the commutative squares defining homomorphisms of the algebras. There is a faithful functor from pointed sets to usual sets, but it is not full and these categories are not equivalent. The category of pointed sets is a poi
https://en.wikipedia.org/wiki/Rectangular%20lattice
The rectangular lattice and rhombic lattice (or centered rectangular lattice) constitute two of the five two-dimensional Bravais lattice types. The symmetry categories of these lattices are wallpaper groups pmm and cmm respectively. The conventional translation vectors of the rectangular lattices form an angle of 90° and are of unequal lengths. Bravais lattices There are two rectangular Bravais lattices: primitive rectangular and centered rectangular (also rhombic). The primitive rectangular lattice can also be described by a centered rhombic unit cell, while the centered rectangular lattice can also be described by a primitive rhombic unit cell. Note that the length in the lower row is not the same as in the upper row. For the first column above, of the second row equals of the first row, and for the second column it equals . Crystal classes The rectangular lattice class names, Schönflies notation, Hermann-Mauguin notation, orbifold notation, Coxeter notation, and wallpaper groups are listed in the table below.
https://en.wikipedia.org/wiki/Fuzzy%20complex
Fuzzy complexes are protein complexes, where structural ambiguity or multiplicity exists and is required for biological function. Alteration, truncation or removal of conformationally ambiguous regions impacts the activity of the corresponding complex. Fuzzy complexes are generally formed by intrinsically disordered proteins. Structural multiplicity usually underlies functional multiplicity of protein complexes following a fuzzy logic. Distinct binding modes of the nucleosome are also regarded as a special case of fuzziness. Historical background For almost 50 years molecular biology was based on two dogmas: (i) equating biological function of the protein with a unique three-dimensional structure and (ii) assuming exquisite specificity in protein complexes. Specificity/selectivity is ensured by unambiguous set of interactions formed between the protein and its ligand (another protein, DNA, RNA or small molecule). Many protein complexes however, contain functionally important/critical regions, which remain highly dynamic in the complex or adopt different conformations. This phenomenon is defined fuzziness. The most pertinent example is the cyclin-dependent kinase inhibitor Sic1, which binds to the SCF subunit of Cdc4 in a phosphorylation dependent manner. No regular secondary structures are gained upon phosphorylation and the different phosphorylation sites interchange in the complex. Classification of fuzzy complexes Structural ambiguity in protein complexes covers a wide spectrum. In a polymorphic complex, the protein adopts two or more different conformations upon binding to the same partner, and these conformations can be resolved. Clamp, flanking and random complexes are dynamic, where ambiguous conformations interchange with each other and cannot be resolved. Interactions in fuzzy complexes are usually mediated by short motifs. Flanking regions are tolerant to sequence changes as long as the amino acid composition is maintained, for example in case of lin
https://en.wikipedia.org/wiki/ASC%20X9
The Accredited Standards Committee X9 (ASC X9, Inc.) is an ANSI (American National Standards Institute) accredited standards developing organization, responsible for developing voluntary open consensus standards for the financial services industry in the U.S. ASC X9 is the USA Technical Advisory Group (TAG) to the International Technical Committee on Financial Services ISO/TC 68 under the International Organization for Standardization (ISO), of Geneva, Switzerland, and submits X9 American National Standards to the international committee to be considered for adoption as international standards or ISO standards. Membership in ASC X9 is open to all U.S. domiciled companies and organizations in the financial services industry. Domestic Role The Accredited Standards Committee X9 develops, establishes, maintains, and promotes standards for the Financial Services Industry in the United States in order to facilitate delivery of financial services and products. Committees ASC X9 is composed of five Subcommittees based on Financial Services business sectors: X9A – Retail Payments (including mobile payments and a card not present fraud group); X9B – Checks and Back-office Operations (all things related to checks); X9C – Corporate Banking (B2B payments and the Balance Transaction Reporting Standard (BTRS) standard); X9D – Securities (stocks and bonds, CUSIP); and X9F – Data & Information Security (methods and cryptography to secure financial data). Standards Examples of X9 standards commonly used today in the US and around the world are: Paper and electronic check standards (X9 owns nearly 98% of the standardized real estate on the front and back of checks, the magnetic ink character record was originally developed by X9, the credit and debit card transactions standard operating all card transactions first developed under X9 Numerous data security standards for financial services including the "PIN" - personal identification standard in wide use today. Data E
https://en.wikipedia.org/wiki/Ubuntu%20MATE
Ubuntu MATE is a free and open-source Linux distribution and an official derivative of Ubuntu. Its main differentiation from Ubuntu is that it uses the MATE desktop environment as its default user interface (based on GNOME 2), instead of the GNOME 3 desktop environment that is the default user interface for Ubuntu. History The Ubuntu MATE project was founded by Martin Wimpress and Alan Pope and began as an unofficial derivative of Ubuntu, using an Ubuntu 14.10 base for its first release; a 14.04 LTS release followed shortly. As of February 2015, Ubuntu MATE gained the official Ubuntu flavour status from Canonical as per the release of 15.04 Beta 1. In addition to IA-32 and x86-64 which were the initial supported platforms, Ubuntu MATE also supports PowerPC and ARMv7 (on the Raspberry Pi 2 and 3 as well as the ODROID XU4). In April 2015, Ubuntu MATE announced a partnership with British computer reseller Entroware, enabling Entroware customers to purchase laptop and desktop computers with Ubuntu MATE preinstalled with full support. Several other hardware deals were announced later. In Ubuntu MATE 18.10, 32-bit support was dropped. Releases Reception In a May 2016 review Jesse Smith of DistroWatch concluded, "despite my initial problems getting Ubuntu MATE installed and running smoothly, I came away with a positive view of the distribution. The project is providing a very friendly desktop experience that requires few hardware resources by modern standards. I also want to tip my hat to the default theme used on Ubuntu MATE." Dedoimedo reviewed Ubuntu MATE in July 2018, and wrote that "[Ubuntu MATE offers] a wealth of visual and functional changes…You really have the ability to implement anything and everything, and all of it natively, from within the system's interface". Starting with the 22.04 LTS release, Ubuntu MATE included AI-generated wallpapers. These were warmly received by popular tech blogs, with OMG! Ubuntu exclaiming "I'm blown away by the quality of
https://en.wikipedia.org/wiki/Microphysiometry
Microphysiometry is the in vitro measurement of the functions and activities of life or of living matter (as organs, tissues, or cells) and of the physical and chemical phenomena involved on a very small (micrometer) scale. The term microphysiometry emerged in the scientific literature at the end of the 1980s. The primary parameters assessed in microphysiometry comprise pH and the concentration of dissolved oxygen, glucose, and lactic acid, with an emphasis on the first two. Measuring these parameters experimentally in combination with a fluidic system for cell culture maintenance and a defined application of drugs or toxins provides the quantitative output parameters extracellular acidification rates (EAR), oxygen consumption rates (OUR), and rates of glucose consumption or lactate release to characterize the metabolic situation. Due to the label-free nature of sensor-based measurements, dynamic monitoring of cells or tissues for several days or even longer is feasible. On an extended timescale, a dynamic analysis of a cell’s metabolic response to an experimental treatment can distinguish acute effects (e.g., one hour after a treatment), early effects (e.g., at 24 hours), and delayed, chronic responses (e.g., at 96 hours). As stated by Alajoki et al., "The concept is that it is possible to detect receptor activation and other physiological changes in living cells by monitoring the activity of energy metabolism". See also Organ-on-a-chip
https://en.wikipedia.org/wiki/United%20States%20Radium%20Corporation
The United States Radium Corporation was a company, most notorious for its operations between the years 1917 to 1926 in Orange, New Jersey, in the United States that led to stronger worker protection laws. After initial success in developing a glow-in-the-dark radioactive paint, the company was subject to several lawsuits in the late 1920s in the wake of severe illnesses and deaths of workers (the Radium Girls) who had ingested radioactive material. The workers had been told that the paint was harmless. During World War I and World War II, the company produced luminous watches and gauges for the United States Army for use by soldiers. U.S. Radium workers, especially women who painted the dials of watches and other instruments with luminous paint, suffered serious radioactive contamination. Lawyer Edward Markley was in charge of defending the company in these cases. History The company was founded in 1914 in New York City, by Dr. Sabin Arnold von Sochocky and Dr. George S. Willis, as the Radium Luminous Material Corporation. The company produced uranium from carnotite ore and eventually moved into the business of producing radioluminescent paint, and then to the application of that paint. Over the next several years, it opened facilities in Newark, Jersey City, and Orange. In August 1921, von Sochocky was forced from the presidency, and the company was renamed the United States Radium Corporation, Arthur Roeder became the president of the company. In Orange, where radium was extracted from 1917 to 1926, the U.S. Radium facility processed half a ton of ore per day. The ore was obtained from "Undark mines" in Paradox Valley, Colorado and in Utah. A notable employee from 1921 to 1923 was Victor Francis Hess, who would later receive the Nobel Prize in Physics. The company's luminescent paint, marketed as Undark, was a mixture of radium and zinc sulfide; the radiation causing the sulfide to fluoresce. During World War I, demand for dials, watches, and aircraft instr
https://en.wikipedia.org/wiki/Restricted%20partial%20quotients
In mathematics, and more particularly in the analytic theory of regular continued fractions, an infinite regular continued fraction x is said to be restricted, or composed of restricted partial quotients, if the sequence of denominators of its partial quotients is bounded; that is and there is some positive integer M such that all the (integral) partial denominators ai are less than or equal to M. Periodic continued fractions A regular periodic continued fraction consists of a finite initial block of partial denominators followed by a repeating block; if then ζ is a quadratic irrational number, and its representation as a regular continued fraction is periodic. Clearly any regular periodic continued fraction consists of restricted partial quotients, since none of the partial denominators can be greater than the largest of a0 through ak+m. Historically, mathematicians studied periodic continued fractions before considering the more general concept of restricted partial quotients. Restricted CFs and the Cantor set The Cantor set is a set C of measure zero from which a complete interval of real numbers can be constructed by simple addition – that is, any real number from the interval can be expressed as the sum of exactly two elements of the set C. The usual proof of the existence of the Cantor set is based on the idea of punching a "hole" in the middle of an interval, then punching holes in the remaining sub-intervals, and repeating this process ad infinitum. The process of adding one more partial quotient to a finite continued fraction is in many ways analogous to this process of "punching a hole" in an interval of real numbers. The size of the "hole" is inversely proportional to the next partial denominator chosen – if the next partial denominator is 1, the gap between successive convergents is maximized. To make the following theorems precise we will consider CF(M), the set of restricted continued fractions whose values lie in the open interval (0, 1) and who
https://en.wikipedia.org/wiki/Critical%20radius
Critical radius is the minimum particle size from which an aggregate is thermodynamically stable. In other words, it is the lowest radius formed by atoms or molecules clustering together (in a gas, liquid or solid matrix) before a new phase inclusion (a bubble, a droplet or a solid particle) is viable and begins to grow. Formation of such stable nuclei is called nucleation. At the beginning of the nucleation process, the system finds itself in an initial phase. Afterwards, the formation of aggregates or clusters from the new phase occurs gradually and randomly at the nanoscale. Subsequently, if the process is feasible, the nucleus is formed. Notice that the formation of aggregates is conceivable under specific conditions. When these conditions are not satisfied, a rapid creation-annihilation of aggregates takes place and the nucleation and posterior crystal growth process does not happen. In precipitation models, nucleation is generally a prelude to models of the crystal growth process. Sometimes precipitation is rate-limited by the nucleation process. An example would be when someone takes a cup of superheated water from a microwave and, when jiggling it with a spoon or against the wall of the cup, heterogeneous nucleation occurs and most of water particles convert into steam. If the change in phase forms a crystalline solid in a liquid matrix, the atoms might then form a dendrite. The crystal growth continues in three dimensions, the atoms attaching themselves in certain preferred directions, usually along the axes of a crystal, forming a characteristic tree-like structure of a dendrite. Mathematical derivation The critical radius of a system can be determined from its Gibbs free energy. It has two components, the volume energy and the surface energy . The first one describes how probable it is to have a phase change and the second one is the amount of energy needed to create an interface. The mathematical expression of , considering spherical particl
https://en.wikipedia.org/wiki/Spherically%20complete%20field
In mathematics, a field K with an absolute value is called spherically complete if the intersection of every decreasing sequence of balls (in the sense of the metric induced by the absolute value) is nonempty: The definition can be adapted also to a field K with a valuation v taking values in an arbitrary ordered abelian group: (K,v) is spherically complete if every collection of balls that is totally ordered by inclusion has a nonempty intersection. Spherically complete fields are important in nonarchimedean functional analysis, since many results analogous to theorems of classical functional analysis require the base field to be spherically complete. Examples Any locally compact field is spherically complete. This includes, in particular, the fields Qp of p-adic numbers, and any of their finite extensions. Every spherically complete field is complete. On the other hand, Cp, the completion of the algebraic closure of Qp, is not spherically complete. Any field of Hahn series is spherically complete.
https://en.wikipedia.org/wiki/Xenomorph
The Xenomorph (also known as a Xenomorph XX121 or Internecivus raptus) is a fictional endoparasitoid extraterrestrial species that serves as the title antagonist of the Alien and Alien vs. Predator franchises. The species made its debut in the film Alien (1979) and reappeared in the sequels Aliens (1986), Alien 3 (1992), and Alien Resurrection (1997). The species returns in the prequel series, first with a predecessor in Prometheus (2012) and a further evolved form in Alien: Covenant (2017). It also featured in the crossover films Alien vs. Predator (2004) and Aliens vs. Predator: Requiem (2007), with the skull and tail of one of the creatures respectively appearing briefly in Predator 2 (1990) and The Predator (2018), as a protagonist (named 6) in the video game Aliens vs. Predator (2010), and will return in the upcoming FX television series Alien (TBA). In addition, the Xenomorph appears in various literature and video game spin-offs from the franchises. The Xenomorphs' design is credited to Swiss surrealist and artist H. R. Giger, originating in a lithograph titled Necronom IV and refined for the series's first film, Alien. The practical effects for the Xenomorph's head were designed and constructed by Italian special effects designer Carlo Rambaldi. Species design and life cycle have been extensively augmented, sometimes inconsistently, throughout each film. Unlike many other extraterrestrial races in film and television science fiction (such as the Daleks and Cybermen in Doctor Who, or the Klingons and Borg in Star Trek), the Xenomorphs are not sapient they lack a technological civilization of any kind, and are instead primal, predatory creatures with no higher goal than the preservation and propagation of their own species by any means necessary, up to and including the elimination of other lifeforms that may pose a threat to their existence. Like wasps or termites, Xenomorphs are eusocial, with a single fertile queen breeding a caste of warriors, workers, o
https://en.wikipedia.org/wiki/Transcriptome
The transcriptome is the set of all RNA transcripts, including coding and non-coding, in an individual or a population of cells. The term can also sometimes be used to refer to all RNAs, or just mRNA, depending on the particular experiment. The term transcriptome is a portmanteau of the words transcript and genome; it is associated with the process of transcript production during the biological process of transcription. The early stages of transcriptome annotations began with cDNA libraries published in the 1980s. Subsequently, the advent of high-throughput technology led to faster and more efficient ways of obtaining data about the transcriptome. Two biological techniques are used to study the transcriptome, namely DNA microarray, a hybridization-based technique and RNA-seq, a sequence-based approach. RNA-seq is the preferred method and has been the dominant transcriptomics technique since the 2010s. Single-cell transcriptomics allows tracking of transcript changes over time within individual cells. Data obtained from the transcriptome is used in research to gain insight into processes such as cellular differentiation, carcinogenesis, transcription regulation and biomarker discovery among others. Transcriptome-obtained data also finds applications in establishing phylogenetic relationships during the process of evolution and in in vitro fertilization. The transcriptome is closely related to other -ome based biological fields of study; it is complementary to the proteome and the metabolome and encompasses the translatome, exome, meiome and thanatotranscriptome which can be seen as ome fields studying specific types of RNA transcripts. There are quantifiable and conserved relationships between the Transcriptome and other -omes, and Transcriptomics data can be used effectively to predict other molecular species, such as metabolites. There are numerous publicly available transcriptome databases. Etymology and history The word transcriptome is a portmanteau of the wo
https://en.wikipedia.org/wiki/Curvaton
The curvaton is a hypothetical elementary particle which mediates a scalar field in early universe cosmology. It can generate fluctuations during inflation, but does not itself drive inflation, instead it generates curvature perturbations at late times after the inflaton field has decayed and the decay products have redshifted away, when the curvaton is the dominant component of the energy density. It is used to generate a flat spectrum of CMB perturbations in models of inflation where the potential is otherwise too steep or in alternatives to inflation like the pre-Big Bang scenario. The model was proposed by three groups shortly after one another in 2001: Kari Enqvist and Martin S. Sloth (Sep, 2001), David Wands and David H. Lyth (Oct, 2001), Takeo Moroi and Tomo Takahashi (Oct, 2001). See also Expansion of the universe Hubble's law Big Bang Cosmological constant Inflaton Cosmological perturbation theory Structure formation Kari Enqvist David Wands David H. Lyth Notes Physical cosmology Inflation (cosmology) Hypothetical elementary particles
https://en.wikipedia.org/wiki/Cytokine%20release%20syndrome
In immunology, cytokine release syndrome (CRS) is a form of systemic inflammatory response syndrome (SIRS) that can be triggered by a variety of factors such as infections and certain drugs. It refers to cytokine storm syndromes (CSS) and occurs when large numbers of white blood cells are activated and release inflammatory cytokines, which in turn activate yet more white blood cells. CRS is also an adverse effect of some monoclonal antibody medications, as well as adoptive T-cell therapies. When occurring as a result of a medication, it is also known as an infusion reaction. The term cytokine storm is often used interchangeably with CRS but, despite the fact that they have similar clinical phenotype, their characteristics are different. When occurring as a result of a therapy, CRS symptoms may be delayed until days or weeks after treatment. Immediate-onset CRS is a cytokine storm, although severe cases of CRS have also been called cytokine storms. Signs and symptoms Symptoms include fever that tends to fluctuate, fatigue, loss of appetite, muscle and joint pain, nausea, vomiting, diarrhea, rashes, fast breathing, rapid heartbeat, low blood pressure, seizures, headache, confusion, delirium, hallucinations, tremor, and loss of coordination. Lab tests and clinical monitoring show low blood oxygen, widened pulse pressure, increased cardiac output (early), potentially diminished cardiac output (late), high levels of nitrogen compounds in the blood, elevated D-dimer, elevated transaminases, factor I deficiency and excessive bleeding, higher-than-normal level of bilirubin. Cause CRS occurs when large numbers of white blood cells, including B cells, T cells, natural killer cells, macrophages, dendritic cells, and monocytes are activated and release inflammatory cytokines, which activate more white blood cells in a positive feedback loop of pathogenic inflammation. Immune cells are activated by stressed or infected cells through receptor-ligand interactions. This can oc
https://en.wikipedia.org/wiki/Collaborative%20Computational%20Project%20Q
Collaborative Computational Project Q (CCPQ) was developed in order to provide software which uses theoretical techniques to catalogue collisions between electrons, positrons or photons and atomic/molecular targets. The 'Q' stands for quantum dynamics. This project is accessible via the CCPForge website, which contains numerous other projects such as CCP2 and CCP4. The scope has increased to include atoms and molecules in strong (long-pulse and attosecond) laser fields, low-energy interactions of antihydrogen with small atoms and molecules, cold atoms, Bose–Einstein condensates and optical lattices. CCPQ gives essential information on the reactivity of various molecules, and contains two community codes R-matrix suite and MCTDH wavepacket dynamics. The project is supported by the Atomic and Molecular Physics group at Daresbury Laboratory, which supports research in core computational and scientific codes and research. This project is a collaboration between University College London (UCL), University of Bath, and Queen's University Belfast. The project is led by Professor Graham Worth who is the Chair, alongside Vice-Chairs Dr Stephen Clark and Professor Hugo van der Hart. Quantemol Ltd is also a close partner of the project. The project is a result of the previous Collaborative Computation Project 2 (CCP2), and is an improved version of this older project. CCPQ (and its predecessor CCP2) have supported various incarnations of the UK Molecular R-matrix project for almost 40 years. Applications Both academic and industrial researchers use CCPQ. One of its uses is in the field of plasma research; reliable data on electron and light interactions is essential in order to model plasma processes used both on a small and large scale. Large scale industrial processes need to investigate the implementation of new methods thoroughly, and CCPQ can be used to theoretically determine the value of new processes. CCPQ has been used to study the Hubbard models for cold atoms
https://en.wikipedia.org/wiki/HMNJ
Distal hereditary motor neuropathy, Jerash type is a protein that in humans is encoded by the HMNJ gene.
https://en.wikipedia.org/wiki/Plancherel%20theorem
In mathematics, the Plancherel theorem (sometimes called the Parseval–Plancherel identity) is a result in harmonic analysis, proven by Michel Plancherel in 1910. It states that the integral of a function's squared modulus is equal to the integral of the squared modulus of its frequency spectrum. That is, if is a function on the real line, and is its frequency spectrum, then A more precise formulation is that if a function is in both Lp spaces and , then its Fourier transform is in , and the Fourier transform map is an isometry with respect to the L2 norm. This implies that the Fourier transform map restricted to has a unique extension to a linear isometric map , sometimes called the Plancherel transform. This isometry is actually a unitary map. In effect, this makes it possible to speak of Fourier transforms of quadratically integrable functions. Plancherel's theorem remains valid as stated on n-dimensional Euclidean space . The theorem also holds more generally in locally compact abelian groups. There is also a version of the Plancherel theorem which makes sense for non-commutative locally compact groups satisfying certain technical assumptions. This is the subject of non-commutative harmonic analysis. The unitarity of the Fourier transform is often called Parseval's theorem in science and engineering fields, based on an earlier (but less general) result that was used to prove the unitarity of the Fourier series. Due to the polarization identity, one can also apply Plancherel's theorem to the inner product of two functions. That is, if and are two functions, and denotes the Plancherel transform, then and if and are furthermore functions, then and so See also Plancherel theorem for spherical functions
https://en.wikipedia.org/wiki/Ion%20implantation-induced%20nanoparticle%20formation
Ion implantation-induced nanoparticle formation is a technique for creating nanometer-sized particles for use in electronics. Ion implantation Ion Implantation is a technique extensively used in the field of materials science for material modification. The effect it has on nanomaterials allows manipulation of mechanical, electronic, morphological and optical properties. One-dimensional nano-materials are an important contributor to the creation of nano-devices i.e. field effect transistors, nanogenerators and solar cells. The offer the potential of high integration density, lower power consumption, higher speed and super high frequency. The effects of ion implantation varies according to multiple variables. Collision cascade may occur during implantation and this causes of interstitials and vacancies in target materials (although these defects may be mitigated through dynamic annealing). Collision modes are nuclear collision, electron collision and charge exchange. Another process is the sputtering effect, which significantly affects the morphology and shape of nano-materials.
https://en.wikipedia.org/wiki/KVCW
KVCW (channel 33) is a television station in Las Vegas, Nevada, United States, affiliated with The CW and MyNetworkTV. It is owned by Sinclair Broadcast Group alongside NBC affiliate KSNV (channel 3). Both stations share studios on Foremaster Lane in Las Vegas (making them the only major television stations whose operations are based inside the city limits), while KVCW's transmitter is located on Black Mountain, near Henderson (southwest of I-515/US 93/US 95). History Early years On April 22, 1987, the Federal Communications Commission (FCC) issued an original construction permit to 4-A Communications to build a new full-power television station, on UHF channel 33, to serve the Las Vegas market. 4-A Communications, owned by Lawrence and Teri DePaulis, became Channel 33, Inc. (which remained the station's licensee until 2015) in August 1987. The station, known as KFBT, went on the air on July 30, 1989, under a program test authority and was given a license one month later. The station's original transmitter was located in the McCullough Range southwest of Henderson. On July 20, 1990, a family ownership group headed by Daniel "Danny" Koker purchased Channel 33, Inc. Under the Kokers, KFBT was an independent station with a firmly local flavor and soon garnered much acclaim with features such as the scary B-movie showcase Saturday Fright at the Movies, hosted by Count Cool Rider, which aired at 10 p.m. Count Cool Rider was actually Danny Koker II, son of the station president and also one of the station's owners, who has since gone on to become a respected builder of custom motorcycles, as well as a regular expert on the History Channel series Pawn Stars and host of its spinoff, Counting Cars. As a WB affiliate and return to independence The station primarily broadcast older movies, sitcoms, and dramas during this era, as well as some Christian religious programs (as the senior Koker was a gospel musician) and professional wrestling (most notably World Class Champion
https://en.wikipedia.org/wiki/Phase%20portrait
In mathematics, a phase portrait is a geometric representation of the orbits of a dynamical system in the phase plane. Each set of initial conditions is represented by a different point or curve. Phase portraits are an invaluable tool in studying dynamical systems. They consist of a plot of typical trajectories in the phase space. This reveals information such as whether an attractor, a repellor or limit cycle is present for the chosen parameter value. The concept of topological equivalence is important in classifying the behaviour of systems by specifying when two different phase portraits represent the same qualitative dynamic behavior. An attractor is a stable point which is also called a "sink". The repeller is considered as an unstable point, which is also known as a "source". A phase portrait graph of a dynamical system depicts the system's trajectories (with arrows) and stable steady states (with dots) and unstable steady states (with circles) in a phase space. The axes are of state variables. Examples Simple pendulum, see picture (right). Simple harmonic oscillator where the phase portrait is made up of ellipses centred at the origin, which is a fixed point. Damped harmonic motion, see animation (right). Van der Pol oscillator see picture (bottom right). Visualizing the behavior of ordinary differential equations A phase portrait represents the directional behavior of a system of ordinary differential equations (ODEs). The phase portrait can indicate the stability of the system. The phase portrait behavior of a system of ODEs can be determined by the eigenvalues or the trace and determinant (trace = λ1 + λ2, determinant = λ1 x λ2) of the system. See also Phase space Phase plane
https://en.wikipedia.org/wiki/GeForce%20100%20series
The GeForce 100 series is a series of Tesla-based graphics processing units developed by Nvidia, first released in March 2009. Its cards are rebrands of GeForce 9 series cards, available only for OEMs. However, the GTS 150 was briefly available to consumers. Products The GeForce 100 Series cards include the G100, GT 120, GT 130, GT 140 and GTS 150. The GT 120 is a based on the 9500 GT with improved thermal designs while the GT 130 is based on the 9600 GSO (which itself was a re-badged 8800 GS). The GT 140 is simply a rebadged 9600 GT. The GTS 150 is an OEM version of the GTS 250 with some slight changes. Despite being based upon previous 9 series cards, the G 100, GT 120, and GT 130 utilize entirely different PCB's and slightly different clock speeds. Chipset table Discontinued support NVIDIA has ceased driver support for GeForce 100 series on April 1, 2016. Windows XP 32-bit & Media Center Edition: version 340.52 released on July 29, 2014; Download Windows XP 64-bit: version 340.52 released on July 29, 2014; Download Windows Vista, 7, 8, 8.1 32-bit: version 342.01 (WHQL) released on December 14, 2016; Download Windows Vista, 7, 8, 8.1 64-bit: version 342.01 (WHQL) released on December 14, 2016; Download Windows 10, 32-bit: version 342.01 (WHQL) released on December 14, 2016; Download Windows 10, 64-bit: version 342.01 (WHQL) released on December 14, 2016; Download See also GeForce 8 Series GeForce 9 Series GeForce 200 Series GeForce 300 Series GeForce 400 Series GeForce 500 Series GeForce 600 Series Nvidia Quadro Nvidia Tesla
https://en.wikipedia.org/wiki/Paper%20snowflake
A paper snowflake is a type of paper craft based on a snowflake that combines origami with papercutting. The designs can vary significantly after doing mandatory folding. An online version of the craft is known as "Make-A-Flake", and was created by Barkley Inc. in 2008. See also Kirigami
https://en.wikipedia.org/wiki/Methyl%20fluoroacetate
Methyl fluoroacetate (MFA) is an extremely toxic methyl ester of fluoroacetic acid. It is a colorless, odorless liquid at room temperature. It is used as a laboratory chemical and as a rodenticide. Because of its extreme toxicity, MFA was studied for potential use as a chemical weapon. The general population is not likely to be exposed to methyl fluoroacetate. People who use MFA for work, however, can breathe in or have direct skin contact with the substance. History MFA was first synthesized in 1896 by the Belgian chemist Swart by reacting methyl iodoacetate with silver fluoride. It can also be synthesized by reacting methyl chloroacetate with potassium fluoride Because of its toxicity, MFA was studied for potential use as a chemical weapon during World War II. It was considered a good water poison since it is colorless and odorless and therefore it can toxify the water supply and kill a big part of the population. By the end of the war, several countries began to make methyl fluoroacetate to debilitate or kill the enemy. Synthesis The synthesis of methyl fluoroacetate consists of a two-step process: Potassium fluoride and the catalyst are added into the solvent within the reactor; this is then stirred and heated up. The catalyst mentioned in this step is a phase-transfer catalyst and can be the chemicals dodecyl trimethyl ammonium chloride, tetrabutylammonium chloride, tetrabutylammonium bromide, or tetramethylammonium chloride. The mass ratio of the potassium fluoride and the catalyst in this step is 0.5~1 : 0.02~0.03. With the solvent mentioned in this step being a mixture of dimethylformamide and acetamide with a mass ratio of 1.4~1.6: 1. The mass ratio of the solvent and potassium fluoride is 1.1~2.0 : 0.5~1. When the reaction temperature of 100~160 °C is reached, methyl chloroacetate is continuously added in the reactor at a speed of 5~10 kg/min with the mass ratio of methyl chloroacetate and potassium fluoride being 1:0.5~1. The reaction between these
https://en.wikipedia.org/wiki/Bird%20trapping
Bird trapping techniques to capture wild birds include a wide range of techniques that have their origins in the hunting of birds for food. While hunting for food does not require birds to be caught alive, some trapping techniques capture birds without harming them and are of use in ornithology research. Wild birds may also be trapped for their display in captivity in zoological gardens or for keeping as a pet. Bird trapping was formerly unregulated, but to protect bird populations most countries have specific laws and regulations. Luring Birds are lured into the vicinity of traps through the use of suitable habitat patches where the birds are known to visit. A specific location may be further modified by the provision of food, the use of decoy birds, the use of calls, or owls that may induce mobbing. Male birds of some species are used as decoys during the breeding seasons to challenge and beckon other males from nearby. Larks were formerly attracted using a rotary paddle, sometimes with shiny mirrors attached, turned by a spring. The phrase "a mirror for larks" was once a common metaphor for a trap. Owls and their calls are often used to bring birds out of dense vegetations. The technique has also used by birdwatchers. Trapping techniques Almost all traps involve the use of food, water or decoys to attract birds within range and a mechanism for restricting the movement, injuring or killing birds that come into range. Food, water, decoy birds and call playback may be used to bring birds to the trap. The use of chemical sprays on crops or food can have more widespread effects and are not usually included in trapping techniques although there are some capture techniques that make use of bait with stupefying agents. The mechanism can be physical and non-lethal like a noose that tightens around the leg or lethal like in deadfall traps. Lethal techniques have been used for the control of birds considered as pests or can be used in the capture of birds for food. Tra
https://en.wikipedia.org/wiki/Center%20of%20excellence
A center of excellence (COE or CoE), also called excellence center, is a team, a shared facility or an entity that provides leadership, best practices, research, support or training for a focus area. Due to its broad usage and vague legal precedent, a "center of excellence" in one context may have completely different characteristics from another. The focus area might be a technology (e.g. Java), a business concept (e.g. BPM), a skill (e.g. negotiation) or a broad area of study (e.g. women's health). A center of excellence may also be aimed at revitalizing stalled initiatives. The term may also refer to a network of institutions collaborating with each other to pursue excellence in a particular area. (e.g. the Rochester Area Colleges Center for Excellence in Math and Science). Organizations Within an organization, a center of excellence may refer to a group of people, a department or a shared facility. It may also be known as a competency center, or as a capability center, or as an excellence center. Stephen Jenner and Craig Kilford, in Management of Portfolios, mention COE as a coordinating function which ensures that change initiatives are delivered consistently and well, through standard processes and competent staff. In technology companies, the center of excellence concept is often associated with new software tools, technologies or associated business concepts such as service-oriented architecture or business intelligence. Academia In academic institutions, a center of excellence often refers to a team with a clear focus on a particular area of research; such a center may bring together faculty members from different disciplines and provide shared facilities. Australia In Australia, the Australian Research Council (ARC) funds a competitive grant program for centres of excellence which link a number of institutions within the country and internationally in a specific field of research. New centres are funded every three years and each run for seven years.
https://en.wikipedia.org/wiki/Ahmia
Ahmia is a clearnet search engine for Tor's hidden services created by Juha Nurmi. Overview Developed during the 2014 Google Summer of Code with support from the Tor Project, the open source search engine was initially built in Django and PostgreSQL. It collects the peculiar anonymous identifier called .onion URLs from the Tor network and feeds these to its index except those containing a robots.txt file. The search engine filters out child pornography and keeps a blacklist of abusive services. The service partners with GlobaLeaks's submissions and Tor2web statistics for hidden service discovery and as of July 2015 has indexed about 5000 sites. Ahmia is also affiliated with Hermes Center for Transparency and Digital Rights, an organization that promotes transparency and freedom-enabling technologies. In July 2015 the site published a list of hundreds of fraudulent fake versions of web pages (including such sites as DuckDuckGo, as well a dark web page). According to Nurmi, "someone runs a fake site on a similar address to the original one and tries to fool people with that" with the intent of scamming people (e.g. gathering bitcoin money by spoofing bitcoin addresses). The background photo of this site is "Hakaniemi in Helsinki". See also Comparison of web search engines List of search engines List of search engines by popularity
https://en.wikipedia.org/wiki/Legged%20Squad%20Support%20System
The Legged Squad Support System (LS3) was a DARPA project for a legged robot which could function autonomously as a packhorse for a squad of soldiers or marines. Like BigDog, its quadruped predecessor, the LS3 was ruggedized for military use, with the ability to operate in hot, cold, wet, and dirty environments. The LS3 was put into storage in late 2015. Specifications The Legged Squad Support System was to "Go where dismounts go, do what dismounts do, work among dismounts," carry of squad equipment, sense and negotiate terrain, maneuver nimbly, and operate quietly when required. The LS3 was approximately the shape and size of a horse. A stereo vision system, consisting of a pair of stereo cameras mounted into the 'head' of the robot, was integrated alongside a light detecting and ranging (LIDAR) component in order to enable it to follow a human lead and record intelligence gathered through its camera. History The initial contract for the Legged Squad Support System was awarded to Boston Dynamics on December 3, 2009. The continued evolution of the BigDog led to the development of the LS3, also known as AlphaDog. The contract schedule called for an operational demonstration of two units with troops in 2012. DARPA continued to support the program and carried out the first outdoor exercise on the latest variation of the LS3 in February 2012, with it successfully demonstrating its full capabilities during a planned hike encompassing tough terrain. Following its initial success, an 18-month plan was unveiled, which saw DARPA complete the overall development of the system and refine its key capabilities, due to start in summer 2012. On September 10, 2012, two LS3 prototypes were demonstrated in an outdoor test. One of them had done so earlier in the year. The LS3 prototypes completed trotting and jogging mobility runs, perception visualization demonstrations, and a soldier-bounded autonomy demonstrations. They were roughly "10 times quieter" than the original pl
https://en.wikipedia.org/wiki/Apoptotic%20DNA%20fragmentation
Apoptotic DNA fragmentation is a key feature of apoptosis, a type of programmed cell death. Apoptosis is characterized by the activation of endogenous endonucleases, particularly the caspase-3 activated DNase (CAD), with subsequent cleavage of nuclear DNA into internucleosomal fragments of roughly 180 base pairs (bp) and multiples thereof (360, 540 etc.). The apoptotic DNA fragmentation is being used as a marker of apoptosis and for identification of apoptotic cells either via the DNA laddering assay, the TUNEL assay, or the by detection of cells with fractional DNA content ("sub G1 cells") on DNA content frequency histograms e.g. as in the Nicoletti assay. Mechanism The enzyme responsible for apoptotic DNA fragmentation is the Caspase-Activated DNase (CAD). CAD is normally inhibited by another protein, the Inhibitor of Caspase Activated DNase (ICAD). During apoptosis, the apoptotic effector caspase, caspase-3, cleaves ICAD and thus causes CAD to become activated. CAD cleaves DNA at internucleosomal linker sites between nucleosomes, protein-containing structures that occur in chromatin at ~180-bp intervals. This is because the DNA is normally tightly wrapped around histones, the core proteins of the nucleosomes. The linker sites are the only parts of the DNA strand that are exposed and thus accessible to CAD. Degradation of nuclear DNA into nucleosomal units is one of the hallmarks of apoptotic cell death. It occurs in response to various apoptotic stimuli in a wide variety of cell types. Molecular characterization of this process identified a specific DNase (CAD, caspase-activated DNase) that cleaves chromosomal DNA in a caspase-dependent manner. CAD is synthesized with the help of ICAD (inhibitor of CAD), which works as a specific chaperone for CAD and is found complexed with ICAD in proliferating cells. When cells are induced to undergo apoptosis, caspase 3 cleaves ICAD to dissociate the CAD:ICAD complex, allowing CAD to cleave chromosomal DNA. Cells that la
https://en.wikipedia.org/wiki/Brendan%20Hassett
Brendan Edward Hassett is an American mathematician who works as a professor of mathematics at Brown University. His research interests include algebraic geometry and number theory. Education and career Hassett graduated from Yale College in 1992, and earned his doctorate in 1996 from Harvard University under the supervision of Joe Harris. After temporary positions at the Mittag-Leffler Institute, University of Chicago, and Chinese University of Hong Kong, he joined the Rice University faculty in 2000. He was promoted to full professor in 2006, chaired the department from 2009 to 2014, and was named as the Milton Brockett Porter Professor of Mathematics in 2013. In July 2015 he moved to Brown University. He is currently the Director of the Institute for Computational and Experimental Research in Mathematics. Hassett has authored the textbook Introduction to Algebraic Geometry (Cambridge University Press, 2007). In 2013, Hassett was named as a fellow of the American Mathematical Society "for contributions to higher-dimensional arithmetic geometry and birational geometry."
https://en.wikipedia.org/wiki/Syntax%20%28logic%29
In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretation or meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols and words of a language, as contrasted with the semantics of a language which is concerned with its meaning. The symbols, formulas, systems, theorems, proofs, and interpretations expressed in formal languages are syntactic entities whose properties may be studied without regard to any meaning they may be given, and, in fact, need not be given any. Syntax is usually associated with the rules (or grammar) governing the composition of texts in a formal language that constitute the well-formed formulas of a formal system. In computer science, the term syntax refers to the rules governing the composition of well-formed expressions in a programming language. As in mathematical logic, it is independent of semantics and interpretation. Syntactic entities Symbols A symbol is an idea, abstraction or concept, tokens of which may be marks or a metalanguage of marks which form a particular pattern. Symbols of a formal language need not be symbols of anything. For instance there are logical constants which do not refer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses). A symbol or string of symbols may comprise a well-formed formula if the formulation is consistent with the formation rules of the language. Symbols of a formal language must be capable of being specified without any reference to any interpretation of them. Formal language A formal language is a syntactic entity which consists of a set of finite strings of symbols which are its words (usually called its well-formed formulas). Which strings of symbols are words is determined by the creator of the language, usually by specifying a set of formation rules. Such a language can be defined without reference to any meanings of any of its expression
https://en.wikipedia.org/wiki/Enteroinvasive%20Escherichia%20coli
Enteroinvasive Escherichia coli (EIEC) is a type of pathogenic bacteria whose infection causes a syndrome that is identical to shigellosis, with profuse diarrhea and high fever. EIEC are highly invasive, and they use adhesin proteins to bind to and enter intestinal cells. They produce no toxins, but severely damage the intestinal wall through mechanical cell destruction. EIEC are closely related to Shigella, like all E. coli are. Their similarity in disease phenotype come from a homologous large virulence plasmid pINV. They also have in common in their loss of cadaverine synthesis, of ompT, and of curli formation. These features are probably acquired independently, as the two lost cadaverine synthesis in different ways. Moreover, the "EIEC" does not form a monophyletic group in E. coli. After the E. coli strain penetrates through the epithelial wall, the endocytosis vacuole gets lysed, the strain multiplies using the host cell machinery, and extends to the adjacent epithelial cell. In addition, the plasmid of the strain carries genes for a type III secretion system that is used as the virulent factor. Although it is an invasive disease, the invasion usually does not pass the submucosal layer. The similar pathology to shigellosis may be because both strains of bacteria share some virulent factors. The invasion of the cells can trigger a mild form of diarrhea or dysentery, often mistaken for dysentery caused by Shigella species. The illness is characterized by the appearance of blood and mucus in the stools of infected individuals or a condition called colitis. Dysentery caused by EIEC usually occurs within 12 to 72 hours following the ingestion of contaminated food. The illness is characterized by abdominal cramps, diarrhea, vomiting, fever, chills, and a generalized malaise. Dysentery caused by this organism is generally self-limiting with no known complications. It is currently unknown what foods may harbor EIEC, but any food contaminated with human feces from
https://en.wikipedia.org/wiki/EveryDoctor
EveryDoctor is a British grassroots advocacy group made up of doctors. The group was established in 2019, and grew during the COVID-19 pandemic in the United Kingdom. In May 2021, EveryDoctor and the Good Law Project brought legal action against the British government in relation to COVID-19 PPE contracts during the first wave of the COVID-19 pandemic, accusing the Department of Health and Social Care of unlawful procurement procedures and providing inadequate PPE supplies. See also COVID-19 contracts in the United Kingdom
https://en.wikipedia.org/wiki/Maurer%E2%80%93Cartan%20form
In mathematics, the Maurer–Cartan form for a Lie group is a distinguished differential one-form on that carries the basic infinitesimal information about the structure of . It was much used by Élie Cartan as a basic ingredient of his method of moving frames, and bears his name together with that of Ludwig Maurer. As a one-form, the Maurer–Cartan form is peculiar in that it takes its values in the Lie algebra associated to the Lie group . The Lie algebra is identified with the tangent space of at the identity, denoted . The Maurer–Cartan form is thus a one-form defined globally on which is a linear mapping of the tangent space at each into . It is given as the pushforward of a vector in along the left-translation in the group: Motivation and interpretation A Lie group acts on itself by multiplication under the mapping A question of importance to Cartan and his contemporaries was how to identify a principal homogeneous space of . That is, a manifold identical to the group , but without a fixed choice of unit element. This motivation came, in part, from Felix Klein's Erlangen programme where one was interested in a notion of symmetry on a space, where the symmetries of the space were transformations forming a Lie group. The geometries of interest were homogeneous spaces , but usually without a fixed choice of origin corresponding to the coset . A principal homogeneous space of is a manifold abstractly characterized by having a free and transitive action of on . The Maurer–Cartan form gives an appropriate infinitesimal characterization of the principal homogeneous space. It is a one-form defined on satisfying an integrability condition known as the Maurer–Cartan equation. Using this integrability condition, it is possible to define the exponential map of the Lie algebra and in this way obtain, locally, a group action on . Construction Intrinsic construction Let be the tangent space of a Lie group at the identity (its Lie algebra). acts on
https://en.wikipedia.org/wiki/GeForce%20700%20series
The GeForce 700 series (stylized as GEFORCE GTX 700 SERIES) is a series of graphics processing units developed by Nvidia. While mainly a refresh of the Kepler microarchitecture (GK-codenamed chips), some cards use Fermi (GF) and later cards use Maxwell (GM). GeForce 700 series cards were first released in 2013, starting with the release of the GeForce GTX Titan on February 19, 2013, followed by the GeForce GTX 780 on May 23, 2013. The first mobile GeForce 700 series chips were released in April 2013. Overview GK110 was designed and marketed with computational performance in mind. It contains 7.1 billion transistors. This model also attempts to maximise energy efficiency through the execution of as many tasks as possible in parallel according to the capabilities of its streaming processors. With GK110, increases in memory space and bandwidth for both the register file and the L2 cache over previous models, are seen. At the SMX level, GK110's register file space has increased to 256KB composed of 64K 32bit registers, as compared to Fermi's 32K 32bit registers totaling 128 KB. As for the L2 cache, GK110 L2 cache space increased by up to 1.5MB, 2x as big as GF110. Both the L2 cache and register file bandwidth have also doubled. Performance in register-starved scenarios is also improved as there are more registers available to each thread. This goes in hand with an increase of total number of registers each thread can address, moving from 63 registers per thread to 255 registers per thread with GK110. With GK110, Nvidia also reworked the GPU texture cache to be used for compute. With 48KB in size, in compute the texture cache becomes a read-only cache, specializing in unaligned memory access workloads. Furthermore, error detection capabilities have been added to make it safer for use with workloads that rely on ECC. The series also supports DirectX 12 on Windows 10. Dynamic Super Resolution (DSR) was added to Kepler GPUs with the latest Nvidia drivers. Architectu
https://en.wikipedia.org/wiki/Jacobsthal%20number
In mathematics, the Jacobsthal numbers are an integer sequence named after the German mathematician Ernst Jacobsthal. Like the related Fibonacci numbers, they are a specific type of Lucas sequence for which P = 1, and Q = −2—and are defined by a similar recurrence relation: in simple terms, the sequence starts with 0 and 1, then each following number is found by adding the number before it to twice the number before that. The first Jacobsthal numbers are: 0, 1, 1, 3, 5, 11, 21, 43, 85, 171, 341, 683, 1365, 2731, 5461, 10923, 21845, 43691, 87381, 174763, 349525, … A Jacobsthal prime is a Jacobsthal number that is also prime. The first Jacobsthal primes are: 3, 5, 11, 43, 683, 2731, 43691, 174763, 2796203, 715827883, 2932031007403, 768614336404564651, 201487636602438195784363, 845100400152152934331135470251, 56713727820156410577229101238628035243, … Jacobsthal numbers Jacobsthal numbers are defined by the recurrence relation: The next Jacobsthal number is also given by the recursion formula or by The second recursion formula above is also satisfied by the powers of 2. The Jacobsthal number at a specific point in the sequence may be calculated directly using the closed-form equation: The generating function for the Jacobsthal numbers is The sum of the reciprocals of the Jacobsthal numbers is approximately 2.7186, slightly larger than e. The Jacobsthal numbers can be extended to negative indices using the recurrence relation or the explicit formula, giving (see ) The following identity holds (see ) Jacobsthal–Lucas numbers Jacobsthal–Lucas numbers represent the complementary Lucas sequence . They satisfy the same recurrence relation as Jacobsthal numbers but have different initial values: The following Jacobsthal–Lucas number also satisfies: The Jacobsthal–Lucas number at a specific point in the sequence may be calculated directly using the closed-form equation: The first Jacobsthal–Lucas numbers are: 2, 1, 5, 7, 17, 31, 65, 127, 257,
https://en.wikipedia.org/wiki/Ordered%20geometry
Ordered geometry is a form of geometry featuring the concept of intermediacy (or "betweenness") but, like projective geometry, omitting the basic notion of measurement. Ordered geometry is a fundamental geometry forming a common framework for affine, Euclidean, absolute, and hyperbolic geometry (but not for projective geometry). History Moritz Pasch first defined a geometry without reference to measurement in 1882. His axioms were improved upon by Peano (1889), Hilbert (1899), and Veblen (1904). Euclid anticipated Pasch's approach in definition 4 of The Elements: "a straight line is a line which lies evenly with the points on itself". Primitive concepts The only primitive notions in ordered geometry are points A, B, C, ... and the ternary relation of intermediacy [ABC] which can be read as "B is between A and C". Definitions The segment AB is the set of points P such that [APB]. The interval AB is the segment AB and its end points A and B. The ray A/B (read as "the ray from A away from B") is the set of points P such that [PAB]. The line AB is the interval AB and the two rays A/B and B/A. Points on the line AB are said to be collinear. An angle consists of a point O (the vertex) and two non-collinear rays out from O (the sides). A triangle is given by three non-collinear points (called vertices) and their three segments AB, BC, and CA. If three points A, B, and C are non-collinear, then a plane ABC is the set of all points collinear with pairs of points on one or two of the sides of triangle ABC. If four points A, B, C, and D are non-coplanar, then a space (3-space) ABCD is the set of all points collinear with pairs of points selected from any of the four faces (planar regions) of the tetrahedron ABCD. Axioms of ordered geometry There exist at least two points. If A and B are distinct points, there exists a C such that [ABC]. If [ABC], then A and C are distinct (A ≠ C). If [ABC], then [CBA] but not [CAB]. If C and D are distinct points on the l
https://en.wikipedia.org/wiki/Social%20defeat
Social defeat is a concept used in the study of the physiological and behavioral effects of hostile interactions among either conspecific animals, or humans, in either a dyadic or in a group-individual context, potentially generating very significant consequences in terms of control over resources, access to mates and social positions. Background Research on social stress has accumulated a useful body of knowledge, providing perspective on the effects of detrimental social and environmental interaction on the brain. Research and experimentation suffer from many methodological difficulties: usually a lack of ecological validity (similarity with natural conditions and stressors) or are not amenable to scientific investigation (difficult to test and verify). Social psychology approaches to human aggression have developed a multitude of perspectives, based on observations of human phenomena like bullying, mobbing, physical and verbal abuse, relational and indirect aggression, etc. Despite the richness of theories developed, the body of knowledge generated has not satisfied scientific requirements of testability and verifiability. Animal studies of within-species aggression developed in 2 main branches: A) approaches based on laboratory experiments, on controlled conditions, allowing the measurement of behavioral, endocrine and neurological variables, but with the shortcoming of applying unnatural stressors (such as foot-shocks and restraint stress) in unnatural conditions (laboratory cages rarely approximate native habitats); B) approaches based on observations of animals in naturalistic settings, which avoided artificial environments and unnatural stresses, but usually not allowing the measurement of physiological effects or the manipulation of relevant variables. In real life situations, animals (including humans) have to cope with stresses generated within their own species, during their interactions with conspecifics, especially due to recurrent struggles over
https://en.wikipedia.org/wiki/Dark%20star%20%28Newtonian%20mechanics%29
A dark star is a theoretical object compatible with Newtonian mechanics that, due to its large mass, has a surface escape velocity that equals or exceeds the speed of light. Whether light is affected by gravity under Newtonian mechanics is unclear but if it were accelerated the same way as projectiles, any light emitted at the surface of a dark star would be trapped by the star's gravity, rendering it dark, hence the name. Dark stars are analogous to black holes in general relativity. Dark star theory history John Michell and dark stars During 1783 geologist John Michell wrote a letter to Henry Cavendish outlining the expected properties of dark stars, published by The Royal Society in their 1784 volume. Michell calculated that when the escape velocity at the surface of a star was equal to or greater than lightspeed, the generated light would be gravitationally trapped so that the star would not be visible to a distant astronomer. Michell's idea for calculating the number of such "invisible" stars anticipated 20th century astronomers' work: he suggested that since a certain proportion of double-star systems might be expected to contain at least one "dark" star, we could search for and catalogue as many double-star systems as possible, and identify cases where only a single circling star was visible. This would then provide a statistical baseline for calculating the amount of other unseen stellar matter that might exist in addition to the visible stars. Dark stars and gravitational shifts Michell also suggested that future astronomers might be able to identify the surface gravity of a distant star by seeing how far the star's light was shifted to the weaker end of the spectrum, a precursor of Einstein's 1911 gravity-shift argument. However, Michell cited Newton as saying that blue light was less energetic than red (Newton thought that more massive particles were associated with bigger wavelengths), so Michell's predicted spectral shifts were in the wrong directi
https://en.wikipedia.org/wiki/Xandr
Xandr, Inc. (pronounced "Zander") is the advertising and analytics subsidiary of Microsoft, which operates an online platform, Community, for buying and selling consumer-centric digital advertising. In December 2021, AT&T announced that they had agreed to sell Xandr (including AppNexus and Clypd) to Microsoft for an undisclosed price. The acquisition was completed in June 2022. History Following its June 2018 acquisition of AppNexus, Xandr was formed by AT&T to construct a national TV advertising marketplace. Xandr was launched, on September 25, 2018, at its inaugural AT&T Relevance Conference, in Santa Barbara, California, and was named after its parent company founder, Alexander Graham Bell. In June 2019, Xandr rebranded its AppNexus DSP, launching Xandr Invest, to serve as its central ad-buying hub. On October 18, 2019, Xandr acquired Massachusetts-based Clypd, an audience-based sales platform for television advertising. On April 30, 2020, it was folded into WarnerMedia. Rumors and speculation spread in 2020 that AT&T was seeking to sell and offload Xandr, along with DirecTV and Crunchyroll. The reasoning is twofold: first, AT&T took on immense debt to purchase WarnerMedia, and any non-core asset sales would help pay off that debt burden. Secondly, Xandr was inherently compromised by its association with AT&T and HBO Max, as many would-be advertisers were reluctant to use what is essentially a competitor for their ad buys, and as such it might be more lucrative on its own. On May 17, 2021, it was announced that WarnerMedia would be merged with Discovery, Inc. to form a new publicly traded company known as Warner Bros. Discovery, but that the Xandr business would not be included in the transaction and would remain a division of AT&T. On December 21, 2021, AT&T announced that they had agreed to sell Xandr (including AppNexus and Clypd) to Microsoft for an undisclosed price, subject to customary closing conditions, including regulatory reviews. On June 6